319
1 INTRODUCTION
In Marine Education (ME), the use of non
conventionalmethodsandtools(biometrictools)isa
useful contribution in its amelioration. ME follows
certain education standards (STCW’95) for each
specialty (Captain, Engineer) and for each level (Aʹ,
Bʹ, C’). Its scope is the acquisition of basic
scientific
knowledge, dexterities on execution (navigation,
routeplotting,administeringtheengine etc)aswellas
protecting the ship and crew (safety issues and
environment protection issues). Specifically, the
STCW’95 standard defines three competency levels:
Management,functionandsupportwhileatthesame
time it defines related dexterities. Every dexterity
level
suggests the totality of the learning goals and
the goal definition is the basic characteristic of
training.Thesimpler competencemake upthe more
complexones.Thishierarchicalincreaseinthelevelof
dexterity places an austere framework for the
educator designer of lessons in each marine school.
The introduction of
simulators and other modern
training tools constitutes an important research
question on what degree it can fulfil all the
expectations set forth by the STCW’95 (IMO, 2003,
Papachristosetal.,2012,Tsoukalasetal.,2008).
Weproposearesearchframeworkforeducational
andusabilityevaluationofmarineelearningsystems
that combines a neuroscience approach (biometric
tools of gaze tracking & speech recording for
measuringemotional user responseslexicalanalysis)
with usability assessment. Certainly, the proposed
approach may require further adaptations to
accommodate evaluation of particular interactive e
learningsystems.Themainelementsoftheproposed
approachinclude(Papachristosetal.,
2012):
1 Registration and interpretation of user emotional
states
2 Gazetrackingandinterpretation
3 Speech recording and lexical analysis (sentiment
processing)
A Neuroscience Approach in User Satisfaction
Evaluation in Maritime Education
D.Papachristos,N.Nikitakos&M.Lambrou
UniversityofAegean,Greece
ABSTRACT: The evaluation with the use of neuroscience methods and tools of a student’s satisfaction
happiness from using the elearning system (elearning platforms, egames, simulators) poses an important
research subject matter. In the present paper, it is presented a research on course conducted in the Marine
TrainingCentreofPiraeus.Inparticular,thisresearchwiththe useof aneuroscience toolsgazetracker
and
voice recording (lexical analysis), investigatestheamountof satisfaction of the students using Engine room
simulator (ERS 5L90MCL11, Kongsberg 2003 AS) by monitoring the users’ eye movement and speech in
combinationwiththeuseofqualitativeandquantitativemethods.Theultimategoalofthisresearchistofind
and
testthecriticalfactorsthatinfluencetheeducationalpracticeandusabilityofelearningmarinesystems
andtheabilitytoconductfulltimesystemcontrolbythemarinecrew.
http://www.transnav.eu
the International Journal
on Marine Navigation
and Safety of Sea Transportation
Volume 7
Number 3
September 2013
DOI:10.12716/1001.07.03.01
320
4 Usabilityassessmentquestionnaires
5 Wrapupinterviews.
Thisprocedureisaprimaryefforttoresearchthe
educational and usability evaluation with emotion
analysis(satisfaction)oftheusersstudentsinmarine
elearningenvironments.
2 THEORETICALBACKGROUND
Intheinvestigativefieldofpsychology,theuseofthe
English word affect
is very popular, which usually
covers a plethora of concepts such as emotions,
moodsandpreferences.Thetermemotiontendstobe
used for the characterization of rather short but
intense experiences, while moods and preferences
refer to lower intensity but greater duration
experiences. In general, we could note that
psychology considers the emotional mechanism as a
deterministmechanismthatprerequiresastimulus
cause incited in the brain by use of the neural and
endocrinesystem(hormonal),theresponseemotion
(Malatesta,2009,Papachristosetal.,2012).
Modern scientific community suggests different
views concerning understanding emotional
mechanism.Thereis
theviewthatemotionisdefined
bythenaturalreactionscausedinthebody(sweating,
pulse increase, etc.), while other researchers believe
thatitisapurelymindprocess,whiletherearealso
hybrid views that define, each one in a varying
degree, the participation and the manner where the
human functions are involved in the emotional
experience(Vosniadou,2001).
Many psychologists have claimed that the only
waytointerpretthetotalityofemotionsistosuggest
that there is a common evolutional base in the
developmentoffacialemotionalexpressions.Butthe
biologicalapproachcannotexplainallthefacetsof
a
human’semotionalbehavior,(Vosniadou,2001).
Duringthelast25years,psychologyfocusesagain
inthesequenceofeventsinvolvedinthecreation”of
an emotion. Zajonc considers that experiencing an
emotion happens often before we have the time to
assessit,whilecontrarily,Lazarousconsidersthatthe
thoughtprecedes
theemotionalexperience,assessing
thatinstantaneouscognitiveassessmentsofsituations
canhappenatthesametimealongsidetheemotional
experience(Lazarous,1982,Zajonc,1984).Thespeed
withwhichweassessasituationisinfluencedbyour
previous experiences. The age scale also seems to
influence the creation of emotions. It
must be noted
that there are also emotions that do not require
cognitive processes (thought). For example, loud
noisesorseeingalion.Suchemotionalreactions can
be important for the survival of the species and are
related to certain stereotypical facial expressions
whichhaveglobalmeaning(Vosniadou,2001).
Also,
another factor that can be investigated in
relation to the emotional experience is the language
process. The psychological research in the language
production, comprehension and development is
developed mainly after 1960 as a result of linguist’s
N.Chomskyresearchongenerativegrammar(Pinker,
2005). The psycholinguistic research showed that
language comprehension
and production is not
influenced only from factors not related to their
linguistic complexity but also from the
speaker’s/listener’sexistingknowledgefortheworld
around him/her, as well as by the information
included in the extra linguistic environment
(Vosniadou,2001).
Investigating the emotional gravity of words
spokenbyaspeaker
anddefineditsemotionalstate
(current or past) constitutes a state of the art issue.
Mostoftheemotionalstatecategorizationsuggested
concern the English language. To overcome this
problem,studieshavebeenconductedthatapproach
themattercrossculturallyandstudytheassignment
of the categories to various languages. This
assignmenthasconceptualtrapssincethemannerin
which an emotional state is apprehensible; an
emotional state is influenced by cultural factors as
well.Inaratherrecentcrossculturalstudydoneby
Fontaine et al., (2007), 144 emotional experiences’
characteristics were examined, which were then
categorized according to the
following emotional
“components”: (a) event assessment (arousal), (b)
psychophysiological changes,(c) motorexpressions,
(d) action tendencies, (e) subjective feelings, and (f)
emotionregulation.
International bibliography contains various
approaches techniques(sorting algorithms)
concerning linguistic emotional analyses, which are
followed and are based mainly in the existence of
word lists or
dictionaries with labels of emotional
gravityalongwithapplicationsinmarketing,cinema,
internet, political discourse etc (Lambov et al., 2011,
Fotopoulou et al., 2009). There are studies also
concerning sorting English verbs and French verbs
that state emotions based on conceptual and
structuralsyntactical characteristics.For the Greek
languagethere
isastudyonverbsofGreekthatstate
emotions based on the theoretical framework
“LexiconGrammar” that is quite old and doesn’t
contain data from real language use; there are also
some studies concerning Greek adjectives and verbs
that state emotions and comparison with other
languages (French Turkish) under
the viewpoint:
Structuralsyntactical + conceptual characteristics.
More recent studies in Greek conducted
systematically the noun structures based on the
theoreticalframeworkof“LexiconGrammar”andthe
establishmentof conceptual&syntactical criteria for
the distinction and sorting of nouns based on
conceptualsyntacticalcharacteristicsofthestructures
inwhich
theyappear(Papachristosetal.,2012).
The observation of eye movement, as well as the
pupil movement, is an established method in many
years now and the technological developments in
bothmaterialequipmentandsoftware,madeitmore
viable as a practicality measurement approach. The
eyes’movementsaresupposedtodepict
thelevel of
the cognitive process a screen demands and
consequently the level of facility or difficulty of its
process. Usually, the optical measurement
concentratesonthefollowing:theeyes’focuspoints,
the eyes’ movement patterns and the pupil’s
alterations. The measurement targets are the
computerscreenareas definition, easy
ordifficultto
understand. In particular the eyes movement
measurements focus on attention spots, where the
eyes remain steady for a while, and on quick
321
movementareas, wheretheeye moves quicklyfrom
one point of interest to another. Moreover the
researchinterestisfocusedintheinteractionofgaze
tracking during the presentation of information and
content(internet)inanaturalenvironment(Dixetal.,
2004,Kotzabasis,2011).
Gaze interaction through eye tracking is
an
interface technology that has great potential. Eye
tracking is a technology that provides analytical
insights for studying human behavior and visual
attention (Duchowski, 2007). Besides that, it is an
intuitive human–computer interface that especially
enables users with disabilities to interact with a
computer (Nacke et al., 2011).
Infrared monitor eye
gaze tracking HumanComputer Interaction (HCI),
which is limited by restrictions of user’s head
movement and frequent calibrations etc, is an
important HCI method (Cheng et al., 2010, Hansen
andQiang,2010).Thismethodmeasuringtheeffectof
personalization could be the relationship of users’
actualbehaviourin
ahypermediaenvironment with
theoriesthatraisetheissueofindividualpreferences
anddifferences(Tsianosetal.,2009).Thenotionthat
there are individual differences in eye movement
behaviourininformationprocessinghasalreadybeen
supportedat aculturallevel (Rayneretal., 2007), at
thelevelofgender differences
(Muelleretal.,2008),
andeveninrelationtocognitivestyle(verbalanalytic
versusspatialholistic)(GalinandOrnstein,1974).
The most common applications for eye tracking
today are either in marketing (e.g., Maughan et al.,
2007) or in usability research (e.g., Schiessl et al.,
2003).Yet,usingeyetrackers as
devicesforHCIhas
startedtobecomeafocusofresearchinrecentyears
and the field is slowly starting to come of age
(Cournia,etal.,2003,Jakob,1990).However,theuse
ofeyetrackingindigitalgamesisstillnew(Isokoski,
andMartin,2006),inthesameway
itisnewforgaze
interactioninvirtualworlds(Istanceetal.,2009)and
for gaze visualizations in threedimensional (3D)
environments(Stellmachetal.,2010).
In the field of learning and instruction, eye
tracking used to be applied primarily in reading
research with only a few exceptions in other areas
suchastextandpicturecomprehensionandproblem
solving (Halsanova et al., 2009, Hannus and Hyona,
1999,Hagerty andJust,1993,Hyona andNiemi,1990,
JustandCarpenter,1980,Rayner,1998,VanCogand
Scheiter,2010,Verschaffeletal.,1992).However,this
has changed over the last years, eyetracking
is
startingtobeappliedmoreoften,especiallyinstudies
onmultimedialearning(VanCogandScheiter,2010).
Because eye tracking provides insights in the
allocationofvisualattention,itisverysuitedtostudy
differences in intentional processes evoked by
different types of multimedia and multi
representational learning materials (Van
Cog and
Scheiter,2010,Halsanovaetal.,2009).
3 RESEARCHMETHODOLOGY
The Research Methodology must fulfill all three
requirements of the cognitive neuroscience: (a)
experiential verification, (b) operational definition,
and(c)repetition.
The main purpose of this research activity is the
analysis of emotional state and the investigation of
the
standards that connect the user’s Satisfaction
Happinessbyuse of the eyehead movement&oral
text(asthebasisforthesituation)inthebasicdipole:
happiness(satisfaction)sad(nonsatisfaction).
WeusearesearchprotocolPRAS.Itisdefinedin
detecting,recognizingandinterpretingthe
emotional
information in conjunction with other information
created during the execution of a scenario in an
electronic learning marine system (simulators or
trainingsoftware).Theemotional informationcomes
from the userʹs emotional state before, during, and
after the scenario/exercise. Its structure concerns the
followingsections(Papachristosetal.,2012):
1 the mood/emotion before the scenario/exercise
(oraltext)
2 Behavioral action (head movement, gaze) during
thescenarioand
3 the emotional postexperience satisfaction (oral
text).
Measuring the emotional information will be
realizedusingthefollowingprocesses:
1 Natural parameters’ measurement: Movement
parameters(headmovement,gazemovement)and
oral
textastextand
2 Registeringuseropinion/viewpoint/view.
The suggested protocol (Protocol Research of Affect
Situation, PRAS) is comprised by the following
sections(Fig.1)(Papachristosetal.,2012):
Figure1.StructureofPRAS
Influence Sector: it based on Action Tendency
Theory(concernview)andonPracticalReasoning
Theory. This theoretical processing is
characterized as a Framework for Userʹs Innate
Stimuli.Theinfluence’sdepartmentconsistsofthe
followingmeasurementsthattakeplacebeforethe
scenario execution by way of questionnaires: (a)
profile (learning
medical), (b) personality, (c)
expectationsinteresting and (d) personal
background (education, professional experience,
computerusing).
Emotion Measurement Sector: Measurements
concerns the happinesssad (emotionmood) in
combination with the degree of activation
assessment by the user within the framework of
this dipole, i.e. the measurement of dynamics in
relation
tothestimuli(sound,animation,schemas,
etc.) received in total by the softwarescenario
(virtual relationship) considering that the user is
InfluenceSector
EmotionMeasurementSector
AppraisalSector
Questionnaires
Eye
tracking,
voice
recording
Questionnaires
Outcomes
322
alwaysonacoreemotionalstate(coreaffect)and
the specific satisfaction for the scenario and
software (evaluation process of the educational
use for the software and scenario/exercise to the
degreeofsatisfactionofthetraineeuser)adopted
by the Oatley approach that the (personal) goals
have been achieved there
is a sense of joy, while
failuresiffollowedbysadnessanddespairandis
connectedwiththeemotionofsatisfaction.Atthe
sametimethe natural parameters comprising the
protocol’scoreareregistered.Thesearethevisua l
(head movement, gaze tracking with “Face
Analysis tool”) and voice recording (emotional
reasoning)(Asteriadisetal.,2009).Thisisbasedon
the use of tools for recording head movement
(distance from the monitor, leftright head
movement, leftright head rolling), gaze tracking
(x,y coordinates) and voice recording of spoken
words(asreasoningofmetaemotionalexperience
lexicalization of emotional gravity).
Additionally,
the researcher records observations
relatedtothephysiologicalandnonphysiological
attitude of the user (mistakes, time of execution,
executionsuccess,theuser’spsychologicalstate).
Appraisal sector: In this section, the Satisfaction
recordingtakesplacebutalsocommentingrelated
tothedaywhenmeasurementistakingplace&in
totaluptothatmoment,asfarasthesoftwaretool
isconcerned,aftertheexperimentalconductofthe
scenario/exercise (usability), personal self
evaluation, scenario evaluation (benefits) in
combination with the weighed usability
assessmenttool(DECSUSTool)(Brooke,1996).
Data processing concerns the composition of all
the above sectors of
PRAS, so that patterns of the
natural parameters in relation to emotional states
(happiness)andSatisfactionScalecanbefound.
4 PARTICIPANTS
Thefirst(random)samplingwascarriedoutbetween
May and June 2012, in the Marine Engine System
Simulator(MESS)LaboratoryoftheMarineTraining
Centre of Piraeus.
The samples consisted of 13
professional (Merchant Marine officers) that were
subjected to a specific experimental procedure in
engine room simulator ERS 5L90MCL11, (video
recording ~23 minutes per student), completed the
questionnairesand gaveinterviews(research
methodology).
5 DATAANALYSIS
Thedataofexperimentcomefromthreesources:
questionnaires,
opticaldataand
interviews(voicerecording).
The samples consisted of 13 professional
(MerchantMarineofficers,Male)thatweresubjected
toaspecificexperimentalprocedure(Tab.1).
Table1.StructureofSample
_______________________________________________
MerchantMarineOfficer
OrderA’B’
(%)(%)
_______________________________________________
Officers38.5(5)*61.5(8)*
Experience
Sum(totalofyears)3746
Mean(years)7.45.75
Max(years)915
Min(years)64
_______________________________________________
*
(frequency)
The sample’s age profile as shown in Table 2,
prevailtheyounger(1235age).
Table2.Sample’sageprofile
_______________________________________________
Age’sscale2435 3645 >45
(%) (%) (%)
_______________________________________________
MerchantMarineOfficers 53.8(7)* 30.8(4)* 15.3(2)*
_______________________________________________
*
(frequency)
The sample’s medical and personality profile (5
Factormodel)asshowninTable 3andTable 4.The
personality profile presents homogeneity (high
medium) and the medical profile has a proportional
of the sample having diseases of eye (myopia,
astigmatism,etc.).
Table3.Medicalprofile
_______________________________________________
MedicalProfileEyediseases Eyeoperation
(%)(%)
_______________________________________________
MerchantMarineOfficers46.1(6)*7.6(1)*
_______________________________________________
*
(frequency)
Table4.Personalityprofile
_______________________________________________
MerchantMarineOfficers(13male)
veryhigh high medium lowverylow
(%) (%) (%) (%) (%)
_______________________________________________
Extraversion‐23  61.5 7.7
Agreeableness23  46.1 30.7‐‐
Conscientiousness 15.4 76.9 7.7‐
Neuroticism‐7.738.4 7.723
Openness15.4 38.4 30.7 15.4
_______________________________________________
The next table shows the educational and
simulationbackground(Tab.5).
Table5.EducationalandSimulationbackground
_______________________________________________
MerchantMarineOfficers(13male)
PositiveNegative
(%)(%)
_______________________________________________
EducationinComputers5446
Simulationexperience
Education61.538.5
Home15.474.6
Job5446
_______________________________________________
The next table shows the results of model
motivation(basedVroommodel)(Tab.6).
323
Table6.MotivationmodelResults
_______________________________________________
MerchantMarineOfficers(13male)
Positive Negative
(%)(%)
_______________________________________________
PerformanceOutcomeExpectance
Jobsearch6921
Payment38.561.5
Professionaldevelopment84.715.3
Valence
Professionalvalue84.715.3
Socialvalue7.692.4
EffortPerformanceExpectance
Professionalperformance7723
JobSecurity100
Interesting
Newtechnologies100
Educationalbenefits100
Personalneeds7.692.4
_______________________________________________
The next table shows the results of evaluation of
training program (marine engine system simulator
training)andSimulatorassoftwaretool(Tab.7).
Table7.Trainingprogramevaluation
_______________________________________________
MerchantMarineOfficers(13male)
veryhighhighmediumlowverylow
(%) (%) (%) (%) (%)
_______________________________________________
TrainingProgram
Educationalgoal 23  54  23‐‐
Timeschedule38.5 30.7 23.1 7.7‐
Totalassessment 15.4 38.6 23  23‐
Simulator
Navigation 38.5 46.1 15.4‐‐
Interface  23  61.6 15.4‐‐
Multimedia 23  53.9 15.4 7.7‐
_______________________________________________
The next table shows the gradation of the
satisfactionin5
th
scaleaboutthescenarioandMarine
Simulator (Sat_Scen, Sat_Sim) by the users in their
answers(Tab.8).
Table8.SimulatorandScenarioSatisfaction
_______________________________________________
MerchantMarineOfficer(13male)
Satisfactionscaleveryhighhighmediumlowverylow
(%) (%) (%) (%) (%)
_______________________________________________
ScenarioinEngineroom 30.7 61.5 7.7‐‐
SimulatorERS46.1 46.1 7.7‐‐
_______________________________________________
The following tables are observed the statistical
measuresofoptical data(faceanalysistool)forGaze
(verticaly), Dist (distance from monitor) and Head
rollpersatisfaction(simulator&scenario):
Table9.Gazetrackingparameter(satisfactionsimulator)
_______________________________________________
MerchantMarineOfficers(13male)
veryhigh high medium
(4male) (8male) (1male)
_______________________________________________
SatisfactionSimulator
Mean6.757.82.73
Max225.06 332.6  137.7
Min‐252.2‐144.07 84.1
STDEV5.24.117.9
_______________________________________________
Table10.Distparameter(satisfactionsimulator)
_______________________________________________
MerchantMarineOfficers(13male)
veryhigh high medium
(4male) (8male) (1male)
_______________________________________________
SatisfactionSimulator
Mean1.111.06 1.03
Max22.85.26 2.14
Min0.050.13 0.32
STDEV0.10.04 0.12
_______________________________________________
Table11.HeadRollparameter(satisfactionsimulator)
_______________________________________________
MerchantMarineOfficers(13male)
 veryhigh high medium
(4male) (8male) (1male)
_______________________________________________
SatisfactionSimulator
Mean0.581.45‐2.4
Max89.188.9 55.2
Min‐86.1‐89.0828.0
STDEV1.73.57.42
_______________________________________________
Table12.Gazetrackingparameter(satisfactionscenario)
_______________________________________________
MerchantMarineOfficers(13male)
 veryhigh high medium
(4male) (8male) (1male)
_______________________________________________
SatisfactionScenario 
Mean8.287.09 0.21
Max225.06 332.6‐203.4
Min‐252.2‐131.4144.07
STDEV6.62.95 20.8
_______________________________________________
Table13.Distparameter(satisfactionscenario)
_______________________________________________
MerchantMarineOfficers(13male)
veryhigh high medium
(4male) (8male) (1male)
_______________________________________________
SatisfactionScenario
Mean1.121.07 0.99
Max22.817.04  2.47
Min0.090.05 0.17
STDEV0.020.09 0.16
_______________________________________________
Table14.HeadRollparameter(satisfactionscenario)
_______________________________________________
MerchantMarineOfficers(13male)
veryhigh high medium
(4male) (8male) (1male)
_______________________________________________
SatisfactionScenario 
Mean1.90.49‐2.16
Max89.182.9 88.9
Min‐83.6‐86.1‐89.08
STDEV3.42.44 6.68
_______________________________________________
The following table is observed the measures of
lexicaldata(sentiment&opinionanalysis):
324
Table15.LexicalAnalysis
_______________________________________________
MerchantMarineOfficers(13answerstext)
veryhigh high
_______________________________________________
SatisfactionSimulator 
UsingModifier
1
 83.3%66.6%
UsingComparisondegree
2
‐16.6%
P
top
3
  116.6% 116.6%
20%216.6%
083.4% 066.8%
Totalwords(alltexts),TotN
w 250129
Mean(alltexts)41.621.5
IndexWord
Satisf
4
0.190.12
IndexWord
NonSatisf
5
0.020.1
SatisfactionScenario
UsingModifier
1
75%75%
UsingComparisondegree
2
‐25%
P
top
3
125% 137.5%
225% 212.5%
050% 050%
Totalwords(alltexts)181236
Mean(alltexts)45.2529.5
IndexWord
Satisf
4
0.160.17
IndexWord
NonSatisf
5
‐0.02
_______________________________________________
1
Lexicalphraseorwordwithsentimentvolume
2
Positive,Comparative,Superlative
3
Topologyofsentimentphrasesintext:1infist½oftext,2
insecond½oftext,0homogeneityinalltext
4
IndexWordSatisf=∑(WS+/Nw)texti/TotNw
5
IndexWordnonSatisf=∑(WS/Nw)texti/TotNw
forW
S+,WS:∑Wordwithsentimentoropinionload
pertext(positivepolarity+ornegativepolarity)
These results based a Greek Lexicon of Emotions
(Vostantzoglou,19982
nd
editionrevised).
The following table is observed the results from
SystemUsability Scale(DEC SUS)usability
assessmentTool:
Table16.SUSscoreresults
1
_______________________________________________
MerchantMarineOfficers(13male)
fullsample veryhigh high
_______________________________________________
SatisfactionSimulator 
Mean73.279.1 69.1
Max92.592.5 82.5
Min62.562.5 62.5
STDEV10.310.8 7.3
Mode62.5‐62.5
_______________________________________________
1
10080highscore,8060satisfactoryrating,<60low
usability
The Total Satisfaction Index (TSI) is
calculatedasfollow:
TSI=[Sat_Sim+Sat_Scen]/2 (1)
The climax with weights of Total Satisfaction
Index(TSI)ofusersshownbelow:
Figure2.ClimaxofTotalSatisfactionIndex
The following table is observed the results from
TotalSatisfactionIndex(TSI):
Table16.ResultsofTSI
_______________________________________________
MerchantMarineOfficers(13users)
Mean STD Mode
_______________________________________________
TSI1.30.51.5
_______________________________________________
FinallythenextfigureisobservedtheTSIrangein
sample:
0
0,5
1
1,5
2
2,5
123 45 678 910111213
users
Total S atisfaction scor e
Figure3.TotalSatisfactionIndexrangeinsample
6 CONCLUSIONS
Fromtheprocessingoftheexperimentaldatasofarit
isestablishedthat:
Visual attention (VA) from the “Face Analysis
tool”shows
growing the attention as satisfaction scenario
increase(meangrowhighveryhigh)indist
parameter (distance from monitor, >1 close to
thescreen)
growing the attention as satisfaction scenario
increase (mean grow high very high) in
HeadRollparameter(rollingoftheheadeye
angle from horizontal level, <10 attention
dependingonthescenario,>10highmobility)
Ver
y
Hi
g
h
Ver
y
Low
H
L
medium
1
1
0
2
2
Wei
g
ht
325
growing the attention as satisfaction scenario
increase(meangrowhighveryhigh)inGaze
trackingparameter(Gazeverticalparameter>1
viewthescreen).
TheEngineroomSimulatortraininggrowsthejob
security and performance, has professional value
andhelptoprofessionaldevelopment(motivation
model).
In lexical
analysis, we observe the total word of
answer’s users depending from satisfaction
(growing the mean of Total words from high
very high satisfaction)and the IndexWord
nonSatisf
< IndexWord
Satisf(from high very high
satisfaction).
In the SUS score has satisfactory rating and
growing the score from high very high
satisfaction.
Highusability(easytouse,easytolearn).
Finally,theTotalSatisfactionIndex(TSI)ishighin
sample(meanTSI:1.3~characterizationof
‘high’).
The connection between all above elements
resultedfromtheprocessingoftheopticalregistration
dataandtheusers’interview&questionnaires.
Theapproachisgeneralinthesensethatitcanbe
appliedinvarioustypesofelearningmarinesystems.
It is also pluralistic in the sense that it
provides the
evaluator with complementary sources of data that
can reveal important aspects of the user experience
duringshipcontrol.Certainly,theproposedapproach
may require further adaptations to accommodate
evaluationofparticularinteractivesystems.
REFERENCES
Asteriadis, S. Tzouveli, P. Karpouzis, K. Kollias, S. 2009.
Estimation of behavioral user state based on eye gaze
and head pose—application in an elearning
environment, Multimedia Tools and Applications,
Springer,Volume41,Number3/February,pp.469493.
Brooke,J.1996.SUS:A“quickanddirty”usabilityscale.In:
Jordan, P. W., Thomas, B., Weerdmeester, B. A.,
McClelland (eds.) Usability Evaluation in Industry,
Taylor&Francis,London,UKpp.189194.
Cheng,D.Zhao,Z.Lu,J.Tu,D.2010.AKindofModelling
and Simulating Method for Eye Gaze Tracking HCI
System,Proceedingsof 3rd InternationalCongress on
ImageandSignalProcessing(CISP2010),IEEE,EMB,pp.
511514.
Cournia, N. Smith, J. D. Duchowski, A.T. 2003. Gaze‐ vs.
handbased pointing in virtual environments, in: CHI
‘03(Ed.),CHI‘03extendedabstractsonHumanfactors
in computing systems, ACM Press, Ft. Lauderdale,
Florida,USA,pp.772–773.
Dix, A. Finlay,
J. Abowd, G. D. Beale, R. 2004. Human
ComputerInteraction,UK:PearsonEducationLimited.
Duchowski,A. T. 2007.Eyetracking methodology:Theory
andpractice,NY:Springer.
Fontaine,J.R.Scherer,K.R. Roesch, E. B. Ellsworth,P.C.
2007. The world of emotions is not twodimensional ,
PsychologicalScience,18(2),pp.
10501057.
Fotopoulou, A. Mini,M. Pantazara,M. Moustaki, A. 2009.
“La combinatoire lexicale des noms de sentiments en
grecmoderne”,inLelexiquedesemotions,I.Navacova
andA.Tutin,Eds.Grenoble :ELLUG.
Galin, D. and Ornstein, R. 1974. Individual Differences in
Cognitive Style—I. Reflective Eye Movements,
Neuropsychologia,vol.12,
pp.367376.
HannusM,HyonaJ.1999.Utilizationofillustrationduring
learning of science textbook passages among low‐ and
highability children, Contemporary Educational
Psychology24,pp.95123.
Hansen,D.W.,Qiang,Ji2010.IntheEyeoftheBeholder:A
Survey of Models for Eyes and Gaze Pattern Analysis
andMachineIntelligence,IEEETransactionson,Vol.32,
Is.3,pp.478500.
HagertyM,JustMA.1993.Constructingmentalmodelsof
machines from text and diagrams, Journal of Memory
andLanguage32,pp.7142
Holsanova J, Holmberg N, Holmqvist K. 2009. Reading
informationgraphics:therole of spatialcontiguityand
dual attentional
guidance, Applied Cognitive
Psychology23,pp.121526.
Hyona J, Niemi P. 1990. Eye movements during repeated
readingofatext,ActaPsychologica73,pp.25980.
Jacob, R. 1990. What you look at is what you get: eye
movementbased interaction techniques, in: CHI ‘90:
Proceedings of the SIGCHI conference
on Human
factors in computing systems, ACM, Seattle,
Washington,UnitedStates,pp.11–18.
JustMA,CarpenterPA.1980.ATheoryofreading:From
eye fixations to comprehension, Psychological Review
87,pp.32955.
IMOInternational,MaritimeOrganization,2003. Issues for
trainingseafarersresultingfromtheimplementationon
boardtechnology,
STW34/INF.6.
Isokoski,P.Martin,P.B.2006.EyeTrackerInputinFirst
Person Shooter Games, in: Proceedings of the 2nd
Conference on Communication by Gaze Interaction:
Communication by Gaze Interaction COGAIN 2006:
GazingintotheFuture,Turin,Italy,pp.78–81.
Istance, H. Vickers, S. Hyrskykari, A. 2009. Gaze
based
interaction with massively multiplayer online games,
in: Proceedings of the 27th international conference
extended abstracts on Human factors in computing
systems,ACM,Boston,MA,USA,pp.4381–4386.
Kotzabasis, P.2011. HumanComputer Interaction:
Principles, methods and examples, Athens,
Kleidarithmos(inGreek).
Lambov, D. Pais, S. Dias, G. 2011. Merged
Agreement
Algorithms for Domain IndependentSentiment
Analysis, Pacific Association, For Computational
Linguistics (PACLING 2011), Procedia‐Socila and
BehaviouralSciences,27,pp.248257.
Lazarous, R. S. 1982. Thoughts on the Relation between
EmotionandCognition,AmericanPsychologist,24,pp.
210222.
Malatesta,L.2009.HumanComputerInteractionbased
in analysis and synthesis optical data”, Phd Thesis,
Athens(inGreek),NTUA.
Maughan,L.Gutnikov,S.Stevens,R.2007.Likemore,look
more, look more, like more: The evidence from eye
tracking, The Journal of Brand Management 14 (2007)
335–342,doi:10.1057/palgrave.bm.2550074.
Mueller,S.C.Jackson,C.P.T.andSkelton,R.W.2008.
Sex
Differences in a Virtual Water Maze: An Eye Tracking
and Pupillometry Study, Behavioural Brain Research,
vol.193,pp.209215.
NackeL.E.Stellmach,S.Sasse,D.Niesenhaus, J.Dachselt,R
2011. LAIF: A logging and interaction framework for
gazebased interfaces in virtual entertainment
environments, Entertainment Computing 2 ,
pp. 265–
273.
Pinker, S. Jackendorff, R. 2005. The faculty of language:
what’sspecialaboutit?,Cognition,95,pp.201236.
Papachristos, D. Alafodimos, K. Nikitakos, N. 2012.
EmotionEvaluation ofSimulation Systems in
Educational Practice, Proceedings of the International
ConferenceonELearningintheWorkplace(ICELW12),
1315June,
NY:KaleidoscopeLearning,www.icelw.org.
RaynerK.1998.Eyemovementsinreadingandinformation
processing: 20 years of research,Psychological Bulletin
124,pp.372422.
326
Rayner, K. Xingshan, L. Williams, C.C. Kyle, R. C. and
Arnold,W.D.2007.EyeMovementsduringInformation
Processing Tasks: Individual Differences and Cultural
Effects,VisionResearch,vol.47,pp.27142726.
SchiesslM, DudaS,TholkeA,FischerR.2003.Eyetracking
and its application isn usability and media research.
“Sonderheft:
blickbewegung”inMMIinteraktivJournal
2003;6.
Stellmach, S. L. Nacke, L. R. Dachselt, R. 2010. Advanced
gaze visualizations for three dimensional virtual
environments. in: Proceedings of the 2010 Symposium
on EyeTracking Research & Applications, ETRA,
Austin,Texas,pp.109–112.
Tsianos, N., Lekkas, Z., Germanakos, P., Mourlas, C.,
Samaras,
G2009.An Experimental Assessmentof the
Use of Cognitive and Affective Factors in Adaptive
EducationalHypermedia, IEEETransactions on
Learning Technologies, Vol. 2, No. 3, JulySeptember
2009,pp.249258.
Tsoukalas, V. Papachristos, D. Mattheu, E. Tsoumas, N.
2008. Marine Engineers’ Training: Educational
AssessmentofEngineRoomSimulators, WMU
Journal
ofMaritimeAffairs,Vol.7,No.2,pp.429448,ISSN1651
436X,CurrentAwareness Bulletin,Vol. XXNo.10,Dec.
2008,IMOMaritimeKnowledgeCentre,pp.7.
VanGogT,ScheiterK.2010.Eyetrackingasatooltostudy
and enhance multimedia learning. Learning and
Instruction20,pp.9599.
Verschaffel L, De
Corte E, Pauwels A. 1992. Solving
compare word problems: An eye movement test of
Lewis and Mayer’s consistency hypothesis, Journal of
EducationalPsychology84,pp.8594.
Vosniadou, St. 2001. Introduction in Psychology, Vol. I,
Athens,Gutenberg(inGreek),2001.
Zajonc, R. B. 1984. On the Primacy of Affect, American
Psychologist,39,
pp.117123.