795
1 INTRODUCTION
1.1 DigitalisationandAutomationinShipping
While safety, economic and environmental concerns
haveincreasedconsiderablyintheshippingindustry
to deal with the challenges in growth and transport
sustainability demands [1], rapid advances in
technologies appear to provide solutions for this
competitiveindustry[2].Shippinghasbeeninvolved
intheglobalwaveofcomputerisationinanattemptto
enhance safety, effectiveness and efficiency of sea
transportationoperations,tosaveenergyandincrease
profitability and competitiveness [35]. From paper
chartandcompassintheanalogworldtoallkindsof
computerisednavigationalormonitoringsystemsina
digital
world,shippinghasalwaysbeenundergoinga
transition.Today,therearemanyautomatedsystems
and Information Technology (IT) applications that
havebeendesigned, developedanddeployedin the
maritimedomain.Therapidadvanceoftechnologies
isrevolutionizingthewaypeople work‐thesetools
provide both shipboard and shorebased users
unprecedented
opportunities to enhance operational
practicesandaccessinformationinpursuitofhigher
efficiency,effectivenessandsafety.
Thetwokeywordsthatmaybestcharacterisethe
ongoing transition in the shipping domain are
digitalisation and automation. According to the
GartnerITglossary,digitalisationistheuseofdigital
technologiestochange
abusinessmodelandprovide
Towards a Pluralistic Epistemology: Understanding
Human-Technology Interactions in Shipping from
Psychological, Sociological and Ecological Perspectives
Y.Man,M.Lundh&S.N.MacKinnon
ChalmersUniversityofTechnology,Gothenburg,Sweden
ABSTRACT:Intheshippingdomain,manyinnovativetechnicalsystemshavebeendesignedanddevelopedin
thepastdecades,aimingtoenablethemaritimeuserstoachievethegoalofsafety,efficiencyandeffectiveness.
The introduction of advanced technologies into workplaces have also created unprecedented challenges.
Human users frequently find themselves in a supporting role to serve technology, being responsible for
automationissuesandblamedfor“humanerrors”thatsometimesresultintragicresults.Thesechallengesare
closely associated with the design and use of technologies. Humantechnology interactions has become an
importantmultidisciplinaryresearchtopic
forshipping.Thisarticlereviewstheoreticalconceptsrelativetothe
dimensionsof psychology,sociologyand ecologyinHumanComputer Interaction(HCI) inorder toform a
deeperunderstandingofhumantechnologyinteractions.Thispaperalsodiscussesthetheoreticalconstructs’
practicalrelevancebyshowinghowafewcasesexemplifyingongoingdevelopmentsectors
inshipping,suchas
energyefficiencyoptimisation,supervisorycontrolofautonomousunmannedships,andecosystemsinengine
control rooms, are understood with these theoretical perspectives. By presenting multidisciplinary
understandings of humantechnology interaction, this paper aims to derive knowledge pertinent to
methodologicalapproachesandphilosophicalstancesoffuturemaritime
humanfactorsandHCIresearch.
http://www.transnav.eu
the International Journal
on Marine Navigation
and Safety of Sea Transportation
Volume 12
Number 4
December 2018
DOI:10.12716/1001.12.04.20
796
new revenueand valueproducing opportunities [6].
For example, the introduction of virtual and
augmentedrealityprovidesnewplatformstosupport
training, retrofitting and design solutions [7]; the
adoption of cloud computing revealsgreat
opportunities from scalable computation, security
improvement and resource management [8].
Increasing interest or utilization of the
big data
concept [9, 10] and blockchain [11] creates huge
potentialsto bemore effectiveand safer.One ofthe
major ongoing international developments is e
Navigation, which proposes to use electronic means
to harmonize “collection, integration, exchange,
presentation and analysis of marine information on
board and ashore, enhance navigation
and related
services for safety and security at sea” [12]. Many
innovation projects regarding information and
communication technology (ICT) have been carried
outintheEuropeanUnionunder theumbrellaof e
Navigation.AsexamplesofEuropeanUnionfunded
programmes, Maritime Unmanned Navigation
through Intelligence in Networks (MUNIN) studied
the feasibility
of autonomous unmanned ships, Sea
Traffic Management(STM) focused on efficient data
exchange between ships and ports and EfficienSea2
aimedtocreateandimplementsolutionsforefficient,
safeand sustainable trafficatseathrough improved
connectivityforships.Itisnoteworthythat,although
digitalisationappearsto have aprofound impact
on
navigationalsafety,cargomanagement,ship/terminal
integration, customer service etc., only 30.4%
shipping companies have implemented or are
currentlyimplementingadigitalisationstrategybased
on the Shipping Industry Survey [13]. At the same
time,autonomypointstowhattheshippingindustry
wants to achieve in the future [14, 15], as the
prevalence of automation onboard has been
increasingly desired by shipowners to reduce
manning levels in order to cut costs [16]. Beside
MUNIN,manyotherprojectsregardingautonomous
ships have been launched in the past several years
[17, 18]. All these characterise the modern shipping
industry in a transition phase as a slowly evolving but
highly tech-driven industry.
1.2 ChallengesinHumanTechnologyInteractions
However, the way to incorporate
digitalisation and
autonomyintothemaritimedomain isquestionable.
The rapid introduction of IT solutions without full
recognitiontothehumanelementandadaptation[4,
19, 20] has been dominantly technologycentric.
Technologyintroductiononboardmovesmuchfaster
than regulatory development. Given this reality the
shipping industry lags behind
other safety critical
industries. Rapid technological advancements create
unprecedented challenges and gaps: often these
systems usually have little considerations for
integrationwithorcognitivecapabilitiesoftheusers
[2123]. The lack of equipment standardisation (e.g.,
in Engine Control Room) and poor usability have
brought about information overload issues [4]. Fast
introduction of technology is also influencing the
crew’s traditional learning experience [24], affecting
howworkerscommunicateandcoordinatewitheach
other[25, 26].From amore macroscopicperspective
tolookatthedigitalisationdevelopmentstrategy,the
slow regulatory progression from the International
Maritime Organisation (IMO), shipping’s highest
regulatorybody, has
alsocompoundedthe
complexity[27,28].Thetypicalreactiontoanaccident
hasbeentrainingandchangingofprocedureswhere
systemicevaluationsorproactivepolicieshaverarely
been developed [29]. Therefore, what we see today
are “human errors” (regarding humans) and
automation issues (regarding machines) in human
technology interactions, revealed in
many maritime
accident reports and studies [3032]. “Clumsy
automation”, a form of poor coordination between
humansandmachines,isusuallybelievedtoaccount
for incidents [33]. What we see is the illdesign of
technology which makes users deploy and use the
systems inways thatare not expected
by regulators
and manufacturers [34, 35]. The systems may also
behave in a way not expected by users, e.g., in the
form of automation surprise” [23, 36] such as the
automated systems act autonomously and the users
may find it hard to understand the machine’s true
statebasedonlimitedfeedback
(lackoftransparency).
In the maritime domain, shipping, logistics and
management are going to be performed in a
completely different way in the future as advanced
andyetto bedescribed technologiesare introduced.
These problems and gaps in shipping are largely
concerned with human capabilities, fallibility and
characteristics, which
are highly relevant research
areasinhumanfactorsorHumanMachineInteraction
(HMI)research(Karwowski,2005).
TheconceptofHMIcanbetracedbacktothefirst
time a human interacted with a “device” to make
sense of its surroundings (e.g., a hammer or a
compass).HMIresearch traditionallyfocusedon
the
human behaviours in interacting with computing
technologyofsomesort[37]orusabilityengineering
inadyadicclosedloophumanmachinesystem[38].
The research topics in HMI have drawn significant
attentioninthelastseveraldecades,suchasHuman
Computer Interaction (HCI) [3941], Human
Automation Interaction [42
45] or Human Robot
Interaction[4648].Spawnedfromthehumanfactors
community (which derives from the problems of
designingcontrolsforhumansduringWorldWarII)
and many other older disciplines, HCI, as a field of
study, emerged as a focal area of both computer
scienceandofappliedpsychology
inthe80’s(e.g.,the
first conference on Human Factors in Computing
Systems was held in 1982 which later turned to the
annualACMSIGCHIconference)[40, 49].Todaythe
principlesandperceptionsofHCIareinfluencingthe
ways we understand the sociotechnical system of
shipping.Inthewhitepaperofan
internationalproject
about autonomous shipping funded by Finnish
FundingAgencyforTechnologyandInnovation,itis
mentionedthat“inadditiontotheinteractionsofthe
various components and subsystems in the
technology,humanoperatorsandhumantechnology
interaction remaineven more important elements in
this implementation” [50]. In a textbook
of human
factors in the maritime domain, a few “examples of
HMI problems” on ships are listed, e.g., lack of
equipment usability and standardisation, over
relianceontechnology,rapidchanges intechnology,
ignoring human factors in the design and
development[4].Nevertheless,howHCI(orhuman
technology interaction in a
more general way) is
perceivedandunderstoodinitstheoreticalsensesand
797
the relevance to the shipping domain remains to be
explored.
1.3 PurposeoftheStudy
The modern HCI has evolved into a very
comprehensivemultidisciplinaryresearchfield with
a plethora of theoretical constructs and frameworks
andacademicworks (Karwowski,2005),particularly
intheschools ofpsychology, sociologyandecology.
Psychology, a study of the human mind and its
functions[51],mayprovideguidelinesfordevelopers
andverifytheusabilityofsystems[40]basedonbasic
cognitive constraints and essential decision making
processes [52]. Sociology, a study of human society
including social relationship and social interaction
[53], may help us
to expand the lens to the social
culturalaspectsofworkinthecontextoftechnology
use [5457]. Ecology, originating from biological
studies thatdeals with therelations of organisms to
oneanotherandtotheirsurroundings[58],mayhelp
ustoscrutinizefundamentalphenomenonofhuman
technology interaction with
concerns on human
experience,interfacedesignandecologyofthewhole
system [59, 60]. This conceptual paper first reviews
some important theoretical concepts relative to
psychology, sociology and ecology to gain a deeper
understanding of the evolving themes of HCI and
nature of humantechnology interaction. Then the
paper connects
these theoretical lenses to a few
maritime applications to synthesise these relevant
models, theories and knowledge. A few maritime
cases exemplifying ongoing development sectors in
shipping,suchasEnergyEfficiency(EE)optimisation,
supervisorycontrolofautonomousunmannedships,
and ecosystems in engine control rooms, are then
discussedinthelight
ofthesetheoreticalperspectives
to explore the impacts of advanced technologies in
shipping(digitalisationandautomation),experienced
gaps and issues, as well as potential design
opportunities. By presenting multidisciplinary
understandings of humantechnology interaction, it
hopes to derive knowledge pertinent to
methodologicalapproachesandphilosophicalstances
astowherefuture
maritimehumanfactorsandHCI
researchisheading.
2 THEORETICALFRAMEWORK
2.1 Psychologicalperspective
The fundamental interactive formalities between
human and artefacts, “the interaction and
communicationbetweenhumanusersandamachine,
dynamic technical system, via a Human Machine
Interface” [61]”, remains essentially unchanged over
the years.The semiotics is a
dyadicframework (i.e.,
machine presentssigns and humans actupon them)
that originatedwith Saussure[62]. The conventional
belief is that the user’s mind is the only cognitive
substance in the dyadic relationship and thus the
meaningcanonlybestructuredfromtheinsideofthe
human agent. This has
led HCI or HMI to be
fundamentally recognized as an information
processingtask[63].Historicallyhumanswerebeing
viewedassystemswithlimitedcapacityin
informationprocessingtasksandstimulicompetefor
theresources[64].Ahuman’scognitivesystemsand
the mechanisms about information processing has
been extensively exploited [40, 63,
65]. In fact the
human’s cognitive systems and the mechanisms
about information processing has been extensively
exploitedviadifferentsystematicapproaches,looking
atvisionandattention[6668],mentalmodels[69,70],
memory functions [71], workload [72, 73], situation
awareness [74, 75] and other human performance
constructsindifferenthuman
informationprocessing
stages[40,52,63].
Card, Newell [37] argued that it would be a
naturalthingforanappliedpsychologyofHCItobe
based theoretically on information processing
psychology.TosomeextenttheHCIdisciplinecanbe
seenas“atypeofappliedcognitivescience,tryingto
applywhat
isknownfromsciencetothedesignand
constructionof machines”[76]. Witha goalto make
the interaction more efficient, a plethora of research
hasbeendedicatedtoexploringtheusabilityissuesin
theilldesignedcomputerinterfacesintermsofhow
they failed to support the human information
processingcapabilities oraccommodate thehuman’s
intrinsic limitations [7779]. One wellknown issue
aboutthiswas“theGulfofExecution”and“theGulf
of Evaluation” [80], which respectively reflect “the
difference between what the user wants to do and
what can actually be done using controls that are
available”, and
“the mismatch between the user’s
intentionand expectationand the actualstate ofthe
system”[77].
Attention is very important in information
processing [52]. There existed many theories that
were trying to account for attention and elucidate
whatattentionis,suchasbeingconceptualizedbyas
a spotlight which spatially
disengages from current
location,movetothetargetlocationandengageatthe
cue[67],orasazoomlensthatcanallocateattention
over a variablearea so that saccades are directed to
the geometric centre of the cues [68, 81], or be
characterized with a selective filter in the
early
selection model [64], or be related to perceptual
memory in the late selection model [82], or
developinghybridmodels[83]orFeatureIntegration
Theorythatassumesthat thefeatureofstimulusare
codedintoafeaturemapforvisualsearchtasks[84].
Regardlessofmetaphors ormodelsbeingdeveloped
or used, attention was widely recognized as a
selective process that brings stimulus into
consciousness or selects parts of visual items for
furtherdetailedanalysis[52,85,86].Thecoreidea of
thisselectivityis thatitreduces theinformation[87]
while it provides “energy” to various information
processingstages[52],
giventhat thesuppliesof the
attentionalresourcesislimited[66].
Another notion that has drawn significant
attentionthatisrelativetotheinformationprocessing
paradigmisSituationAwareness(SA),whichhasan
extensive use andtheoretical discussion in maritime
sectors, aviation industry, military training,
teamwork, education and so on
[8890]. There are
variousdefinitionsandexplanationsofSAtermsand
their orientation context [9193], but Ensley’s model
has been the most widely referenced SA model to
describe it as an operator’s knowledge of the
environmentat a givenpoint of time[74, 75]. There
798
are three levels of concept in her model, i.e., the
perceptionoftheelementsintheenvironmentwithin
avolumeoftimeandspace(level1SA,perception),
the comprehension of their meaning (level 2 SA,
comprehension)andtheprojectionoftheirstatusinto
the future (level 3 SA, projection/anticipation).
SA
certainly has some roots in the everyday language
about explaining what the operator is not aware of
whatmakeshim/herlosethe“bigpicture”.A critical
assumption of this understanding is that operators
must be aware of particular information at critical
momentstomakecriticaldecisions[94].Thisis
stilla
predominantapproachtoreferencesofSAinaccident
reports,inwhichthenotionofSAistightlyconnected
tocognitiveaspectssuchasworkloadandperceptual
factors[95,96].
In the classic information processing cycle
proposedbyNeisser[65],thehumanmindcreatesa
cognitiveschemeoftheworld
anddirectshisactions
tolookfortheanticipatedaspectsoftheinformation.
Thesampledresultsfromtheworldwouldinreturn
modify and update the internal cognitive map. The
cognitive map is framed as mental model to represent
external reality [69] or conceptualised as a result of the
physiological perception [97] or a representation of the
world and their meanings and knowledge [70]. Endsley
[98] believed that the mental model is a systematic
dynamic
understanding of how the world works,
directing interactive cycles of bottomup and top
down information processing. Therefore Endsley
[74]’s SA model is essentially a model based on
information processing mechanism. The information
processing cycle also inspired a lot of other research
regarding the SA conceptualisation [89, 99-102]. For
example, instead of conceptualising SA as a product,
Distributed SA proposed by Stanton, Stewart [103]
described SA as an emergent system property residing in all
of the involved agents, which is beyond the psychological
level.
If we still adopt the psychological school of SA
conceptualisation, then sufficient SA
is required for
decision making [104]. Rationality is the implicit
motivator behind this process per traditional
cognitive science. Simon [105] argued that three
activities were involvedin decision making: Human
needtofindoridentifythesituation,andtheninvent,
developandanalysepossiblealternatives,andfinally
selectaspecificpath
fromthoseavailable.Seemingly
segmented stages were used to describe this
“rational”informationhandling (i.e.,sensory
processing, perception, decision selection, response
execution) affected by memory and attentional
resources [52]. The point is that logically sound
decisions are what a rational decision maker seeks
after,thoughthedegreeofrationalitycan
vary[106].
The core assumption about rationality in decision
making is that objective data and formal process of
analysis empowered by the laws of probability,
expected utility theory or Bayesian statistics yield
optimal judgements [107]. However, there aremany
critics about the rational decision making approach
suchascognitiveresourcelimitations
andbehaviour
biases[108].Heuristicsandexperiencearefrequently
mentioned as useful approaches to constrain search
[63,109,110]andenablesrapidresponsesindecision
making, which indicates that humans do not
necessarily adhere to the classic optimal decision
makingprocessdrivenbyrationalityorprobabilityin
many situations. Much
research focus had shifted
from laboratory settings to dynamic natural settings
in 80s to understand how people actually make
decisionsincomplextasks,leadingtotheemergence
of situated cognition [55], distributed cognition [57],
recognitionprimed decision making under the
frameworkofnaturalisticdecisionmaking[110,111]
as examples. One well
known example is that
firefighters were found that they did not make
algorithmic strategies or comparing alternatives at
criticalmoments,buttheyjustsimplyrecognizedthe
situationwithexperienceandusedadaptivewaysto
putoutfires[110].Therecognitionofthesituationis
seemingly consistent with the proposition of
“learning by doing” [112] or the dynamics of
assimilationandaccommodation[113].
Ifthereisafailureofrationality,thenerrorscould
occurandpropagatelikeadominoeffectfollowinga
seeminglylinear causal chain. Reason [114]’s“Swiss
Cheese Model” is probably one of the most well
known models in terms
of understanding accident
analysisandhumanerrors.The“defensiveplanes”or
the series of “barriers” (i.e., decision makers, line
management,preconditions,productiveactivitiesand
defences), represented as slices of cheese, illustrated
how accidents might occur when a trajectory of
accidentopportunitythroughthemultipledefencesis
created. Although the notion
of “active failure”
triggered by “latent failure” and queries into
organisational factors and management issues [114,
115] shows a tendency to go beyond the cognitive
psychology, Reason’s approach may still be
considered as a linear view of accident trajectories
[116]. Decisions made at a given moment usually
made perfect sense for
that particular context and
onlyhindsightcanrevealitsvulnerability[117].
2.2 Sociologicalperspective
Although HCI’s origin has been based on cognitive
psychology that tends to study individuals in
isolation from the surrounding environment and
context, there has been a growing consensus in
academia since‘80s that the cognitivepara digm
has
itsownlimitationsandthereshallbemoreroomfor
socialandcontextualorientation[40].ActivityTheory
(AT) provides an alternative to study human
consciousness based upon anthropological /
psychologicaltheoriesofVygotsky[118]andLeontʹev
[119] that concentrate on the interactions between
humans and artefacts in natural every
day life
settings [54, 120]. In AT, consciousness is reflected
and manifested by what we do in the social
environment[54].Thesubjectsandobjectsarepeople
andentities developedin cultureandare essentially
social,thereforethesocialactivitiesarechosenasthe
unit of analysis. This was believed
to be a radical
deviation from the typical empirical studies in
psychology that at time focused on subjective and
objective phenomena in observations and controlled
experiments[54].Iftheclassicanalyticalapproachis
aboutstudyingthesubjectsandobjectsseparatelyin
anattempttofindsomerelationshipinbetween,then
Vygotsky’s
theoriesarelookingattheinteractionsor
the“actingoftheworld”.Theseconceptswerefurther
expanded to the AT by Aleksey Leontiev, who
799
focusedontheevolutionofmindandintroducedthe
concept“activity”asananalyticaltooltounderstand
the fundamental subjectobject interaction [121]. The
basic representation of activity is the existence of
subject(apersonoragroup),object(anobjectivethat
motivatestheactivity)andthepurposefulinteractions
between
them [119]. Vygotsky and Leontiev’s
researchhaveshapedanewperspectivethatactivities
are not treated as a collection of a linear movement
but should be analysed through dynamic lens
grounded in cultural and historical developments.
Later the scope has been further expanded
Engeström [122] useda notion of
“community” and
its relations to subject and object to illustrate the
relationship between individual and group (e.g.,
socialnorms,culture,rules,conventions,etc.)andthe
relationship between group and organisation (e.g.,
divisionoflabour).
Thesestudiesaboutcollectiveactivitiesandsocio
cultural perspectives has significantly enriched the
content and scope of
AT, enabling it to be a tool to
understand problems in an organisational context
[123]. Exclusively focusing on the information
processing tasks in many reallife design projects is
meaningless, if researchers ignore the gaps between
organisationsandsystemdesign.ATcouldbehelpful
toconsiderallaspectsofactivity,
frommotivationto
operations, from physical to social conditions of an
activity,toaddress theissuesinorganisationalchange
andsystemsdesign[123].Therearetwonotionsthat
are central to the evolution of HCI: First is context,
which is constituted through the enactment of an
activityinvolvingpeopleandartefacts;
contextisboth
something internal (e.g., motivation) and something
external (e.g., other people, artefacts, environment,
settings)[124].Secondisthenotionofmediation[125,
126]. Artefacts can shape how we interact with the
world and they may also represent how we
understand the world. The culturally developed
artefactsmaybecome
partsofwhathumansare[127,
128].Technologiesare“thefundamentalmediatorsof
purposefulhumanactionsthatrelatedhumanbeings
to the immediately present objective world and to
humancultureandhistory”[54].Thesubjectusesthe
technological artefacts in a certain context with
attentions and motives and the artefacts’ roles
are
mediating the relationship between the subject and
theobjectofthatactivity.Byconsideringthehuman
use of technology within a much wider context of
human interaction with the world, the sociological
perspectivesubstantiallyexpandsthescopeofclassic
informationprocessingbasedHCI[54,125](Table1).
The meanings of
a behaviour in the activity are
embodiedintheuseofthemediatingartefactsinthe
realworld settings. This could be well explained by
the approaches of situated action [55, 129] and
distributed cognition [57, 130], which advocate the
values to study the actual behaviours in reallife
settings and
emphasise the importance of context.
Situated action is “the activity of personsacting in
setting”[56].It arguesthatthecontextcaninfluence
the activity so people can improvise and innovate
basedonthespecificsituation,whereasthetraditional
informationprocessingparadigmasserts that
problem solving is a process characterized
of
rationality [55, 56]. For example learning at work is
deemed asa culturally and sociallysituated activity
[131]. In Lave and Wenger [129]’s concept of
communities of practice, groups of workers share a
commonconcernandlearnhowtoimprovetheways
of doing as the interaction between and within
the
groups proceeds on a regular basis. Knowledge
development is seen as collective and collaborative
achievementsinthecommunitiesofpractice[129,132,
133].Theemphasisisthatknowledgedevelopmentis
achieved by increased participation, which refers to
theprocessinwhicha“newcomer”immerseshimself
or herself in the
sociocultural practices of a
communityandthushis/hercompetencewouldgrow
ashe/sheismoreknowledgablyskilfulthroughmore
interactions [129]. The characteristics of situatedness
canbreedcollaborationsandinnovationsthroughthe
increased participation [134,135]. Distributed
cognitionadvocatestodirectawaytheanalysisfrom
individualpropertiesorknowledgetoa
systemlevel,
such as the distributed collection of people and
artefacts, the functional relationships of the system
[130],whichisalsoopposedtotheclassicviewofthe
informationprocessingparadigm.
2.3 Ecologicalperspective
The limitations of the manmachine dichotomy are
that, within the traditional information processing
paradigm, it
usually treats the elements in isolation
without considering the humanenvironment
relationship [136]. How a person perceives the
environment and forms the representation of the
world has suggested the values of ecological
perspectivesinpsychology[58,137,138].Gibson[58]
developedecologicalphysicstodescribe aworldview
forstudyingperceptualexperience
whichisnomore
independentofanobserver, opposedtothe classical
physics [139]. The ecological approach asserted that
the lawsthat relate the observerto the environment
aresomethingtightlyconnectedtofunctionalaspects
oftheenvironment,i.e.,affordances[139].Gibsonian
concepts contend that there is inseparability of the
behaviour of the human from the work domain,
which is a radical departure from the traditional
views of psychology as a study of the human
organism[140].
Table1. Traditional HCI with the informationprocessing
based psychological approach vs. modern HCI with the
socialculturalapproach.
_______________________________________________
TraditionalHCI ModernHCI
_______________________________________________
Dominating Psychological Socialandculturalfactors
factors factors
Unitof LowlevelanalysisHighlevelanalysison
analysis onuserinterface  meaningfulactivity
interaction
Context UserToolUserToolEnvironment
MainFocus Tasks(typically Mediatingartefactsin
individual)subjectobjectrelationship
(typicallycollective)
Methods Laboratorystudies Ethnographicstudies
focusingonpracticesin
reallife
Approaches Normativesuch Descriptiveapproach
astaskanalysis suchasActivityTheory
Inclusive Affordance,etc. ComputerSupported
notionCooperativeWork
(CSCW),etc.
_______________________________________________
800
OnewellknownecologicalapproachisEcological
InterfaceDesign(EID),whichwasproposedtoguide
interface design for complex sociotechnical systems
[141143].A dominatingfactor forundesired human
performance was found to be the lack of functional
relationship of the controlled process thus the
interfaceprovidedincompleteproblemrepresentation
[141,144].Thisisessentiallyaquestionof1)howto
findasuitablelanguagetodescribethecomplexityof
the domain, to reveal the constraints within and
unravel the intricate relationship of the variables in
thecontrolledprocess[143];2)howtheinterfacecan
communicatewiththeoperators[143].
EIDproposes
to create a mapping relationship between the
invariants of the functional systems in the work
domainandtheinterfacesonthedisplay,inorderto
make the abstract properties of the controlled
processes visible to the operators [60, 141]. Such
ecological perspective expands the scope of analysis
to the
work domain, forming a triadic HCI model
(humaninterfaceecology)toaddresstheconcernsof
contextandfunctionalconstraintsoftheworkecology
[62, 145]. It suggests that good design cannot be
achievedwithoutadequateknowledgeandthorough
understandingofthedomain[60,146].Theecological
approachpromisesabasic
butimportantfoundation
toallowustounderstandsociotechnicalsystemsviaa
humanenvironment system perspective [147]. For
example, much research regarding Cognitive Work
Analysis (CWA) have emerged in recent years [148
153] to identify the technological and organisational
requirementsandmodelintrinsicworkconstraintsto
informdesign[154].
In addition
to interface design, the ecological
insights have evolved modern HCI studies by
encouraging research to “think big” (i.e., take the
systems perspectives to understand human
automationrelationshipandfocusontheglobaleco
system) [60, 155158]: HumanAutomation System
considers that “human operators are intermittently
programming and receiving information
from a
computerthatinterconnectsthroughartificialsensors
and effectors to the controlled process or task
environment” [159]. Over the past four decades,
human supervision of automated systems has
essentially been developing a formality of coupling
and communication between humans and machines
[159,160].Whileautomationhasgreatadvantagesfor
quality
control and performance efficiency in
handlingroutinetasks,itmightprovidetheleasthelp
forthehumanoperatorstosolveproblemwhenthere
are unexpected events. This is known as “ironies of
automation” [161]. Woods [162] also described
automated systems as “strong, silent, clumsy, and
difficult to direct”. There is
considerable research
literature concerning automation issues and its
connection to system effectiveness, such as the
strengths and weaknesses of humans and machines
(see Table 2). Although many cited works focus on
the psychological processes impacted by the
introduction of automations, some of them already
appreciated the systems perspective to understand
the
automationissues.
With the trend to use the systems perspective to
understand humanautomation relationship, some
taxonomies that were proposed to describe tasks
distributed in a humanautomation system were
criticised for encouraging reductive thinking [173]
andprovidinglittlehelptoaddresstheneedstomake
humans and machines work
together to accomplish
complex tasks [23], such as the concepts of level of
automation (LOA) or degree of automation (DOA)
[45,174177].Automatedsystemsarebecomingmore
capableandtrulyautonomousbutunexpectedevents
andautomationfailuresmayhardlybe ineradicable,
representing huge complexities in the actual field
[178]. There
is a growing need for the human
automation system to work in teams with common
goals[23,179].Inahumanautomationsystem,what
ismostlyrequiredforimproving effectivenessisnot
necessarily enhancing each agent’s capability, but
their integration in collaborations [23, 157] in
sociotechnicalsystems.
Table2.Studiesthatdescribeautomationissues
__________________________________________________________________________________________________
StudyContext NegativeOutcomesPrimaryAnalyticalApproach
__________________________________________________________________________________________________
[163] Aviation“Physicalisolation”(isolatedfromthephysical Psychologicalperspective
structureoftheairplaneandship)and“mental
isolation”(isolatedfromthesystemstate)
[44,104] AviationLowsituationawareness,errantmentalmodelsandPsychologicalperspective
outoftheloopsyndrome
[164,165] Aviation Automationbiasandcomplacency(referstothe Focusingonthecognitiveprocesses
andMaritimeoperator’spurportedbehaviourofnotconducting involved(psychologicalperspective)
necessarysystemcheckbutassuming“allwas
well”
whenthedangeroussituationactuallyevolves);
misuseordisuseofautomation
[166,167] AviationTrusttowardsautomation,automationcomplacency Psychologicalperspective
[168171] Aviation“Errorofomission”(theoperatorfailedtorespondorFocusingonthepsychologicalimpactof
delayedrespondingtothesystemsirregularities), automationonhumanoperators
“errorofcommission”(theoperatortrustedthe (psychologicalperspective)
prominentparametersononedisplaydespite
contradictoryinformationonother)ofautomation
bias
[172] GeneralOver‐andunderrelianceortrustonautomationAnalysed fromorganisational,
sociological,
interpersonal,psychological,and
neurologicalperspectives
[173] GeneralOver‐andunderrelianceortrustonautomation Analysedfromthesystem’sperspective(the
(overtrustisaresultofhighselfdirectnessandlowsystemismodelledwith“selfsufficiency”
selfsufficiencywhileundertrustisaresultofhigh and“selfdirectness”,freedomfromoutside
selfsufficiencyandlowselfdirectness)control)
__________________________________________________________________________________________________
801
Ithasbeenobservedthatinmanyaccidentswhere
safetywasobviouslycompromised,humanerrorhas
been considered as one dominating factor [31, 180,
181], being understood as an individual or a team’s
behaviours exceeding certain limits of a system’s
thresholdordeviatingfromtheexpectednorms.The
focus on human
fallibility [114] or individual
limitations makes it incapable of understanding the
system in holistic terms [117, 182]. Based on the
ecological approach, variability is manifested in the
process during which individuals express their own
degree of freedom to adapt to the local constraints
[183].Safety dependson theability of
thesystem to
remainwithintheboundaries[116]whilethesystem
might be drifting all the time [158]. The drift is not
because of operators’ evil desires, inadequate
vigilance or knowledge, instead, the deviation
becomes part of norm in highrisk operations. Risks
could accumulate during the “incubation period”, a
period
in which the incremental changes that may
later contribute to a systemwide collapse get
unnoticed [184, 185]. On one hand it shows the
importance to build adaptive systems to allow the
operatorstorespondflexiblyinanincreasedmargin
of manoeuvrability [186]. On the other hand, it
suggeststo
shiftthefocusfromindividualcapabilities
tohow thesystemfunctions aswhole.Itis nomore
just about humanmachine interaction but about
couplingsbetweenhuman,technologyandecology.
3 TOWARDSAPLURALISTICEPISTEMOLOGYIN
MARITIMEHUMANFACTORSRESEARCH
This section will discuss the aforementioned
theoretical constructs’ practical relevance by
presenting a few applied maritime projects. The
projects include shorebased control centre
development in the autonomous unmanned ship
project (Maritime Unmanned Navigation through
Intelligence in Networks, MUNIN), tabletbased
service for bridge opening information coordination
project (GOTRIS, Göta älv River Information
Services),EnergyEfficiency(EE)optimisationproject,
maritimeconnectivity
platform(EfficienSea2project).
The projects hadtheir own specific contextbut they
all involved advanced digitalisation or/and
automation development and the human element in
complex maritime sociotechnical systems. They are
development examples of a few very representative
sectorsinshipping,suchasEE,autonomousshipping,
and digitalisation of tools and
equipment used
onboard.Theprojectswerehighlytechnology driven
and developed mainly by people with technical
backgrounds who usually are more techsavvy and
less concerned with humanmachine interaction. To
perceive humantechnology interaction through
differentlensesmay setupanimportantfoundation
to allow reflecting and synthesising links
between
required disciplines in pursuit of a coherent whole
andapluralisticepistemology.
3.1 Supervisorycontrolofautonomousunmannedships
The shipping industry is at full speed towards full
autonomy and high degree of digitalisation [187].
MUNIN is a European Union project that examines
thefeasibilityofautonomousunmannedvessels and
their automation governance from shorebased
facilities during intercontinental deepsea voyages
[188]. Considered in the MUNIN paradigm, an
autonomousunmanneddrybulkcarrieriscontrolled
by anautomated autonomous shipcontroller that is
concurrently monitored by an operator at a Shore
ControlCenter,SCC.Adatacommunicationchannel
connects
the SCC control system and controlled
process in the autonomous unmanned vessels,
forming an essentially remote supervisory control
system[159].
One observed human factors issue is that the
participant might fail to attend to new alarms if
he/she got “trapped” in dealing with an existing
alarm [15, 189]. This is exactly
what Endsley
described as attentional tunnelling that the operators
“lock in on certain aspects or features of the
environment…will either intentionally or
inadvertently drop their scanning behaviour” [104],
whichisthemostcommonsituationleadingtoanSA
failure[190].Althoughadjacentphenomenonsuchas
information overload can influence attention
[191
193],toexplorethepossibleunderlyingcausalfactors,
we need to go deeper about attention’s role in
information processing [52] and understand how
different mechanisms fundamentally influence the
allocationoftheselectiveattention[194,195].During
the diagnostic process, the operator’s goaldirected
attentionisinthetopdownform,
dependingonthe
task goals. Mental models play a crucial role in
directing attentions because it provides a critical
mechanism to interpret the significance of
informationandmakesenseoftheworld[70].Inthe
SCC, all participants were experienced master
mariners, but they were never working as the SCC
operators to monitor and control multiple vessels
priortotheexperimentalscenarioruns. Thedominant
mindset is to maintain the safety of own vessel
during navigationalactivities. With six vesselsto be
managed,theparticipantshavereasonstobestressful
under this unfamiliar situation, which is usually
associatedwithselective
attention[142].
The MUNIN studies also suggest that reduced
information and transparency in the distributed
context can create outoftheloop syndrome,
introduce automation bias and undermine SA
significantly, even highly reliable technical
components are utilised [15]. When a navigator
conducts shiphandling on a ship bridge, the “gut
feeling”isofparamountimportanceforthenavigator
to understand the ship status and subtle dynamics
within the environment in a more efficient way
comparedtotraditionaltheelectronicdisplays[196].
For instance, the kinetic movement and vibrations
could imply the both the internal ship status (e.g.,
fullyloadedcargo)and
external environmenteffects
on the manoeuvrability of the vessel. Being present
onboardallowsthesubjectstoactivelypayattention
tothedynamicstodevelopsufficientSA.However,it
was observed in the MUNIN projects that the
operators could only rely on the information on the
display‐theSCC’sinterface design
didnot support
802
wellthehumanoperatorsto conductearlydetection
oftheirregularities[15,189].
The commonproblems accompanying SA
degradation,suchas“visiontunnelling”,“outofthe
loop”,“errantmentalmodels”,wereidentifiedinthe
supervisory control of autonomous ships in the
MUNIN project. Human limitations are the central
issues. It
indicates that the design of future
supervisory control systems need to explicitly
considerhowtomatchtheinformationprocessofthe
computer to the metal decision processes of an
operator[142].ThisistosaythatHMIdesignneedsto
concern itself with the cognitive attributes of the
operatorand
communicatetheresultstotheoperator
in a form that is compatible with the operator’s
perceptionactioncycleanddecisionmakingstrategy
[104, 142, 190]. More importantly, the design must
considerthathowtheinterfacescanaccommodatefor
the contextual change, not only reduce the visual
demandsofdisplays,butalso
supporttheoperator’s
adaptive performance in dynamic task allocation
[197]. The SA requirement analysis as a part of the
SAorienteddesign [104], maybe appreciated forits
valuesincouplinghumancapabilities/limitationsand
interface design to generate concrete design
guidelines solutions. For example, a specific design
instance could be
making manoeuvrability of the
vesselassalientcuesatapropertimeasthedynamic
situation evolves to the improve the operators’ SA,
which could become a hypothesis for future
laboratorystudies.
However, the findings are not bound to
psychologicalperspectivesbutreachsociologicaland
ecological dimensions as well. The organisational
hierarchyandregulationscausedcriticalproblemsin
theworkatSCC[15,189].Basedonthecurrentchain
of command in an unchanged organisational
structure,thecaptain(arolethatweassumestillexist
in SCC) became the teamSA chain’s weakest and
most vulnerable link when he had difficulty
developingSAaboutthesituation.Theorganisational
factorsandthe“COLREGsforunmannedships” (i.e.,
InternationalRegulationsforPreventingCollisionsat
Seaasthe“rulesoftheroad”atsea)wererelatively
downplayed in this technologycentric project. The
participants simply got confused and took different
navigationalactionswhenfacingthe
samesituationof
identification of an unknown object at sea [15]. The
vacuumoflegalandregulatorypolicieshascreateda
huge gap for the development of the concept of
autonomousunmannedshippingandremotecontrol,
manifestedbytheMUNINproject.
In MUNIN, the shorebased bridge system is
prone
to serve for “navigational mental models”
undertheguidanceofcurrentCOLREGsregulations.
ItsimplyassumesallSCCoperatorshavenavigation
backgrounds.Therefore theSCC supervisory control
systems were configured in an “oldwineinanew
bottle” fashion in the project. This design approach
maybeveryefficientintechnical
systemdevelopment
via directly moving the bridge to the shore and
turningnavigatorstocomputersystemoperators,but
itignoresthechangesofthecontextandworkitself.
Beside the sociological perspective, what apparently
ismissingisanecologicalqueryintothenatureofthis
new work and competence required.
A perfect
airplane pilot does not make a perfect air traffic
controllerastheyreceivedifferenttrainingsuitedfor
their work. As more projects are established and
developed foran area of autonomousshipping [14],
just as Porathe [198] wondered, shall we need a
navigating navigator onboard or a monitoring
operator
ashore. These concerns are beyond the
considerationsofapurecognitiveapproachbutthey
require a multidisciplinary approach to fathom the
emergingissuesintheworkdomain.
3.2 Energyefficiencyoptimisationonboard
Apredominantdirection inthe shipping industryis
to focus on the functional development of
technological artefacts and
its relevance to human
cognitivelimitations[199204],suchasexaminingthe
operator’s SA performance [205, 206] or a usability
study in a laboratory environment [207]. Although
theseconstructsareusefultodescribetheinformation
processing stages and ground designsolution based
on this “cognitivist approach” [154], they might be
incapable
of explainingwhat actually happen in the
field. Here we present a case that is beyond the
“cognitivist approach”. It takes a sociocultural
approachtounderstandtheissuesinthemaritimeEE
sector.
AprojectregardingEEoptimisationonboardships
wasconducted[208210].Theprojectisabouta
state
oftheart performance monitoring system called
ETApilot,whichhadbeeninstalledonmodernferry
vessels to aggregate huge amount of energy
consumption related data to supposedly inform the
crewsaboutEE.Yetitwasfoundthatthethisdecision
making support tool was “not used by the crew
members” [210]. Some navigators claimed that by
disconnecting ETApilot, navigating manually could
contributemorefuelsaving[208].Inaddition,thereis
a social boundary between the engine and bridge
departmentstoshareknowledgewitheachother.For
example, the engineers did have many thoughts on
EEandexchangedideas
overlunchbuttheyseldom
spoke to the bridge about their opinions [208]. In
Kataria, Holder [211]’s study, the division between
the engine and deck was framed as “huge Berlin
wall”.
In order to investigate if and how the identified
gapsintheEEpracticescanbemitigatedandexplore
opportunities
of interaction design, a focus group
studywas conducted[209]. Thestudy examinedthe
practitioners’ interactivity towards EE as well as
exploredthepotentialsofthisEEmonitoringtool.The
results have suggested that building a common
groundisacrucialsteptowardstheirjointactivity,as
the engineers and
navigators must understand each
other’s concerns in order to plan adaptively and
perform efficiently for improved EE. Knowledge
sharingforamutualunderstandingonboardshipsis
very crucial. The findings also suggest there is an
opportunityforinteractiondesign[209]:bytakingthe
distributedcognitionperspective,thetoolmayshape
a
discussionspacebetweenthebridgeandtheEngine
Control Room (ECR) to facilitate their collaborative
learning activities. It employed a socialcultural
perspective to perceive humantechnology
interaction. Collaborative learning can provide
opportunitiestoimprove,evolve,reinforce,andeven
803
innovatepractices[134]andinvitemutuallearningas
collectively shaped activity [132]. In this specific
project, itwas foundthatthe postjourney reflection
amongtheship’screwsisonepracticeexampleyetto
be undertaken. Interaction design in such a context
may likely influence communication and existing
social boundaries
between different divisions of
labour inan organisational context. This means that
thegoalofinteractiondesignthenbecomestobuttress
communications and participation of practitioners,
i.e., “shaping a communication space” [212]. If the
energy monitoring tool could better evaluate the EE
performance, the engineers and navigators may be
able
togobeyondthesocialboundaryandhavesome
concrete materials to address each other’s concerns.
InformedbyVygotsky[118]’stheory,ourcognitionis
distributed and our knowledge is maintained and
improved by such interaction. Learning could be
conceptualised as situated action that is inherently
integrated with human activities in certain
context
[129, 134], so navigators could obtain more
knowledgeaboutenergyconsumptionfromengineers
andengineerscouldunderstandnavigatorsconcerns
inshipmanoeuvring.Thiscollaborativelearningmay
lead both departments to innovate together to
improvetheirfutureperformanceinjointactivities.
A socialcultural approach should be appreciated
to understand
activities among communities of
practice and humantechnology interaction.
Employing technologies to facilitate knowledge
mobilisationandsocialinteractionisa keyaspectof
interactiondesign incomplex sociotechnical systems
[132], which has not been addressed that much in
maritime human factors research or industrial
development. Another article based on the same
energy efficiency project [187] has extensively
discussed learning and knowledge mobilisation in
maritime activities and its connections to future
maritime development. The socialcultural
perspectivecanenableustoconceptualiseknowledge
with a new perspective, i.e., “the result of everyday
interaction” [213], in contrast with the traditional
viewoftreating
knowledgeasmerelyanobject.With
thisperspective interactiondesignisassociated with
knowledgetransferperse.
ThehumantechnologyinteractionissuesintheEE
contextisnotonlyrelevanttosociologicalconstructs
butalsoecologicalconcerns.Onemajorproblemwith
many shipping organisations is that management
does not have sufficient
knowledge or an effective
monitoring mechanism [214]. The fuel management
systemusedonboardservesasonecrucialmediumto
link design and management. With the datadriven
decisionsupport/postvoyage analytical systems, not
onlytheship’screwcouldhaveconcreteplatformsto
selfevaluate performance, but also the management
could
havepossibilitytoseewhatisreallygoingonto
prepare any organisational adaptation. Big data is
essentially concerned about new ways of
management[10,215].Thisistoprobethevaluesof
databased management practices in the newly
emerging bigdata shipping ecology. From the
ecological perspective the context
of the maritime
sociotechnicalsystemswillbeinfluenceddramatically
by artificial intelligence and advanced intellectual
applications.Whenthelocalknowledge(e.g.,howto
conduct ecodriving on one ship) is institutionalised
with proper organisational support mediated by
technologies,whatemergesisacollaborativesynergy
betweenhumans (practitionersat the sharp
endand
managementteamatthebluntend)andtechnologies
(datadriven decision support systems) on a global
level. The impact can even go to an industry and
societylevel,e.g.,exploringnewapproachesforIMO
to optimise regulations and policies regarding
industrialEEpracticesinamoreadaptivemanner,or
possibilities to optimise standards like Ship Energy
Efficiency Management Plan (SEEMP), which has
been identified problems in management system
standards[216].
3.3 ManagingunrulytechnologiesinECR
Todaymuchattentionhasbeenpaidtotechnological
possibilities to address human limitations or
satisfying local needs in technologydriven projects.
Theseprojects
largelyignoredwhatkindofworkand
ecosystemwould evolvetowards.There was acase
studyregardingtheecosystemdevelopmentinships’
ECRs [217]. In ECRs, although the functionality of
overview is found to be important for monitoring
tasks, the displays and controls are increasingly
distributed [28]. This is
partly because each
manufacturer has their own standards regarding
design, none would normally concern about the
interoperability or user experience of practitioners
workingwithdozensofheterogeneousproductsand
servicesonboard.Withmoretechnologiesintroduced
to ECRs, local problems might get solved but the
overarchingecologyisbecomingmorecomplex.
The
situation is not only wracked with the technical
hardshipbutalsoorganisationalfactors,compounded
by the regulatory vacuum which allows a “wild”
growth of the intertwined ecosystems in the ECRs
[27,217].
The paper suggests the importance of having a
holistic thinking in the shipping industry’s ecology
(such as
the overall architecture) [217]. This is not
about how we tackle some usability problem or
introduce some “intelligent” IT applications locally,
buthowweshouldperceivetheissueglobally(within
an organisation, across organisations, or even in a
largercontext).Agloballevelsolutionwasmentioned
‐ Maritime Connectivity Platform (MCP,
former
known as Maritime Cloud) developed in the EU
project “EfficienSea 2” [218] in the eNavigation
framework[12].Withmoretechnologicalapplications
and services being created, the infr astructure
developmentoftheMCPorMaritimeCloudaimsto
connect all maritime stakeholders for information
exchangeandshapeamoresustainable
ecosystemin
shipping [219, 220]. Although the MCP is adhoc for
shipbridges,theengineandbridgedepartmentshare
thesameproblemthatgrowingtechnologiesneedto
be managed on a higher level in the infrastructure.
The key to obtain a crossplatform overview with
high consistency, accessibility, readability
and
discoverability lies in the interoperability and
standardisation. By introducing the MCP, there
wouldbean“appstore”forvariouskindsof“apps”
from a wide scope of service providers and
manufacturers. Although the MCP directly concerns
thedevelopment workinthebackend,i.e., howwe
manageunrulytechnologies
withdesign,deployment
804
andgovernanceofstandardisation,sucheffortsdone
willalsolikelyinfluenceuserexperienceinthefront
end.
What matters most might be not a solution as
such, but the needs to conduct a holistic systems
perspectivetounderstandhumantechnologyecology
issuesinshipping.Shipengineerssetupnewscreens
orputupnewpostitnotesintheECRtomakelocal
adaptations to meet their emerging needs. The
shipping industry seems to be developing
predominantly in this fashion with more technical
solutions being introduced onboard to adapt to
individual local circumstances. Human adaptations
actually manifest themselves in error
mechanisms:
errorsthatareassociatedwithadaptationisinnature
a behavioural process of safety boundary seeking,
thus adaptations or human errors cannot be
eliminated per se [59, 221]. This is to say, local
adaptations might be also creating information
management problems and leadingto human errors
at the same time.
A critical point in systems
perspectiveisthatthingsgorightinthesamewayas
things go wrong [222, 223]. “The technological
flexibilities that simply create burdens on the
practitioners” could prevail over “the technological
flexibilities that are used to increase the range of
practitioner adaptive response to the
variability
residentinthefieldofactivity”[224].
For the techdriven shipping industry, it is an
ecological lesson for us to reflect upon what this
industry is evolving towards and find ways to
manage unruly technologies for human use. It is
importanttounderstandhowthisshippingsystemis
carrying
out “transforming”, or in Dekker [158]’s
words, “adaptation” or “drifting into failure” in the
specific context (i.e., the slow incremental process
duringwhichperiodsmalldeviationsthatareusually
takenforgrantedasʺnormsʺreverberate,proliferate,
and propagate through the web of complex
interactions). The situation in ECRs seems to
be a
manifestationofsuccessfullocaladaptations,butlocal
adaptiveness can lead to “illusion of assistance” or
“miscalibration”ofthedynamicsituation,asitmight
lead the decision makers to ignore the mal
adaptiveness on a global scale [185]. The reflections
from the ecology study of ECR are accentuating the
importance of having a holistic view and systems
thinkingintermsofhumantechnologyinteraction.
Complexities are the results of a large problem
space, large numbers of interacting elements, social
cultural perspectives, heterogeneous perspectives,
distributed, dynamic processes, nonlinear
interactions, highly coupled elements with various
interactions (e.g., the element evolves
with one
another and with the environment) and hard to
forecastorpredict[154,225227].Withtheprevalence
ofimperfect automationtoday andmaybe inthe far
future,themore exactdecisionit madefor theuser,
thehigherriskofautomationbia s andmore“human
errors” is introduced. It
is important to integrate
human and technology into a sociotechnical system,
in which their capabilities are supplementing each
other to provide improved system performance‐
“effectiveness in sociotechnical systems will often
depend on whether the technologies function as
collaborative system team players” [157]. The most
valuable ecological perspective is to consider
the
system as a whole (humantechnologyecology), i.e.,
shifting from structuralism, which primarily focuses
on what happen inside an agent, to functionalism,
which primarily focuses on what happen inside a
system.
3.4 Towardsapluralisticepistemology
Some of the maritime study cases presented in this
paper adopt the psychological school
of SA
interpretation to “zoom in”, while others study
employs socialcultural and ecological lenses to
“zoom out”. To some extent this discourses a
dialectical dialog between somewhat contradicted
theories and views, between mind and matter,
between the old school and new school of HCI
research paradigm that is introduced
in section 2.
Here we can take a more philosophical attitude to
understandthe“controversy”andvalueofpluralism
in research of humantechnology interaction in
general.
Thedebateonmetaphysicaltheoriesorontologies
informs us, actually, there can be multiple ways of
understandingbeing.Thematerialismadvocatesthat
matters exist
independently of thoughts [228]. The
worldisouttherewaitingtobe“discovered”andthe
observer has the possibility to take an objective
distance between him/herself and the observed
phenomena sohe/she couldtake an objective stance
towardsthenatureofmatter‐thisisdescribedasthe
“discoverer epistemology” by
Flach and Voorhorst
[128]. The discoverer could be considered as a
“detachedobserver”[229]‐theobjectiverealityexists
in the rationalistic view in which a cognitive agent
perceivecues andformsmental representations(i.e.,
knowledge)thatcanbemanipulated(i.e.,thinkingor
cognitive activity). This philosophical ontology with
the “discoverer” attitude
dominates the mindset of
many scientific researchers of multiple disciplines
such as chemistry, biology, physics and even
computer science. Positivism asserts that there is
absolute truth about knowledge and it is gathered
from the natural and social world only through
rigorousunbiasedempiricalmethods,methodsbased
onhypothesisand
empiricalfactgathering[230].This
epistemology indicates that we should already have
assumption and knowledge. We formulate a
falsifiablehypothesisand dothe controlled
experimenttotestandobservesothestudybecomes
“scientific” per Popper [231]’s hypothetical
deductivism(falsification)principles.
Butistherealwaysabsolutetruthofknowledge?Is
the
study to our mind always subject to laws and
theory?Probablypositiveanswersforneuroscienceas
it is still tryi ng to seek the objective reality in the
humanbrain’sstructures,butinothercasesfindings
mightbe nolongerindependentfrom theobservers.
Flach and Voorhorst [128] describe the concept
of
“inventorepistemology”andthatobserversimmerse
themselves intimately into the studied phenomenon
andknowingthatcanonlybe“inventedfrominside”.
Differentobserversmaycreatedifferentobservation’s
results, similar to “observer’s effect” in quantum
mechanics [232]. From the epistemological theory’s
perspective, this is congruent with postpositivism
that rejects the
assumption that there is an absolute
805
truth about knowledge [230]. The world is rather
complicated and hard to predict, thus the authentic
accounts of the reality can only be constructed
throughourinteractionwiththeworlditself(e.g.,see
constructionism in many social science research to
understand the socially constructed nature of real
worldphenomenon[133,
233]).
Theprobleminourscientificresearchconcerning
the topic of humantechnology interaction is that
researchers usually take a dualistic ontology for
grantedinwhichmindandmatteraredeemedastwo
distinctivesystemsandthereforewehavetwokinds
ofrealitiesorevensciences[128].Theconsequenceis
that we mightend up observing and understanding
the world exclusively through one len. Many
academic debates have been unfolded around this,
e.g.,“mindversusmatter”topicincognitivescience,
especially surrounding those wellknown constructs
suchas situationawareness. Whilesome argued the
importance of studying cognitive constructs [75],
others
believed the importance of focusing on the
“system”[158].
However,thereisanalternativebylookingatthe
actual consequences and considering the practical
effects of the objects (i.e., knowledge, concepts,
meanings,beliefs,etc).Pragmatismthatderivedfrom
the work of Peirce [234], James [235], Mead [236],
Dewey [237], argues
that values and truths are
determined by utility and experience to address the
problem of the dualism. For example the radical
empiricismconstructedbyJames(1976)highlightsthe
notionofexperiencetoeliminatethesubstantialityof
substance and consciousness. Bennett and Flach
(2011) contended that human experience is the
essentiallythe
“jointfunctionofmindandmatter”(p.
460),somethingthathasbeenechoedinPirsig[127]’s
discourse of experience of driving the motorcycle (a
metaphor of technologies). Pragmatism advocates
pluralisticapproachestoderiveknowledgeaboutthe
problem becauseknowledge and truth is plural and
contextual‐“it is not ideal or
a fixed conception of
reality but a means for dealing with it effectively”
[230]. Winograd and Flores [229] contended that
cognition should be viewed as “a pattern of
behaviour that is relevant to the functioning of the
person”(p.71)and“knowledgeisalwaystheresultof
interpretation,whichdependsonthe
entireprevious
experienceoftheinterpreterandonsituatednessina
tradition”(p.75),soknowledgeisbothsubjectiveand
objective. The philosophy of pragmatism invites
multiple methods and worldviews to discuss the
meaningofinteractionandunderstandthenatureof
humantechnology interaction. Perhaps there are no
separate realities
or sciences but one unified science
(“scienceofexperience”)inwhichhumanexperience
isshapedintheinteractionofmindandmatter[128].
Pureobjectivity cannever beachieved [238]andwe
arepartofwhatweaimtochange[128].Whatmatters
for researchers and designers in HCI, in order
to
understand the complexity of the domain (e.g.,
shipping), is probably a pluralistic epistemological
frameworkinwhichwearenotnecessarily“boundto
astance”butwelearnto“apprecia teeachstance”.
This article explores these values of
epistemological pluralism by using the cases in the
shipping as examples. As
these cases suggest the
sociologicalandecologicalconcernsareascrucial as
the psychological concerns. We should have a
pluralistic epistemology to study and understand
humantechnology interactions in shipping, i.e.,
recognising the value of both “think small” and
“think big”, both “zoom in” and “zoom out”.
Multidisciplinaryendeavoursarerequired
toachieve
the pluralistic epistemology. In addition, the
presented cases here have suggested that there are
interdisciplinary understandings between
disciplinaryknowledge.Forinstancewhenwerealise
the limitations of the cogntivist approach, we might
be already in the realms of socioecological
dimensions, such as queries about how a decision
support tool could facilitate collaborative learning,
management practices in the emerging datadriven
era, what the humanmachine system in the ECR
would evolve towards, etc. The crucial value of a
multidisciplinary approach is perhaps helping us to
seta foundationtoformaninterdisciplinarysynergy
as an emerging language to
understand complex
sociotechnical systems. This view might benefit the
maritime human factors and HCI research
communityaswellastheshippingindustry.
4 CONCLUSIONS
Thisarticlereviewstheoreticalconceptsrelativetothe
dimensions of psychology, sociology and ecology in
order to form a deeper understanding of human
technology interactions. It
also discusses the
theoreticalconstructs’practicalrelevancebyshowing
howafewcases exemplifyingongoingdevelopment
sectors in shipping are understood with these
theoretical perspectives. Today’s maritime human
factorsandHCIresearchhassignificantlywidenedits
research scope to something that is situated at the
intersections of computer science, social science,
engineeringdesign,andpsychology.Theboundaries
between the traditional disciplines are diminishing
[155].Theshippingindustryisatfullspeedtowards
digitalisation and automation, but the transition
period is, as substantiated by various studies
presented in the paper, characterised of fragmented
needs of all kinds, lack of human factors concerns,
contextual considerations, regulatory support, and
wild growth of unruly technologies, etc. With more
andmoreadvancedtechnologiesbeingintroducedto
theshippingdomain,usedwithpeople,bypeople,for
people, it is of significant importance to take an
attitude of pluralistic epistemology for future
maritime development and HCI research. In the
author’s opinion, it is not only the apparatus to
conduct multidisciplinary/interdisciplinary research,
but also this emerging language for us to learn and
appreciate the complexity of the world, both
industrialandsocietal.
REFERENCES
1. Bhattacharya,Y.,Employeeengagementintheshipping
industry:astudyofengagementamongIndianofficers.
WMU Journal of Maritime Affairs, 2015. 14(2): p. 267
292.
806
2. Stopford, M., Maritime Economics (3rd edition). 2009,
Milton Park, Abingdon, Oxon United Kingdom:
Routledge.
3. Allen, P., Perceptions of technology at sea amongst
British seafaring officers. Ergonomics, 2009. 52(10): p.
12061214.
4. Grech, M.R., Human factors in the maritime domain /
Michelle Rita Grech, Tim John Horberry, Thomas
Koester,
ed. T. Horberry and T. Koester. 2008, Boca
Raton,Fla:CRCPress.
5. Ljung,M.andM.Lützhöft,Functions,performancesand
perceptionsofworkonships.WMUJournalofMaritime
Affairs,2014.13(2):p.231250.
6. Gartner.GartnerITGlossary‐Digitalization. 2018[cited
2018 June]; 21]. Available from:
https://www.gartner.com/it
glossary/digitalization/.
7. Lukas, U.F.v., Virtual and augmented reality for the
maritime sector applicationsand requirements. IFAC
ProceedingsVolumes,2010.43(20):p.196200.
8. Gentzsch, W., A. Purwanto, and M. Reyer. Cloud
Computing for CFD based on Novel Software
Containers. in 15th International Conference on
Computer and IT Applications in
the Maritime
Industries‐COMPITʹ16.2016.Lecce,Italy.
9. Koga,S.,Majorchallengesandsolutionsforutilizingbig
data in the maritime industry. 2015, World Maritime
University:Malmö,Sweden.p.91.
10.Rødseth, Ø.J., L.P. Perera, and B. Mo. Big Data in
Shipping‐Challenges and Opportunities. in 15th
InternationalConference
on Computer andIT
Applications in theMaritime Industries‐COMPITʹ16.
2016.Lecce,Italy:TechnischeUniverstatHamburg.
11.Park, K. Blockchain Is About to Revolutionize the
ShippingIndustry.2018[cited2018June21st];Available
from: https://www.bloomberg.com/news/articles/2018
0418/drowninginaseaofpaperworldsbiggest
shipsseekaway
out.
12.IMO.ENavigation.2014[cited2017March6];Available
from:
http://www.imo.org/en/OurWork/safety/navigation/pag
es/enavigation.aspx.
13.RINA, Digitisation Industry Survey. 2018, Royal
InstitutionofNavalArchitects.
14.Rylander,R.andY.Man,Autonomoussafetyonvessels
‐ an international overview and trends within the
transportsector.2016,Lighthouse:Gothenburg,Sweden.
15.Man,
Y.,etal.,Humanfactorissuesduringremoteship
monitoringtasks:Anecologicallessonforsystemdesign
in a distributed context. International Journal of
IndustrialErgonomics,2018.68:p.231244.
16.Hummels, D., Transportation costs and international
trade in the second era of globalization. Journal of
EconomicPerspectives,2007.21(3):
p.131154.
17.Aplin, J. RollsRoyce launches project to design
unmannedships.2015[cited 2018 June 21st]; Available
from: https://www.forumforthefuture.org/blog/rolls
roycelaunchesprojectdesignunmannedships.
18.MarEx.ChinatoBuildAutonomousShipTestBed.2018
[cited 2018 June 21st]; Available from:
https://www.maritimeexecutive.com/article/chinato
buildautonomousship
testbed#gs.A2uESKE.
19.Hetherington, C., R. Flin, and K. Mearns, Safety in
shipping: The human element. Journal of Safety
Research,2006.37(4):p.401411.
20.Landauer, T., The trouble with computers usefulness
usabilityandproductivity.1995:MITpress.
21.Lützhöft,M.,M.Lundh,andT.Porathe,Onboardship
management overview
systeman information sharing
systemonboard.Transactionsof the Royal Instituteof
NavalArchitects,2013.155:p.1114.
22.Lundh,M.andL.W.Rydstedt,Astaticorganizationina
dynamic context A qualitative study of changes in
workingconditionsforSwedishengineofficers.Applied
Ergonomics,2016. 55:
p.17.
23.Lützhöft, M. and S. Dekker, On Your Watch:
Automation on the Bridge. The Journal of Navigation,
2002.55(01):p.8396.
24.Oliveira,M.,J.Costa,andH.Torvatn.TomorrowʹsOn
Board Learning System (TOOLS). in Learning and
Collaboration Technologies: Third International
Conference.2016.Toronto,ON,
Canada:Springer.
25.Lützhöft,M.,“Thetechnologyisgreatwhenitworks”:
Maritime Technology and Human Integration on the
Shipʹs Bridge, in Department of Management and
Engineering, Industrial ergonomics. 2004, Linköping
University, The Institute of Technology: Linköping
University.
26.Lützhöft,M.andJ.Nyce,Integrationworkontheshipʹ
s
bridge.JournalofMaritimeresearch,2008.5(2):p.5974.
27.Mallam,S.C.andM.Lundh,ShipEngineControlRoom
Design: Analysis of Current Human Factors &
Ergonomics Regulations & Future Directions.
Proceedings of the Human Factors and Ergonomics
SocietyAnnualMeeting,2013.57(1):p.521525.
28.Wagner, E.,
M. Lundh, and P. Grundevik, Engine
Control Rooms Human Factors Field Studies. 2008,
MSIDesign,ChalmersUniversityofTechnology,SSPA:
Gothenburg,Sweden.
29.SchröderHinrichs,J.U.,etal.,Maritimehumanfactors
andIMOpolicy.MaritimePolicy& Management,2013.
40(3):p.243260.
30.House, C.and C.
Place, Report on the investigationof
the engine failure of Savannah Express and her
subsequent contact with a linkspan at Southampton
Docks. 2006, Marine Accident Investigation Branch:
Southampton,UnitedKingdom.
31.Rothblum,A.M.HumanErrorandMarineSafety.2000
[cited 2018 March 10]; Available from: http://bowles
langley.com/wp
content/files_mf/humanerrorandmarinesafety26.pdf.
32.Sandhåland, H.,
H. Oltedal, and J. Eid, Situation
awareness in bridgeoperations A studyof collisions
betweenattendant vessels and offshore facilitiesin the
NorthSea.SafetyScience,2015.79:p.277285.
33.Grech,M.R.,T.Horberry,andA.Smith,HumanErrorin
Maritime Operations: Analyses of Accident Reports
Using
theLeximancerTool.Proceedings oftheHuman
FactorsandErgonomics Society Annual Meeting,2002.
46(19):p.17181721.
34.MAIB,Reportontheinvestigationof the grounding of
Muros. 2016, Marine Accident Investigation Branch in
UK:Southampton.
35.Grech,M.R.andN.Lemon, HumanCentredDesignfor
Enhanced Navigation Systems: Shifting
the Focus on
User Needs, in PACIFIC 2015‐9th International
Maritime Exposition & Conference. 2015: Sydney,
Australia.
36.Ahvenjärvi, S., Management of The Safety of
AutomationChallengesTheTrainingofShipOfficers,in
Greenships, eco shipping, clean seas: the 12th Annual
General Assembly of the International Association of
Maritime
Universities.2011:Gdynia.
37.Card,S.K.,A.Newell,andT.P.Moran,ThePsychology
of HumanComputer Interaction. 1983: L. Erlbaum
AssociatesInc.469.
38.Lim,K.Y.andJ.B.Long,TheMUSEmethodofusability
engineering.1994,Cambridge:UniversityPress.
39.Norman,D.A.andS.W. Draper, User Centered System
Design; New Perspectives
on HumanComputer
Interaction.1986:L.ErlbaumAssociatesInc.526.
40.Carroll,J.M.,Human–computerinteraction:psychology
asascienceofdesign.InternationalJournalofHuman
ComputerStudies,1997.46(501522).
41.Dix, A., et al., HumanComputer Interaction (3rd
Edition).2003:PrenticeHall,Inc.
42.Sheridan,T.B.andR.Parasuraman,
HumanAutomation
Interaction.ReviewsofHumanFactorsandErgonomics,
2005.1(1):p.89129.
807
43.Hancock, P.A., et al., HumanAutomation Interaction
Research: Past, Present, and Future. Ergonomics in
Design: The Quarterly ofHuman Factors Applications,
2013.21(2):p.914.
44.Endsley, M.R., From Here to Autonomy: Lessons
Learned From Human–Automation Research. Human
Factors,2016.59(1):p.527.
45.Kaber, D.B., Issues in
HumanAutomation Interaction
Modeling: Presumptive Aspects of Frameworks of
Types and Levels of Automation. Journal of Cognitive
Engineering and Decision Making, 2017: p.
1555343417737203.
46.Kosuge,K.andY.Hirata.HumanRobotInteraction.in
2004 IEEE International Conference on Robotics and
Biomimetics.2004.
47.Burke, J.L., et al., Final report
for the DARPA/NSF
interdisciplinary study on humanrobot interaction.
IEEE Transactions on Systems, Man, and Cybernetics,
Part C (Applications and Reviews), 2004. 34(2): p.103
112.
48.Sheridan, T.B., Human–Robot Interaction: Status and
Challenges.HumanFactors:TheJournaloftheHuman
FactorsandErgonomicsSociety,2016.
49.Lazar,J.,J.Feng,
andH. Hochheiser,ResearchMethods
inHumanComputerInteraction,2ndedition.2017,MA,
USA.:MorganKaufmann.
50.AAWA,Remoteandautonomousships‐thenextsteps.
2016,RollsRoyceplc:London.
51.Wolff, C., Psychologia Empirica Methodo Scientifica
Pertractata(1732).2010:KessingerPublishing.
52.Wickens, C.D., Engineering psychology and human
performance.
1992:HarperCollinsPublishers.
53.ASANET. An Introduction to Sociology: The Field of
Sociology. 2008 [cited 2018 May 2]; Available from:
http://www.asanet.org/sites/default/files/savvy/introtoso
ciology/Documents/Field%20of%20sociology033108.htm
#whatissociology.
54.Kaptelinin,V.andB.A.Nardi,ActingwithTechnology:
ActivityTheory andInteractionDesign.2006,
Cambridge:MITPress.
55.Suchman, L.A., HumanMachine Reconfigurations:
Plans and Situated Actions (2nd
Edition). 2007, NY,
USA:CambridgeUniversityPress.
56.Lave, J.,Cognition in practice: Mind, mathematics and
cultureineverydaylife.1988,NewYork,US:Cambridge
UniversityPress.
57.Hutchins, E., Cognition in the wild. 1995, Cambridge:
MITPress.
58.Gibson, J.J., The Ecological Approach to Visual
Perception:ClassicEdition.1979:
PsychologyPress.
59.Rasmussen, J., A.M. Pejtersen, and L.P. Goodstein,
Cognitivesystemsengineering.1994:JohnWiley&Sons,
Inc.378.
60.Bennett, K.B. and J.M. Flach, Display and Interface
Design:SubtleScience,ExactArt.2011:CRCPress,Inc.
510.
61.Johannsen, G., Humanmachine interaction., in
EncyclopediaofLifeSupportSystems
(EOLSS):Control
Systems, Robotics, and Automation. 2007, EOLSS
Publishers:Paris,France.
62.Flach, J.M., Situation Awareness: Context Matters! A
Commentary on Endsley. Journal of Cognitive
EngineeringandDecisionMaking,2015.9(1):p.5972.
63.Proctor, R.W. and K.P.L. Vu, Human information
processing: an overview for humancomputer
interaction,
inThe humancomputerinteraction
handbook, A.J. Julie and S. Andrew, Editors. 2003, L.
ErlbaumAssociatesInc.p.3551.
64.Broadbent, D.E., Perception and communication. 1958,
Elmsford,NY:PergamonPress.
65.Neisser, U., Cognition and Reality: Principles and
Implications of Cognitive Psychology. 1976, San
Francisco:Freeman.
66.Kahneman, D., Attention
and effort. 1973, Englewood
Cliffs,NJ:PrenticeHall.
67.Posner, M.I., Orienting of attention. The Quarterly
journalofexperimentalpsychology,1980.32(1):p.325.
68.Eriksen,C.W.andJ.D.St.James,Visualattentionwithin
and around the field of focal attention: A zoom lens
model.Perception&Psychophysics,1986.
40(4):p.225
240.
69.Craik,K.,TheNatureofExplanation.1943,Cambridge:
CambridgeUniversityPress.
70.JohnsonLaird,P.N.,Mentalmodels:towardsacognitive
scienceoflanguage,inference,andconsciousness.1983:
HarvardUniversityPress.513.
71.Baddeley, A., Working Memory. 1986, Oxford:
ClarendonPress.
72.Reid, G.B. and T.E. Nygren,
The Subjective Workload
Assessment Technique: A Scaling Procedure for
Measuring Mental Workload, in Advancesin
Psychology,A.H.PeterandM.Najmedin,Editors.1988,
NorthHolland.p.185218.
73.Parasuraman,R.andP.A.Hancock,Adaptivecontrolof
workload. Stress, workload and fatigue, ed. P.A.
Hancock and P.E. Desmond. 2001, Mahwah,
NJ:
Erlbaum.
74.Endsley,M.R.,MeasurementofSituationAwarenessin
DynamicSystems.HumanFactors,1995.37(1):p.6584.
75.Endsley,M.R.,TowardaTheoryofSituationAwareness
inDynamicSystems.HumanFactors,1995.37(1):p.32
64.
76.Norman,D.A.,CognitiveEngineering,inUsercentered
systems design, D.A.N.a.S.W. Draper, Editor.
1983,
Erlbaum:Hillsdale,NJ.p.3162.
77.Faulkner, X., Usability Engineering. 2000, London:
MacmillanPressLTD.
78.Dumas, J.S. and M.C. Salzman, Usability Assessment
Methods. Reviews of Human Factors and Ergonomics,
2006.2(1):p.109140.
79.Hollingsed, T. and D.G. Novick, Usability Inspection
Methods after 15 Years of
Research and Practice.
Sigdocʹ07: Proceedings of the 25th Acm International
ConferenceonDesignofCommunication,2007:p.249
255.
80.Norman,D.A.,DesignofEverydayThings:Revisedand
Expanded.2013,London:MITPress.
81.Eriksen, C.W. and T.D. Murphy, Movement of
attentionalfocusacrossthevisualfield:Acriticallook
at
theevidence.Perception&Psychophysics,1987.42(3):p.
299305.
82.Duncan,J.,Thelocusofinterferenceintheperceptionof
simultaneousstimuli.Psychologicalreview,1980.87: p.
273300.
83.Treisman, A.M., Preattentive processing in vision.
ComputerVision,GraphicsandImageProcessing, 1985.
31(2):p.156177.
84.
Treisman, A.M. and G. Gelade, A featureintegration
theoryofattention. CognitivePsychology,1980.12(1):p.
97136.
85.Carrasco,M.,Visualattention:Thepast25years.Vision
Research,2011.51(13):p.14841525.
86.Findlay, J.M. and I.D. Gilchrist, Active Vision: The
Psychology of Looking and Seeing. 2003, UK: Oxford
UniversityPress.
87.Tsotsos, J.K., L. Itti, andG. Rees,A brief and selective
historyofattention, inNeurobiologyofAttention,L.Itti,
G.Rees,andJ.K.Tsotsos,Editors.2005,AcademicPress.
88.Wickens, C.D., Situation Awareness: Review of Mica
Endsleyʹs1995ArticlesonSituationAwareness Theory
and Measurement.
Human Factors: The Journal of the
HumanFactorsand ErgonomicsSociety,2008. 50(3): p.
397403.
89.Salmon,P.M.,etal.,Whatreallyisgoingon?Reviewof
situation awareness models for individuals and teams.
Theoretical Issues inErgonomics Science,2008. 9(4): p.
297323.
90.Riley,J.M.,etal.,Performance
andSituationAwareness
EffectsinCollaborativeRobotControlwithAutomation.
808
Proceedings of the Human Factors and Ergonomics
SocietyAnnualMeeting,2008.52(4):p.242246.
91.Durso, F.T.and A.Sethumadhavan,Situation
awareness:understandingdynamicenvironments.Hum
Factors,2008.50(3):p.4428.
92.Stanton, N.A., P.M. Salmon, and G.H. Walker, Let the
Reader Decide: A Paradigm Shift for Situation
Awareness
in Sociotechnical Systems. Journal of
CognitiveEngineeringandDecisionMaking,2015.9(1):
p.4450.
93.Chiappe, D., T.Z. Strybel, and K.P.L. Vu, A Situated
ApproachtotheUnderstandingofDynamicSituations.
JournalofCognitiveEngineeringandDecisionMaking,
2015.9(1):p.3343.
94.Lundberg, J., Situation awareness
systems, states and
processes: a holistic framework. Theoretical Issues in
ErgonomicsScience,2015.16(5):p.447473.
95.Samantha, V.K., J.L. Steven, and Y. HyoSang,
CoincidenceBetweentheScientificandFolkUsesofthe
Term “Situation(al) Awareness” in Aviation Incident
Reports.JournalofCognitiveEngineeringandDecision
Making,2011.
5(4):p.378400.
96.Selcon, S.J., R.M. Taylor, and E. Koritsas. Workload or
situational awareness?. TLX vs. SART for aerospace
systemsdesignevaluation.1991.
97.Marr,D.,Vision:AComputationalInvestigationintothe
Human Representation and Processing of Visual
Information.1982,SanFrancisco:Freeman.
98.Endsley, M.R., Theoretical underpinnings
of situation
awareness: A critical review. Situation awareness
analysisandmeasurement,2000:p.332.
99.Adams, M.J., Y.J. Tenney, and R.W. Pew, Situation
AwarenessandtheCognitiveManagementofComplex
Systems.HumanFactors,1995.37(1):p.85104.
100. Smith, K. and P.A. Hancock, Situation Awareness Is
Adaptive, Externally Directed
Consciousness. Human
Factors: The Journal of the Human Factors and
ErgonomicsSociety,1995.37(1):p.137148.
101. Stanton, N.A., et al., Human Factors Methods: A
Practical Guide for Engineering And Design. 2006:
AshgatePublishingCompany.
102. Stanton,N.A., etal.,Issituationawarenessallinthe
mind? Theoretical Issues
in Ergonomics Science, 2009.
11(12):p.2940.
103. Stanton, N.A., et al., Distributed situation awareness
in dynamic systems: theoretical development and
applicationofanergonomicsmethodology.Ergonomics,
2006.49(1213):p.12881311.
104. Endsley,M.R.,DesigningforSituationAwareness:An
Approach to UserCentered Design, Second Edition.
2011:
CRCPress,Inc.396.
105. Simon, H.A., Models of Man. 1957, New York: John
WileyandSons.
106. Supe, S.V., Factors related to different degree of
rationality in decisionmaking among farmers. 1969,
IndianAgriculturalResearchInstitute:NewDelhi.
107. Polič, M., Decision making: between rationality and
reality. Interdisciplinary
Description of Complex
Systems,2009.7(2):p.7889.
108. Kørnøv, L. and W.A.H. Thissen, Rationality in
decision‐ and policymaking: implications for strategic
environmental assessment. Impact Assessment and
ProjectAppraisal,2000.18(3):p.191200.
109. Kahneman, D. and A. Tversky, Choices, values, and
frames.2000,NewYork:CambridgeUniversityPress.
110. Klein,G.,Arecognitionprimeddecision(RPD)model
of rapid decision making. 1993: Ablex Publishing
Corporation.
111. Klein, G., The recognitionprimed decision (RPD)
model: Looking back, looking forward. Naturalistic
decisionmaking,1997:p.285292.
112. Flach, J.M., et al., Decisionmaking in practice: The
dynamics of muddling through.
Applied Ergonomics,
2017.63:p.133141.
113. Piaget,J.andB.Inhelder,ThePsychologyoftheChild.
1969,NewYork:BasicBooks.
114. Reason, J., Human Error. 1990, UK: Cambridge
UniversityPress.
115. Reason, J., Human error: models and management.
BMJ : British Medical Journal, 2000. 320(7237): p. 768
770.
116. Le Coze, J.C., New models for new times. An anti
dualistmove.SafetyScience,2013.59:p.200218.
117. Dekker,S.,TenQuestionsAboutHumanError.2004,
Mahwah,NJ:LawrenceErlbaum.
118. Vygotsky, L., Mind in Society: The Development of
Higher Psychological Processes. 1978, Cambridge,
Massachusetts:Harvard
UniversityPress.
119. Leontʹev, A.N., Activity Consciousness and
Peronality.1978,EngelwoodCliffs,NJ:PrenticeHall.
120. Bødker,S.,ActivityTheoryasaChallengetoSystems
Design.1990,1990(334).
121. Leontʹev, A.N., Problems of the development of the
mind.1981,Moscow: Progress.
122. Engeström, Y., Learning by expanding: An activity
theoretical approach to developmental research. 1987,
Helsinki:OrientaKonsultitOy.
123. Bødker,S.,ThroughtheInterface:AHumanActivity
ApproachToUserInterfaceDesign.1991, Hillsdale,N.J.:
LawrenceErlbaum.
124. Nardi,B.A.,Studyingcontext:acomparisonofactivity
theory, situated action models, and distributed
cognition, in Context and consciousness, A.N.
Bonnie,
Editor. 1995, Massachusetts Institute of Technology. p.
69102.
125. Kuutti, K., Activity theory as a potential framework
for humancomputer interaction research, in Context
andconsciousness,A.N.Bonnie,Editor.1995,
MassachusettsInstituteofTechnology.p.1744.
126. Kaptelinin,V. and B.Nardi, ActivityTheory in HCI:
Fundamentals and
Reflections. 2012: Morgan \&
ClaypoolPublishers.106.
127. Pirsig, R.M., Zen and the Art of Motorcycle
Maintenance. 1974, NY, USA: William Morrow and
Company.
128. Flach, J.M. and F. Voorhorst, What Matters: Putting
Common Sense to Work. 2016, Dayton, USA: Wright
StateUniversityLibraries.
129. Lave,J.andE.Wenger,
SituatedLearning‐Legitimate
peripheral participation. 1991, Cambrige, UK:
CambridgeUniversityPress.
130. Hollan, J., E. Hutchins, and D. Kirsh, Distributed
cognition: toward a new foundation for human
computer interaction research. ACM Trans. Comput.
Hum.Interact.,2000.7(2):p.174196.
131. Leidner, D.E. and S.L. Jarvenpaa, The use of
informationtechnology
toenhancemanagementschool
education:atheoreticalview.MISQ.,1995.19(3):p.265
291.
132. Lundin,J.,Talkingaboutwork.Designinginformation
technologyforlearningininteraction,inDepartmentof
Informatics. 2005, University of Gothenburg:
GothenburgSweden.
133. Orr,J.E.,TalkingAboutMachines:AnEthnographyof
AModernJob.
1996,U.S.:CornellUniversityPress.
134. Wenger, E., Communities of practice: Learning,
meaning, and identity. 1998: Cambridge university
press.
135. Wenger, E., R.A. McDermott, and W. Snyder,
Cultivating communities of practice: A guide to
managingknowledge.2002:Harvard BusinessPress.
136. Flach, J.M. and R.R. Hoffman, The Limitations of
Limitations.IEEE
IntelligentSystems,2003.18(1):p.94
96,c3.
137. Brunswik,E.,The conceptualframeworkof
psychology, in International Encyclopedia of Unified
Science.1952,TheUniversityofChicagoPress:Chicago.
138. Taylor, F., Psychology and the design of machines.
AmericanPsychologist,1957.12(5):p.249258.
809
139. Flach, J.M., Situation Awareness: Proceed with
Caution. Human Factors: The Journal of the Human
FactorsandErgonomicsSociety,1995.37(1):p.149157.
140. Heft, H., The Relevance of Gibson’s Ecological
Approach to Perception for EnvironmentBehavior
Studies,inTowardtheIntegrationofTheory,Methods,
Research,andUtilization,G.T.Moore
andR.W.Marans,
Editors.1997,SpringerUS:Boston,MA.p.71108.
141. Rasmussen, J. and K.J. Vicente, Coping with human
errorsthroughsystemdesign:implicationsforecological
interface design.International Journal of ManMachine
Studies,1989.31(5):p.517534.
142. Rasmussen, J., Information Processing and Human
Machine Interaction:
An Approach to Cognitive
Engineering.1986:ElsevierScienceInc.228.
143. Vicente, K.J. and J. Rasmussen, Ecological interface
design: theoretical foundations. Systems, Man and
Cybernetics, IEEE Transactions on, 1992. 22(4): p. 589
606.
144. Smith,G.F.,Representationaleffectsonthesolvingof
anunstructureddecisionproblem.IEEETransactionson
Systems, Man,
and Cybernetics, 1989. 19(5): p. 1083
1090.
145. Flach, J.M., et al., Interface Design: A Control
Theoretic Context for a Triadic Meaning Processing
Approach, in The Cambridge Handbook of Applied
Perceptual Research, H. Robert, et al., Editors. 2015,
CambridgeUniversityPress.
146. Bennett,K.B., Ecologicalinterface design and system
safety:
One facet of Rasmussenʹs legacy. Applied
Ergonomics,2017. 59,PartB:p.625636.
147. Flach,J.M.,TheEcologyofHumanMachineSystems:
A Personal History, in Global Perspectives on the
Ecologyof HumanMachine Systems, J.M. Flach, et al.,
Editors.1995,LawrenceErlbaumAssociates:Hove,UK.
p.113.
148. Jenkins,D.P.,etal.,CognitiveWorkAnalysis:Coping
withComplexity.2008:Ashgate.
149. Jenkins, D.P.D., G.H.D. Walker, and N.A.P. Stanton,
CognitiveWorkAnalysis.2012,Abingdon,GB:Ashgate.
150. Read,G.J.M.,etal.,Designingatickettoridewiththe
Cognitive Work Analysis Design Toolkit. Ergonomics,
2015:p.121.
151. Naikar, N., Cognitive work analysis: An influential
legacy extending beyond human factors and
engineering. Applied Ergonomics, 2017. 59, Part B: p.
528540.
152. Hilliard,A. and G.A. Jamieson, Representing energy
efficiency diagnosis strategies in cognitive work
analysis. Applied Ergonomics, 2017.59, Part B: p. 602
611.
153. Stanton, N.A.,
et al., Cognitive Work Analysis:
Applications, Extensions and Future Directions. 2017:
Taylor&FrancisGroup.
154. Vicente, K.J., Cognitive work analysis: Toward safe,
productive, and healthy computerbased work. 1999,
Mahwah,NJ:LawrenceErlbaumAssociatesInc.
155. Hollnagel,E.,TheDiminishingRelevanceofHuman
Machine Interaction, in The Handbook of
Human
Machine Interaction: A HumanCentered Approach,
G.A. Boy, Editor. 2011, Ashgate Publishing Limited:
England.p.417429.
156. Hobbs, A., et al., Three principles of humansystem
integration, in Proceedings of the 8th Australian
Aviation Psychology Symposium. 2008: Sydney,
Australia.
157. Behymer, K.J. and J.M. Flach, From Autonomous
Systems to
Sociotechnical Systems: Designing Effective
Collaborations. She Ji: The Journal of Design,
Economics,andInnovation,2016.2(2): p.105114.
158. Dekker, S., Drift into Failure: From Hunting Broken
ComponentstoUnderstandingComplexSystems.2011,
Farnham:AshgatePublishingCo.
159. Sheridan, T.B., Humans and Automation: System
Design and Research Issues. 2002, New
York: John
Wiley.280.
160. Sheridan, T.B., Telerobotics, automation and human
supervisorycontrol.1992,Cambridge:MITPress.
161. Bainbridge, L., Ironies of automation. Automatica,
1983.19(6):p.775779.
162. Woods, D.D., Decomposing Automation: Apparent
Simplicity, Real Complexity. Automation and Human
Performance:TheoryandApplications,ed.R.
ParasuramanandM.Mouloua.
1996:Erlbaum.
163. Norman,D.A.,Theproblemofautomation:
Inappropriate feedback and interaction, not over
automation, in Human factors in hazardous situations,
D.E. Broadbent, A. Baddeley, and J.T. Reason, Editors.
1990,OxfordUniversityPress.p.585593.
164. Parasuraman,R.andD.H.Manzey,Complacencyand
bias in human use of
automation: an attentional
integration.HumFactors,2010.52(3):p.381410.
165. Parasuraman, R. and V. Riley, Humans and
Automation: Use, Misuse, Disuse, Abuse. Human
Factors: The Journal of the Human Factors and
ErgonomicsSociety,1997.39(2):p.230253.
166. Sauer,J.,A.Chavaillaz,andD.Wastell,Experienceof
automation
failures in training: effects on trust,
automation bias, complacency, and performance.
Ergonomics,2015: p.128.
167. Chavaillaz, A., D. Wastell, and J. Sauer, System
reliability, performanceand trust in adaptable
automation.AppliedErgonomics,2016.52:p.333342.
168. Mosier, K.L., et al., Aircrews and Automation Bias:
The Advantages
of Teamwork? The International
JournalofAviationPsychology,2001.11(1):p.114.
169. Skitka, L., K.L. Mosier, and M. Burdick,
Accountability and automation bias. International
JournalofHumanComputerStudies,2000.52(4):p.701
717.
170. Skitka, L., K.L. Mosier, and M. Burdick, Does
automationbiasdecisionmaking?InternationalJournal
ofHumanComputerStudies,1999.51(5):p.9911006.
171. Mosier,K.L.,etal.,Automationbias:decisionmaking
and performance in hightech cockpits. The
InternationalJournalofAviationPsychology,1998.8(1):
p.4763.
172. Lee, J.D. and K.A. See, Trust in Automation:
Designing for Appropriate Reliance. Human Factors,
2004.46(1):p.5080.
173. Bradshaw, J.M., et al., The Seven Deadly Myths of
Autonomous Systems. IEEE Intelligent Systems, 2013.
28(3):p.5461.
174. Wickens,C.D.,AutomationStages&Levels,20Years
After. Journal of Cognitive Engineering and Decision
Making,2017:p.1555343417727438.
175. Riley, V., A General Model
of MixedInitiative
HumanMachine Systems. Proceedings of the Human
FactorsSocietyAnnualMeeting,1989.33(2):p.124128.
176. Parasuraman,R.,T.B.Sheridan,andC.D.Wickens,A
model for types and levels of human interaction with
automation. Systems, Man and Cybernetics, Part A:
SystemsandHumans,IEEETransactionson,
2000.30(3):
p.286297.
177. Endsley, M.R. and D.B. Kaber, Level of automation
effects on performance, situation awareness and
workloadinadynamiccontroltask.Ergonomics,1999.
42(3):p.46292.
178. Woods, D.D., The Risks of Autonomy. Journal of
CognitiveEngineeringandDecisionMaking,2016.10(2):
p.131133.
179. Klein,G.,etal.,Tenchallengesformakingautomation
aʺ team playerʺ in joint humanagent activity. IEEE
IntelligentSystems,2004.19(6):p.9195.
180. MAIB, Annual report 1999. 2000, Department of the
EnvironmentTransportandRegions.:London.
181. Ventikos,N.P.,G.V.Lykos,andI.I.Padouva,Howto
achieve an effective behavioralbased safety plan: the
810
analysis of an attitude questionnaire for the maritime
industry.WMUJournalofMaritimeAffairs,2014.13(2):
p.207230.
182. Dekker,S.andE.Hollnagel,Humanfactorsandfolk
models. Cognition, Technology & Work, 2004. 6(2): p.
7986.
183. Rasmussen,J.,Riskmanagementinadynamicsociety:
amodellingproblem.
SafetyScience,1997.27(2):p.183
213.
184. Turner, B.A., ManMade Disasters. 1978, London:
WykehamPublications.
185. Dekker, S. and S. Pruchnicki, Drifting into failure:
theorising the dynamics of disaster incubation.
TheoreticalIssuesinErgonomicsScience,2014.15(6):p.
534544.
186. Rankin,A.,etal.,Resilienceineveryday
operations:A
frameworkforanalyzingadaptations inhighriskwork.
JournalofCognitiveEngineeringandDecisionMaking,
2014.8(1):p.7897.
187. Man,Y.,M. Lundh,andS.N.MacKinnon.Facingthe
New Technology Landscape in the Maritime Domain:
KnowledgeMobilisation,NetworksandManagementin
HumanMachine Collaboration. in 9th International
ConferenceonAppliedHumanFactorsandErgonomics.
2019. Orlando, Florida, USA: Springer International
Publishing.
188. Porathe, T., H.C. Burmeister, and Ø.J. Rødseth,
MaritimeUnmannedNavigationthroughIntelligencein
Networks:TheMUNINproject.2013,12thInternational
Conference on Computer and IT Applications in the
Maritime Industries, COMPIT’13, Cortona, 1517
April
2013.
189. Man, Y., et al., From Desk to Field‐Human Factor
Issues in Remote Monitoring and Controlling of
Autonomous Unmanned Vessels. Procedia
Manufacturing,2015.3:p.26742681.
190. Endsley, M.R., Situation Awareness Misconceptions
and Misunderstandings. Journal of Cognitive
EngineeringandDecisionMaking,2015.9(1):p.432.
191.
Szalma, J.L., et al., Effects of Sensory Modality and
TaskDurationonPerformance,Workload,andStressin
SustainedAttention.HumanFactors,2004.46(2):p.219
233.
192. Fiske, S.T. and S.L. Neuberg, A Continuum of
Impression Formation, from CategoryBased to
Individuating Processes: Influences of Information and
Motivation on Attention
and Interpretation, in
Advances in Experimental Social Psychology, M.P.
Zanna,Editor.1990,AcademicPress.p.174.
193. Speier,C.,J.S.Valacich,andI.Vessey,TheInfluenceof
Task Interruption on Individual Decision Making: An
Information Overload Perspective. Decision Sciences,
1999.30(2):p.337360.
194. Koch, C.,TheQuest for
Consciousness: A
Neurobiological Approach. 2004, USA: Roberts and
CompanyPublishers.
195. Parkhurst, D., K. Law, and E. Niebur, Modeling the
roleofsalienceintheallocationofovertvisualattention.
VisionResearch,2002.42(1):p.107123.
196. Prison, J., J. Dahlman, and M. Lundh, Ship sense‐
striving for harmony
in ship manoeuvring, in WMU
JournalofMaritimeAffairs.2013.p.115127.
197. Engström, J., G. Markkula, and V. Trent, Attention
selection and task interference indriving: an action
orientedview,in1stInternationalConferenceonDriver
Distraction and Inattention (DDI 2009). 2009:
Gothenburg,Sweden.
198. Porathe, T., A Navigating
Navigator Onboard or a
Monitoring Operator Ashore? Towards Safe, Effective,
andSustainableMaritimeTransportation:Findingsfrom
Five Recent EU Projects. Transportation Research
Procedia,2016.14:p.233242.
199. Cohen, I., W.P. Brinkman, and M.A. Neerincx,
Modelling environmental and cognitive factors to
predictperformance inastressfultrainingscenario
ona
naval ship simulator. Cognition, Technology & Work,
2015.17(4):p.503519.
200. Itoh, K., et al., Risk Analysis of Ship Navigation by
Use of Cognitive Simulation. Cognition, Technology &
Work,2001.3(1):p.421.
201. Prison,J.andT.Porathe.Navigationwith2Dand3D
maps‐a
comparativestudywithmaritimepersonnel.in
the 39th Nordic Ergonomics Society Conference. 2007.
Lysekil,Sweden.
202. Tian, H.B., B. Wu, and X.P. Yan. Challenges and
developmentsofwatertransportsafetyunderintelligent
environment. in the 4th International Conference on
Maritime Technology and Engineering. 2018. London,
UK:Taylor&FrancisGroup.
203. Sauer, J., et al., Effects of display design on
performance in a simulated ship navigation
environment.Ergonomics,2002.45(5):p.329347.
204. Shahir, H.Y., et al. Maritime situation analysis
framework:Vesselinteractionclassificationand
anomalydetection.in 2015IEEEInternational
ConferenceonBigData(BigData).2015.
205. Okazaki,
T.andC.Nishizaki,SituationAwarenessof
Ship Maneuvering Simulator Training International
Journal of Emerging Trends in Engineering &
Technology,2015.3(1).
206. Porathe,T.,J.Prison,andY.Man,Situationawareness
inremotecontrolcentresforunmannedships,inRINA
Human Factors in Ship Design and Operation
Conference.2014:
London,UK.p.93101.
207. Kjeldskov, J. and M.B. Skov, Studying usability in
sitro: Simulating real world phenomena in controlled
environments. International Journal of Human
ComputerInteraction,2007.22(12):p.736.
208. Viktorelius,M.andM.Lundh,TheRoleofDistributed
Cognition in Ship Energy Optimization, in Energy
EfficientShips.2016:London,UK.
209. Man, Y., M. Lundh, and S.N. MacKinnon, Maritime
Energy Efficiency in a Sociotechnical System: A
Collaborative Learning Synergy via Mediating
Technologies. TransNav, the International Journal on
Marine Navigation and Safety of Sea Transportation,
2018.12(2).
210. Viktorelius, M., Expanding practice theory in energy
research‐
a culturalhistorical activity perspective.
Energy Research and Social Science, 2017. Under
reviewing.
211. Kataria, A., et al., Exploring BridgeEngine Control
RoomCollaborativeTeamCommunication.TransNav‐
the International Journal on Marine Navigation and
SafetyofSeaTransportation,2015.9(2):p.169176.
212. Winograd,T.,BringingDesigntoSoftware.1996,
US:
AddisonWesley.
213. Parent, R., M. Roy, and D. StJacques, A systems
based dynamic knowledge transfer capacity model.
Journal of Knowledge Management, 2007. 11(6): p. 81
93.
214. Johnson, H. and K. Andersson, Barriers to energy
efficiency in shipping. WMU Journal of Maritime
Affairs,2016.15(1):p.79
96.
215. ViktorMayerSchönbergerandK. Cukier,BigData:A
Revolution That Will Transform How We Live, Work,
andThink.2013,UK:JohnMurray.
216. Johnson, H., et al., Will the ship energy efficiency
managementplanreduceCO2emissions?Acomparison
with ISO 50001 and the ISM code. Maritime Policy
&
Management,2013.40(2):p.177190.
217. Man,Y.,M. Lundh, and S.N. MacKinnon, Managing
UnrulyTechnologiesintheEngineControlRoom:from
Problem Patching to an Architectural Thinking and
Standardization. WMU Journal of Maritime Affairs,
2018:p.123.
218. EU. EfficienSea 2‐Efficient, Safe and Sustainable
Traffic
atSea.2015[cited2018Sep13]; Availablefrom:
https://ec.europa.eu/inea/en/horizon
2020/projects/h2020transport/waterborne/efficiensea2.
811
219. DMA.MaritimeCloudconceptualmodel.2016[cited
2018Sep14];Availablefrom:http://maritimecloud.net/.
220. DMA, MidTerm Periodic Technical Report, Part B.
EfficienSea2‐efficient,safeandsustainabletrafficatsea.
2016,DanishMaritimeAuthority.
221. Foord, A.G. and W.G. Gulland, Can Technology
Eliminate Human Error? Process Safety and
EnvironmentalProtection,
2006.84(3):p.171173.
222. Hollnagel, E., D.D. Woods, and N.C. Leveson,
Resilience engineering: Concepts and precepts. 2006,
Aldershot,UK:Ashgate.
223. Hollnagel, E., The ETTO Principle: Efficiency
Thoroughness Tradeoff: Why Things That Go Right
Sometimes Go Wrong. 2009, Farnham, UK: Ashgate
PublishingLimited.
224. Woods,D.D.,The
priceofflexibility,inProceedings of
the 1st international conference on Intelligent user
interfaces.1993,ACM:Orlando,Florida,USA.p.1925.
225. Perrow,C.,NormalAccidents:LivingwithHighRisk
Technologies.1984:PrincetonUniversityPress.
226. Flach,J.M.,Complexity:learningtomuddlethrough.
Cognition,Technology&Work,2011.14(3): p.
187197.
227. Snowden,D.J.andM.E.Boone,ALeaderʹsFramework
for Decision Making. Harvard business review, 2007.
85(11):p.6876.
228. Marx, K. and F. Engels, Theses On Feuerbach, in
Ludwig Feuerbach and the End of Classical German
Philosophy.1845,ProgressPublishers:Moscow,USSR.
229. Winograd,T.
andF.Flores,Understandingcomputers
and cognition. 1986, Norwood, NJ: Ablex Publishing
Corporation.
230. OʹLeary, Z., The Social Science Jargon Buster. 2007,
ThousandOaks,CA:Sage.
231. Popper, K., The logic of scientific discovery. 1959,
London:Hutchins&Co.
232. Squires, E.J., The Mystery of the Quantum World.
1994:CRC
Press.
233. Latour,B.,Aramis,Or,TheLoveofTechnology.1996:
HarvardUniversityPress.
234. Peirce, C.S., What Pragmatism Is. The Monist, 1905.
15(2):p.161181.
235. James,W., Pragmatism: A NewNamefor Some Old
WaysofThinking.Science,1907.26(667):p.24.
236. Mead,G.H.,TheSocial
PsychologyofGeorgeHerbert
Mead.1956:UniversityofChicagoPress.
237. Dewey, J., The Quest for Certainty: A Study of the
RelationofKnowledgeandAction.1929:Putnam.1425.
238. Patton, M.Q., Qualitative research & evaluation
methods.2002,ThousandOaksCa.:Sage.xxiv,598p.