49
1 INTRODUCTION
The maritime industry constantly performs
challenging operations with much potential for
human errors. These operations need a delicate
interplay between human and technological factors
organized in a sociotechnical system to achieve
complex goals: e.g. to successfully transport
hazardous cargo in constrained and shallow waters
alongside heavy traffic. Sociotechnical systems are
charact
erized by high numbers of dynamic and
interdependent tasks that are necessary to
successfully perform a wide range of complex
operations. All components of these systems must
workseparatelyandinmutualdependencywitheach
other. In the maritime domain, technical errors are
less prone than human errors, which dictates the
amplit
ude necessary to put on training and
assessmentofoperator’serrorperformance.
Humanerrorhappensalltimeandisaninevitable
partofhumannature.Withinthemaritimeindustry;
however,humanerrorsgeneratecriticalconsequences
soseveretheyareworthspendingtimeandresources
topreventandmitigate (Kim and Nazir 2016).Such
consequences are associated with costly damages to
equipment, loss of lives, severe injuries, or
environmentalpollut
ion.
Human error is involved in between 8085 % of
maritime accidents (HanzuPazara et al. 2008).
Consequently, much resources are spent to improve
humanperformanceandreducehumanerror.
Errorandhumanreliabilit
yhavebeenresearched
frommultipleperspectives,i.e.preventiveorreactive,
andlevelsi.e.anindividual,team,andallthewayto
theorganizational‐orsocietallevel.Thisisnecessary
consideringthathumanperformanceisinfluencedon
all levels of analysis, from individual cognitive
patternstoorganizationalstructure.
Human Error in Pilotage Operations
J
.Ernstsen&S.Nazir
UniversityCollegeofSoutheastNorway,Vestfold,Norway
ABSTRACT: Pilotage operations require close interaction between human and machines. This complex
sociotechnical system is necessary to safely and efficiently maneuver a vessel in constrained waters. A
sociotechnicalsystemconsistsofinterdependenthuman‐andtechnicalvariablesthatcontinuouslymustwork
togethertobesuccessful.Thiscomplexityispronetoerrors,andstatisticsshowtha
tmosttheseerrorsinthe
maritime domain are due to human components in the system (80 85%). This explains the attention on
research to reduce human errors. The current study deployed a systematic human error reduction and
predictionapproach(SHERPA)toshedlightonerrortypesanderrorremediesa
pparentinpilotageoperations.
Datawascollectedusinginterviewsandobservation.Hierarchicaltaskanalysiswasperformedand55tasks
were analyzed using SHERPA. Findings suggests that communication and action omission errors are most
pronetohumanerrorsinpilotageoperations.Practicalandtheoreti
calimplicationsoftheresultsarediscussed.
http://www.transnav.eu
the International Journal
on Marine Navigation
and Safety of Sea Transportation
Volume 12
Number 1
March 2018
DOI:10.12716/1001.12.01.05
50
Ultimately, it requires strenuous efforts to
pinpointwhenandwhereerrorsarelikelytohappen.
Thetypeoferrorsandprobabilityofhumanerrorto
occurcanbefoundthroughcarefulanalysisoftasks
and system requirements. This will yield designers
andtrainers information of which specific tasks and
system
characteristics need fortification.This
proactiveapproachtohumanerrorisvaluableforthe
maritimeindustry(withmuchcompetitionandscarce
resources) considering the cost of consequences,
despite the efforts needed to implement measures
againsthumanerrors.
Experts and novices are both prone to errors.
Experience is an essential part of
expertise, and the
roadtobecomeanexpertinvolvesdevelopingmental
schemas.Theschemashelptheoperatorbyreducing
the time taken to recognize situations and to make
decisionsandcorrectiveactionsaccordingly(Naziret
al. 2013). Experts have sophisticated ways to
subconsciouslyknowwhattodooftencharacterized
by
experts telling that “they just know”. Their
schemas allow them to understand situations
triggered by small, subtle cues within the
environment. As opposed to experts, novices have
mentalschemasthatarelesseffective,thusrelyingon
more attention and cognitive resources to perceive,
understand, and predict the same situation. This
differencemanifestsintheantecedentsrelatedtothe
errors conducted in complicated situations: Where
experts can perceive subtle environmental cues to
understand the situation while monitoring, novices
must pay closer attention to catch the same cues.
Experts; who uses less attention and rely on mental
patterns, can be misguided when perceiving
or
interpreting environmental cues, consequentially
making a poor decision and action. Novices are less
likely to make the same mistake as they pay more
resourcestotheenvironmentandinterpretsthecues
moreconsciously,butthismakesnovicesmoreprone
tooverload,whichtherefore, makesthemignorantto
importantenvironmental
cuesaboutthesituation. In
complex maritime operations, understanding these
characteristics are of paramount importance to
effectively implement measures that reduce the
probability and mitigate consequences of human
errors.
Pilotage is a renown complicated pilotage
operation(SharmaandNazir2017).Consideringthe
dynamic nature of pilotage operations, i.e. that the
safest
situationoftenistokeepgoing,putspressureto
continuouslymaintainsituationawareness.Lossofit,
byforinstancethemechanismsdepictedabove,may
result in an accident. There are many examples of
accidents during pilotage operations, e.g. Godafoss,
Federal Kivalina and Crete Cement accidents
(Accident Investigation Board 2010a; Accident
Investigation Board 2010b; Accident Investigation
Board 2012). To assess human reliability in an
operation, one must understand the operation itself.
Thus,nextadepictionofagenericpilotageoperation.
Pilotage operations can be broken down to eight
main tasks: Order and get the pilotaboard, develop
group relationship, installing the pilot,
assess
environment and weather, decide route, supervise
navigation,coordinate tugboatsandberthing
(Ernstsen et al. In Press). Developing group
relationshipand assessingenvironmentandweather
are nonsequential continuous tasks, while the other
tasksareusuallyperformedinthesequenceshownin
Figure1below.
Figure1.Timelineoftasksinpilotageoperation
Pilotage operations are dynamic with many
interdependenttasks.Italsoconsistsof much subtle
and nontransparent feedback from the system,
makingitmorechallengingandmentallyintensiveto
perceive, assess, understand, and decide the proper
course of action. For instance, radar with unprecise
settingsmaydetectnoisewhichcanbe
bothwavesor
fishingvesselstoanuntrainedeye.Thus,operatorsin
pilotage operations are heavily dependent on
individualskillsandknowledgeoftheoperation,as
wellasefficientcollaborationtosuccessfullybring the
vessel to berth or out of the port. This complexity
gives much potential to do human
errors, which
emphasizestheneedtounderstandthenatureofsuch
errors.
Human error research vastly increased after
complexaccidents inthe70sand80s,e.g.ThreeMile
Island and Chernobyl. The focus changed from
technical malfunctions to acknowledging the role of
human factors. After this, accident investigations
began to look
for errors caused by human
components,eitheritbeingfoundatthesharp‐orthe
blunt end. Error research became popula r, and as a
consequent,manytheoriesweredevelopedaccording
to how it is conceptually applied, e.g Rasmussen
(1983); Reason (1990); Sanders and Moray (1991);
Wickensetal.(2015);Woods
etal.(1994).
Hollnagel(2000)attemptedanovelviewoferror,
looking at errors as contextual factors influencing
(normal) performance variability and dictates one
need to understand how these factors influence
behavior to understand how situational changes
impactperformancevariability(asopposedtocoinit
“human error”). As mentioned, pilotage operations
are complex, dynamic with a multitude of
interdependent tasks. This dictates a need to
understand which environmental circumstances
affectshumanreliabilitytoallowpinpointedtraining
anddesignalterations.
Human reliability is the positive orientation of
humanerror.Humanreliabilityassessment(HRA)is
a broad name for ways to find and
predict human
errors in a system. The increase in human error
research have resulted in a high number of various
humanreliabilityassessmentmethods,andmostcan
be divided as quantitative or qualitative approaches
tounderstandandpredicthumanerror.Forinstance,
51
Bell and Holroyd (2009) found 72 tools related to
humanreliability.PleaseseeAalipouretal.(2016)for
a short review of more HRA examples. The basic
functions of most HRA methods are: (1) to find
humanerrorsassociatewiththeoperation,(2)predict
the likelihood of occurrence, and (3);
if necessary,
reduction of their likelihood (Park and Jung 1996).
Quantitative approaches to human error are mostly
concerned with human error probabilities, which,
according to Bell and Holroyd (2009), is defined as
depictedinEquation1:

E
EO
N
PHE
N
 (1)
where N
Eis number of errors and NEOis number of
opportunities for errors. However, to find data to
calculate error probability is challenging and often
duetomuchsubjectivity.Acountermeasureistofirst
thoroughlyunderstandwhicherrortypesareproneto
occur in the operation under analysis before
attemptingtocalculateerrorprobabilities.
ThecomplexityofHRA
increasesastheoperation
ismoreintertwinedinasociotechnical frameworkas
therearemoreinterdependentanddynamicvariables
influencingthehumanreliability.Thismakesiteven
moredifficulttothoroughlyunderstand whicherror
types exists. To find them in a complex system;
however, SHERPA is a suitable human reliability
assessmentmethodformaritimeoperations.
The main contribution of the current paper is to
performSHERPAtoidentifyerror typesfortheeight
tasks associated with a pilotage operation, as
mentionedabove.ASHERPAcanshednovellighton
complex operations through a consistent analysis of
tasks, error types and potential
consequences
associated with tasks. The goal is to provide
information about human errors in pilotage
operations.
2 METHOD
Data was collected using interview and observation.
The interviews were unstructured, openended
interviews and observations. The interview was
designed to gather information regarding tasks and
goals associated to a pilotage operation
and the
cognitive demands for the pilots and captain
respectively.Theintervieweeswerepresentedwitha
scenario of a 30.000 deadweight oil tanker with a
goaltoberthatSlagentangenoilrefineryinNorway.
Itisastandardscenariothatmostcaptainsandpilots
haveexperiencedoratleastcan
relateto.Adefinition
ofamediumsizedaccidentwasinquiredmidofthe
scenariotalkthrough.Theparticipantswereaskedto
ranktheaccidenttoalevel4ona10levelscale,with
level10beingtheaccidentwithhighestconsequences,
e.g.explosionandlossoflifeor
severecasualties.
All interviews began with a review of informed
consent to participate and to audio record the
interviews.Thelengthwas1hourand15minuteson
average,longest1hourand37minutesandshortest1
hour and 5 minutes. When data saturation was
achieved,ashifttowardsvalidation
ofdataoccurred
to ensure a valid representation of the piloting
operation.Thesameinterviewerwas used to ensure
consistency. Data collection process and storing was
approvedbyNorwegianCentreforResearchData.
Observation was issued to collect data and to
validate and verify findings following the task
analysis. The
observation scenario was to follow a
pilotonacarcargovesselleavingOsloPortboundto
Hvasserpilotstation.Theresearcherwasconsciousto
notice the occurrence of tasks that were identified
from the interview data. The observation was open,
andtheresearchercouldaskquestionthroughoutthe
voyage to
ensure a consistent and elaborate
understandingoftheoperation.
2.1 Samplingandresponserate
Thesnowballapproachwasusedtogather
interviewees (i.e. ask interviewees to provide
colleague/friends fitting the criteria of interviewee).
Eightintervieweeswithpilotingexpertisecontributed
to the analysis and four interviewees with captain
expertise. To be
a captain or pilot requires much
experience, thus all applicable interviewees were
deemedsubjectmatterexpertsconsideringtheirwork
positions.Theinterviewshadslightlyshiftduringthe
research,consistentwiththeiterativedevelopmentof
much qualitative research: the development and
validationofthetaskanalysisandfurther,validation
of SHERPA. Most
interviews were conducted in
person; however, due to geographical separation,
threeinterviewsweredoneusingFaceTime®.
2.2 Structureandanalyzeresults
Interview data were transcribed verbatim. More
efficient transcription methods were used as data
saturation approached, e.g. transcription of only
relevant sections of dataset. Tasks and functions for
the task
analysis were identified with both a
grounded (i.e. bottomup) and a theoretical/practical
evaluation (topdown), where the information is
evaluatedbysubjectmatterexperts.
2.2.1 Contentanalysisandtaskanalysis
Content analysis is a common way to analyze
textualdatawherethebasicprincipleistocodedata
into categories.
Categories can be grounded directly
from the text itself or relate to established theories.
The interview transcription was coded to converge
tasksandgoalsrevealedintheinterviews.Therecord
wasbrokendowntobeanalyzedwiththepurposeof
identifying emerging categories within the dataset.
Thiswasusedto
structureandprovideinputtothe
task analysis. Content analysis is a powerful way to
reduce confirmation bias when understanding
interview data.The information from the content
analysis was used to structure the hierarchical task
analysis.
52
Figure2.Maintasks,tasksandsubtasksinapilotageoperation
53
Tasks and functions are structured hierarchically
following the steps of Annett et al. (1971) for
conducting hierarchical task analysis. Baber and
Stanton(1996)statesthattaskanalysisisacommonly
used tool to structure tasks prior of subsequent
investigativehumanfactorsanalyses.Thecondensed
resultsfromthetaskanalysiscan
beseeninFigure2.
2.2.2 SystematicHumanErrorReductionandPrediction
Approach(SHERPA)
SHERPA concerns the identification of three
commontrends:(1)errorswithahighprobabilityof
occurring, (2) errors which are deemed critical, i.e.
substantial damage to vessel, personnel or yield
environmental hazards, and (3) finding with
a high
frequencyofthesameerrortype,e.g.multipleerrors
arecategorizedasanactionerror(pleaseseeTable1
foranoverviewoferrorcategories).Thesearetrends
commoninpilotageoperations.
Critical consequences are defined binary (yes/no)
in SHERPA. Error probabilities are defined in an
ordinal manner in
SHERPA; i.e. low probability is
assignederrorswhichhaveneveroccurred,medium
probabilityisassignedifithasoccurredonprevious
occasions, and high probability are assigned if the
error frequently occurs, and this data relies on
historicaltrendsand/orsubjectmatterexperts.
The information obtained using SHERPA can be
used to understand which error types related to the
various tasks in a pilotage operation and how to
preventormitigatethem.
Table1.ErrorcategoriesSHERPA
_______________________________________________
ErrorcategoriesActionerrors
Checkingerrors
Retrievalerrors
Communicationerrors
Selectionerrors
_______________________________________________
Furthermore,SHERPAmayintrigueresearchwith
appealing hypotheses to investigate; that is, if you
investigate relationships and frequencies of error
types.
3 RESULTSANDANALYSIS
The eight main tasks elaborated on above were
investigated using SHERPA. Each main task was
independently analyzed following an after followed
an analysis of the overall
error relationships and
frequencydistributionoferrors
3.1 Resulthierarchicaltaskanalysis(HTA)
Thetaskanalysiswasstructuredhierarchicallyandin
a timeline to give information about the complexity
associated with pilotage operations. There was
identifiedtwononsequentialtasks(task2and4)and
6sequentialtasks.Thesequential
tasksaregenerally
(notstrictly)conductedintheorderpresentedabove,
whereas task 2 and 4 are continuously carried out
across the other tasks as well. This is depicted in
Figure 1 in the introductory section. However, the
tasksareplacedhaphazardlyintask2and4toshow
itssignificance
totheoverallprocess.Asanexample
of deviation, it was noticed during the observation
study that the pilot began weather assessment an
hour before entering the vessel, to begin mentally
planningandunderstandingtheoperation.Thepilot
was experienced and identified fog which must be
accounted for while voyaging
in confined waters.
Frequency and distribution of tasks can be seen in
Table2below,consistentwithFigure2above.
Table2.Frequencyofmaintasks,tasksandsubtasks
_______________________________________________
Maintasks8
Tasks28
Subtasks55
_______________________________________________
Thetaskanalysisdidnotgointomoredetaillike
motoric, mechanical and cognitive operations as it
wouldnotcontributefurtherinformationtoconduct
SHERPA.
3.2 AnalysisofSHEPRAresults
Themostfrequenthumanerrorinpilotageoperations
arerelatedtoactionomission(decisionnottoact),as
revealedbySHERPA.
Thesecondmostfrequenttype
of errors are communication errors. Unfortunately,
pilotage operations consist of much communication
andaredependentonefficientandprecisesharingof
information to achieve a successful operation;
additionally,thisappliestoformalaswellasinformal
communication. Considering that the maritime
industry employs crew from
all over the world,
culturalandlanguagebarriersputelevatedstrainon
the communication aspect of the operation. This
indicateaneedtofurtherinvestigatetherelationship
ofcommunicationandactionomission.
Table3 below shows an overview of error types,
frequencyoftherespectiveSHERPAprobabilitiesand
tasks which were
deemed critical. The table shows
resultsforall8maintasksandintotalfortheoverall
operation.Thetablegivesinformationaboutthemost
and second most frequent error type spread out on
each main task respectively, with action error the
most frequent overall and information error the
second
most frequent. The ordinal probability of
errorsispresentedaswell:Hereweseethatmedium
probabilityismostevident representing27ofthe55
subtasks. The human error assessment reveal that
there are 15 subtasks witha critical consequence of
occurrence. There are two subtasks which are
assessedtohavehighprobabilityofoccurrenceanda
potential for high consequence if the error is
conducted: 2.1.2 and 3.2.1, please see Table 4 where
thesetasksareextractedfromSHERPA.
54
Table3. Most frequent error type, second most frequent error type, number of errors with low, medium, and high
probabilityandnumber of errorswith critical consequencedistributedamong eight main tasksdiscoveredin HTA. P =
Probability,C=Critical.Hyphen“/”indicatesatie.
__________________________________________________________________________________________________
Task1 Task2 Task3 Task4 Task5 Task6 Task7 Task8 Total
__________________________________________________________________________________________________
1
st
most Information/ Action Checking None Action Information Action Action Action
errortype Action
2
nd
most Information/ Information Action None Retrieval Action Information Retrieval/ Information
errortype ActionInformation
LowP 1030 007620
MediumP 7530 360527
HighP 0420 10008
Noof2
320 262015
Criticalerrors
__________________________________________________________________________________________________
Table4.Taskswhichareconsideredhighprobabilityofoccurrenceandpotentiallyhighconsequence.
__________________________________________________________________________________________________
Subtask ErrormodeErrordescriptionConsequence  RecoveryP C Remedial
strategy
__________________________________________________________________________________________________
2.1.2 I1:Information Uncertaintyofwho EvasiveClearstatement HighYes Ensureroutinesof
notcommunicated.willhavecontrolof maneuversofwhowillcontrolclarifyingcontrol.
instrumentsandin omitted.whichpartsof
variousscenarios.theoperation.
3.2.1 C1:Checkomitted Pilotnotknownto Vesselnotbehaving
Contactcrew HighYes Pilotreceive
technicalaccordingly,e.g. immediatelyvesseltechnical
malfunctions.sterncrashintoportregardingvesselconditionpriorto
becauselackof technicalstatus.boardingvessel.
thruster.
__________________________________________________________________________________________________
Theextractedsubtaskshaveerrortyperegarding
information and check omission. Subtask 2.1.2 is to
initiate talk regarding intent and expectations; most
importantly,distributionoftasksamongthepilotand
crew, e.g. whom will maneuver the vessel while
berthing. Subtask 3.2.1 is a check of paramount
importancethatthepilotneedstodowhileinstalling
him‐orherselftothecommandbridge.Withfurther
interpretations of findings, it may be hypothesized
that subtask 2.1.2 and 3.2.1 are dependent or the
sameunderlyingmechanisms.
3.3 Validityandreliabilityconsiderations
Twoindependentresearcherswereintroducedtosub
sections
of the data to analyze findings, a common
process to ensure reliability of qualitative analyses.
Further subsequent validations were performed by
two subject matter experts to ensure that the
researchers have structured and analyzed the date
consistency. The converged result among the
researchersandsubjectmatterexpertswasconsistent.
The
observation study of the reallife piloting
operation functioned as part validation and part
further data gathering, as consistent with the
mentioned iterative nature of qualitative research.
Theresultsfrom
Theobservationstudyprovidedmoreevidencefor
reliableandvalidfindingsfromthehumanreliability
analysis.
4 DISCUSSION
The findings in
the current research shed light on
humanerrorsinpilotageoperationsthathaspotential
toresultinaccidents(e.g.Godafoss,FederalKivalina
and Crete Cement accidents). Communication and
action errors were found to be most prevalent in
pilotageoperations.
Detailedunderstandingofhumanerror inpilotage
operationsareuncoveredin
thecurrentresearch. This
informationisgatheredusingaqualitativeapproach
whichisacommonwaytodeepenunderstandingofa
topic in an exploratory manner, and to perhaps
subsequentlygeneratehypothesesthatareofinterest
for the scientific community and/or industry to
explorefurther.Severalcognitivechallengesthatput
load
onthemaritimeoperatorswereidentifiedinthe
humanreliabilityassessmentconductedinthisstudy.
These cognitive challenges influence the overall
mental capacity of the operators which affects the
safety‐andefficiencyperformanceoftheteam.Action
errors and information errors were found most
prevalent in a pilotage operation. The
frequency of
errors was further analyzed to understand the
underlyingmechanicsthatimpactshumanreliability
duringpilotageoperations.
4.1 Potentialunderlyingmechanicsinpilotageoperations
There is a possible connection between omission
errorsandcommunicationerrorsthatemergeswhile
studying the distribution and frequency of errors
types. It is likely that
this connection is the same as
the underlying mechanics which impact the
performanceof2.1.2and3.2.1.
Thismechaniccanbesocialclimate.Socialclimate
is commonly understood as antecedents of safety
compliance (Neal et al. 2000): where safety
compliancehereistocheckpilotcards(subtask2.1.2)
and
to perform intrateam communication (subtask
3.2.1). Neal et al. (2000) found a factor loading
55
between safety knowledge and safety compliance of
.35usingstructureequationmodelling.Knowledgeof
safetybehaviorandsafetycompliancearetied.Figure
3 below shows connection between social climate,
safetyknowledgeandsafetycompliancerelatedtothe
mostfrequenterrorsrevealedinthecurrentstudyon
pilotage operations. In this
hypothesis, team
communicationskillisassociatedwithapartofsafety
knowledgeconsideringhowcommunicationtraining
isfocusedonitssafetyandefficiencyimportance.
Figure3.Effectsbetweengoodandbadsafetyclimate.
Effects (1) and (2) depict command bridges that
havepoorsafetyclimate.Thereasonforpoorclimate
canbemanifold,forinstancepersonalitydifferences,
pressurefromshipownersorlackoftrust.Effect(2)
includes that the pilot has relevant skill in team
communication,anunderstandingofcommunication
as a safety
barrier acquired from training and
experience, as well as how to carry out proper
communication in stressful operations. In effect (3),
the command bridge operates under a good safety
climate.Thisrendertheskillonteamcommunication
(regardingsafetyknowledge)lessimportanttoensure
safe team performance: e.g. the team
communicates
intent and expectations and the pilot is incentivized
(bythecaptain)tocheckpilotcards.
4.2 Theoreticalandpracticalimplications
Humanerrorswillandarealwaysconducted,itisnot
aboutmakingusrobotsbuttopinpointtheerrorsthat
are most likely to occur Human error in pilotage
operations
are of concern because of the complexity
that exists and the consequences that may occur
following a human error. The prevalence and
consequence of human error dictates a need to
research and find measures to prevent them and
mitigate the consequences if they do occur. and the
reducethem.
The
currentfindingsareconsistentwiththeoretical
researchonhumanerrors.Communicationandaction
errors are essential components to have safe and
efficient working conditions for teams operating in
complex sociotechnical systems such as pilotage
operations. An underlying mechanism has been
suggested that should be further investigated to
understand how to improve
communication and
actionexecution.
Pilotageoperationsareexpensiveoperationswith
anaimtoensuresafepassageinconstrainedwaters.
The contribution of this research provide evidence
thatsuchoperationsarepronetocommunicationand
actionomissionerrors.Thisshoulddictateafocuson
these skills in training and selection of
captains and
pilots.
4.3 Limitations
The study has some limitations. In retrospect, there
shouldbemorestandardizationoftheinterviews.The
openended interviews make room for flexible and
pinpointed collection of data; however, in complex
operations such as pilotage, the openended
interviews tended to distort which parts of the
operation received attention. At the same time, the
relative high number of interviews justified this to
ensureanoverallunderstandingofpilotage.Another
limitation is regarding SHERPA. The dynamic and
complex nature of pilotage operations with several
nonsequentialtasksmakesitachallengetodevelopa
consistenthierarchical task
analysis(necessary input
toSHERPA);insofar,thiswascombatedwithusinga
timeline representation of the tasks and by
acknowledging nonsequential tasks to fit the most
commonly used placement in the over operational
procedure.
Reflexivity and subjectivity considerations are
common limitations with qualitative studies. The
analysts, interviewers and interviewees will
systematically shed their attitudes, prior knowledge
and experience on the findings and gathered data.
Nonetheless,measureshavebeentakentoreducethe
issue of subjective influence on research, e.g. using
other researchers when analyzing the data and the
iterative nature of gathering data and interpreting
data. To ensure that interviewees
are not led in any
directions, they were told that participation is
voluntaryandthattheinterviewcanbediscontinued
without any explanations. These measures are
consistent and commonly mentioned to reduce
subjectivebiaswhenperformingqualitativeanalyses
(Willig2008).
5 CONCLUSION
Pilotage operations have potential of human errors
whereerrors
havehighconsequences.Itisimportant
tounderstandandidentifytasksinsuchoperationsto
effectively design layout and train operators
according to the operational demands.This research
revealedthatpilotageoperations are prone to errors
whicharedependent on thecommandbridgesafety
climateandsuggestsfurtherexperimentsto
quantify
the causal relationships between action omission
errorsandsafetyclimate.
56
REFERENCES
AalipourM,AyeleYZ,BarabadiA(2016)Humanreliability
assessment (HRA) in maintenance of production
process: a case study International Journal of System
Assurance Engineering and Management 7:229238
doi:10.1007/s131980160453z
AccidentInvestigationBoardN(2010a)CreteCement‐IMO
NO. 9037161,Grounding atAspond Island inthe Oslo
Fjord,
Norway,on19November2008ReportSjø1
Accident Investigation Board N (2010b) Report on Marine
AccidentFederalKivalinaIMONO.9205885Grounding
atÅrsundøya,Norway6October2008ReportSjø1
Accident Investigation Board N (2012) Report on
Investigation Into Marine Accident M/V Godafoss
V2PM7 Grounding in Løperen, Hvaler
on 17 February
2011ReportSjø1
Annett J, Duncan K, Stammers R, Gray M (1971) Task
analysis. Department of Employment Training
InformationPaper6.HMSO,London,
Baber C, Stanton NA (1996) Human error identification
techniques applied to public technology: predictions
compared with observed use Applied ergonomics
27:119131
Bell J,
Holroyd J (2009) Review of human reliability
assessment methods. Health and Safety Executive
(HSE).ResearchReportRR679,
Ernstsen J, Nazir S, Roed BK (In Press) Human reliability
analysis of a pilotage operation TransNav, the
International Journal on Marine Navigation and Safety
ofSeaTransportation
HanzuPazaraR,BarsanE,Arsenie
P,ChiotoroiuL,RaicuG
(2008)Reducingofmaritimeaccidentscausedbyhuman
factors using simulators in training process Journal of
MaritimeResearch5:318
Hollnagel E (2000) Looking for errors of omission and
commission or The Hunting of the Snark revisited
Reliability Engineering & System Safety 68:135145
doi:http://dx.doi.org/10.1016/S09518320(00)00004
1
Kim Te, Nazir S (2016) Exploring marine accident
causation: A case study Occupational Safety and
HygieneIV:369374
NazirS,ColomboS,MancaD(2013)Minimizingtheriskin
theprocessindustrybyusingaplantsimulator:anovel
approachChemicalEngineeringTransactions32:109114
Neal A, Griffin
MA, Hart PM (2000) The impact of
organizationalclimateonsafetyclimateand individual
behaviorSafetyscience34:99109
ParkKS,JungKT(1996)Consideringperformanceshaping
factors in situationspecific human error probabilities
International Journal of Industrial Ergonomics 18:325
331
Rasmussen J (1983) Skills, rules, and knowledge; signals,
signs, and
symbols, and other distinctions in human
performancemodelsIEEEtransactionsonsystems,man,
andcybernetics:257266
ReasonJ(1990)Humanerror.Cambridgeuniv ersitypress,
SandersJ, MorayN(1991)HumanError—Cause,Prediction
andReduction,1991.LawrenceErlbaum:Hillsdale,NJ, 
Sharma A, Nazir S (In press) Distributed Situation
Awareness in pilotage
operations: Implications and
Challenges; 12th International Conference on Marine
NavigationandSafetyofSeaTransportation;2123June
2017Gdynia,Poland
WickensCD,HollandsJG,BanburyS,ParasuramanR(2015)
Engineering psychology& humanperformance.
PsychologyPress,
Willig C (2008) Phenomenological psychology: Theory,
researchandmethodExistentialAnalysis19:429433
Woods
DD, Johannesen LJ, Cook RI, Sarter NB (1994)
Behindhumanerror:Cognitivesystems,computersand
hindsight.DTICDocument.