503
1 INTRODUCTION
The motivation for autonomous commercial ships
stems from a desire to enhance safety, reduce costs
and decrease environmental risk associated with
shippingoperations.Humanerrorisestimatedtobe
responsible for between 76%94% of marine
casualties.[Allianz 2012] Seafarers and human
support can account for 30%44% of
ships costs in
termsofsalaries,crewquarters,bridgespace,human
interfaces and controls, and environmental systems
(heating and air conditioning, food, water, lighting,
plumbing, etc.).[Minter 2017, CBI 2018] Maritime
shipping is a significant contributor of greenhouse
gassesintotheenvironment,accountingforbetween
2.8%3.1%ofannualemissions.[IMO2015]
Muchcanbesaidabouthowandwhenunmanned
andautonomousshipsmayberealizedinthefuture.
A classic perception of how a remotely controlled
vesselmayoperateischaracterizedasfollows:
The captain with a giant screen which overlays the
environmentaroundhisvesselwithanaugmented reality
view
can navigate confidently using the computer
enhanced vision of the world, with artificial intelligence
spottingandlabelingeveryotherwateruser,theshore,and
navigationmarkers.[Stewart2018].
Intelligent Autonomous Ship Navigation using Multi-
Sensor Modalities
R.GlennWright
GMATEK,Inc.,Annapolis,MD,USA
ABSTRACT: This paper explores the use of machine learning and deep learning artificial intelligence (AI)
techniques as a means to integrate multiple sensor modalities into a cohesive approach to navigation for
autonomous ships. Considered is the case of a fully autonomous ship capable of making decisions
and
determining actions by itself without active supervision on the part of onboard crew or remote human
operators.Thesetechniques,whencombinedwithadvancedsensorcapabilities,havebeentoutedasameansto
overcomeexistingtechnicalandhumanlimitationsasunmannedandautonomousshipsbecomeoperational
presentlyandinupcoming
years.Promisesoftheextraordinarycapabilitiesofthese technologiesthatmayeven
exceed those of crewmembers for decision making under comparable conditions must be tempered with
realisticexpectationsastotheirultimatetechnicalpotential,theiruseinthemaritimedomain,vulnerabilities
that may preclude their safe operation; and methods for development,
integration and test. The results of
researchperformedbytheauthorinspecificapplicationsofmachinelearningandAItoshippingarepresented
citing key factors that must be achieved for certification of these technologies as being suitable for their
intended purpose. Recommendations are made for strategies to surmount present
limitations in the
development, evaluation and deployment of intelligent maritime systems that may accommodate future
technologicaladvances.Lessonslearnedthatmaybeappliedtoimprovesafetyofnavigationforconventional
shippingarealsoprovided.
http://www.transnav.eu
the International Journal
on Marine Navigation
and Safety of Sea Transportation
Volume 13
Number 3
September 2019
DOI:10.12716/1001.13.03.03
504
Expanding this concept to autonomous ships
merely requires replacing the captain with an
automaton. Whilethismay bewell statedas a goal,
such a bold level of selfassurance, confidence and
trust inthe capability and correctness of sensor and
reasoning systems and their proper integration on
which the
captain must rely should be considered
premature as the safe and reliable performance of
such systems has yet to be proven. Furthermore, in
this scenario there is no consideration of situational
awarenessbelowthewaterline.
Machine learning and deep learning artificial
intelligence (AI) technologies form the core decision
makingcapability
tonavigateMaritimeAutonomous
Surface Ships (MASS). The results of research
presented focuses on the specific requirements of
vessel navigation in terms of the sensors needed to
survey the immediate vicinity to achieve situational
awarenessfortacticaldecisionmakinginresponseto
immediate threats and conditions. However, also
consideredisthe
largercontextofMaritime Domain
Awareness (MDA) as relates to the successful
completion of a voyage by extending shipboard
capabilities using external sensors and information
resources. The nature and characteristics of sensor
dataisalsoconsideredintermsoftheinformationto
be conveyed and limitations of the data taking
into
accountcompleteness,accuracyandlatency.
Having considered the scope of the information
that is available, an assessment of the processes,
methods and framework used in the development,
testinganddeploymentofdecisionmakingproducts
ismade.Issuesconsideredincludetheproper useof
machinelearninganddeeplearningAI,verification
of
implementationasbeingcorrectandsuitablefortheir
intended purpose, and ultimately to determine
whether their scope is sufficient to ensure safety of
navigation. Key factors influencing the probabilities
of achieving the goals of enhanced safety, reduced
costsanddecreasedenvironmentalriskarediscussed
basedupontheresultsof
experimentsperformed.
Conclusions are provided regarding critical gaps
in sensor coverage and capabilities as well as
shortfalls in MASSenabling technologies that are
presently not considered by industry, regulatory
authoritiesandacademia.Recommendationsarealso
given to address these deficiencies to help advance
MASSgoalsandobjectives.
2 REGULATORYISSUES
For
many decades the International Regulations for
Preventing Collisions at Sea (COLREGS) have
requiredthat“everyvesselshallatalltimesmaintain
aproperlookoutbysightandhearingaswellasby
all available means appropriate in the prevailing
circumstances and conditions so as to make a full
appraisal
of the situation and the risk of collision.”
[COLREGS rule 5]. Vessels are further required to
makeproper use ofradarequipment toobtain early
warningofriskofcollision,touseradarplottingor
equivalentsystematicobservationofdetectedobjects;
andare warnedthat assumptions shallnot bemade
on
the basis of scanty information.[COLREGS rule
7b,c]Suchregulationswerewrittenforvesselsstaffed
by seafarers who rely on their human senses and
interpretation of environmental conditions,
navigation charts and instruments based upon
knowledgeand experienceto executea safe voyage.
The present regulatory framework is limited to
human vision
and hearing, echosounder, radar,
Automatic Radar Plotting Aid (ARPA), Automated
Identification System (AIS), Electronic ChartDisplay
Information System (ECDIS) and Global Satellite
Navigation System (GNSS) to fulfill these
requirements. However, these technologies fall far
short of ensuring safety of navigation by remotely
controlled or autonomous vessels. The International
Maritime Organization (IMO)
is now conducting a
regulatory scoping exercise toamend the regulatory
frameworktoenablethesafe,secureand
environmentally friendly operation of partly or
entirely unmanned MASS and their interaction and
coexistence with manned ships within the existing
IMO instruments.[MSC 98/20/2] In view of these
presentinternationalregulationsMASS
researchand
development is currently limited to within national
watersandbetweenadjacentcountries.
3 ENVIRONMENTALSENSING
Sensor systems dedicated to monitoring the surface
ship maritime environment, illustrated in Figure 1,
are available from three perspectives: the water’s
surface,belowsealevelandfromspace.Surfaceand
subsea systems generally provide
real time ship
centric, line of sight data and imagery while space
based systems provide access to data, information
andimageryavailableworldwidefromawidevariety
ofsourcesexternaltothevessel.
Figure1. Maritime Environment Sensor System
Perspectives.
The scope of sensors needed to safely navigate
MASS along long stretches of relatively low traffic,
deep ocean routes does not differ much from
navigation in shallow, coastal waters amongst
archipelagos crowded with both working and
recreationalvessels.Moresignificantis theabilityto
properly integrate multiple sensor modalities with
reasoning
aboutvastamountsofdataandimageryto
create the information needed to make and explain
observations, critically assess their significance
505
regarding potential threa ts, vessel capability and
performance; and to react to these observations to
minimizerisk,ensurethevoyageiscompletedsafely,
recover from dangerous situations and, in the event
recovery is not possible, to effectively preserve life,
propertyandtheenvironment.Thescopeofmaritime
sensors (beyond presently mandated
equipment)
available from all perspectives and the fusion of
sensordataandimagerytocreateinformationforuse
by automated reasoning processes on board vessels
and landbased operators are described in this
paragraph. The methods and techniques used to
analyze this information and take all appropriate
actioniscovered
inparagraph4.
3.1 ShipboardSensors
Sensor capabilities needed on board both remotely
controlled and autonomous ships must not merely
replicatethesightandhearingofseafarers,butmust
exceed their abilities by enabling constant vision
through 360° around the vessel in four dimensions
(x,y,z,time)athigherresolutionandgreateraccuracy
thanishumanlypossible.Thisincludestheabilityto
see in the dark in all weather conditions including
heavy rain, snow and through fog over the water’s
surfaceandtohearsoundsassociatedwithships,aids
tonavigation(ATON)andintheenvironmentsuchas
sound signals and waves crashing
on rocks. Also
needed is the ability to see underwater ahead and
aroundthevesseltodetectandrespondtothreatsnot
chartedandtoavoidgroundingsandallision.MASS
mustthenreasonwiththisinformationoverextended
periodsoftimeinamannerthatisconsistent,correct
andverifiable.
Shipboard
sensors required under IMO vessel
carriage requirements include human sight and
hearing,oftenaugmentedwithbinocularsandhailer
listening capabilities. This is supplemented with
radar tohelp detect and avoidother vessels, ATON
and land masses. An echosounder is also needed to
maintainconstantvigilanceofwaterdepthbelowthe
keel. ECDIS displays electronic navigation chart
(ENC) information that should represent the most
recent hydrographic surveys of the areas sailed, the
locationsofchannelsandATON,andknownhazards
tonavigationlikelytobeencounteredalongtheroute.
AIS provides a wealth of information on nearby
vessels related to position, speed
and identity, and
routing. GNSS provides context for all of the above
information in terms of vessel geographic position,
speedanddirectionoftransit.
These required sensors perform very well in
extendingthesight ofseafarers atseato accomplish
traditional navigation functions. However, the IMO
regulatoryframeworkhasfailed
tokeepupwithnew
technologies that can also enhance safety of
navigationforconventionalships.Withtheadventof
remotelycontrolledand MASS,new sensor
capabilitiesare now beingconsidered that hopefully
may be applied to both staffed and autonomous
ships. Several of these technologies extend the
functionality of existing
systems by providing new
features, while others provide entirely new abilities
that have not beenpossible in the past. Further, the
integration of shipboard sensor data with external
dataandinformationresourcesavailablefromspace
based sensors and broadband communication
channelsprovidethefundamentalbuildingblocksfor
cooperative decision making
between vessels and
shoreside operators, and locally between vessels
usingawidearea network(WAN)thatisestablished
amongstthevesselsthemselves.
ManysuchtechnologiesareillustratedinTable1.
Adiscussionoftheircharacteristics,thetypesofdata
they can produce and their application to enhance
vessel situational awareness is
provided in the
paragraphsthatfollow.
3.1.1 SurfaceSensors
Augmentation of present IMOmandated vessel
environmentalsensorsystemswithfurthercapability
isessential toachievesituationalawarenessforMASS
andto ensureproper supervision and traceabilityof
decision making. These sensor systems can expand
upon existing capabilities as well as
provide new
capabilitiesnotpresentlyavailablewhich,throughthe
fusion of diverse data sources, can provide
unprecedentedlevelsofvesselsituationalawareness.
Examples of shipboard surface sensing systems that
canprovidenewandredundantprecisionnavigation,
timing,visionandacousticcapabilitiesinclude:
InertialNavigationSystems(INS)
LaserImaging(LiDAR)

MillimeterRadar(mmRADAR)
VideoandInfrared(IR)Cameras,and
Microphones.
Supplementalcapabilityatandabovesealevelcan
beachievedusingUnmannedAerialVehicles(UAVs)
equipped withsimilar sensorsto extend the vessel’s
vision. A basic complement of weather instruments
integrated into the overall vessel sensor fusion
architecturecanproviderealtimedataonwindspeed
and direction, temperature, barometric pressure,
humidity and sea temperature that is vital for
onboard reasoning capabilities to detect and
compensatefortheeffectsofwind,currentsandother
phenomenaonMASSthroughoutthevoyage.
Specific attention is given to ATON such as
landmarks, buoys and other devices or systems
externaltovesselsdesignedandoperatedtoenhance
the safe and efficient navigation of vessels and/or
vesseltraffic.[IALA2014]Visionsensorsonboardan
autonomous vessel must be capable of imaging
ATON with sufficient resolution to detect their
characteristics, make a positive identification
and
determinetheirpositionthroughtheuseofGNSSand
ECDIS. Visual sensors may be supplemented with
radar and forward looking navigation sonar to
confirmATON positioningon ECDISwith realtime
observations. ATON transmitted using AIS (AIS
ATON)maybe colocatedwith physicalATON and
viewableonAISreceivers
onboardthevesselprovide
another means for determining position. Virtual
ATON (VATON) that require no physical
infrastructure can also aid in determining position
throughcoordinatedusewithGNSS,ECDISandthe
vessel echosounder used to provide navigation
through contour tracking along the seabed.[Wright
and Baldauf, 2016] VATON may be
placed at
locations where physical and AISATON are not
possible due to harsh environmental conditions
506
and/or remote location. Note that both AISATON
and VATON also adhere to the IALA definition in
thattheyareexternalto thevessel.However,unlike
physical ATON reliance upon the visible spectrum,
AISATON and VATON rely upon radio and sonar
signals external to the vessel that are present
in the
electromagneticspectrum.
3.1.2 SubseaSensors
Seafarersdevelopskillsandtechniquesoveryears
of experience to assess changes to the environment
which canindicate hazardous sea states and bottom
conditions that compromise safety of navigation.
Visual clues include changes in sea color during an
approachtowardsashoal,water
temperaturechange,
andbreakingwavesorareasofcalmamongstrough
seas without obvious cause. Without anything more
thananechosoundertoprovidedirectinformationof
the depth of water directly blow the keel, seafarers
today are expected to operate using second hand
information of the depths, hazards and obstructions
along
their routes of transit provided by navigation
charts that may contain obsolete survey data that is
years,decadesorevencenturiesold.
Remote and autonomous vessel operations must
compensate for lack of human knowledge and
expertise as well as deficient charts by providing
sensor capabilities to directly assess bottom
configurations
andconditionsinrealtime.Examples
of shipboard subsurface sensing systems to provide
new and redundant precision navigation and vision
capabilitiesincludeechosoundersandsidescansonar,
eitherseparatelyorintegratedtogetherinoneunit,to
provide terrain tracking capabilities and high
resolution imaging of seabed landmarks to aid in
navigation.[Wright and
Baldauf, 2016a] Navigation
sonar with forward looking capabilities can provide
highresolutionbathymetrythatmaybecomparedto
electronic navigation charts (ENCs) displayed on
ECDISforbackupnavigation,detectionofhazardsto
navigation and obstacles, and avoidance of large
marine mammals.[Wright and Russell. 2017] These
dataarealsosufficientto
crowdsourcebathymetryfor
navigation chart development.[FarSounder 2018]
Also, much like UAVs, the use of Unmanned
UnderwaterVehicles(UUVs)canextendMASSvision
aheadofandinthelocalvicinityofthevesselbelow
thewaterline.
3.2 SpacebasedSensors
Asof2018therewereapproximately4,600satellitesin
Earth orbit, of
which nearly 2,000 were
operational.[UNOOSA 2018] One report shows the
growthinsatellitelaunchesincreasingthreefoldover
thenextdecadewith3,323satelliteswithamassover
50 kg. launched and to be launched between 2018
2027,comparedto1,019satellitesthatwerelaunched
between 20082017.[Satnews 2019] Many
of these
satellites,whensupplementedwithterrestrialsignals,
canprovideprecisepositioningandtiming
information with up to 1cm accuracy as part of the
GNSS. Many other satellites are used in maritime
operations to gather meteorological and
oceanographic (METOC) and terrestrial imaging.
However, much of the increase in satellite launches
represent
anewgenerationofsmallsatellitessentto
low earth orbit to create constellations of thousands
thatwillprovideubiquitousglobalbroadbandaccess.
This trend has already been noted with the
announcementbyInmarsatthattheirworldwideFleet
Xpressservicelaunched inMarch2016hadbyearly
2017passedthe
10,000shipmilestone.[gCaptain2017]
Table1.SensorTypesandDataClassesamongstMaritimeSurface,SubseaandSpaceSystems.
507
Broadband satellite connectivity is essential for
communications to aid in the monitoring of MASS
operations, the sharing of large volumes of sensor
imagery,dataandresultsofonboard decisionmaking
processes; and to help the implementation of
blockchain technology and big data applications to
ensure safe and secure operations. This
includes
spacebased sensing of AIS, METOC imagery and
numerical datasets, and other sensors including
Synthetic Aperture Radar to aid in noncooperative
surfacefeatureandobjectdetection;andLongRange
IdentificationandTracking dataforvessels.
3.3 SensorDataTypesandCharacteristics
Three classes of data are available from
maritime
sensors comprising the pixel, time and frequency
domainswhichrepresentdifferentperspectivesofthe
environment.[Wright2018]Thepixeldomainreflectsa
translation of a spatial quantity into a pixel
representation.Thisoccursbycapturinganimageofa
scene or object directly onto picture elements, or
pixels, each of
which contains an impression of the
qualitiesofasmall portionoftheoverallimage.The
originalsceneor objectisreconstructed bymeansof
reproducing the pixel impressions onto a display.
Thisisthe casefordigital andinfraredcameras and
othervisualsensors.Changesinimagerythatoccuras
a function of time are reflected in the time domain.
Differentmathematicalandstatisticalfunctionscanbe
applied topixel and time domainrepresentations to
extract data and correlate information regarding
image content. Direct to pixel domain imagery is
limited based upon the size and resolution of the
sensor and
can be enhanced using optical
magnificationandgreaternumbersofsmallerpixels,
as well as through the use of image filtering and
softwareanalytics.
Radar,sonarandLiDARimages arecreatedusing
an entirely different process involving one or more
transducers (antennas) that transmit and/or receive
signals.Thesesignalsaresubsequently
convertedinto
different domain representations. An example for
radar is provided in Figure 2 where received
waveforms are analyzed in the frequency and time
domains(b,c)andprocessedtocreateapixeldomain
representation(a).
Highly complex waveforms across many
frequenciesareprojectedontoascenewhicharethen
modified
through reflection and absorption based
uponthephysicalandelectricalcharacteristicsofthe
objectswithinthescene.Aportionofthetransmitted
signals are reflected back to and received by the
transducer which are analyzed as a function of
changes that occur over time as well as changes
detected in the
frequency of the signal. Analysis of
time and frequency domain signals is performed to
acquire the information necessary to subsequently
create a pixel domain image for display in the
mannerscustomarytoradar,sonarandLiDAR.
While this indirect approach has proven to be
highly accurate and reliable, it can
result in a great
deal of variability in how the targets and scene are
displayed to the user based upon signal resolution
andmanufactureruserinterfacedesignpreferences.A
target may be represented as a “blip” on a radar
screen and navigation sonar can paint a 3D surface
model of bottom
terrain, while LiDAR systems can
display a highly accurate model of the terrain and
quaysideenvironment.
Figure2. Chirp Waveform Variation over Time, with
Resulting Pixel Domain Representation of the Local
Environment.
Unlike imaging sensors, information contained
withinthereceivedsignalinthetimeandfrequency
domainsareusedtocreatetheresultingpixeldomain
image ba sed upon the properties of the waveforms
being transmitted, the gain and resolution of the
transducer elements, the sensitivity of the receiver
and the capabilities of the
software to analyze the
reflected signals. The ability to actively interrogate
targets using a wide range of waveforms provides
greater flexibility to analyze their reflected signal
propertiesacrossalldataclasses.Dynamicadjustment
ofwaveformsignalcharacteristicsinrealtimebased
upon target properties and greater capabilities in
analyzing time
and frequency domain datasets
continue to result in the retrieval of much greater
information content than was previously possible.
Recentexamplesinthecaseofsonardataincludethe
acquisition of swath bathymetry from navigation
sonar and other scientific data from high resolution
side scan sonar imagery.[FarSounder 2018, Wright
2017]. Similar
advances have also occurred in other
508
maritime applications that include improvements in
solidstateDopplerradars.
4 EXPERIMENTPARAMETERS
Experiments performed using a combination of
machine learning and deep learning AI techniques
resulted in the acquisition, assessmentand
characterizationofpixel,timeandfrequencydomain
representations of several different types of sensor
data to enhance situational
awareness for MASS
operations.Specificexamplesillustratethedetection,
identificationandcorrelationofothershipandsmall
vessel traffic and ATON to support safe navigation
along a well surveyed route according to modern
standardsofnavigation.Effortsincludedthefusionof
shipboard sensor data with information contained
within navigation charts,
local notices to mariners,
tideandcurrents,andotherinformationapplicableto
thevoyages.Thescopeofexperimentsperformedas
partofcontinuingresearchwaslimitedtoasubsetof
the complete vessel sensor suite needed to develop,
refine and evaluate shipboard data acquisition
methods, data analytics and resulting information
processes
for autonomous navigation in preparation
for future full scale implementation on a research
vesseltestbed.
4.1 ExperimentalSettingandConditions
The location of these experiments is in the Mid
AtlanticregionontheeastcoastoftheUnitedStates
within the Chesapeake Bay and its tributaries near
Annapolis, Maryland.
This area is frequented by
cargo, freighter, passenger ship, special craft and
otherlargevesselsintransittoandfromtheAtlantic
OceanandthePortofBaltimore.Therearealsomany
small recreational vessels present in the area,
especiallyduringthesummermonths.
The transit route is approximately 11 nm
long
beginning between buoys 87 and 88 on the eastern
side of Chesapeake Bay 1 nm off of Kent Island,
proceeding westward with Tolly Point Shoal (buoy
1AH)tostarboardand theNavalAnchorage toport
(buoy2),thennorthwestuptheSevernRiverpastthe
city of Annapolis and
the U.S. Naval Academy to a
point½ nmto theeast ofSt. Helena Island in Little
RoundBay.[NOAA12282]Thisrouterangesindepth
from 31 meters in the east to 5 meters in the west,
with an averagedepth of 8 meters along the final 9
nm of
the route. Along the route are 26 buoys and
fixed ATON, two bridges and several prominent
landmarksinterms ofbuildings, domesand natural
featuresthatserveasATON.
4.2 VesselandSensorConfigurations
Participating in these experiments serving as a test
bed for sensor integration and fusion is a
10meter
researchvesselwith1meterdraftequippedwiththe
followingsensors:
FurunoGP37WAAS/DGPSreceiver,
Furuno1954C4ft.48rpmradarwithARPA,
FurunoGD1920Ccolorvideoplotter,
ICOMMA500TclassBAIS,
EchoPilot3Dforwardlookingsonar,
LowranceHDS5
echosounder/fishfinder,
FLIRMD625thermalimagingcamera,and
Hikivision8MPultralowlightimagingcamera.
Datacommunicationsareaccomplishedunderthe
NMEA0183 data busarchitecture withdirect image
capturetovideodatarecorder,allunderthecontrolof
a Dell Inspiron quad core laptop, 2.3 GHz
with
NVidiaGT650MGPU,8GBRAM,1TBharddrive.
4.3 ReasoningSystems
AcombinationofmachinelearningandAItechniques
were used to acquire, assess and characterize pixel
domain imagery of vessels and ATON including
buoys, bridges and prominent landmarks. Directed
learning focusing on feature detection and
classification was used
to train a neural network to
recognizevarious typesofvesselsandATON.Optical
character recognition (OCR) was used to positively
identifyindividualbuoysandfixedmarksfromvideo
andinfraredimagery witha60degree fieldofview
that may be zoomed to 17 degrees for precise
identification.Radar
waveformswereanalyzedusing
deeplearningAItohelpdiscerninformationbeyond
that availablein the pixeldomainradar image. This
includedanalyzingchangesinfrequency,amplitude,
phase and/or polarization.Lacking a direct interface
to theradarsystem, many of thesewaveforms were
simulated in the performance of this experiment.
Imagery
representative of basic signals and their
many possible variations along with metadata
obtained from other sensors were used in training
neuralnetworkstodistinguishbetweentargets.
Essential to the analysis of various images and
signals is the creation of large datasets that are
representative of potential objects and waveforms.
These
datasets consist of two parts, a comprehensive
datasetandalimiteddataset,withtheformerbeinga
subset of the latter. The comprehensive dataset is
shared for use in supervised learning in the
development and refinement of statistical processes
andforunsupervisedlearninginthedevelopmentof
neural network processes. It
includes complete
numericaldataprovidingimageryanddescriptionof
imagerycomponentsencompassingscale,range,units
andotherfactors.Thelimiteddatasetcontainsonly
objectsandwaveformsandisusedforunsupervised
learning during neural network development. This
training dataset consists of thousands of
representative objects and signal waveforms of
various
resolutions,frequencies,bandwidths,sample
ratesandcomplexity.
5 EXPERIMENTRESULTS
Ourinitialneuralnetworkconfigurationconsistedof
the ResNet50 architecture with which we achieved
ATON object and signal identification rates of
between93.22% and97.55%accuracy rates.
Subsequent use of a Convolutional Neural Network
(CNN) resulted in enhanced results ranging
from
98.34% to 99.97% accuracy. Further improvements
509
wereachievedthroughadjustmentstoexistingCNN
layersandadding newlayerstailored specificallyto
features and attributes associated with vessels and
ATON.Adjustmentsoflearningrates,weightfactors
andotherCNNcharacteristicsalsoimprovedtraining
speed and accuracy. The primary CNN architecture
for the pixel domain was an AlexNet
design
consisting of 27 different layers tailored to and
adjusted for object recognition. The primary CNN
architectureforthetimeandfrequencydomainswas
an AlexNet design consisting of 29 different layers
tailoredtoandadjustedfor signalrecognitioninthe
timeandfrequencydomains.
All ATON along the route
were detected and
identified by the CNN as being of the appropriate
type(nun, can,fixedmark,dome,building,etc.)and
having proper characteristics (red, green, numbers,
letters, etc.). ATON position correlation was made
usingvisualimagery,electronicnavigationchartand
radartargetdisplayforallATONwithinvisualrange,
and
within sonar range using forward looking
navigation sonar as an additional sensor. Positive
identificationofspecificbuoyoraidnumberoccurred
for18ofthe27occurrences;andfortwobridges,one
dome and three buildings. Positive identification
occurred for three vessels and one AISATON, with
correctpositioncorrelation
madeusing radar.Radar
waveform variations correspond to vessel sizes and
configurations were observed, along with ARPA
correlationofheadingandspeedvectors.
Routeselection by bestwaters considering vessel
draftandroutedirectnesswasconfirmedbyforward
looking navigation sonar bottom topography, with
temporary deviation from planned course necessary
for
vesselavoidanceon threeoccasions.Variationof
live echosounder depth measurements over the
tracked course was within expectations,[Wright and
Baldauf.2016a]withnotableshoalingevidentonthe
navigationsonarneartwobars.
6 DISCUSSION
CNN performance in ATON, landmark and vessel
detection and identification was demonstrated in
combination with radar
and AIS target correlation
and echosounder bottom terrain tracking over a
transit route with complex features including large
and small vessel traffic and land masses. Forward
looking navigation sonar provided ATON position
verificationwhentheywerewithinitseffectiverange
of45to 200meters, whichchangesdepending upon
waterdepth.
AllATONappeartobeintheirassigned
positions considering variation within their proper
watchcircleduetotheeffectofwindandtides.
OCR was found to be an effective method for
positive ATON identification during daylight hours
and during nighttime with video, lowlight and IR
sensingwhen
buoydesignations werewithincamera
fieldofviewandnotobscuredbyothervessels,heavy
rain and other factors. Positive identification was
reinforced through consistency within multiple
hundredsofvideoframes.Theprimaryfactorinthe
failure of this method was in cases where the buoy
identificationwasorientedawayfrom
thecameraand
notviewableduetobuoyrotation.
The combination of video, IR, radar and ARPA,
andnavigationsonarsensing,indecreasinglevelsof
resolution,providednearly100%detectionofmarine
targetsrelevanttothevessel’srouteoftransitandin
determining vessel speed adjustments and alternate
routes for
collision avoidance. Significant exceptions
occurred in cases of small vessels and watercraft
demonstrating erratic behavior along with
unpredictablechangesincourseandspeed.ENCand
navigation sonar provided below the waterline
awareness of expected and actual environmental
conditionstoaidinalternateroutedetermination.
7 CONCLUSIONS
The use of CNN for
visual ATON, landmark and
vessel detection and identification, when combined
with radar target correlation and navigation sonar/
echosounder bottom terrain tracking, appeared
sufficient for safe and reliable navigation under
limited experimental conditions. Consideration
shouldbegiventoATONdesignenhancementsthat
may better facilitate machine recognition of their
characteristics and
positive identification of
individual buoys. A combination of multisensing
modalities to achieve comprehensive situational
awareness both above and below the waterline
appearedtobeeffectiveinrealtimealternativecourse
planning,especiallyinthecaseofshoalingconditions
notevidencedontheENC.
The use of multiple redundant sensor system
to
overcome the limitations and vulnerabilities of
individual sensor systems were evaluated in
simulations performed using data recorded during
theexperiment.Useofasinglebeamechosounderfor
bottom terrain tracking provided effective to
overcomelossofGNSScapability,butwaslimitedto
theresolutionandplacementofthesoundings
inthe
ENC. Highresolution bathymetry contained within
ENCacquiredusingmultibeamechosoundersand/or
navigation sonar has already been shown to be an
effectiveremedytothisproblem.[WrightandBaldauf.
2016b]
Theresultsofthisexperimentwereachievedwith
sensors having limited field of view. Significant
improvements in safety and reliability
can be
achievedthrough360degreedetectionofATONand
potential hazards and threats, augmented with
identification using high resolution video and IR
sensing that may be directed at specific objects and
featuresofinterest.
Asignificantlimitationofthisexperimentwasthe
lack of direct availability of radar and
sonar sensor
signaldatainthetimeandfrequencydomains.Future
experiments will further explore the direct
acquisition, analysis and use of these data in
combination with other sensor modalities. This will
includeintegratingsensormodalitiestoaidin object
and threat detection with immediate route planning
andmaneuveringtoavoid
suchoccurrences.
Anotherlimitationisinthebandwidthofexisting
NMEA data bus architectures to support very large
numbers of sensors in terms of both data and
imagery. This may be remedied in part by the
510
proposed International Marine Electronics
Association(IMEA)OneNetopenstandardbasedon
Internet Version IPv6 and the IEEE 802.3 Ethernet
LocalAreaNetwork.TheresultsofaRadarWorking
GroupwithinOneNetdevelopingradarmessageson
thenetworkwillbeofkeeninterestindeterminingits
potentialinthisregard.
ACKNOWLEDGEMENT
This research is funded in part by the U.S. Naval Air
Warfare Center, Lakehurst, New Jersey, USA, under
contractN6833518C0355.
REFERENCES
Allianz2012. Safety andShipping 19122012: FromTitanic
to Costs Concordia. Allianz Global Corporate &
Specialty. March 2012. 3.
www/agcs/allianz.com/PDFs/Reports.
CBI 2018. Digitizing the First Mile: Technology and
Shipping. CBInsights. 2018.
http://support.citrixonline.com/en_US/Webinar
COLREGSRule5.InternationalRegulationsforPreventing
CollisionsatSea.1972.Rule5Lookout.
COLREGSRule
7.Ibid.Rule7Riskofcollision,(b)and(c).
FarSounder 2018. FarSounder Joins NOAA as a Trusted
Node. Press Release. October 17, 2018.
www/farsounder.com/about/press_releases.
gCaptain 2017. Connected at Sea: Inmarsat’s New High
Speed Broadband Service Hits 10,000Ship Milestone.
May 4, 2017, http://gcaptain.com/fleetxpressexceeds
10000shipmilestone
firstanniversary/
IALA 2014. IALA International Dictionary of Aids to
MarineNavigation,citedinIALANAVGUIDE Aidsto
NavigationManual.2014.SeventhEdition.30.
IMO 2015. Third International Maritime Organization
GreenhouseGasStudy.Section3:Scenariosforshipping
emissions20122050.18.
Minter 2017. Adam Minter. Autonomous Ships Will Be
Great.
Bloomberg. May 16, 2017.
https://wwwbloomberg.com/ opinion/articles/201705
16/autonomousshipswillbegreat.
MSC 98/20/2. Work Programme. Maritime Autonomous
Surface Ships Proposal for a Regulatory Scoping
Exercise.Regulatory Scoping Exercise for the Use of
Maritime Autonomous Surface Ships (MASS). MSC
98/20/2.27February2017.
NOAA12282.ChesapeakeBay,SevernandMagothy
Rivers.
NOAAChart12282.
Satnews 2018. Euroconsult Report Focuses on Satellites to
beBuiltandLaunchedby2027.SatnewsDaily.January
26, 2019.
http://satnews.com/story.php?number=2091711277
Stewart2018. Stewart, J. paraphrase, Rolls Royce wants to
Fillthe Seas with SelfSailing Ships. Wired. 15October
2018. https://www.wired.com/story/rollsroyce
autonomousship/.
UNOOSA2018.Annual
Report2017.UnitedNationsOffice
forOuterSpaceAffairs.Vienna.March2018.5.
Wright 2017. Scientific Data Acquisition using Navigation
Sonar. IEEE/MTS Oceans Conf. Anchorage AK. Sept.
2017.
Wright2018.SignalsIntelligenceAutomatedAssessmentof
Test Capabilities. IEEE Automatic Testing Conference.
Washington,DC.September2018.
Wright and Baldauf, 2016. Wright,
R. Glenn and Baldauf,
Michael. Virtual Electronic Aids to Navigation for
RemoteandEcologicallySensitiveRegions.TheJournal
of Navigation. The Royal Institute of Navigation 2016.
doi:10.1017/S0373463316000527.
Wrightand Baldauf.2016a.Wright,R.GlennandBaldauf,
Michael.CorrelationofVirtualAidstoNavigationtothe
Physical Environment.TransNav, the International
Journal on Marine Navigation and Safety of Sea
Transportation.Vol.10,No.2,2016.
WrightandBaldauf.2016b.Wright,R.Glenn andBaldauf,
Michael. Hydrographic Survey in Remote Regions: Using
Vessels of Opportunity Equipped with 3dimensional
ForwardLookingSonar.JournalofMarineGeodesy.Vol.
39.No.6.DOI10.1080/01490419.
2016.1245266.339357.
WrightandRussell.2017. WrightR.GlennandIanRussell.
Navigation Sonar use in Maritime Frontier Exploration.
Soundings.Hydrograph.Soc.oftheUK.Sum.2017.33
36.