790
though, and during this time, the system must solve
the actions autonomously.
The work described by this paper has focused on
how to use remote operation to improve positioning
accuracy for small affordable vessels. Unmanned
ground vehicles (UGV) have, for many years, been
teleoperated to master harsh environments during,
e.g., military or search and rescue (SAR) missions [5]–
[7]. Small autonomous vessels at sea are also
essential, and Murphy believes they will play an
important role during future SAR operations [8]. The
challenges with remote control and positioning are
similar for small and large ships. However, the
communication link’s throughput sets a limitation on
smaller, more affordable vessels, as they can not have
a large satellite antenna due to the size, weight, and
cost constraints. This limitation makes the streaming
of video and transmission of high-resolution images
infeasible. For the positioning problem, we have, for
the same reason, confined ourselves only to use
affordable navigation sensors.
Figure 1. A participant of the user study taking a bearing
by pointing towards an augmented landmark.
The positioning system is built upon our previous
implementation with terrain-aided navigation (TAN),
presented in [9]. This paper estimated the position
from a real-world field trial by comparing the bottom
depth and magnetic intensity with available maps. To
enhance the position accuracy even further, we
manually measured bearings to landmarks from the
recorded 360 image, making it possible for the
positioning tool to adjust the position estimation
accordingly. This is not possible to do manually on an
unmanned ship. In this new work, a user instead
measures these bearings from a teleoperation system
in virtual reality (VR), see Figure 1.
The teleoperation system also builds on our
previous work, presented in [10], [11]. This work
focused on developing a teleoperation tool with a
low-cognitive load that could provide a good
situational awareness (SA), leading to better safety for
the vessel. In the work described in the latter paper,
we developed a specific GUI to compare the
performance when using VR, 3D visualization on a
laptop, and 2D visualization on a laptop. In this
earlier study, we observed that the longer available
time for decisions at sea, measured in seconds or
minutes, makes it ideal for teleoperation. This
contrasts with the fast dynamics of the traffic
situations for cars and airplanes, often measured in
milliseconds, reported as challenging teleoperation
areas due to the vulnerability from mainly long
latency [12], [13]. Several research papers propose
methods to compensate or predict the teleoperated
vehicle’s pose to mitigate the latency problem [14]–
[16]. We use this knowledge to predict our current
position based on heading, speed, and the received
estimated position from the remote vessel. We
concluded in our previous study that 3D, and
especially VR, gave the best performance. VR can
strengthen the visualization, and thereby the total
communication between the machine and the human
[17]. It has also been shown that VR can enhance SA
when driving a remote car [18], [19]. Because of our
good results for VR in our previous work, we use
only VR in this current work. Here we have re-built
the GUI to evaluate how teleoperation can support
navigation, and more specifically, the TAN
application. We base the user evaluation on
recordings from a field trial to make the user
experience as realistic as possible.
One of our main objectives has been to provide the
user with an immersive experience that provides
good SA. To gain trust in the system’s ability to
navigate, it is essential that the user gets a good
overview and instantly can determine whether the
position is estimated correctly or not. When
navigating onboard a manned vessel, the usual way
of doing this is to try to match the real-world terrain
with objects on the sea chart or radar and try to judge
if the directions and ranges coincide. The mental
rotations needed for this task are difficult for a
human to perform [20], [21], and we believe it is even
more challenging to do remotely, i.e., by comparing
what is seen on a video screen with what is seen on
the sea chart. Porathe concluded it is better on
manned vessels to guide the operators by visualizing
a 3D map oriented to match the user’s view of the
surrounding world [20]. Figure 1 shows that we have
built our GUI corresponding with this research, as the
user will see if the real-world corresponds to the 3D-
world, and thereby the position, easily and instantly.
Moreover, if the system’s position is not entirely
accurate, the user can enhance the position accuracy
by providing new bearing updates to the positioning
system.
Our main contribution is to provide a GUI design
for ship teleoperation providing good situation
awareness, which meets the limitation of ships with a
low throughput connection. We have shown that the
users experienced the GUI to be simple to use while
having a good overview of the situation. When the
positioning system estimated an inaccurate position,
the users could react upon this instantly.
Furthermore, we have shown that our TAN
application can be supported remotely by an operator
taking bearings to landmarks.
This paper is organized as follows: Section II
describes the Implementation and Method of the
project, including the design of the applications in
Subsection II-A, the Field Trial in Subsection II-B, and
the User Study in Subsection II-C. The results are
given in Section III, followed by Discussion and
Conclusion in Section IV and V.
2 IMPLEMENTATION AND METHOD
This section describes how the software for the
teleoperation tool and the positioning tool, called