587
in the third row, the stacked cargo (red area) on the
ship is clearly and completely segmented by MSIANet,
while other methods produce blurred and irregular
edges. This is attributed to the introduction of the edge-
guided branch, which allows the model to focus more
on edge regions. Additionally, in the second row, the
background mountain (yellow area) and water surface
(green area) are clearly distinguished in MSIANet's
segmentation results without significant
misclassification. In contrast, other methods, such as
STDCNet and DDRNet, exhibit discontinuous
boundaries in the mountain area. The dynamic feature
fusion module enables the model to efficiently capture
multi-scale contextual information, maintaining
segmentation accuracy in complex backgrounds. In the
first and third rows, IDA-Fast-SCNN produces more
uniform segmentation results for multiple categories
(e.g., water surfaces, ships, mountains), without
overemphasizing one category or neglecting small
targets. This validates the effectiveness of the
importance-weighted loss function in handling class
imbalance and segmenting key objects. In summary,
the analysis of these specific details shows that
MSIANet significantly outperforms other models in
segmenting critical regions. This further demonstrates
the effectiveness of the proposed method for semantic
segmentation tasks in waterborne navigation
scenarios.
4 CONCLUSION
This paper proposes a real-time semantic segmentation
method for waterborne navigation scenarios, called
MSIA-Net, which enhances multi-scale information
and incorporates importance weighting to improve
segmentation accuracy in complex aquatic
environments while maintaining real-time
performance. By introducing an edge-guided branch
and a Dynamic Feature Fusion Module (DFFM), the
model's ability to perceive multi-scale information is
significantly enhanced. Additionally, a n improved
loss function based on importance weighting is
designed to increase the model's focus on critical
objects in waterborne navigation scenarios. A
parameter-free attention mechanism is also integrated
into the decoder to combine regional information from
the encoder and semantic information from the
decoder, restoring spatial details in the image and
guiding the model to focus on more critical objects.
On the On_Water dataset and the SeaShips dataset
constructed in this study, the proposed method
achieves segmentation accuracies of 83.1% and 73.2%,
respectively, while achieving an inference speed of 69.1
frames per second with only 5.76M parameters.
Considering the balance between parameters,
accuracy, and speed, the proposed algorithm
outperforms other lightweight networks in identifying
targets such as ships and small-sized obstacles, making
it more suitable for waterborne navigation scenarios.
LIMITATIONS AND FUTURE DIRECTIONS
This paper has demonstrated advancements in real-
time semantic segmentation for water navigation
scenes. However, the rapid evolution of intelligent
shipping and unmanned vessel technology
necessitates enhanced perception capabilities for
increasingly complex environments and demanding
applications. Consequently, further optimization of
current methodologies remains crucial. Future
research should focus on the following key areas:
1. Enhancing Model Robustness and Generalization
through Multi-Modal Integration: Future work
should focus on integrating multi-modal perception
methods to improve the model's robustness and
adaptability across diverse water environments and
when encountering dynamic targets.
2. Optimizing Perception of Dynamic Scenes and
Small Objects: Further research is needed to
enhance the model's ability to accurately segment
dynamic objects and small-scale targets on the
water surface, potentially through temporal
information integration and improved feature
representation.
REFERENCES
[1] Praczyk, T. Artifcial neural networks application in
maritime, coastal, spare positioning system. Theor. Appl.
Inf. 2006, 18, 1175–1189.
[2] Praczyk, T. Neural anti-collision system for autonomous
surface vehicle. Neurocomputing 2015, 149, 559–572.
[3] P. Santana, R. Mendica, and J. Barata, “Water detection
with segmentation guided dynamic texture recognition,”
in Proc. IEEE Int. Conf. Robot. Biomimet. (ROBIO),
Guangzhou, China, 2012, pp. 1836–1841.
[4] S. Fefilatyev and D. Goldgof, “Detection and tracking of
marine vehicles in video,” in Proc. Int. Conf. Pattern
Recognit., Tampa, FL, USA, 2008, pp. 1–4.
[5] Cheng, D.-C.; Meng, G.-F.; Cheng, G.-L.; Pan, C.-H. Senet:
Structured edge network for sea–land segmentation.
IEEE Geosci. Remote Sens. Lett. 2017, 14, 247–251.
[6] C.Y. Jeong, H.S. Yang, K.D. Moon. Horizon detection in
maritime images using scene parsing network[J]. Image
and vision processing and display technology,
2018,54(12):760-762.
[7] M. Kristan, V. S. Kenk, S. Kovaˇ ciˇ c, and J. Perš, “Fast
image-based obstacle detection from unmanned surface
vehicles,” IEEE Transactions on Cybernetics, vol. 46, no.
3, pp. 641–654, 2016.
[8] S. Scherer et al., “River mapping from a flying robot: State
estimation, river detection, and obstacle mapping,”
Auton. Robots, vol. 33, nos. 1–2, pp. 189–214, 2012.
[9] Bovcon, B., & Kristan, M. (2019). Benchmarking Semantic
Segmentation Methods for Obstacle Detection on a
Marine Environment.
[10] Qiao Y L, Zhao X C. Obstacle detection method based on
improved semantic segmentation model[J]. Journal of
Naval University of Engineering, 2023, 35(01): 18-24.
[11] Bao X C, Liu F Y, Nie J G, et al. Research on Multi-type
Floating Object Segmentation Method on Water Surface
Based on Improved Deeplabv3+[J/OL]. Water Resources
and Hydropower Engineering: 1-16.
[12] Xiong R, Cheng L, Hu T, et al. Research on Fast
Segmentation Algorithm for Feasible Region and
Obstacles of Unmanned Surface Vehicle[J]. Journal of
Electronic Measurement and Instrumentation, 2023,
37(02): 11-20.
[13] Kristan M, Sulic V, Kovacic S. Fast image-based obstacle
detection from unmanned surface vehicles[J]. IEEE
Transactions on Cybernetics, 2015, 46(12):2809-2821.
[14] Prasad D K, Rajan D, Rachmawati L, et al. Video
Processing from Electro-optical Sensors for Object
Detection and Tracking in Maritime Environment: A
Survey[J]. IEEE Transactions on Intelligent
Transportation Systems, 2017, 18(08):1993-2016.