Integrated Application Research on Marine Image Recognition Models

Chih-Chen Kao

School of Intelligent Manufacturing, Shanghai Zhongqiao Vocational and Technical University, Shanghai 201514, China

Yu-Fen Peng

Physical Education Office, Yuanpei University of Medical Technology, Hsinchu 30015, Taiwan

Bo-Wen Wu

Department of Optometry, College of Medical Technology and Nursing, Yuanpei University of Medical Technology, Hsinchu 30015, Taiwan

DOI: https://doi.org/10.36956/sms.v7i2.1915

Copyright © 2025 Chih-Chen Kao, Yu-Fen Peng, Bo-Wen Wu. Published by Nan Yang Academy of Sciences Pte. Ltd.

Creative Commons LicenseThis is an open access article under the Creative Commons Attribution-NonCommercial 4.0 International (CC BY-NC 4.0) License.


Abstract

Marine environments present significant challenges for image processing due to factors such as low light intensity, suspended particles, and varying degrees of water turbidity. These conditions severely degrade the clarity and quality of captured marine images, making accurate image recognition difficult. The problem is further compounded by the limited availability of high-quality, labeled training samples, which restricts the effectiveness of conventional recognition algorithms. Existing techniques in both academic and industrial settings—such as Principal Component Analysis (PCA), Neural Networks, and Wavelet Transforms—typically involve converting color images to grayscale prior to feature extraction. While this simplifies processing, it also results in the loss of essential color information, which is often critical for distinguishing features in marine imagery. To address these issues, this paper proposes a novel approach that preserves and utilizes the full color information of marine images during processing and recognition. The method combines color image representation with Hu's invariant moments to extract stable and rotation-invariant features. These features are then input into a Back Propagation Neural Network (BPNN), which is trained to recognize and classify various marine targets. The integration of color-based feature extraction with BPNN significantly improves recognition performance, particularly under complex environmental conditions. Experimental results show that the proposed system achieves a recognition accuracy exceeding 98%, demonstrating its effectiveness and potential for practical applications in marine exploration, environmental monitoring, and underwater robotics.

Keywords: Marine Image Color Preprocessing; Pattern Recognition; BPNN; Invariant Moments


References

[1] Koutroumbas, K., Theodoridis, S., 2010. Pattern Recognition, (4th Edition). Academic Press: New York, NY, USA. pp. 13-86.

[2] Everitt, B., Krzanowski, W.J.,1988. Principles of Multivariate Analysis. Oxford University Press: Oxford, UK. pp. 332-364.

[3] Wu, B.W., Huang, C.C., Lin, W.C., 2023. Development of Neural Network Recognition Model for Optical Illusion Images. Journal of Physics: Conference Series. 2468, 012123. DOI: https://doi.org/10.1088/1742-6596/2468/1/012123

[4] Wu, B.W., Fang, Y.C., Wen, C.C., et al., 2022. A Study of Artificial Neural Network Technology Applied to Image Recognition for Underwater Images. IEEE ACCESS. 10, 13844–13851. DOI: https://doi.org/10.1109/ACCESS.2022.3144742

[5] Wu, B.W., Fang, Y.C., 2021. Application of blurred circular 3D images on the human vision model. Microsystem Technologies. 27, 1099–1105.

[6] Wu, B.W., Fang, Y.C., 2015. Applications of neural networks in human shape visual perception. Journal of the Optical Society of America A. 32(12), 2338–2345.

[7] Wu, B.W., Fang, Y.C., 2007. Neural Network Application to Thermal Image Recognition of Low Resolution Objects. Journal of Optics A-Pure and Applied Optics. 9(2), 134–144.

[8] Hu, M.K., 1962. Visual pattern recognition by moment invariants. IRE Transactions on Information Theory. 8(2), 179–187.

[9] Fine, N.J., Wilf, H.S., 1965. Uniqueness theorems for periodic functions. American Mathematical Society. 16(1), 109–114.

[10] Backes, A.R., Junior, J.S., 2017. LBP maps for improving fractal based texture classification. Neuro computing. 266, 1–7.

[11] Abuzneid, M.A., Mahmood, A., 2018. Enhanced human face recognition using LBPH descriptor multi-KNN and back-propagation neural network. IEEE Access. 6, 20641–20651.

[12] Traore, A.B., 2018. Deep convolution neural network for image recognition. Ecological Informatics. 48, 257–268.

[13] Dheir, M., Mettleq, A., 2019. Classifying nuts types using convolutional neural network. International Journal of Academic Information Systems Research. 3, 12–18.

[14] Kuo, C.-C.J., Zhang, M., Li, S.Y., et al., 2019. Interpretable convolutional neural networks via feedforward design. Journal of Visual Communication and Image Representation. 60, 346–359