• 期刊收录
  • 论文
  • 水产名词
  • 专家库

ISSN 2095-1388

主管 辽宁省教育厅

主办 大连海洋大学

基于YOLOv7模型改进的轻量级鱼类目标检测方法

2023. 基于YOLOv7模型改进的轻量级鱼类目标检测方法. 大连海洋大学学报, 38(6): 1032-1043. doi: 10.16535/j.cnki.dlhyxb.2023-085
引用本文: 2023. 基于YOLOv7模型改进的轻量级鱼类目标检测方法. 大连海洋大学学报, 38(6): 1032-1043. doi: 10.16535/j.cnki.dlhyxb.2023-085
MEI Haibin, HUANG Zheng, YUAN Hongchun. 2023. A lightweight fish object detection method improved based on the YOLOv7 model. Journal of Dalian Ocean University, 38(6): 1032-1043. doi: 10.16535/j.cnki.dlhyxb.2023-085
Citation: MEI Haibin, HUANG Zheng, YUAN Hongchun. 2023. A lightweight fish object detection method improved based on the YOLOv7 model. Journal of Dalian Ocean University, 38(6): 1032-1043. doi: 10.16535/j.cnki.dlhyxb.2023-085

基于YOLOv7模型改进的轻量级鱼类目标检测方法

  • 基金项目:

    国家自然科学基金(61972240)

详细信息
    作者简介:

    梅海彬(1973-),男,副教授。E-mail:hbmei@shou.edu.cn

  • 中图分类号: S 977;TP 391.4

A lightweight fish object detection method improved based on the YOLOv7 model

  • Fund Project: 国家自然科学基金(61972240)
  • 为了解决商业渔船电子监控系统中鱼类检测和识别依赖于人工完成的问题,提出一种基于YOLOv7的轻量级鱼类实时检测模型YOLOv7-MRN,将YOLOv7的骨干网络替换为MobileNetv3骨干网络,以降低运算量,并添加了感受野模块RFB来增强网络的特征提取能力;通过引入基于归一化的注意力机制模块NAM,重新设计颈部特征融合网络,以抑制无关紧要的权重。结果表明:在HNY768远洋渔船电子监控视频渔业数据集上,YOLOv7-MRN模型的mAP@0.5为86.5%,运算量仅为原模型YOLOv7的9.8%,模型在GPU和CPU上的推理速度分别提高了121.69%和219.09%;相较于其他模型,YOLOv7-MRN模型的实际检测效果更好,尤其是在强日光场景下。研究表明,本文中提出的YOLOv7-MRN模型对鱼类的检测效果好,消耗的计算资源更少,可将该模型部署在电子渔船监控系统中。
  • 加载中
  • FAO.The state of world fisheries and aquaculture[M/OL].Rome:Food and Agriculture Organization of the United Nations, 2022.https://www.fao.org/documents/card/en/c/cc0461en.

    SUN Y W, ZHANG S M, JIANG K J, et al.Research progress of application status of electronic monitoring technology in ocean fishing vessels and prospects[J].Marine Fisheries, 2022, 44(1):103-111.(in Chinese)

    PUNT A E, SMITH D C, TUCK G N, et al.Including discard data in fisheries stock assessments:two case studies from south-eastern Australia[J].Fisheries Research, 2006, 79(3):239-250.

    FAO.Seafood traceability for fisheries compliance:country-level support for catch documentation schemes[M/OL].Rome:Food and Agriculture Organization, 2017.https://xueshu.baidu.com/usercenter/paper/show?paperid=6faf3fcf61828bdf3ed357ad8d3fb93e&site=xueshu_se&hitarticle=1.

    VAN HELMOND A T M, MORTENSEN L O, PLET-HANSEN K S, et al.Electronic monitoring in fisheries:lessons from global experiences and future opportunities[J].Fish and Fisheries, 2020, 21(1):162-189.

    NEEDLE C L, DINSDALE R, BUCH T B, et al.Scottish science applications of remote electronic monitoring[J].ICES Journal of Marine Science, 2015, 72(4):1214-1229.

    GILMAN E, DE RAMÓN CASTEJÓN V, LOGANIMOCE E, et al.Capability of a pilot fisheries electronic monitoring system to meet scientific and compliance monitoring objectives[J].Marine Policy, 2020, 113:103792.

    FRENCH G, MACKIEWICZ M, FISHER M, et al.Deep neural networks for analysis of fisheries surveillance video and automated monitoring of fish discards[J].ICES Journal of Marine Science, 2020, 77(4):1340-1353.

    SUNG M, YU S C, GIRDHAR Y.Vision based real-time fish detection using convolutional neural network[C]//OCEANS 2017-Aberdeen.Aberdeen, UK:IEEE, 2017:1-6.

    SALMAN A, AHMAD SIDDIQUI S, SHAFAIT F, et al.Automatic fish detection in underwater videos by a deep neural network-based hybrid motion learning system[J].ICES Journal of Marine Science, 2020, 77(4):1295-1307.

    WANG C Y, BOCHKOVSKIY A, LIAO H Y M.YOLOv7:trainable bag-of-freebies sets new state-of-the-art for real-time object detectors[C]//2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).Vancouver, BC, Canada:IEEE, 2023:7464-7475.

    BOCHKOVSKIY A, WANG C Y, LIAO H Y M.YOLOv4:optimal speed and accuracy of object detection[EB/OL].arXiv, 2020.https://arxiv.org/abs/2004.10934.

    CAI K W, MIAO X Y, WANG W, et al.A modified YOLOv3 model for fish detection based on MobileNetv1 as backbone[J].Aquacultural Engineering, 2020, 91:102117.

    LI W H, ZHANG Z K, JIN B A, et al.A real-time fish target detection algorithm based on improved YOLOv5[J].Journal of Marine Science and Engineering, 2023, 11(3):572.

    YUAN H C, TAO L.Detection and identification of fish in electronic monitoring data of commercial fishing vessels based on improved Yolov8[J].Journal of Dalian Ocean University, 2023, 38(3):533-542.(in Chinese)

    LIU S T, HUANG D, WANG Y H.Receptive field block net for accurate and fast object detection[C]//European conference on computer vision.Cham:Springer, 2018:404-419.

    LIU Y, SHAO Z, TENG Y, et al.NAM:Normalization-based Attention Module[EB/OL].arXiv, 2021.https://arxiv.org/abs/2111.12419.

    DING X H, ZHANG X Y, MA N N, et al.RepVGG:making VGG-style ConvNets great again[C]//2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).Nashville, TN, USA:IEEE, 2021:13728-13737.

    HOU QB, ZHOU D Q, FENG J S.Coordinate attention for efficient mobile network design[C]//2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR).Nashville, TN, USA:IEEE, 2021:13708-13717.

    HU J, SHEN L, SUN G.Squeeze-and-excitation networks[C]//2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition(CVPR).Salt Lake City, UT, USA:IEEE, 2018:7132-7141.

    WOO S, PARK J, LEE J Y, et al.CBAM:convolutional block attention module[C]//European Conference on Computer Vision.Cham:Springer, 2018:3-19.

    LI X B, LI Y G, GUO N, et al.Maskdetection algorithm based on YOLOv5 integrating attention mechanism[J].Journal of Graphics, 2023, 44(1):16-25.(in Chinese)

    WU Z G, CHEN M.Lightweight detection method for microalgae based on improved YOLO v7[J].Journal of Dalian Ocean University, 2023, 38(1):129-139.(in Chinese)

    ZHANG K, CHEN Z J, QIAO D, et al.Real-time image detection via remote sensing based on receptive field and feature enhancement[J].Laser & Optoelectronics Progress, 2023, 60(2):331-340.(in Chinese)

    YANG G L, YANG H, YU S Y, et al.Improved YOLOv5 traffic sign detection algorithm [J].Computer Engineering and Applications, 2023, 59(10):262-269.(in Chinese)

计量
  • 文章访问数:  96
  • PDF下载数:  1
  • 施引文献:  0
出版历程
收稿日期:  2023-04-19

目录