一种远距离行人小目标检测方法
DOI:
作者:
作者单位:

作者简介:

通讯作者:

中图分类号:

TH701 TP183

基金项目:

国家自然科学基金联合基金项目(U1813216)资助


A long-distance pedestrian small target detection method
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    远距离行人小目标成像像素少、缺乏纹理信息,深度卷积神经网络难以提取小目标细粒度特征,难以准确识别与检测。 本文提出一种远距离行人小目标检测方法。 首先,在 YOLOv4 的基础上引入浅层特征改进特征金字塔,提取行人小目标细粒度 特征,提出引力模型特征自适应融合方法,增加多层次语义信息之间的关联度,防止小目标特征信息流失。 然后,采用增强型超 分辨率生成对抗网络增加行人小目标特征数量,提高行人小目标检测准确率。 最后,选取图像像素中占比范围为 0. 004% ~ 0. 026% 的行人小目标建立试验数据集,通过与 Faster RCNN、ION、YOLOv4 对比实验验证。 结果表明,本文方法 mAP0. 5 提高了 25. 2% 、26. 3% 、11. 9% ,FPS 达到 24,研究成果在远距离安防监测监控领域具有重要应用价值。

    Abstract:

    Small pedestrian targets at long distance have problems of few pixels and lack of texture information. The deep convolutional neural network is difficult to extract fine-grained features of small objects. This article proposes a long-distance pedestrian small target detection method. Firstly, based on the YOLOv4 algorithm, the shallower features are introduced to improve the feature pyramid to extract fine-grained features of pedestrian small objects. An adaptive feature fusion layer is proposed based on the gravity model to increase dependency between multiple feature layers and prevent the loss of small target feature information. Then, ESRGAN is utilized to increase pedestrian small target features number and improve pedestrian small target detection accuracy. Finally, the small pedestrian targets are selected with a proportion of 0. 004% ~ 0. 026% in the image pixels to establish the self-made data set. Compared with Faster RCNN, ION, and YOLOv4, the mAP0. 5 values are increased by 25. 2% , 26. 3% and 11. 9% . And the FPS reaches 24. The research results have important application value in the field of long-distance security monitoring

    参考文献
    相似文献
    引证文献
引用本文

石 欣,卢 灏,秦鹏杰,冷正立.一种远距离行人小目标检测方法[J].仪器仪表学报,2022,43(5):136-146

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:
  • 最后修改日期:
  • 录用日期:
  • 在线发布日期: 2023-02-06
  • 出版日期: