A floating-waste-detection method for unmanned surface vehicle based on feature fusion and enhancement

Yong Li*, Ruichen Wang, Dongxu Gao, Zhiyong Liu

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

43 Downloads (Pure)

Abstract

Unmanned surface vehicle (USV)-based floating-waste detection presents significant challenges. Due to the water surface’s high reflectivity, there are often light spots and reflections in images captured by USVs. Furthermore, floating waste often consists of numerous small objects that prove difficult to detect, posing a robustness challenge for object-detection networks. To address these issues, we introduce a new dataset collected by USV, FloatingWaste-I, which accounts for the effects of light in various weather conditions, including sunny, cloudy, rainy and nighttime scenarios. This dataset comprises two types of waste: bottles and cartons. We also propose the innovative floating-waste-detection network, YOLO-Float, which incorporates a low-level representation-enhancement module and an attentional-fusion module. The former boosts the network’s low-level representation capability while the latter fuses the highest- and lowest-resolution feature map to improve the model robustness. We evaluated our method by using both the public dataset FloW-img and our FloatingWaste-I dataset. The results confirm YOLO-Float’s effectiveness, with an AP of 44.2% on the FloW-img dataset, surpassing the existing YOLOR, YOLOX and YOLOv7 by 3.2%, 2.7% and 3.4%, respectively.

Original languageEnglish
Article number2234
Number of pages18
JournalJournal of Marine Science and Engineering
Volume11
Issue number12
DOIs
Publication statusPublished - 26 Nov 2023

Keywords

  • feature enhancement
  • feature fusion
  • floating-waste dataset
  • object detection
  • unmanned surface vehicle

Fingerprint

Dive into the research topics of 'A floating-waste-detection method for unmanned surface vehicle based on feature fusion and enhancement'. Together they form a unique fingerprint.

Cite this