CoRange: Collaborative Range-Aware Adaptive Fusion for Multi-Agent Perception

Qiuhao Shu, Jinchao Chen, Yantao Lu, Ying Zhang, Yang Wang

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

Collaborative perception facilitates information exchange and communication among neighbouring agents and is a competitive solution to address the limited field of view of a single vehicle in autonomous driving. Although collaborative perception has brightened application prospects and can greatly enhance vehicle safety in transportation systems, it often requires substantial communication and computation overhead to achieve efficient collaboration. Even worse, the sparse point cloud data caused by distant objects impairs the perceptual abilities of individual agents, and unavoidable errors in agents’ poses result in misaligned global observations and low perception accuracy. In this work, we focus on the multi-agent collaborative perception problem and present a Collaborative Range-aware adaptive fusion framework named CoRange to achieve the communication-efficient and fusion-effective perception of the rapidly changing environment. First, we present a range-aware communication mechanism to reduce the misjudgment and minimize the bandwidth consumption, by fully considering the characters of range information and selectively transmitting the features of critical regions. Then, we design a local and agent-wise attention module to handle point clouds at different distances and capture relationships between heterogeneous agents, with an objective of enhancing the detection accuracy and robustness against pose errors. Finally, we adopt a hierarchical adaptive fusion method to effectively integrate the features representing diverse semantic information and provide multi-source representations for ego agents. Experiments on both simulated and real-world datasets are conducted to validate the efficiency and effectiveness of the proposed approach, and the experimental results demonstrate that our approach exhibits superior performance in limited communication bandwidths and noisy environments when LiDAR-based object detection tasks are carried out.

Original languageEnglish
Pages (from-to)4316-4329
Number of pages14
JournalIEEE Transactions on Intelligent Vehicles
Volume10
Issue number8
DOIs
StatePublished - 2025

Keywords

  • Collaborative perception
  • LiDAR-based object detection
  • feature fusion
  • multi-agent perception

Fingerprint

Dive into the research topics of 'CoRange: Collaborative Range-Aware Adaptive Fusion for Multi-Agent Perception'. Together they form a unique fingerprint.

Cite this