TY - JOUR
T1 - CoRange
T2 - Collaborative Range-Aware Adaptive Fusion for Multi-Agent Perception
AU - Shu, Qiuhao
AU - Chen, Jinchao
AU - Lu, Yantao
AU - Zhang, Ying
AU - Wang, Yang
N1 - Publisher Copyright:
© 2016 IEEE.
PY - 2025
Y1 - 2025
N2 - Collaborative perception facilitates information exchange and communication among neighbouring agents and is a competitive solution to address the limited field of view of a single vehicle in autonomous driving. Although collaborative perception has brightened application prospects and can greatly enhance vehicle safety in transportation systems, it often requires substantial communication and computation overhead to achieve efficient collaboration. Even worse, the sparse point cloud data caused by distant objects impairs the perceptual abilities of individual agents, and unavoidable errors in agents’ poses result in misaligned global observations and low perception accuracy. In this work, we focus on the multi-agent collaborative perception problem and present a Collaborative Range-aware adaptive fusion framework named CoRange to achieve the communication-efficient and fusion-effective perception of the rapidly changing environment. First, we present a range-aware communication mechanism to reduce the misjudgment and minimize the bandwidth consumption, by fully considering the characters of range information and selectively transmitting the features of critical regions. Then, we design a local and agent-wise attention module to handle point clouds at different distances and capture relationships between heterogeneous agents, with an objective of enhancing the detection accuracy and robustness against pose errors. Finally, we adopt a hierarchical adaptive fusion method to effectively integrate the features representing diverse semantic information and provide multi-source representations for ego agents. Experiments on both simulated and real-world datasets are conducted to validate the efficiency and effectiveness of the proposed approach, and the experimental results demonstrate that our approach exhibits superior performance in limited communication bandwidths and noisy environments when LiDAR-based object detection tasks are carried out.
AB - Collaborative perception facilitates information exchange and communication among neighbouring agents and is a competitive solution to address the limited field of view of a single vehicle in autonomous driving. Although collaborative perception has brightened application prospects and can greatly enhance vehicle safety in transportation systems, it often requires substantial communication and computation overhead to achieve efficient collaboration. Even worse, the sparse point cloud data caused by distant objects impairs the perceptual abilities of individual agents, and unavoidable errors in agents’ poses result in misaligned global observations and low perception accuracy. In this work, we focus on the multi-agent collaborative perception problem and present a Collaborative Range-aware adaptive fusion framework named CoRange to achieve the communication-efficient and fusion-effective perception of the rapidly changing environment. First, we present a range-aware communication mechanism to reduce the misjudgment and minimize the bandwidth consumption, by fully considering the characters of range information and selectively transmitting the features of critical regions. Then, we design a local and agent-wise attention module to handle point clouds at different distances and capture relationships between heterogeneous agents, with an objective of enhancing the detection accuracy and robustness against pose errors. Finally, we adopt a hierarchical adaptive fusion method to effectively integrate the features representing diverse semantic information and provide multi-source representations for ego agents. Experiments on both simulated and real-world datasets are conducted to validate the efficiency and effectiveness of the proposed approach, and the experimental results demonstrate that our approach exhibits superior performance in limited communication bandwidths and noisy environments when LiDAR-based object detection tasks are carried out.
KW - Collaborative perception
KW - LiDAR-based object detection
KW - feature fusion
KW - multi-agent perception
UR - https://www.scopus.com/pages/publications/105019761525
U2 - 10.1109/TIV.2024.3478756
DO - 10.1109/TIV.2024.3478756
M3 - 文章
AN - SCOPUS:105019761525
SN - 2379-8858
VL - 10
SP - 4316
EP - 4329
JO - IEEE Transactions on Intelligent Vehicles
JF - IEEE Transactions on Intelligent Vehicles
IS - 8
ER -