TY - GEN
T1 - AdaFlow
T2 - 22nd ACM Conference on Embedded Networked Sensor Systems, SenSys 2024
AU - Wu, Fengmin
AU - Liu, Sicong
AU - Zhu, Kehao
AU - Li, Xiaochen
AU - Guo, Bin
AU - Yu, Zhiwen
AU - Wen, Hongkai
AU - Xu, Xiangrui
AU - Wang, Lehao
AU - Liu, Xiangyu
N1 - Publisher Copyright:
© 2024 Copyright is held by the owner/author(s).
PY - 2024/11/4
Y1 - 2024/11/4
N2 - The rise of mobile devices equipped with numerous sensors, such as LiDAR and cameras, has spurred the adoption of multi-modal deep intelligence for distributed sensing tasks, such as smart cabins and driving assistance. However, the arrival times of mobile sensory data vary due to modality size and network dynamics, which can lead to delays (if waiting for slower data) or accuracy decline (if inference proceeds without waiting). Moreover, the diversity and dynamic nature of mobile systems exacerbate this challenge. In response, we present a shift to opportunistic inference for asynchronous distributed multi-modal data, enabling inference as soon as partial data arrives. While existing methods focus on optimizing modality consistency and complementarity, known as modal affinity, they lack a computational approach to control this affinity in open-world mobile environments. AdaFlow pioneers the formulation of structured cross-modality affinity in mobile contexts using a hierarchical analysis-based normalized matrix. This approach accommodates the diversity and dynamics of modalities, generalizing across different types and numbers of inputs. Employing an affinity attention-based conditional GAN (ACGAN), AdaFlow facilitates flexible data imputation, adapting to various modalities and downstream tasks without retraining. Experiments show that AdaFlow significantly reduces inference latency by up to 79.9% and enhances accuracy by up to 61.9%, outperforming status quo approaches. Also, this method can enhance LLM performance to preprocess asynchronous data.
AB - The rise of mobile devices equipped with numerous sensors, such as LiDAR and cameras, has spurred the adoption of multi-modal deep intelligence for distributed sensing tasks, such as smart cabins and driving assistance. However, the arrival times of mobile sensory data vary due to modality size and network dynamics, which can lead to delays (if waiting for slower data) or accuracy decline (if inference proceeds without waiting). Moreover, the diversity and dynamic nature of mobile systems exacerbate this challenge. In response, we present a shift to opportunistic inference for asynchronous distributed multi-modal data, enabling inference as soon as partial data arrives. While existing methods focus on optimizing modality consistency and complementarity, known as modal affinity, they lack a computational approach to control this affinity in open-world mobile environments. AdaFlow pioneers the formulation of structured cross-modality affinity in mobile contexts using a hierarchical analysis-based normalized matrix. This approach accommodates the diversity and dynamics of modalities, generalizing across different types and numbers of inputs. Employing an affinity attention-based conditional GAN (ACGAN), AdaFlow facilitates flexible data imputation, adapting to various modalities and downstream tasks without retraining. Experiments show that AdaFlow significantly reduces inference latency by up to 79.9% and enhances accuracy by up to 61.9%, outperforming status quo approaches. Also, this method can enhance LLM performance to preprocess asynchronous data.
KW - affinity matrix
KW - distributed multi-modal system
KW - mobile applications
KW - non-blocking inference
UR - http://www.scopus.com/inward/record.url?scp=85211816840&partnerID=8YFLogxK
U2 - 10.1145/3666025.3699361
DO - 10.1145/3666025.3699361
M3 - 会议稿件
AN - SCOPUS:85211816840
T3 - SenSys 2024 - Proceedings of the 2024 ACM Conference on Embedded Networked Sensor Systems
SP - 606
EP - 618
BT - SenSys 2024 - Proceedings of the 2024 ACM Conference on Embedded Networked Sensor Systems
PB - Association for Computing Machinery, Inc
Y2 - 4 November 2024 through 7 November 2024
ER -