Automatic Machine Learning based Real Time Multi-Tasking Image Fusion

Shahid Karim, Geng Tong, Jinyang Li, Xiaochang Yu, Jia Hao, Yiting Yu

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Imaging systems work diversely in the image processing domain, and each system contains specific characteristics. We are developing models to fuse images from different sensors and environments to get promising outcomes for different computer vision applications. The multiple unified models have been developed for multiple tasks such as multi-focus (MF), multi-exposure (ME), and multi-modal (MM) image fusion. The careful tuning of such models is required to get optimal results, which are still not applicable to diverse applications. We propose an automatic machine learning (AML) based multi-tasking image fusion approach to overcome this problem. Initially, we evaluate source images with AML and feed them to the task-based models. Then, the source images are fused with the pre-trained and fine-tuned models. The experimental results authenticate the consequences of our proposed approach compared to generic approaches.

Original languageEnglish
Title of host publicationProceedings of the 2024 16th International Conference on Machine Learning and Computing, ICMLC 2024
PublisherAssociation for Computing Machinery
Pages327-333
Number of pages7
ISBN (Electronic)9798400709234
DOIs
StatePublished - 2 Feb 2024
Event16th International Conference on Machine Learning and Computing, ICMLC 2024 - Shenzhen, China
Duration: 2 Feb 20245 Feb 2024

Publication series

NameACM International Conference Proceeding Series

Conference

Conference16th International Conference on Machine Learning and Computing, ICMLC 2024
Country/TerritoryChina
CityShenzhen
Period2/02/245/02/24

Keywords

  • automatic ML
  • imaging systems
  • multi-tasking image fusion

Fingerprint

Dive into the research topics of 'Automatic Machine Learning based Real Time Multi-Tasking Image Fusion'. Together they form a unique fingerprint.

Cite this