Efficient Multi-Resolution Fusion for Remote Sensing Data with Label Uncertainty

Abstract

Multi-modal sensor data fusion takes advantage of complementary or reinforcing information from each sensor and can boost overall performance in applications such as scene classification and target detection. This paper presents a new method for fusing multi-modal and multi-resolution remote sensor data without requiring pixel-level training labels, which can be difficult to obtain. Previously, we developed a Multiple Instance Multi-Resolution Fusion (MIMRF) framework that addresses label uncertainty for fusion, but it can be slow to train due to the large search space for the fuzzy measures used to integrate sensor data sources. We propose a new method based on binary fuzzy measures, which reduces the search space and significantly improves the efficiency of the MIMRF framework. We present experimental results on synthetic data and a real-world remote sensing detection task and show that the proposed MIMRF-BFM algorithm can effectively and efficiently perform multi-resolution fusion given remote sensing data with uncertainty.

Links

IEEEXplore
PDF

Citation

Plain Text:
H. Vakharia and X. Du, “Efficient Multi-Resolution Fusion for Remote Sensing Data with Label Uncertainty,” IGARSS 2023 – 2023 IEEE International Geoscience and Remote Sensing Symposium, Pasadena, CA, USA, 2023, pp. 6326-6329, doi: 10.1109/IGARSS52108.2023.10282851.