Sun Jiabing Liu Jilin Li Jun. Multi-Source Remote Sensing Image Data Fusion[J]. Journal of Remote Sensing, 1998, (1). DOI: CNKI:SUN:YGXB.0.1998-01-008.
Remote sensing image fusion of different data sources of the same area can be used to enrich the information about the interested areas. The image fusion of the most different bands of electro-magnetic spectrum (such as optical and radar data)
provides additional information with respect to each of single sensor separately
thus more accurate classification can be achieved. Fused high spatial resolution data (such as panchromatic air photo) and lower spatial
but higher spectral resolution data (such as LANDSAT TM)
can improve image sharpness and enhance feature extraction and visual interpretation
and can be used for objects’ detection change. At present image fusion has three basic methods to be discussed on remote sensing section: 1. pixel-based fusion
2.feature-based fusion
3.decision-level fusion. In this paper three methods of multisource image fusion are discussed. They are pixels-based weight fusion
feature fusion based on wavelet transform and separate classes fusion based on Bayes rule.