MultiSensor Raster and Vector Data Fusion Based on Uncertainty Modeling.
Sang-Chul Lee and Peter Bajcsy
Proceedings of IEEE International conference on Image Processing (ICIP 04), Singapore, pt.5, p3355-8, 2004
We propose a new methodology for fusing temporally changing multi-sensor raster and vector data by developing a spatially and temporally varying uncertainty model of acquired and transformed multi-sensor measurements. The proposed uncertainty model includes errors due to (1) each sensor by itself, e.g., sensor noise, (2) transformations of measured values to obtain comparable physical entities for data fusion and/or to calibrate sensor measurements, (3) vector data spatial interpolation that is needed to match different spatial resolutions of multi-sensor data and (4) temporal interpolation that has to take place if multi-sensor acquisitions are not accurately synchronized.
The proposed methodology was tested using simulated data with varying (a) amount of sensor noise, (b) spatial offset of point sensors generating vector data and (c) model complexity of the underlying physical phenomenon. We demonstrated the multi-sensor fusion approach with a data set from a structural health monitoring application domain.