



Geospatial Image Fusion
- Nov. 13, 2012
- 3:30 p.m.
- LeConte 312
Abstract
Satellite images are used for a wide range of remote sensing applications including surveillance, mapping, geology, environmental monitoring, and agricultural surveying. Unfortunately, it is difficult to construct a satellite camera that gathers both high spatial and spectral resolution. A common solution is to equip different types of cameras on a single satellite and then use image processing algorithms to fuse the data. I will present a wavelet-based variational approach to image fusion that preserves both spatial and spectral accuracy. Unlike existing fusion methods, this algorithm extends to high-dimensional hyperspectral data. I will also discuss how this work led to an interesting density estimation algorithm that incorporates geography. This is joint work with Prof. Andrea Bertozzi (UCLA).