A paradigm is considered for regarding remotely-sensed measurements as projections of a multi-dimensional, infinite-resolution information space onto a lower-dimensional, finite-resolution measurement space. The fundamental precepts of remote sensing, data fusion, and estimation theory are reviewed and then employed as a framework for recovering higher-dimensional, higher-resolution information, lost in the remote sensing process. A general formulation is developed to address multiple forms of measurement diversity using maximum likelihood estimation. Novel image reconstruction techniques are demonstrated that leverage temporal diversity, wavelength diversity, and object diversity. Multiple real-world image reconstruction problems are solved employing this estimation theoretic data fusion approach. |