This content is not included in your SAE MOBILUS subscription, or you are not logged in

Spatial Resolution and Contrast of a Focused Diffractive Plenoptic Camera

  • Magazine Article
  • 18AERP09_09
Published September 01, 2018 by SAE International in United States
  • English

New technology captures spectral and spatial information of a scene in one snapshot while raising pixel counts and improving image quality.

Air Force Institute of Technology, Wright-Patterson Air Force Base, Ohio

The concept of an imaging system that captures both spatial and spectral information has existed for a while. An example of one such imaging system that encodes both location and wavelength into an image is a Fourier Transform Spectrometer (FTS).

The FTS works by capturing a 2D image that contains both spatial dimensions while sweeping along a Michelson Interferometer to capture the spectral dimension, leading to a 3D image cube. But the fact that the FTS needs to sweep along the spectral dimension introduces an operational time lag. For example, when imaging a scene that is constantly changing, such as a forest fire, this might introduce noise that might make it difficult to process the resulting images. Or there could be mechanical vibrations of the instrument, referred to as pointing jitter, which adds noise. If there were a system that could encode two spatial dimensions and one spectral dimension in a single snapshot, it would remove the operational time lag noise and the pointing jitter that the FTS introduces. The Fresnel Zone Light Field Spectral Imager (FZLFSI), from here on referred to as the Diffractive Plenoptic Camera (DPC), is such a system, capturing these three dimensions in one snapshot.