AWARE2 Multiscale Gigapixel Camera


The AWARE program focuses on design and manufacturing of microcameras as a platform for scalable supercameras. As illustrated below, the AWARE microcamera includes advanced relay optics with integrated focal mechanisms, vertically integrated focal plane and focal plane read-out, and speciality microcamera control modules. Using this platform, the AWARE team has published designs for cameras resolving 1-50 gigapixels.

Traditional monolithic lens designs, must increase f/# and lens complexity and reduce field of view as image scale increases. In addition, traditional electronic architectures are not designed for highly parallel streaming and analysis of large scale images. The AWARE Wide field of view project addresses these challenges using multiscale designs that combine a monocentric objective lens with arrays of secondary microcameras.

A basic AWARE system architecture is shown below, producing a 1.0 gigapixel image based on 98 micro-optics covering a 120 by 40 degree FOV. A monocentric objective enables the use of identical secondary systems (referred to as microcameras) greatly simplifying design and manufacturing. Following the multiscale lens design methodology, the field-of-view (FOV) is increased by arraying microcameras along the focal surface of the objective. In practice, the FOV is limited by the physical housing. Each microcamera operates independently, offering much more flexibility in image capture, exposure, and focus parameters.

Image
AWARE system architecture

Design Methodology

Optics

Conventional cameras struggle to achieve diffraction-limited performance in as the entrance aperture increases in size. Scaling the size of an optical system to gigapixels also scales the optical path difference errors and the resulting aberrations [9]. Because of this, larger instruments require more surfaces and elements to produce diffraction-limited performance. Multiscale designs [1] are a means of combating this escalating complexity. Rather than forming an image with a single monolithic lens system, multiscale designs divide the imaging task between an objective lens and a multitude of smaller micro-optics. The objective lens is a precise but simple lens that produces an imperfect image with known aberrations. Each micro-optic camera relays a portion of the microcamera image onto its respective sensor correcting for the objective aberrations and forms a diffraction-limited image. Because there are typically hundreds or thousands of microcameras per objective, the microcamera optics are much smaller and therefore easier and cheaper to fabricate. The scale of the microcameras are typically those of plastic molded lenses, enabling mass production of complex aspherical shapes and therefore minimizing the number of elements. An example optical layout is modeled in the figure below.

Image
example optical layout is modeled

Electronics

The electronics subsystem reflects the multiscale optical design and has been developed to scale to an arbitrary number of microcameras. The focus and exposure parameters of each camera are independently controlled and the communications architecture optimized to minimize the amount of transmitted data. The electronics architecture is designed to support multiple simultaneous users and is able to scale the output bandwidth depending on application requirements.

In the current implementation, each microcamera includes a 14 megapixel focal plane, focus mechanism, and a HiSpi interface for data transmission. An FPGA-based camera control module provides an interface to provide local processing and data management. The control modules communicate over ethernet to an external rendering computer. Each module connects to two microcameras and is used to sync image collection, scale images based on system requirements, and implement basic exposure and focus capabilities for the microcameras.

Image Formation

The image formation process generates a seamless image from the microcameras in the array. Since each camera operates independently, this process must account for alignment, rotation, illumination discrepancies between the microcameras. To approach real-time compositing, a forward model based on the multiscale optical design is used to map individual image pixels into a global coordinate space. This allows display scale images to be stitched multiple frames per second independent of model corrections, which can happen at a significantly slower rate.

The current image formation process supports two functional modes of operation. In the "Live-View" mode, the camera generates a single display-scale image stream by binning information at sensor level to minimize the transmission bandwidth and then performing GPU based compositing on a display computer. This mode allows users to interactively explore events in the scene in realtime. The snapshot mode captures a full dataset in 14 seconds and stores the information for future rendering and analysis. This mode is used for capturing still images such as those presented on the AWARE image archive.

Flexibility

A major advantage of this design is that it can be scaled.  Except for slightly different surface curvatures, the same microcamera design suffices for 2, 10, and 40 gigagpixel systems.  FOV is also strictly a matter of adding more cameras, with no change in the objective lens or micro-optic design.


Current System

The 1.5 gigapixel AWARE 10 camera, Triton, was completed in October 2013. A photograph of Triton is shown below.

Image
1.5 gigapixel AWARE 10 camera - Triton

Our transition partner, Aqueti, builds the qG quarter gigapixel camera using the AWARE 2 design. The qG camera is shown in the picture below.

Image
qG camera

Second generation color AWARE 2 cameras came on line in April 2013 using the original AWARE 2 mounting dome. AWARE 2 captures a 120 degree circular FOV with 226 microcameras, 38 microradian ifov, and an effective f-number of 2.17.  Each microcamera operates at 10 fps at full resolution.  The optical volume is about 8 liters and the total enclosure is about 300 liters.  The optical track length from the first surface of the objective to the focal plane is 188 mm.  Example images taken by these systems can be found on the AWARE image server, at aqueti.com and on gigapan.com.


Construction

AWARE cameras constructed by an academic/industrial consortium with significant contributions from more than 50 graduate students, researchers and engineers. Duke University is the lead institution and led the design and manufacturing team. The construction process for the first cameras is illustrated in timelapse video below.


Future Systems

Significant improvements have been made to the optics, electronics, and integration of AWARE cameras have been made over the life of this program. Some are described here: Camera Evolution.  In the Spring of 2013, DISP will build the AWARE 40 15 microdradian ifov systems and will build next generation AWARE 10 and AWARE 2 cameras. The goal of this DARPA project is to design a long-term production camera that is highly scalable from sub-gigapixel to tens-of-gigapixels. Deployment of the system is envisioned for military, commercial, and civilian applications

Ultimately, the goal of AWARE is to demonstrate that it is possible to capture all of the information in the optical field entering a camera aperture. The monocentric multiscale approach allows detection of modes at the diffraction limit. As discussed in "Petapixel Photography," the number of voxels resolved in the space-time-spectral data cube is ultimately limited by photon flux. We argue in the "gigapixel television," a paper presented at the 14th Takayanagi Kenjiro Memorial Symposium, that real-time streaming of gigapixel images is within reach and advisable.


People and Collaborators

The AWARE project is led at DARPA by Dr. Nibir Dhar. The seeds of AWARE began at DARPA through the integrated sensing and processing model developed by Dr. Dennis Healy [12]


Publications

  1. D. J. Brady and N. Hagen, "Multiscale lens design," Opt. Express 17(13), 10659-10674 (2009)
  2. D. L. Marks and D. J. Brady, "Gigagon: A Monocentric Lens Design Imaging 40 Gigapixels," p. ITuC2, Optical Society of America (2010).
  3. D. L. Marks, E. J. Tremblay, J. E. Ford and D. J. Brady, "Microcamera aperture scale in monocentric gigapixel cameras," Appl. Opt. 50(30), 5824-5833 (2011)
  4. H. S. Son, D. L. Marks, J. Hahn, J. Kim and D. J. Brady, "Design of a spherical focal surface using close-packed relay optics," Opt. Express 19(17), 16132-16138 (2011)
  5. D. J. Brady, Optical Imaging and Spectroscopy (Wiley; Optical society of America, Hoboken, N.J., 2009).
  6. Brady, D. J., Dennis Healy, ISP, Montage and MOSAIC, Computational OPtical Sensing and Imaging (COSI), 2011, Toronto.
  7. Daniel L. Marks and David J. Brady, "Close-up imaging using microcamera arrays for focal plane synthesis", Opt. Eng. 50, 033205 (Mar 23, 2011)
  8. Brady, D.J., M.E. Gehm, R.A. Stack, D.L. Marks, D.S. Kittle, D.R. Golish, E.M. Vera, and S.D. Feller, Multiscale gigapixel photography. Nature, 2012. 486(7403): p. 386-389
  9. Gehm, M.E., D.R. Golish, E.M. Vera, J. Hahn, D.L. Marks, and D.J. Brady, Image formation challenges in the mosaic platform, in Visual information processing xx, Z.U. Rahman, S.E. Reichenbach, and M.A. Neifeld, Editors. 2011.
  10. Golish, D.R., E.M. Vera, K.J. Kelly, Q. Gong, P.A. Jansen, J.M. Hughes, D.S. Kittle, D.J. Brady, and M.E. Gehm, Development of a scalable image formation pipeline for multiscale gigapixel photography. Optics Express, 2012. 20(20): p. 22048-22062.
  11.  Marks, D.L., H.S. Son, J. Kim, and D.J. Brady, Engineering a gigapixel monocentric multiscale camera. Optical Engineering, 2012. 51(8)
  12. Son, H.S., A. Johnson, R.A. Stack, J.M. Shaw, P. McLaughlin, D.L. Marks, D.J. Brady, and J. Kim, Optomechanical design of multiscale gigapixel digital camera. Applied Optics, 2013. 52(8): p. 1541-1549.
  13. Son, H.S., D.L. Marks, J. Hahn, J. Kim, and D.J. Brady, Design of a spherical focal surface using close-packed relay optics. Optics Express, 2011. 19(17): p. 16132-16138.
  14. Tremblay, E.J., D.L. Marks, D.J. Brady, and J.E. Ford, Design and scaling of monocentric multiscale imagers. Applied Optics, 2012. 51(20): p. 4691-4702.
  15. Petapixel photography and the limits of camera information capacity
  16. Gigapixel Television