Testbed experiment for SPIDER: A photonic integrated circuit-based interferometric imaging system

Similar documents
Mode-Field Diameter and Spot Size Measurements of Lensed and Tapered Specialty Fibers

Optics Vac Work MT 2008

SIMULATION AND VISUALIZATION IN THE EDUCATION OF COHERENT OPTICS

1 Laboratory #4: Division-of-Wavefront Interference

Lab2: Single Photon Interference

Experimental Observation of Invariance of Spectral Degree of Coherence. with Change in Bandwidth of Light

E x Direction of Propagation. y B y

Single Photon Interference Christopher Marsh Jaime Vela

specular diffuse reflection.

Synthesis Imaging. Claire Chandler, Sanjay Bhatnagar NRAO/Socorro

Lab 5: Diffraction and Interference

Wavelength scanning interferometry for measuring transparent films of the fusion targets

4D Technology Corporation

HOLOEYE Photonics. HOLOEYE Photonics AG. HOLOEYE Corporation

Diffraction and Interference of Plane Light Waves

Imaging and Deconvolution

High spatial resolution measurement of volume holographic gratings

Comparison of Beam Shapes and Transmission Powers of Two Prism Ducts

Coherent Gradient Sensing Microscopy: Microinterferometric Technique. for Quantitative Cell Detection

CHAPTER 2: THREE DIMENSIONAL TOPOGRAPHICAL MAPPING SYSTEM. Target Object

Application of Photopolymer Holographic Gratings

Hyperspectral interferometry for single-shot absolute measurement of 3-D shape and displacement fields

What is Frequency Domain Analysis?

PHY 222 Lab 11 Interference and Diffraction Patterns Investigating interference and diffraction of light waves

3. Image formation, Fourier analysis and CTF theory. Paula da Fonseca

Chapter 5 Example and Supplementary Problems

Spectrographs. C. A. Griffith, Class Notes, PTYS 521, 2016 Not for distribution.

ECEN 4606, UNDERGRADUATE OPTICS LAB

Chapter 2: Wave Optics

Supplementary materials of Multispectral imaging using a single bucket detector

A Single Grating-lens Focusing Two Orthogonally Polarized Beams in Opposite Direction

Chapter 82 Example and Supplementary Problems

Iterative procedure for in-situ EUV optical testing with an incoherent source

Interference with polarized light

Physical Optics FOREWORD

Tutorial Solutions. 10 Holographic Applications Holographic Zone-Plate

Using a multipoint interferometer to measure the orbital angular momentum of light

Charles L. Bennett, USA. Livermore, CA 94550

ratio of the volume under the 2D MTF of a lens to the volume under the 2D MTF of a diffraction limited

Advanced phase retrieval: maximum likelihood technique with sparse regularization of phase and amplitude

MICHELSON S INTERFEROMETER

Basic optics. Geometrical optics and images Interference Diffraction Diffraction integral. we use simple models that say a lot! more rigorous approach

Physical Optics. You can observe a lot just by watching. Yogi Berra ( )

Experimental Competition

Lab Report: Optical Image Processing

Diffraction. Single-slit diffraction. Diffraction by a circular aperture. Chapter 38. In the forward direction, the intensity is maximal.

DYNAMIC ELECTRONIC SPECKLE PATTERN INTERFEROMETRY IN APPLICATION TO MEASURE OUT-OF-PLANE DISPLACEMENT

MEMS SENSOR FOR MEMS METROLOGY

OPTI-521 Graduate Report 2 Matthew Risi Tutorial: Introduction to imaging, and estimate of image quality degradation from optical surfaces

Development of shape measuring system using a line sensor in a lateral shearing interferometer

SUPPLEMENTARY INFORMATION

Chapter 9. Coherence

White-light interference microscopy: minimization of spurious diffraction effects by geometric phase-shifting

Draft SPOTS Standard Part III (7)

7.3 Refractive Index Profiling of Fibers and Fusion Splices

Formulas of possible interest

Physics 214 Midterm Fall 2003 Form A

Coupling of surface roughness to the performance of computer-generated holograms

AP Physics Problems -- Waves and Light

10.4 Interference in Thin Films

Supplementary Figure 1 Optimum transmissive mask design for shaping an incident light to a desired

Winter College on Optics in Environmental Science February Adaptive Optics: Introduction, and Wavefront Correction

Engineered Diffusers Intensity vs Irradiance

Experiment 8 Wave Optics

Wave Particle Duality with Single Photon Interference

An Intuitive Explanation of Fourier Theory

3D TeraHertz Tomography

College Physics 150. Chapter 25 Interference and Diffraction

WORCESTER POLYTECHNIC INSTITUTE

Single Photon Interference

A Low Power, High Throughput, Fully Event-Based Stereo System: Supplementary Documentation

Lenses lens equation (for a thin lens) = (η η ) f r 1 r 2

A. K. Srivastava, K.C. Sati, Satyander Kumar alaser Science and Technology Center, Metcalfe House, Civil Lines, Delhi , INDIA

Experiment 1: Diffraction from a Single Slit

Second Year Optics 2017 Problem Set 1

Basic Polarization Techniques and Devices 1998, 2003 Meadowlark Optics, Inc

UNIT 102-9: INTERFERENCE AND DIFFRACTION

4. Recommended alignment procedure:

CHAPTER 26 INTERFERENCE AND DIFFRACTION

Applications of Piezo Actuators for Space Instrument Optical Alignment

Unit 5.C Physical Optics Essential Fundamentals of Physical Optics

LC-1: Interference and Diffraction

Interference and Diffraction

Null test for a highly paraboloidal mirror

THE UNIVERSITY OF NEW SOUTH WALES SCHOOL OF PHYSICS EXAMINATION - NOVEMBER 2009 PHYS ADVANCED OPTICS

Techniques of Noninvasive Optical Tomographic Imaging

Range Sensors (time of flight) (1)

Fourier, Fresnel and Image CGHs of three-dimensional objects observed from many different projections

Reflection and Refraction of Light

Lab 2 Report. Carlin Gettliffe

Chapter 37. Interference of Light Waves

INTERFERENCE. Interf - 1

Mu lt i s p e c t r a l

Supporting information for: A highly directional room-temperature single. photon device

PHYSICS. Chapter 33 Lecture FOR SCIENTISTS AND ENGINEERS A STRATEGIC APPROACH 4/E RANDALL D. KNIGHT

Evaluation of Two-Dimensional Phase Unwrapping Algorithms for Interferometric Adaptive Optics Utilizing Liquid-Crystal Spatial Light Modulators

22.4. (a) (b) (c) (d)

Multi-frame blind deconvolution: Compact and multi-channel versions. Douglas A. Hope and Stuart M. Jefferies

PHYS 3410/6750: Modern Optics Midterm #2

Textbook Reference: Physics (Wilson, Buffa, Lou): Chapter 24

Transcription:

Testbed experiment for SPIDER: A photonic integrated circuit-based interferometric imaging system Katherine Badham, Alan Duncan, Richard L. Kendrick, Danielle Wuchenich, Chad Ogden, Guy Chriqui Lockheed Martin Advanced Technology Center Samuel T. Thurman Lockheed Martin Coherent Technologies Tiehui Su, Weicheng Lai, Jaeyi Chun, Siwei Li, Guangyao Liu, S. J. B. Yoo Dept. of Electrical and Computer Engineering, UC Davis AMOS CONFERENCE PAPER 1. ABSTRACT The Lockheed Martin Advanced Technology Center (LM ATC) and the University of California at Davis (UC Davis) are developing an electro-optical (EO) imaging sensor called SPIDER (Segmented Planar Imaging Detector for Electro-optical Reconnaissance) that seeks to provide a 10x to 100x size, weight, and power (SWaP) reduction alternative to the traditional bulky optical telescope and focal-plane detector array. The substantial reductions in SWaP would reduce cost and/or provide higher resolution by enabling a larger-aperture imager in a constrained volume. Our SPIDER imager replaces the traditional optical telescope and digital focal plane detector array with a densely packed interferometer array based on emerging photonic integrated circuit (PIC) technologies that samples the object being imaged in the Fourier domain (i.e., spatial frequency domain), and then reconstructs an image. Our approach replaces the large optics and structures required by a conventional telescope with PICs that are accommodated by standard lithographic fabrication techniques (e.g., complementary metal-oxide-semiconductor (CMOS) fabrication). The standard EO payload integration and test process that involves precision alignment and test of optical components to form a diffraction limited telescope is, therefore, replaced by in-process integration and test as part of the PIC fabrication, which substantially reduces associated schedule and cost. In this paper we describe the photonic integrated circuit design and the testbed used to create the first images of extended scenes. We summarize the image reconstruction steps and present the final images. We also describe our next generation PIC design for a larger (16x area, 4x field of view) image. 2. SPIDER CONCEPT DESIGN The SPIDER imager consists of a multitude of direct-detection white-light interferometers on a PIC which generates interference fringes. The measured complex visibility (phase and amplitude) of the fringes corresponds to a Fourier component of the object being imaged through the Van Cittert-Zernike theorem [1]. 2D Fourier fill of the scene (object) being imaged is attained by collecting data for multiple 1D interferometer arrays with several orientations. To achieve azimuthal sampling, the arrays are arranged in a radial blade pattern, pictured in Fig 1(a). The imager design is composed of 37 1D arrays with lenslets coupling light from an extended scene into waveguides in the PIC, where interferometric beam combination occurs. Dense radial sampling is achieved by making measurements at various optical wavelengths or spectral bands for each of the baselines in the individual 1D interferometer arrays. Data collected for each baseline and spectral channel correspond to angular spatial frequencies B/λ where B is the baseline, or distance between lenslet pairs and λ is the wavelength of light. Therefore, higher spatial frequency information about the object will be captured by longer baselines and shorter wavelengths. The spatial resolution of SPIDER is determined by its effective aperture size which is equal to the length of the maximum baseline, B max. Measuring 12 baseline pairs on each PIC of the imager provides the 2D Fourier transform of the object, effectively mapping the 2D Fourier plane of the scene.

Fig. 1. (a) SPIDER sensor radial blade design with 37 arrays, and (b) a schematic of light coupled through lenslets into a single interferometer baseline, combined in a MMI, split into spectral channels by an AWG and imaged onto a 2D detector array. The main component of the electro-optical sensor is the planar photonic integrated circuit design for the 1D interferometer arrays. Two past generations of PICs were developed and characterized [2,3]. The newest PIC design is discussed here and was introduced in a former work [4]. We use the third generation SPIDER PIC which is 22 x 22 mm in size, providing an average of 6 db insertion loss and -15 db throughput attenuation. The PIC s lithographic layout is pictured in Fig. 2(a) along with a picture in Fig. 2(b). It contains 24 input waveguides that are paired up to form 12 baselines using 2 x 2 multi-mode interferometer (MMI) beam combiners. Light from each of the MMI output ports is then spectrally split using arrayed waveguide gratings (AWGs), which allow broadband operation and provide increased spatial frequency coverage. The AWGs for each baseline are identical and provide a maximum of 18 spectral channels ranging from 1223 nm to 1586 nm and uniformly spaced in wavenumber. All 18 spectral channels are used for the longest baseline, while fewer are needed for the short baselines. The total number of output ports for each PIC is 206. Fig. 2. (a) Lithographic layout of the SPIDER PIC and (b) a photograph of the PIC. 3. SPIDER EXPERIMENT: OPTICAL TESTBED An in-lab optical testbed was implemented to test the PIC s ability to create images of extended scenes. The testbed is pictured in Fig. 3. It includes a back-illuminated scene and a projector, which effectively places the scene in the

far field of the PIC. Light from the scene is then coupled through the lenslets, into the PIC interferometers and to the detector which records the interference fringes. The PIC output ports are actually imaged by a lens onto a focal plane array for the detection. While the PIC includes thermo-electric phase modulators that enable estimation of the complex (amplitude and phase) fringe visibility through temporal fringe scanning, the PIC was not actually wire bonded. Instead, we used a fast-steering mirror (FSM) driven by a sinusoidal voltage signal to scan through the fringes. The bottom of Fig 3(b) displays a three-row mask placed in front of the lenslets which is used to block various lenslets for characterization of the optical throughput for individual beam paths (without interference). This is necessary for estimating the normalized fringe visibility, which is a measure of the complex degree of coherence and independent of the relative beam intensities for each baseline. Fig. 3. Optical testbed layout to simulate the SPIDER imager using a (a) Scene generator (light source and projector) followed by a long-pass (LP) filter, fast-steering mirror (FSM), (b) through the mask, lenslets, PIC and detector and a (c) USAF resolution test chart (group 2 element 1) and a scene of a train yard used as the extended scenes. The images chosen for the scene include a U.S. Air Force (USAF) resolution test chart (group 2 element 1), and (2) a scene of a train yard, shown in Fig. 3(c). As described previously, we used a 1.5 mm-diameter aperture mask over both scenes to match the scene width to the system field of view [4]. Intensity measurements from a single baseline pair are modeled as a function of time via the two-beam interference equation given by, tot(t)= t)+ t)+2 [ t) t)] m cos[ t arg(m where I 1(t) and I 2(t) are the intensities from the right and left-input lenslets for each baseline, (t) is the time-varying fringe phase from the FSM scan amplitude and frequency, µ is the complex degree of coherence related to the 2D spatial Fourier transform of the intensity distribution via the Van-Cittert Zernike theorem, and m is the PIC instrumental visibility. During data acquisition, the FSM was driven sinusoidally at 10 millihertz frequency, shifting the beam 2.6 mrad in angle space while 3000 frames of data from the focal plane array were recorded at 30 Hz for each of the three mask positions. The terms I 1(t) and I 2(t) of Eq. (1) are characterized by collecting data with light illuminating each of the input lenslets individually using the mask shown in Fig. 3(b) to block either the left or right lenslets per baseline pair. The mask is also set to allow transmission through all baselines in order to provide the fringe data for all spectral channels. The PIC is directly imaged onto the detector plane so that a single row of pixels contains waveguides with the fringe information needed for computational image reconstruction. Every waveguide pixels intensity varies at a temporal frequency according to the spatial frequency and spectral band of its baseline (i.e., longer baselines and shorter wavelengths give higher spatial frequency information about the object). Fig. 4 displays fringe measurements I tot(t) for two baselines of different lengths baseline 3 which is 2.16 mm in length and baseline 12 which is 20.88 mm. Notice that the longer baseline 12 exhibits higher frequency fringes than the shorter baseline 3. The width of each

fringe envelope (~500 frames) is determined primarily by the scene width. The center of each envelope corresponds to the condition where the PIC lenslets look nominally at the center of the scene through the FSM. As the FSM moves away from this position, the fringe phase term (t) changes and the amount of light coupled into the PIC decreases. The fringe data is processed to extract the fringe visibility, amplitude and phase at the center of each fringe packet, which is equivalent to the product of the object and system visibilities term (m ) in Eq. (1). The system visibility m is characterized separately through measurements on a point source object (for which = 1). Fig. 4. Fringes of a single output waveguide spectral channel which have low frequency for baseline 3 (black) and high frequency for baseline 12 (blue). The experiment used only one PIC, which provides only one dimension of Fourier sampling. To get Fourier samples in two dimensions, the scene was rotated and data was collected in 10-degree increments. All of the fringe data was reduced to estimate the scene complex degree of coherence for each scene orientation, baseline and spectral channel. This information is equivalent to a set of 2D samples of the scene Fourier transform in polar format. Images were then reconstructed from this data by simply computing the inverse fast Fourier transform (FFT) of this data and by using more complicated iterative reconstruction algorithms. The iterative approach offers the ability to incorporate nonnegativity constraints and/or a priori scene information into the reconstruction. 4. SPIDER PROCESSED IMAGES Imaging demonstrations were performed with two scenes: a set of tri-bars from a USAF resolution target and a grayscale aerial-photograph of a train yard. Both scenes were chrome-on-glass transparencies that were masked down to the PIC field-of-view with a 1.5 mm-diameter circular aperture. For each scene, data was collected for scene orientation angles 0-180 degrees in 5- or 10-degree increments. Data does not need to be collected over a full range of 360 degrees, because the Fourier transform of a real-valued intensity object has Hermitian symmetry. Fig. 5(a) shows several images for the USAF bar target. The first image is a synthetic digital model of the scene, while the second is a simulated image, created by computing noise-free Fourier samples of the scene with the same sampling as the experimental data and using an inverse FFT reconstruction algorithm. This simulated image gives an indication of the image quality expected from the experiment. The third image is an initial FFT-based reconstructed image of the bar chart. The features of the bar target are noticeably blurred compared with its simulation. We believe this is caused from wobble in the rotation stage used to rotate the scene between each data set. This was investigated by comparing the experimental data for each scene orientation with the synthetic data used to generate the simulated image. Translation of the scene with rotation angle (the effect of stage wobble) can be detected through this comparison as a linear phase error (vs. radial spatial frequency) that varies systematically with rotation angle. Additionally, the linear phase errors can be removed from the experimental data to reconstruct the final FFT-based image, which is shown on the right of Fig. 5(a). Note that stage wobble would not be an issue in a SPIDER system based on a 2D array of PICs, as shown in Fig. 1(a) within a rigid mount.

Fig. 5(b) shows similar image results for the scene object of a train yard, with the exception that the last image is the result of an iterative image reconstruction algorithm that incorporates penalty metrics for nonnegativity and finite scene support as well as a total-variation metric for regularization. These terms incorporate a priori information about the scene and help mitigate the impacts of sparse Fourier sampling in the experiment. While there are some artifacts, the final reconstruction matches the truth object shown on the left of Fig. 5(b) fairly well. Note that the truth object was obtained by viewing the actual slide transparency with a microscope equipped with a camera. The quality of the final image for both targets authenticates SPIDER s imaging demo as a success. Fig. 5. Image reconstruction results of the (a) USAF test target and the (b) train yard scene with the numerically simulated FFT image, initial inverse FFT image reconstruction and the final image with alignment corrections. 5. SUMMARY The extended scene imaging demo for SPIDER gave results allowing us to be confident in the system. SPIDER s full design consists of a 2D array of multiple PICs, but in this demo we used an individual PIC to successfully demonstrate the system s imaging capability via rotation of the scene. Wobble in the scene rotation stage introduced phase errors in the data collected for the imaging demos, but this would not be a problem in a final system that does not rely on scene rotation for data collection. Roughly 30 percent of the output waveguides had no measurable interference fringe, so these needed to be identified for image characterization. Over time the PIC design has been modified to reduce the number of waveguide crossings to reduce on-chip losses and cross-talk between baselines. Minimizing the number of crossings reduces the on-chip losses and cross-talk insertion loss. We continue to work to reduce the throughput losses in the PIC and expect a fourth-generation PIC throughput attenuation to drop by 5 db, giving us an improved throughput loss of 10 db including coupling efficiency. The optical testbed and image reconstruction experiment proved that the SPIDER system would work, but we continue to research advanced state-of-the-art PIC capabilities. The future of SPIDER includes more experimentation and work on improvement of the PIC technology and optimization of the system architecture which would both reduce SWaP and improve image quality. A reduction in SWaP is to be achieved by increasing system complexity and system integration. We plan on designs with longer baselines for higher resolution, and more baselines for dense Fourier sampling. With a maximum system baseline of 100 mm we can obtain images 4x in size or 200 x 200 image pixels. We also plan to incorporate higher-density integration using multiple PIC combinations, along with detectors on the PIC. We are also exploring real-time image processing approaches.

6. REFERENCES 1. J. W. Goodman, Statistical Optics, (John Wiley and Sons, Inc., New York, 2000). 2. SPIDER: Next Generation Chip Scale Imaging Sensor, Duncan, et al, Proceedings of the Advanced Maui Optical and Space Surveillance Technologies Conference, Wailea, Hawaii, September 15-18, 2014. 3. SPIDER: Next Generation Chip Scale Imaging Sensor Update, Duncan, et al, Proceedings of the Advanced Maui Optical and Space Surveillance Technologies Conference, Wailea, Hawaii, September 20-23, 2016. 4. Photonic integrated circuit-based imaging system for SPIDER Badham, et al, The Pacific Rim Conference on Lasers and Electro-Optics, Singapore, July 31-Aug 4, 2017. 7. ACKNOWLEDGEMENTS This research was developed with funding from the Defense Advanced Research Projects Agency (DARPA). The views, opinions, and/or findings contained in this material are those of the authors and should not be interpreted as representing the official views or policies of the Department of Defense or the U.S. Government.