Characterization of a Time-of-Flight Camera for Use in Diffuse Optical Tomography

Similar documents
Assessment of Diffuse Optical Tomography Image Reconstruction Methods Using a Photon Transport Model

Diffuse Optical Tomography, Inverse Problems, and Optimization. Mary Katherine Huffman. Undergraduate Research Fall 2011 Spring 2012

Chapter 18 Ray Optics

WAVELENGTH MANAGEMENT

Engineered Diffusers Intensity vs Irradiance

specular diffuse reflection.

WAVELENGTH MANAGEMENT

Chapter 7: Geometrical Optics. The branch of physics which studies the properties of light using the ray model of light.

Bayesian estimation of optical properties of the human head via 3D structural MRI p.1

Depth Sensors Kinect V2 A. Fornaser

Physical Optics. You can observe a lot just by watching. Yogi Berra ( )

Mode-Field Diameter and Spot Size Measurements of Lensed and Tapered Specialty Fibers

Chapter 26 Geometrical Optics

Advanced Image Reconstruction Methods for Photoacoustic Tomography

TracePro Stray Light Simulation

Chapter 36. Image Formation

Simulation of Diffuse Optical Tomography using COMSOL Multiphysics

Development of automated ultraviolet laser beam profiling system using fluorometric technique

Outline. ETN-FPI Training School on Plenoptic Sensing

2/26/2016. Chapter 23 Ray Optics. Chapter 23 Preview. Chapter 23 Preview

3D Time-of-Flight Image Sensor Solutions for Mobile Devices

Chapter 26 Geometrical Optics

Reflections. I feel pretty, oh so pretty

Computer Vision. The image formation process

INFOGR Computer Graphics. J. Bikker - April-July Lecture 10: Shading Models. Welcome!

High spatial resolution measurement of volume holographic gratings

Acuity. Acuity Sensors and Scanners. Product Brochure

Other Laser Surgery Laser Tonsillectomy Use CO 2 with mirror bouncing system Operation takes 15 minutes, no pain Cauterizes blood vessels & Lymphatic

Diffuse light tomography to detect blood vessels using Tikhonov regularization Huseyin Ozgur Kazanci* a, Steven L. Jacques b a

AP Physics: Curved Mirrors and Lenses

Quantitative evaluation of systematic imaging error due to uncertainty in tissue optical properties in high-density diffuse optical tomography

Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation

Measuring Light: Radiometry and Cameras

SYSTEM LINEARITY LAB MANUAL: 2 Modifications for P551 Fall 2013 Medical Physics Laboratory

INTRODUCTION REFLECTION AND REFRACTION AT BOUNDARIES. Introduction. Reflection and refraction at boundaries. Reflection at a single surface

SUPPLEMENTARY INFORMATION

Applications of Piezo Actuators for Space Instrument Optical Alignment

Welcome to: Physics I. I m Dr Alex Pettitt, and I ll be your guide!

dq dt I = Irradiance or Light Intensity is Flux Φ per area A (W/m 2 ) Φ =

Slide 1. Technical Aspects of Quality Control in Magnetic Resonance Imaging. Slide 2. Annual Compliance Testing. of MRI Systems.

ENY-C2005 Geoinformation in Environmental Modeling Lecture 4b: Laser scanning

To determine the wavelength of laser light using single slit diffraction

dq dt I = Irradiance or Light Intensity is Flux Φ per area A (W/m 2 ) Φ =

GBT Commissioning Memo 11: Plate Scale and pointing effects of subreflector positioning at 2 GHz.

Overview of Active Vision Techniques

Comparison of Beam Shapes and Transmission Powers of Two Prism Ducts

TRANSIENT IMAGING. Freshman Imaging Project 2014

Mirror Example Consider a concave mirror radius -10 cm then = = Now consider a 1 cm candle s = 15 cm from the vertex Where is the image.

Spectral analysis of non-stationary CT noise

Physics 214 Midterm Fall 2003 Form A

2 Depth Camera Assessment

Chapter 3 Image Registration. Chapter 3 Image Registration

index of refraction-light speed

PHYSICS. Chapter 33 Lecture FOR SCIENTISTS AND ENGINEERS A STRATEGIC APPROACH 4/E RANDALL D. KNIGHT

Physics 11 Chapter 18: Ray Optics

Review Session 1. Dr. Flera Rizatdinova

AP* Optics Free Response Questions

Time-of-flight basics

An Intuitive Explanation of Fourier Theory

Textbook Reference: Glencoe Physics: Chapters 16-18

45 µm polystyrene bead embedded in scattering tissue phantom. (a,b) raw images under oblique

Chapter 32 Light: Reflection and Refraction. Copyright 2009 Pearson Education, Inc.

Optics II. Reflection and Mirrors

Reflection and Refraction

Image Acquisition Systems

Understanding Variability

Physics 210 Medical Physics Midterm Exam Fall 2012 October 12, 2012

LBP2-SAM Series Beam Sampler for C-mount Cameras. User Notes

Chapter 34: Geometrical Optics

Optimized Design of 3D Laser Triangulation Systems

Chapter 7: Geometrical Optics

Chapter 12 Notes: Optics

Enhanced material contrast by dual-energy microct imaging

Optics Vac Work MT 2008

Nicholas J. Giordano. Chapter 24. Geometrical Optics. Marilyn Akins, PhD Broome Community College

Formulas of possible interest

A fast method for estimation of light flux in fluorescence image guided surgery

Optical Sectioning. Bo Huang. Pharmaceutical Chemistry

Independent Resolution Test of

High count rate multichannel TCSPC for optical tomography

Instruction manual for T3DS calculator software. Analyzer for terahertz spectra and imaging data. Release 2.4

Information page for written examinations at Linköping University TER2

Chapter 8: Physical Optics

Radiometry (From Intro to Optics, Pedrotti 1-4) Radiometry is measurement of Emag radiation (light) Consider a small spherical source Assume a black

Chapter 6 : Results and Discussion

Pipe Inspection with Digital Radiography

The Lens. Refraction and The Lens. Figure 1a:

D&S Technical Note 09-2 D&S A Proposed Correction to Reflectance Measurements of Profiled Surfaces. Introduction

ToF Camera for high resolution 3D images with affordable pricing

Optics. a- Before the beginning of the nineteenth century, light was considered to be a stream of particles.

Coupling of surface roughness to the performance of computer-generated holograms

Chapter 26 Geometrical Optics

Unit 3: Optics Chapter 4

Light: Geometric Optics (Chapter 23)

SUPPLEMENTARY INFORMATION

WORCESTER POLYTECHNIC INSTITUTE

PHYSICS. Chapter 34 Lecture FOR SCIENTISTS AND ENGINEERS A STRATEGIC APPROACH 4/E RANDALL D. KNIGHT

All forms of EM waves travel at the speed of light in a vacuum = 3.00 x 10 8 m/s This speed is constant in air as well

The MK Formation Tester A Complete 3-Dimensional Sheet Analyzer

3D TeraHertz Tomography

Transcription:

Characterization of a Time-of-Flight Camera for Use in Diffuse Optical Tomography A thesis submitted in partial fulfillment of the requirements for the degree of Bachelor of Science degree in Physics from the College of William and Mary by Neal Parker Advisor: Dr. William Cooke Senior Research Coordinator: Henry Krakauer Date: April 22, 2016

Introduction Diffuse optical tomography is the use of diffuse light to create an image of the desired target. In our case, the CamBoard pico s is a scanner-less LIDAR imager that uses an RF- modulated light source, being sinusoidally modulated at f m = 30MHz (s t = sin (2πf! )). This time-of-flight camera sends out a pulse of 850nm light, illuminating the entire scene. The camera simply uses the difference in time between the outgoing and incoming light to determine the distance traveled (r =!! ). The saturation of each pixel in! the 21120 (176x120) pixels that are available is used to determine the spatial orientation of the arriving photons. So, when coupled with the distance information, an image can be constructed. This camera captures four images within every frame to create an image where the first and third and the second and fourth images collected are 90 out of phase. There has been considerable focus on the prospects of using diffuse optical tomography as an alternative imaging technology in the health/veterinary fields as a low cost, much less invasive, more accessible, portable procedure when compared to MRI and CAT scans. This technology can be used to reconstruct images of biological tissues through solving the diffusion equation: φ r, t t = D r φ r, t vμ! r φ r, t + vs(r, t) Where φ r, t is the photon fluence at location r and time t, D r is the diffusion constant at location r, v is the speed of light in the material, and S(r, t) is an isotropic source term. When the light, in the IR range, interacts with tissues of interest, some of the light enters the tissue and some is reflected. The light has some transit time for which it remains within the tissue before being reemitted. Analysis of the light that reemitted, while excluding the reflected incident light, allows one to render an image of the internal structure of the target region. This is typically done with one source and multiple detectors that are equidistant from that source. This has been used to do basic analysis of brain activity and is the basic idea behind pulse oximetry. Because vessels containing larger amounts of oxygenated hemoglobin absorb light differently than those containing a larger concentration of deoxygenated hemoglobin, Figure 1:The BOLD Effect [2] analysis of the intensity of light being reemitted can be used to determine regions of the brain that are active during certain tasks [1]. This phenomenon of the absorbance shift of hemoglobin in its oxygenated and deoxygenated form is known as the BOLD effect (the Blood- Oxygen-Level Dependent effect). There are considerable issues associated with diffuse optical tomography, namely, that signal strength attenuates very rapidly. The light does not penetrate very far into the tissue meaning that the information collected is restricted to the periphery of the tissue. Complex geometry of tissues, distribution of tissues that have different coefficients of 2

absorption, potentially non-linear absorption, as well as thermal interference combine to further complicate the effort. When compared to the traditional method of diffuse optical tomography, the method that we are exploring with the CamBoard pico s would use a much greater density of detectors with each pixel acting as an individual detector. There are 120x176 pixels for just one source. This should allow for significantly greater resolutions of images acquired from tissues. However, because of the greater density of detectors compared to the number of sources, the strength of signal being provided to each detector would be reduced. I will be exploring this trade-off. If the efficiency of this dense array of detectors is too low, then this camera will not be useable for diffuse optical tomography. Results The CamBoard pico s can exclude information from pixels where the amplitude of received light is too low, the detectors are being saturated, and/or the variation between pixels of a given region in the phase plot exceeds some value. I have gathered images, developed from phase and amplitude data from the camera, of a flat plane 30cm from the face of the camera. There are three main regions of interest here: the sides (left and right of the plane) that appear black and the plane itself. Figures 2 and 3: Phase(left) and Amplitude(Right) Images of Flat Plane This image of a flat plane is generated solely from the phase information generated by the camera. There is little variation in the plane region compared to the non-plane regions surrounding the plane. This relative uniformity is a factor in determining the quality of data. Additionally, there are rings of phase within this plane region corresponding to different straight-line distances from the camera. (These rings will be more easily viewed in later images.) This image is generated from the amplitude data from the camera. You may notice that the location of much higher intensity striking the board. This is because the light from the camera striking this section of the board returns with considerably less photon loss than the others. The light in this region will show up as the region on the phase map with the smallest variation in phase. Smaller boxes of increasing size within the plane and non-plane regions were selected and the standard deviation of the phase values within these smaller boxes was found and plotted against the area of each corresponding box to estimate the overall 3

standard deviation of each region. As the area of the boxes got larger and larger, the corresponding standard deviations converged on the population standard deviation of each region. Figure 4: Estimation of Std. Deviation in Plane Region The above image shows the approximation of the true standard deviation of the phase within the plane region. The value of which was found to be.008 rad. Figure 5: Comparison of Std. Deviations in Plane and Non-plane Regions This figure shows a side-by-side comparison of the approximate standard deviations of the plane and non-plane regions. The value of the standard deviation in non-plane region was estimated to be.35 rad. This value likely undershoots the true value, as the two larger box sizes (500 and 400 pixels) seem to be outliers. 4

In general, the standard deviations of the phase in the dark regions are much greater than those in the plane region. Combining the information from the approximation of the true standard deviation of each region with the magnitude of the phase difference, allows us to see that one of the determining factors that the CamBoard software uses in determining good vs. bad image regions is the standard deviation in the phase of regions of pixels, not the magnitude of the phase. From the small samples I used, the average of the standard deviations in the plane region was.008 rad. While in the dark region, it was.3484 rad. Figures 6:Magnitude of Phase Across One Row These rows were selected to show the variation in the phase values across one row of the phase image, which included points in the plane and non-plane regions. As you cans see, the regions with the highest amplitude corresponded to those with the least variation in phase. The highest amplitude corresponds to the shortest distance traveled by the light striking the plane. As you can see, there is slight warping that occurs through the middle of the phase plots. This is because the straight-line distance to the camera changes as you make your way across one row of pixels. Another aspect of the project was to redirect the camera s light source using a fiber optic to obtain images of backlit synthetic fat. Roughly 16% of the original output of the camera s LED was diverted into the fiber. From there, two lenses, one with a focal length of 10cm and the other with a focal length of 4.5cm (figure 8), were used to 5

magnify the source and the object being imaged, the magnification being roughly 2x. The solid angle of the incident light would also increase; more light would strike the diode of the camera. Because of this, several neutral density filters had to be added between the camera and the second lens to keep from saturating the camera s detectors (Figure 8.) Figure 7:Magnified Image of Wires The above image is of small wires magnified and imaged by the CamBoard pico s. These were viewed from a distance of 30cm. Each wire is ~.25mm in diameter. Figure 8: Magnification Setup Used 6

To determine if the CamBoard pico s would be a reliable detector to use for diffuse optical tomography, we must determine if the amount of photons incident to each detector (pixel) is sufficient for our uses. Because there is such a high density of detectors compared to the normal methods of diffuse optical tomography, it means that the amount of light incident to each would be considerably less. The camera shoots 45fps. Figure 9: Image showing the shots taken per frame by the CamBoard pico s Each frame is made up of four individual shots corresponding to four different phases. These four images are then used to create one final image. During each of the four shots, the diode is rapidly turned on and off at 30MHz. In short, the diode is off for much more time than it is on. This means that the peak power emitted is higher than the measured average power. The average power was measured at a distance of 30cm from the camera. P!"#$ = avg power = 77μW duty cycle =.41 avg power duty cycle =.000183W With the camera set to an integration time of 1000µs, for one frame of four shots, the camera has emitted 1.83x10-7 J. This means that 7.82x10 11 photons at 850nm have been emitted from the camera.. 000183W 1000μs = 1.83x10!! J E!"# = 2.34 10!!" J/photon 1.83x10!! J 2.34 10!!" J/ photon = 7.82 10!! photons 7

At a distance of.371 m between the plane and the camera, 330 counts were measured in the highest region of amplitude. This was measured using LightVis. The area contained in the image was determined by measuring height and width of the area of the board that could be seen on the phase image in the LightVis application. Once the light strikes the plane, it is reflected back into an angle of roughly 1sr (this was tough to measure). Simple sketch showing the camera setup Energy count area of one pixel = area in image =.256m! area in image # of pixels (120 176) = 9 10!! m! = P!"#$ t!"#$%&'#!(" area of pixel angle reflected into distance! #counts 8

Energy count = 8.59 10!!" J photons pixel = E count 1 E!"# = 3671 photons The typical path length of a photon through fatty tissues is on the order of 10cm. At a distance of.13 m to the flat plane, the number of photons incident to each pixel skyrockets to a little over 3671 photons/pixel. However, it is very important to note that if this system were being used with a diffusive material such as fat, a substantial portion of these photons would be lost. We must also know the error of the device in determining distances from recorded phase differences. We must be able to see sufficiently small variations in the phase due to the different path lengths taken by the photons to arrive at the detector array of the camera. It is critical that we have statistical confidence that these variations are due to path difference, not instrumental variation or thermal noise. Figure 10: Phase Image Showing location of Vertical Sample The figure is showing the full phase plot. The vertical line at column 85, designated by the yellow line, was chosen to model the phase as the distance from the center region (corresponding to the smallest phase shift and smallest distance traveled) changes. 9

Figure 11 This image shows the 3-dimensional distribution of the amplitude at a distance of.3 cm away. The phase was modeled by determining the angles and distances that corresponded to each 1 pixel increment from the location of highest intensity, row 65 (Figure 11). The relationship followed a linear trend with the following equations. First part of model 1: 65 : y =.0004195x.005 Second part of model 65: 120 : y =.0005095x +.055 The second part of the model, from row 65 to 120, had considerably more noise toward the lower end (nearer to 120). This is likely because some of the light was hitting the table first and not striking the plane directly. The residuals were taken in these two sections of the fit (Figure 13). The average absolute value for the first was found to be.0027 rad. While in the second, it was found to be.0044 rad. According to the equation relating phase difference and distance, d = cφ 4πf! where c is the speed of light, f! is the frequency of modulation, and φ is the phase difference, an uncertainty of.0027 rad would account for 2.1mm of uncertainty in the distance measurement. 10

Figure 12 Figure 13 11

Conclusion Because of the relatively robust signal strength of the camera without the aid of lenses (3671 photons incident to each pixel from.3m away) and small minimum discernable step size in the phase data (.0027 rad), lead me to conclude that this camera could be used as a powerful detector array for diffuse optical tomography. In addition, the small size and power supply needed would make for a compact, highly portable diffuse optical tomography system. References [1] D.A. Boas, D.H. Brooks, E.L. Miller, C.A. Dimarzio, M. Kilmer, R.J. Gaudette, Q. Zhang. Imaging the Body with Diffuse Optical Tomography. IEEE Signal Processing Magazine, 57-75. Nov. 2001. [2] Dehghani, H., Eames, M. E., Yalavarthy, P. K., Davis, S. C., Srinivasan, S., Carpenter, C. M.,... Paulsen, K. D. (2008). Near infrared optical tomography using NIRFAST: Algorithm for numerical model and image reconstruction. Communications in Numerical Methods in Engineering, 25(6), 711 732. doi:10.1002/cnm.1162 [3] Eggebrecht, A. T., Culver, J. P. (2014). High-Density Diffuse Optical Tomography: Imaging Distributed Function and Networks in the Human Brain. Bio-Optics World, 7(4). Retrieved from http://www.bioopticsworld.com [4] S.A. Prahl, Optical absorption of hemoglobin, Oregon Medical Laser Center, OR, Tech. Rep., Dec. 15, 1999 [5] Reference Design Brief CamBoard Pico S 71.19K. Siegen, Germany: pmdtechnologies. 2014. Web. http://pmdtec.com/html/pdf/pmd_rd_brief_cb_pico_71.19k_v0103.pdf. 12