SUPPLEMENTARY INFORMATION

Size: px
Start display at page:

Download "SUPPLEMENTARY INFORMATION"

Transcription

1 DOI: 1.138/NPHOTON.234 Detection and tracking of moving objects hidden from view Genevieve Gariepy, 1 Francesco Tonolini, 1 Robert Henderson, 2 Jonathan Leach 1 and Daniele Faccio 1 1 Institute of Photonics and Quantum Sciences, Heriot-Watt University, David Brewster Building, Edinburgh, EH14 4AS, UK 2 Institute for Micro and Nano Systems, University of Edinburgh Alexander Crum Brown Road, Edinburgh EH9 3FF, UK To whom correspondence should be addressed; gg132@hw.ac.uk; d.faccio@hw.ac.uk. S1 Background subtraction and data processing As explained in the text, in a case where an object-free background is impractical or impossible to acquire, the median of histograms recorded at different times can be used to estimate the background. We show in figure S1 an example of a background calculated from the median of eight recorded positions. Figure S1a shows the histograms of a given pixel for the eight positions of the target mentioned in the main text. We can see that the peaks coming from the target are shifting in time when the target changes position, between 2.5 ns and 4.5 ns. The rest of the signal is coming from fixed objects: the walls and the ceiling. At each point in time, we record the median value of the 8 positions. Figure S1b shows the result of the median-calculated background (object-present background) compared to the object-free background. It gives a fairly good approximation of the background, especially for the peaks coming from a fixed object. In a real-life implementation, the object-present background can be calculated with the first few acquisitions. If there is no object moving, but a target is present, this algorithm will fail to detect the target; but as soon as the target starts moving, it will take around 15 seconds ( 5 acquisitions) to record a background and begin to accurately locate the target. The best approximation of a background will come from a target that is moving in both the x and y directions. However, we underline that during this initialisation period of 15 seconds, the camera is acquiring data which does provide information regarding the movement of the target: this data is simply less accurate with respect to the data acquired at later times, but nonetheless indicates target movement. To retrieve the position of the hidden target, we also need to know the position r i of pixel i on the floor. The camera is looking down at the floor (it is located at a 46 cm height above the floor and is imaging a region, on the floor, located 67 cm ahead), but still records a 32x32 square image. This image actually corresponds to a field of view that is trapezoidal and stretched both spatially and temporally, with respect to a squared field of view perpendicular to the line of sight of the camera. To correctly NATURE PHOTONICS 1

2 DOI: 1.138/NPHOTON.234 Signal from target a Signal from environment b Figure S1: Background subtraction. To avoid acquiring a background in controlled conditions, where we know the target is not present, we can rely on a few acquisitions where the target is at different positions. a) In the eight histograms shown here, the target was positioned at different places leading to a variation in its detected signal. b) The background calculated from the median of the histograms in a (object-present) is very close to the background acquired without the target present (object-free). retrieve the positions of a hidden target, we need to first determine the actual position ( r i = x i,y i ) of each pixel of the camera. To do so, we measure the dimensions of the field of view and its distance to the camera to reconstruct the actual shape of the field of view on the floor. In a real-life application, knowing the height and angle of the camera with respect to the floor will be sufficient to determine the geometry of its field of view and therefore know the positions (x i,y i ) of each pixel in the field of view. We also correct for temporal distortion of the recorded data: because the field of view is not perpendicular to the camera s line of sight, photons recorded at the top of the field of view take longer to reach the camera than the ones coming from the bottom. Again, knowing the geometry of our imaging system, we correct the measured t i accordingly. We then use the values of r i, t i and σ ti to retrieve the position of a hidden target, as explained in the next section. S2 Position retrieval The mathematical model we develop for locating objects around corners relies on the time histograms recorded by each pixel of the SPAD camera. These histograms give a probability distribution of arrival times for photons scattered by a hidden target. This time distribution can be mapped into a probability 2 NATURE PHOTONICS

3 DOI: 1.138/NPHOTON.234 SUPPLEMENTARY INFORMATION density in space for the target position. A spatial probability density is calculated for each pixel, and the product of all probabilities constitutes the joint probability density of finding the target at a specific position in space. We first treat each pixel independently. The scattered light from an object at position r o is detected at pixel i (located at position r i ) at time t i. The arrival time t i corresponds to the time the light takes to propagate (at speed c) from the laser position on the floor ( r l ) to the object and then from the object to the pixel: c t i = r o r l + r i r o (S1) Figure S2 illustrates the variables and relevant distances used in the calculation. Solving equation S1 for the object position r o gives infinitely many solutions lying on the surface of an ellipsoid defined by foci r l and r i : the possible positions in space that can generate a signal at pixel i at time t i lie on an ellipsoidal surface with evenly distributed probability. In the absence of any uncertainties in the measured signals, the resulting probability density of the object s location i ( r o ) calculated from the data collected by pixel i is therefore given by i ( r o ) { 1 if r o r l + r i r o = c t i otherwise. (S2) i (r o ) Laser spot Point i r o r l r i r o Figure S2: Retrieving the position of a hidden target For each pixel i, a signal is detected at time t i. t i is the time it takes for a pulse to propagate from the laser spot to the object ( r o r l ) and from the object to the pixel ( r i r o ). We determine a probability density in space i ( r o ) of where the object can be located to send a signal at pixel i at time t i. We write here with a proportional sign and will treat the normalisation issue ( S i ( r o )d r o = 1) at the end of the development. The function above can be represented in a simpler form by using ellipsoidal coordinates with foci r l and r i i ( r o ) δ(ε ct i ), where ε = r o r l + r i r o. (S3) NATURE PHOTONICS 3

4 DOI: 1.138/NPHOTON.234 In any real implementation, the histogram h i (t) recorded by any given pixel i will contain an uncertainty on the arrival time of the signal. As a result, the probability density i ( r o ) will no longer be a uniform ellipsoid, and the uncertainty in the time histogram can be mapped onto the spatial probability density as: ( ε ) i ( r o ) δ(ε ct)h i (t)dt = h i. (S4) c In our experiments, the signal recorded in h i (t) has a Gaussian form with a standard of σ ti. The uncertainty is originating from different sources, for example the jitter on the system and the finite size of the target. As a result of the Gaussian form of the recorded signal, the general expression of the probability density i ( r o ) expressed in S4 then becomes i ( r o ) exp [ (ε/c t i) 2 ]. (S5) 2σt 2 i Here t i is the mean arrival time registered by the pixel i and σ ti is its standard deviation. Figure S2 shows an example of the spatial distribution of i ( r o ). In our setup we aim to locate the object in x and y, on the plane of the floor, making the assumption that such an object is not moving considerably in the vertical direction. We therefore set a two dimensional search space at a given height close to ground level and r o takes the form r o =(x o,y o ). We take the height of search to be the height of the chest of our foam mannequin z o = 17 cm, as this is the region of highest reflectivity. In a real-life implementation, the height can be appropriately estimated based on the type of target we wish to track or locate. An error z in estimating this height will result in the worst case in an error r o of the same order in the determination of the target s r o coordinates, although it will be typically much smaller and decreases rapidly for objects that are further away: r o (z / r ) z. As mentioned above, the calculated probability densities i of each pixel are multiplied to obtain the joint probability density P ( r o ). However, there is a risk that a given pixel i will lead to an unsignificant fit of the signal and that the values of t i and σ ti will be unreliable. To avoid that these unreliable i ( r o ) affect the joint probability density by multiplying relevant densities by zero, we take the probability P i ( r o ) associated with the pixel i to be a linear combination of the probability density i ( r o ) and a uniform probability density P uniform that will prevent any point in space to be multiplied by zero, P i ( r o )=α i i ( r o )+(1 α i )P uniform. (S6) Here, α i is a coefficient between and 1, related to the reliability of the probability density P i ( r o ). Our choice of α i for each pixel depends on how well the probability density P i ( r o ) overlaps with the space in which we are searching the object. More precisely we set α i = A i /A where A i is the area contained in the search space where the probability density P i ( r o ) is over a certain threshold (half its maximum) and A is the area of the search space. Given a number of pixels n, each with associated probability P i ( r o ), the joint probability density P ( r o ) is defined as P ( r o )=N where N is a normalisation constant. n P i ( r o ). i=1 (S7) 4 NATURE PHOTONICS

5 DOI: 1.138/NPHOTON.234 SUPPLEMENTARY INFORMATION S3 Signal-to-noise ratio We measured the signal-to-noise (SNR) ratio for all eight positions of the target shown in figure 3 of the main text. We find that, for a fixed distance between the camera and the field of view, the SNR is proportional to 1/r 4 =1/( r o r l r o r i ) 2, as expected by the attenuation of the spherical wave propagating from the laser spot to the object, then from the object to the field of view of the camera. This dependance is clearly seen from the data shown in figure S SNR 1 5 Experimental data linear fit /r 4 (m -4 ) Figure S3: Signal-to-noise ratio of experimental data. The signal-to-noise ratio decreases with the attenuation of the signal in 1/r 4. In the experiments reported here, we obtain a SNR of 2 for a target at a distance r o r l r o r i = 6 cm. A few improvements, both in the setup and in the detection hardware, could contribute to improve this distance up to possibly 1 meters: 1. The SNR is proportional to the area of the target. For a human, this implies an increase in signal by a factor 3 compared to the target used in our experiments. The improvement for cars is potentially even higher. 2. Some characteristics of the SPAD array are likely to improve in the future (and are indeed already under development). For example, the fill factor of the current detector is only 2%. With arrays where the electronics are vertically integrated, or using microlens arrays, it will be possible to increase the fill factor to 8%, allowing a further factor 4x improvement. 3. We are currently integrating signal over only 1 frames, as our read-out speed is limited by the USB 2. connection of the SPAD array. Work is currently being done to incorporate USB 3. technology to the camera, so that the read-out will be greatly accelerated and a higher number of integration frames can be used. Using the increased data rates of USB 3., we can expect the SNR to increase by a factor We could readily gain a factor 5x by optimising the near-infrared sensitivity of the SPAD detectors. Indeed, this has been recently demonstrated with sensitivities at our operating wavelength of the order of 25-3%, to be compared with the 5% sensitivity in our current system [1, 2]. We also note that many of these numbers (8% fill factor with microlens arrays, 25% photon efficiency) are already available in commercial SPAD arrays operating at 15 nm (InGaAS technology). NATURE PHOTONICS 5

6 DOI: 1.138/NPHOTON.234 Taking into account all of these contributions, it is likely that we can improve the signal by a factor 6, which means we could track human-size targets at around 1 meters. Some factors, in turn, could contribute to decrease the signal-to-noise ratio. Mainly, we use a white surface and target in our experiments: a change in the material could potentially decrease the SNR by a factor 1. This would translate into a detection range of around 6 m rather than 1 m. S4 Uncertainty of the position retrieval We measure the uncertainty or precision in x and y of our technique by calculating the mean square weighted deviation, namely σ 2 x, σ 2 y, of the retrieved probability densities P ( r o ). As can be seen in figure 3 of the main text, the precision varies according to the position of the hidden target. Figure S4 shows how σ x and σ y vary as a function of the target s distance, more precisely as r o r i, i.e. the distance between the object and the field of view. Uncertainty (< x, < y ) (cm) 8 < x 6 < y r o -r i (m) Figure S4: Uncertainty of experimental data. Uncertainty of retrieved position for the eight positions of the target shown in figure 3 of the main text. target position y (cm) σx (cm) target position y (cm) σy (cm) a target position x (cm) b target position x (cm) Figure S5: Simulated uncertainty of the position retrieval algorithm. The uncertainties σ x and σ y of our retrieval algorithm depend on the target s position. a) The precision in x is maximal on the line crossing the two foci of the probability density ellipses we use in the retrieval algorithm (y = 83) and increases as we go away from this line in the y direction. For a given y position, we also observe the uncertainty σ x to decrease with increasing distance. b) The uncertainty σ y has a different behaviour, and increases with distance, as the ellipses i get larger. 6 NATURE PHOTONICS

7 DOI: 1.138/NPHOTON.234 SUPPLEMENTARY INFORMATION We also observe that the precision of the method depends on the specific position in x and y of the target. To study this dependance, we perform simulations of the signal emitted by a single-point target scanning a range of positions in x and y. The signal emitted by this target is taken to have a temporal deviation of σ t = 218 ps, which is the average deviation that we find from our experimental data. From the simulated signal, we apply the same position retrieval algorithm to find a probability density P ( r o ) and we calculate its weighted standard deviations σ x and σ y. In figure S5, we show the obtained uncertainties as a function of the target s position (x, y). The uncertainty is encoded in color in these two graphs. We observe in figure S5 a complex dependance of the precision on the target s position, which can be explained with geometrical arguments by considering the probability density ellipses i used in our retrieval. The behaviour of the uncertainty is explained schematically in figure S6: as can be seen, the ellipses resulting from a target at a large distance are both large and similar to each other (as they converge towards the limiting mathematical function which is a circle). This leads to a large overlap and hence a large uncertainty in the y direction but reduced uncertainty in the x direction. We also study the effect of another important parameter in the uncertainty σ x and σ y of our retrieval: the number of pixels i from the field of view used in the retrieval algorithm. Although 124 pixels (32x32) Figure S6: Geometry of the position retrieval algorithm and its expected precision. The dependance of σ x and σ y can be understood from the geometry of our retrieval process. We rely on ellipses to describe the probability density of an object s position corresponding to our measurements. If the object is close (green), the ellipses are smaller and have a more pronounced curve where they overlap. When the object is farther (red), the ellipses become flatter at the region of overlap, leading to higher uncertainty in y, but lower uncertainty in x. A similar argument can be made to understand the variation of σ x as a function of the y position of the object, where the uncertainty in x varies according to the curvature of the ellipses at the position of overlap. NATURE PHOTONICS 7

8 DOI: 1.138/NPHOTON.234 a b Uncertainty (cm) Uncertainty (cm) 1 5 Experimental Data # pixels used Simulations #pixels used Figure S7: Effect of the number of pixels on the precision of the retrieval algorithm. Using a lower number of pixels than the 124 of the camera for the retrieval algorithm leads to a lower precision. This correlation is observed in the experimental data a shown in figure 3 of the main text, and is confirmed by simulations b. This dependancy can explain one effect of a lower signal-to-noise ratio on the precision of the data: if the signal cannot be detected on all pixels of the field of view, a lower number of pixels will be used for the position retrieval and the uncertainty will be higher. σ x σ y σ x σ y are technically available for the retrieval, the decreasing signal-to-noise ratio leads to a loss of signal on some of these pixels. In figure S7a, we show the precision obtained on the eight positions of figure 3 of the main text as a function of the number of pixels that were used in the retrieval algorithm for these positions. We observe that the uncertainty does increase quite significantly when the number of pixels is decreased. To confirm this effect, we perform simulations where only a portion of the 124 pixels, chosen in a random fashion, are used for the retrieval algorithms. The results of the simulation are shown in S7b and clearly show a dependance between the number of pixels and the obtained precision. This dependancy shows how a loss of signal of a portion of our pixels will affect the precision of our method, and is also relevant to consider in the design of similar systems that would perform tracking of hidden objects with a number of single-pixel detectors rather than with a SPAD array. In such a case, the relative position and distribution of pixels will also play an important role in the expected precision of the system. S5 Tracking moving objects We show in the main text the results for an experiment where the target is moving at 2.8 cm/s on a track positioned at x = 4 cm. Two other sets of data were acquired, for tracks positioned at x = 2 cm and x = 3 cm and the object is moving along the track at 3.9 cm/s and 5.3 cm/s, respectively. We show in Figure S8 consecutive positions acquired for the target moving along these two other tracks. These results are similar to those shown in Figure 4 of the main text, although they are more precise as the target is closer to the wall. We can also see that the difference in distance covered by the target moving at different speed. The Supplementary Video provides all retrieved positions for all tracks, where we clearly see a difference in speed as the target changes track. 8 NATURE PHOTONICS

9 DOI: 1.138/NPHOTON.234 SUPPLEMENTARY INFORMATION S6 Tracking multiple targets Our method can be extended to tracking multiple hidden objects, provided that the signal originating from the objects do not significantly overlap. As a proof of principle, we recorded the signal from two targets separated by 45 cm. Once the background is retrieved from the recorded signal, we can distinguish the two signals coming from the two distinct targets, as they produce backscattered spherical waves that can be distinguished both in time and in space, as illustrated in Fig. S9. For this proof-of-principle experiment, we first found the multiple peaks in each histogram and applied our retrieval algorithm individually to each of the two separate signals. In the inset of figure S9, we can see that the two signals are spatially separated in each frame. The retrieved probability densities are in good agreement with the positions retrieved from single-target measurements. References [1] Webster, E.A.G., Grant, L.A. & Henderson, R.K. A high-performance single-photon avalanche diode in 13-nm CMOS imaging technology. Electron Device Letters, IEEE 33, (212). [2] Mandai, S., Fishbrun, M.W., Maruyama, Y. & Charbon, E. A wide spectral range single-photon avalanche diode fabricated in an advanced 18 nm CMOS technology. Opt. Express 2, (212). 12 time (s) time (s) Field of view 1 9 Field of view y (cm) 6 9 y (cm) x (cm) x (cm) a b Figure S8: Tracking moving object. The target is moving in a straight line along the y-direction, from bottom to top, at a speed of 3.9 cm/s in a and 5.3 cm/s in b. The retrieved probability density of the target s location are represented in color, where the color indicates the time at which the position is recorded. The acquisition was done in real-time as the object was moving along the tracks. NATURE PHOTONICS 9

10 DOI: 1.138/NPHOTON y (cm) x (cm) Figure S9: Detecting multiple targets. When multiple targets are present, they produce different signals that can be distinguished both in time (distinct peaks in the histograms) and in space. The inset shows one frame of the recorded movie, where we clearly see two backscattered spherical waves passing through the field of view, as indicated by the blue (target 1) and purple (target 2) lines. The retrieved positions are in good agreement with the positions found by measuring the signal measured from one single target at a time. The blue and purple dots represent the point of maximum retrieved probability in the case of a single-target measurement for target 1 and 2, respectively. 1 NATURE PHOTONICS

Heriot-Watt University

Heriot-Watt University Heriot-Watt University Heriot-Watt University Research Gateway Detection and tracking of moving objects hidden from view Gariépy, Geneviève; Tonolini, Francesco; Henderson, Robert; Leach, Jonathan; Faccio,

More information

Supplementary Information

Supplementary Information Supplementary Information Interferometric scattering microscopy with polarization-selective dual detection scheme: Capturing the orientational information of anisotropic nanometric objects Il-Buem Lee

More information

Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation

Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation Obviously, this is a very slow process and not suitable for dynamic scenes. To speed things up, we can use a laser that projects a vertical line of light onto the scene. This laser rotates around its vertical

More information

Improving the 3D Scan Precision of Laser Triangulation

Improving the 3D Scan Precision of Laser Triangulation Improving the 3D Scan Precision of Laser Triangulation The Principle of Laser Triangulation Triangulation Geometry Example Z Y X Image of Target Object Sensor Image of Laser Line 3D Laser Triangulation

More information

Confocal Microscope Imaging of Single-Emitter Fluorescence and Hanbury Brown & Twiss Setup for Photon Antibunching. Edward Pei

Confocal Microscope Imaging of Single-Emitter Fluorescence and Hanbury Brown & Twiss Setup for Photon Antibunching. Edward Pei Confocal Microscope Imaging of Single-Emitter Fluorescence and Hanbury Brown & Twiss Setup for Photon Antibunching Edward Pei Abstract The purpose of these labs was to study single photon sources and measure

More information

specular diffuse reflection.

specular diffuse reflection. Lesson 8 Light and Optics The Nature of Light Properties of Light: Reflection Refraction Interference Diffraction Polarization Dispersion and Prisms Total Internal Reflection Huygens s Principle The Nature

More information

SUPPLEMENTARY FILE S1: 3D AIRWAY TUBE RECONSTRUCTION AND CELL-BASED MECHANICAL MODEL. RELATED TO FIGURE 1, FIGURE 7, AND STAR METHODS.

SUPPLEMENTARY FILE S1: 3D AIRWAY TUBE RECONSTRUCTION AND CELL-BASED MECHANICAL MODEL. RELATED TO FIGURE 1, FIGURE 7, AND STAR METHODS. SUPPLEMENTARY FILE S1: 3D AIRWAY TUBE RECONSTRUCTION AND CELL-BASED MECHANICAL MODEL. RELATED TO FIGURE 1, FIGURE 7, AND STAR METHODS. 1. 3D AIRWAY TUBE RECONSTRUCTION. RELATED TO FIGURE 1 AND STAR METHODS

More information

SIMULATED LIDAR WAVEFORMS FOR THE ANALYSIS OF LIGHT PROPAGATION THROUGH A TREE CANOPY

SIMULATED LIDAR WAVEFORMS FOR THE ANALYSIS OF LIGHT PROPAGATION THROUGH A TREE CANOPY SIMULATED LIDAR WAVEFORMS FOR THE ANALYSIS OF LIGHT PROPAGATION THROUGH A TREE CANOPY Angela M. Kim and Richard C. Olsen Remote Sensing Center Naval Postgraduate School 1 University Circle Monterey, CA

More information

Locating ego-centers in depth for hippocampal place cells

Locating ego-centers in depth for hippocampal place cells 204 5th Joint Symposium on Neural Computation Proceedings UCSD (1998) Locating ego-centers in depth for hippocampal place cells Kechen Zhang,' Terrence J. Sejeowski112 & Bruce L. ~cnau~hton~ 'Howard Hughes

More information

SUMMARY: DISTINCTIVE IMAGE FEATURES FROM SCALE- INVARIANT KEYPOINTS

SUMMARY: DISTINCTIVE IMAGE FEATURES FROM SCALE- INVARIANT KEYPOINTS SUMMARY: DISTINCTIVE IMAGE FEATURES FROM SCALE- INVARIANT KEYPOINTS Cognitive Robotics Original: David G. Lowe, 004 Summary: Coen van Leeuwen, s1460919 Abstract: This article presents a method to extract

More information

Supplementary Materials for

Supplementary Materials for www.sciencemag.org/cgi/content/full/338/6107/640/dc1 Supplementary Materials for Quantum Entanglement of High Angular Momenta Robert Fickler,* Radek Lapkiewicz, William N. Plick, Mario Krenn, Christoph

More information

SUPPLEMENTARY INFORMATION

SUPPLEMENTARY INFORMATION Supplementary Information Compact spectrometer based on a disordered photonic chip Brandon Redding, Seng Fatt Liew, Raktim Sarma, Hui Cao* Department of Applied Physics, Yale University, New Haven, CT

More information

Sensor Modalities. Sensor modality: Different modalities:

Sensor Modalities. Sensor modality: Different modalities: Sensor Modalities Sensor modality: Sensors which measure same form of energy and process it in similar ways Modality refers to the raw input used by the sensors Different modalities: Sound Pressure Temperature

More information

A New Model for Optical Crosstalk in SinglePhoton Avalanche Diodes Arrays

A New Model for Optical Crosstalk in SinglePhoton Avalanche Diodes Arrays A New Model for Optical Crosstalk in SinglePhoton Avalanche Diodes Arrays I. Rech, A. Ingargiola, R. Spinelli, S. Marangoni, I. Labanca, M. Ghioni, S. Cova Dipartimento di Elettronica ed Informazione Politecnico

More information

y z x SNR(dB) RMSE(mm)

y z x SNR(dB) RMSE(mm) a b c d y z x SNR(dB) RMSE(mm) 30.16 2.94 20.28 6.56 10.33 13.74 0.98 25.37 Supplementary Figure 1: System performance with different noise levels. The SNRs are the mean SNR of all measured signals corresponding

More information

Mode-Field Diameter and Spot Size Measurements of Lensed and Tapered Specialty Fibers

Mode-Field Diameter and Spot Size Measurements of Lensed and Tapered Specialty Fibers Mode-Field Diameter and Spot Size Measurements of Lensed and Tapered Specialty Fibers By Jeffrey L. Guttman, Ph.D., Director of Engineering, Ophir-Spiricon Abstract: The Mode-Field Diameter (MFD) and spot

More information

Supporting information for: A highly directional room-temperature single. photon device

Supporting information for: A highly directional room-temperature single. photon device Supporting information for: A highly directional room-temperature single photon device Nitzan Livneh,, Moshe G. Harats,, Daniel Istrati, Hagai S. Eisenberg, and Ronen Rapaport,, Applied Physics Department,

More information

Range Sensors (time of flight) (1)

Range Sensors (time of flight) (1) Range Sensors (time of flight) (1) Large range distance measurement -> called range sensors Range information: key element for localization and environment modeling Ultrasonic sensors, infra-red sensors

More information

PEER Report Addendum.

PEER Report Addendum. PEER Report 2017-03 Addendum. The authors recommend the replacement of Section 3.5.1 and Table 3.15 with the content of this Addendum. Consequently, the recommendation is to replace the 13 models and their

More information

Condenser Optics for Dark Field X-Ray Microscopy

Condenser Optics for Dark Field X-Ray Microscopy Condenser Optics for Dark Field X-Ray Microscopy S. J. Pfauntsch, A. G. Michette, C. J. Buckley Centre for X-Ray Science, Department of Physics, King s College London, Strand, London WC2R 2LS, UK Abstract.

More information

Confocal Microscopy Imaging of Single Emitter Fluorescence and Hanbury Brown, and Twiss Setup for Photon Antibunching. Abstract

Confocal Microscopy Imaging of Single Emitter Fluorescence and Hanbury Brown, and Twiss Setup for Photon Antibunching. Abstract James Maslek 10/26/12 Confocal Microscopy Imaging of Single Emitter Fluorescence and Hanbury Brown, and Twiss Setup for Photon Antibunching Abstract The purpose of this experiment was to observe fluorescence

More information

9.9 Coherent Structure Detection in a Backward-Facing Step Flow

9.9 Coherent Structure Detection in a Backward-Facing Step Flow 9.9 Coherent Structure Detection in a Backward-Facing Step Flow Contributed by: C. Schram, P. Rambaud, M. L. Riethmuller 9.9.1 Introduction An algorithm has been developed to automatically detect and characterize

More information

10/5/09 1. d = 2. Range Sensors (time of flight) (2) Ultrasonic Sensor (time of flight, sound) (1) Ultrasonic Sensor (time of flight, sound) (2) 4.1.

10/5/09 1. d = 2. Range Sensors (time of flight) (2) Ultrasonic Sensor (time of flight, sound) (1) Ultrasonic Sensor (time of flight, sound) (2) 4.1. Range Sensors (time of flight) (1) Range Sensors (time of flight) (2) arge range distance measurement -> called range sensors Range information: key element for localization and environment modeling Ultrasonic

More information

COMPUTER SIMULATION TECHNIQUES FOR ACOUSTICAL DESIGN OF ROOMS - HOW TO TREAT REFLECTIONS IN SOUND FIELD SIMULATION

COMPUTER SIMULATION TECHNIQUES FOR ACOUSTICAL DESIGN OF ROOMS - HOW TO TREAT REFLECTIONS IN SOUND FIELD SIMULATION J.H. Rindel, Computer simulation techniques for the acoustical design of rooms - how to treat reflections in sound field simulation. ASVA 97, Tokyo, 2-4 April 1997. Proceedings p. 201-208. COMPUTER SIMULATION

More information

NEAR-IR BROADBAND POLARIZER DESIGN BASED ON PHOTONIC CRYSTALS

NEAR-IR BROADBAND POLARIZER DESIGN BASED ON PHOTONIC CRYSTALS U.P.B. Sci. Bull., Series A, Vol. 77, Iss. 3, 2015 ISSN 1223-7027 NEAR-IR BROADBAND POLARIZER DESIGN BASED ON PHOTONIC CRYSTALS Bogdan Stefaniţă CALIN 1, Liliana PREDA 2 We have successfully designed a

More information

To determine the wavelength of laser light using single slit diffraction

To determine the wavelength of laser light using single slit diffraction 9 To determine the wavelength of laser light using single slit diffraction pattern 91 Apparatus: Helium-Neon laser or diode laser, a single slit with adjustable aperture width, optical detector and power

More information

Supplementary Figure 1 Optimum transmissive mask design for shaping an incident light to a desired

Supplementary Figure 1 Optimum transmissive mask design for shaping an incident light to a desired Supplementary Figure 1 Optimum transmissive mask design for shaping an incident light to a desired tangential form. (a) The light from the sources and scatterers in the half space (1) passes through the

More information

Lab2: Single Photon Interference

Lab2: Single Photon Interference Lab2: Single Photon Interference Xiaoshu Chen* Department of Mechanical Engineering, University of Rochester, NY, 14623 ABSTRACT The wave-particle duality of light was verified by multi and single photon

More information

TABLE OF CONTENTS PRODUCT DESCRIPTION VISUALIZATION OPTIONS MEASUREMENT OPTIONS SINGLE MEASUREMENT / TIME SERIES BEAM STABILITY POINTING STABILITY

TABLE OF CONTENTS PRODUCT DESCRIPTION VISUALIZATION OPTIONS MEASUREMENT OPTIONS SINGLE MEASUREMENT / TIME SERIES BEAM STABILITY POINTING STABILITY TABLE OF CONTENTS PRODUCT DESCRIPTION VISUALIZATION OPTIONS MEASUREMENT OPTIONS SINGLE MEASUREMENT / TIME SERIES BEAM STABILITY POINTING STABILITY BEAM QUALITY M 2 BEAM WIDTH METHODS SHORT VERSION OVERVIEW

More information

HOUGH TRANSFORM CS 6350 C V

HOUGH TRANSFORM CS 6350 C V HOUGH TRANSFORM CS 6350 C V HOUGH TRANSFORM The problem: Given a set of points in 2-D, find if a sub-set of these points, fall on a LINE. Hough Transform One powerful global method for detecting edges

More information

COMPARISION OF REGRESSION WITH NEURAL NETWORK MODEL FOR THE VARIATION OF VANISHING POINT WITH VIEW ANGLE IN DEPTH ESTIMATION WITH VARYING BRIGHTNESS

COMPARISION OF REGRESSION WITH NEURAL NETWORK MODEL FOR THE VARIATION OF VANISHING POINT WITH VIEW ANGLE IN DEPTH ESTIMATION WITH VARYING BRIGHTNESS International Journal of Advanced Trends in Computer Science and Engineering, Vol.2, No.1, Pages : 171-177 (2013) COMPARISION OF REGRESSION WITH NEURAL NETWORK MODEL FOR THE VARIATION OF VANISHING POINT

More information

Grade 9 Math Terminology

Grade 9 Math Terminology Unit 1 Basic Skills Review BEDMAS a way of remembering order of operations: Brackets, Exponents, Division, Multiplication, Addition, Subtraction Collect like terms gather all like terms and simplify as

More information

Using a multipoint interferometer to measure the orbital angular momentum of light

Using a multipoint interferometer to measure the orbital angular momentum of light CHAPTER 3 Using a multipoint interferometer to measure the orbital angular momentum of light Recently it was shown that the orbital angular momentum of light can be measured using a multipoint interferometer,

More information

Modeling and Estimation of FPN Components in CMOS Image Sensors

Modeling and Estimation of FPN Components in CMOS Image Sensors Modeling and Estimation of FPN Components in CMOS Image Sensors Abbas El Gamal a, Boyd Fowler a,haomin b,xinqiaoliu a a Information Systems Laboratory, Stanford University Stanford, CA 945 USA b Fudan

More information

34.2: Two Types of Image

34.2: Two Types of Image Chapter 34 Images 34.2: Two Types of Image For you to see an object, your eye intercepts some of the light rays spreading from the object and then redirect them onto the retina at the rear of the eye.

More information

Tutorial: Instantaneous Measurement of M 2 Beam Propagation Ratio in Real-Time

Tutorial: Instantaneous Measurement of M 2 Beam Propagation Ratio in Real-Time Tutorial: Instantaneous Measurement of M 2 Beam Propagation Ratio in Real-Time By Allen M. Cary, Jeffrey L. Guttman, Razvan Chirita, Derrick W. Peterman, Photon Inc A new instrument design allows the M

More information

Updated Sections 3.5 and 3.6

Updated Sections 3.5 and 3.6 Addendum The authors recommend the replacement of Sections 3.5 3.6 and Table 3.15 with the content of this addendum. Consequently, the recommendation is to replace the 13 models and their weights with

More information

f. (5.3.1) So, the higher frequency means the lower wavelength. Visible part of light spectrum covers the range of wavelengths from

f. (5.3.1) So, the higher frequency means the lower wavelength. Visible part of light spectrum covers the range of wavelengths from Lecture 5-3 Interference and Diffraction of EM Waves During our previous lectures we have been talking about electromagnetic (EM) waves. As we know, harmonic waves of any type represent periodic process

More information

SUPPLEMENTARY INFORMATION

SUPPLEMENTARY INFORMATION SUPPLEMENTARY INFORMATION doi:10.1038/nature10934 Supplementary Methods Mathematical implementation of the EST method. The EST method begins with padding each projection with zeros (that is, embedding

More information

A 100Hz Real-time Sensing System of Textured Range Images

A 100Hz Real-time Sensing System of Textured Range Images A 100Hz Real-time Sensing System of Textured Range Images Hidetoshi Ishiyama Course of Precision Engineering School of Science and Engineering Chuo University 1-13-27 Kasuga, Bunkyo-ku, Tokyo 112-8551,

More information

Uncertainty simulator to evaluate the electrical and mechanical deviations in cylindrical near field antenna measurement systems

Uncertainty simulator to evaluate the electrical and mechanical deviations in cylindrical near field antenna measurement systems Uncertainty simulator to evaluate the electrical and mechanical deviations in cylindrical near field antenna measurement systems S. Burgos*, F. Martín, M. Sierra-Castañer, J.L. Besada Grupo de Radiación,

More information

A Shape from Silhouette Approach to Imaging Ocean Mines

A Shape from Silhouette Approach to Imaging Ocean Mines A Shape from Silhouette Approach to Imaging Ocean Mines Robert Drost, David Munson, Jr., and Andrew Singer Coordinated Science Laboratory and Department of Electrical and Computer Engineering University

More information

Optimized Design of 3D Laser Triangulation Systems

Optimized Design of 3D Laser Triangulation Systems The Scan Principle of 3D Laser Triangulation Triangulation Geometry Example of Setup Z Y X Target as seen from the Camera Sensor Image of Laser Line The Scan Principle of 3D Laser Triangulation Detektion

More information

Ground Plane Motion Parameter Estimation For Non Circular Paths

Ground Plane Motion Parameter Estimation For Non Circular Paths Ground Plane Motion Parameter Estimation For Non Circular Paths G.J.Ellwood Y.Zheng S.A.Billings Department of Automatic Control and Systems Engineering University of Sheffield, Sheffield, UK J.E.W.Mayhew

More information

Local Image Registration: An Adaptive Filtering Framework

Local Image Registration: An Adaptive Filtering Framework Local Image Registration: An Adaptive Filtering Framework Gulcin Caner a,a.murattekalp a,b, Gaurav Sharma a and Wendi Heinzelman a a Electrical and Computer Engineering Dept.,University of Rochester, Rochester,

More information

A Model-Independent, Multi-Image Approach to MR Inhomogeneity Correction

A Model-Independent, Multi-Image Approach to MR Inhomogeneity Correction Tina Memo No. 2007-003 Published in Proc. MIUA 2007 A Model-Independent, Multi-Image Approach to MR Inhomogeneity Correction P. A. Bromiley and N.A. Thacker Last updated 13 / 4 / 2007 Imaging Science and

More information

Development of an automated bond verification system for advanced electronic packages

Development of an automated bond verification system for advanced electronic packages Development of an automated bond verification system for advanced electronic packages Michel Darboux, Jean-Marc Dinten LETI (CEA - Technologies AvancSes) - DSYS/SCSI CENG, 17 rue des Martyrs F38054 Grenoble

More information

Dynamic Reconstruction for Coded Aperture Imaging Draft Unpublished work please do not cite or distribute.

Dynamic Reconstruction for Coded Aperture Imaging Draft Unpublished work please do not cite or distribute. Dynamic Reconstruction for Coded Aperture Imaging Draft 1.0.1 Berthold K.P. Horn 2007 September 30. Unpublished work please do not cite or distribute. The dynamic reconstruction technique makes it possible

More information

Implemented by Valsamis Douskos Laboratoty of Photogrammetry, Dept. of Surveying, National Tehnical University of Athens

Implemented by Valsamis Douskos Laboratoty of Photogrammetry, Dept. of Surveying, National Tehnical University of Athens An open-source toolbox in Matlab for fully automatic calibration of close-range digital cameras based on images of chess-boards FAUCCAL (Fully Automatic Camera Calibration) Implemented by Valsamis Douskos

More information

Chapter 7: Geometrical Optics. The branch of physics which studies the properties of light using the ray model of light.

Chapter 7: Geometrical Optics. The branch of physics which studies the properties of light using the ray model of light. Chapter 7: Geometrical Optics The branch of physics which studies the properties of light using the ray model of light. Overview Geometrical Optics Spherical Mirror Refraction Thin Lens f u v r and f 2

More information

Validation of Heat Conduction 2D Analytical Model in Spherical Geometries using infrared Thermography.*

Validation of Heat Conduction 2D Analytical Model in Spherical Geometries using infrared Thermography.* 11 th International Conference on Quantitative InfraRed Thermography Validation of Heat Conduction 2D Analytical Model in Spherical Geometries using infrared Thermography.* by C. San Martín 1,2, C. Torres

More information

Distortion Correction for Conical Multiplex Holography Using Direct Object-Image Relationship

Distortion Correction for Conical Multiplex Holography Using Direct Object-Image Relationship Proc. Natl. Sci. Counc. ROC(A) Vol. 25, No. 5, 2001. pp. 300-308 Distortion Correction for Conical Multiplex Holography Using Direct Object-Image Relationship YIH-SHYANG CHENG, RAY-CHENG CHANG, AND SHIH-YU

More information

Assignment 3: Edge Detection

Assignment 3: Edge Detection Assignment 3: Edge Detection - EE Affiliate I. INTRODUCTION This assignment looks at different techniques of detecting edges in an image. Edge detection is a fundamental tool in computer vision to analyse

More information

COHERENCE AND INTERFERENCE

COHERENCE AND INTERFERENCE COHERENCE AND INTERFERENCE - An interference experiment makes use of coherent waves. The phase shift (Δφ tot ) between the two coherent waves that interfere at any point of screen (where one observes the

More information

arxiv: v1 [cs.cv] 2 Mar 2017

arxiv: v1 [cs.cv] 2 Mar 2017 Non-line-of-sight tracking of people at long range SUSAN CHAN,,** RYAN E. WARBURTON,,** GENEVIEVE GARIEPY, JONATHAN LEACH, AND DANIELE FACCIO,* arxiv:7.4v [cs.cv] Mar 7 Institute of Photonics and Quantum

More information

Simplified model of ray propagation and outcoupling in TFs.

Simplified model of ray propagation and outcoupling in TFs. Supplementary Figure 1 Simplified model of ray propagation and outcoupling in TFs. After total reflection at the core boundary with an angle α, a ray entering the taper (blue line) hits the taper sidewalls

More information

Geometry Vocabulary Math Fundamentals Reference Sheet Page 1

Geometry Vocabulary Math Fundamentals Reference Sheet Page 1 Math Fundamentals Reference Sheet Page 1 Acute Angle An angle whose measure is between 0 and 90 Acute Triangle A that has all acute Adjacent Alternate Interior Angle Two coplanar with a common vertex and

More information

mywbut.com Diffraction

mywbut.com Diffraction Diffraction If an opaque obstacle (or aperture) is placed between a source of light and screen, a sufficiently distinct shadow of opaque (or an illuminated aperture) is obtained on the screen.this shows

More information

PHY 222 Lab 11 Interference and Diffraction Patterns Investigating interference and diffraction of light waves

PHY 222 Lab 11 Interference and Diffraction Patterns Investigating interference and diffraction of light waves PHY 222 Lab 11 Interference and Diffraction Patterns Investigating interference and diffraction of light waves Print Your Name Print Your Partners' Names Instructions April 17, 2015 Before lab, read the

More information

To Measure a Constant Velocity. Enter.

To Measure a Constant Velocity. Enter. To Measure a Constant Velocity Apparatus calculator, black lead, calculator based ranger (cbr, shown), Physics application this text, the use of the program becomes second nature. At the Vernier Software

More information

Impact of 3D Laser Data Resolution and Accuracy on Pipeline Dents Strain Analysis

Impact of 3D Laser Data Resolution and Accuracy on Pipeline Dents Strain Analysis More Info at Open Access Database www.ndt.net/?id=15137 Impact of 3D Laser Data Resolution and Accuracy on Pipeline Dents Strain Analysis Jean-Simon Fraser, Pierre-Hugues Allard Creaform, 5825 rue St-Georges,

More information

Image Processing Fundamentals. Nicolas Vazquez Principal Software Engineer National Instruments

Image Processing Fundamentals. Nicolas Vazquez Principal Software Engineer National Instruments Image Processing Fundamentals Nicolas Vazquez Principal Software Engineer National Instruments Agenda Objectives and Motivations Enhancing Images Checking for Presence Locating Parts Measuring Features

More information

Optical Sensors: Key Technology for the Autonomous Car

Optical Sensors: Key Technology for the Autonomous Car Optical Sensors: Key Technology for the Autonomous Car Rajeev Thakur, P.E., Product Marketing Manager, Infrared Business Unit, Osram Opto Semiconductors Autonomously driven cars will combine a variety

More information

Selective Optical Assembly of Highly Uniform. Nanoparticles by Doughnut-Shaped Beams

Selective Optical Assembly of Highly Uniform. Nanoparticles by Doughnut-Shaped Beams SUPPLEMENTARY INFORMATION Selective Optical Assembly of Highly Uniform Nanoparticles by Doughnut-Shaped Beams Syoji Ito 1,2,3*, Hiroaki Yamauchi 1,2, Mamoru Tamura 4,5, Shimpei Hidaka 4,5, Hironori Hattori

More information

Tracking Trajectories of Migrating Birds Around a Skyscraper

Tracking Trajectories of Migrating Birds Around a Skyscraper Tracking Trajectories of Migrating Birds Around a Skyscraper Brian Crombie Matt Zivney Project Advisors Dr. Huggins Dr. Stewart Abstract In this project, the trajectories of birds are tracked around tall

More information

And. Modal Analysis. Using. VIC-3D-HS, High Speed 3D Digital Image Correlation System. Indian Institute of Technology New Delhi

And. Modal Analysis. Using. VIC-3D-HS, High Speed 3D Digital Image Correlation System. Indian Institute of Technology New Delhi Full Field Displacement And Strain Measurement And Modal Analysis Using VIC-3D-HS, High Speed 3D Digital Image Correlation System At Indian Institute of Technology New Delhi VIC-3D, 3D Digital Image Correlation

More information

LC-1: Interference and Diffraction

LC-1: Interference and Diffraction Your TA will use this sheet to score your lab. It is to be turned in at the end of lab. You must use complete sentences and clearly explain your reasoning to receive full credit. The lab setup has been

More information

Middle School Math Course 2

Middle School Math Course 2 Middle School Math Course 2 Correlation of the ALEKS course Middle School Math Course 2 to the Indiana Academic Standards for Mathematics Grade 7 (2014) 1: NUMBER SENSE = ALEKS course topic that addresses

More information

Beam Profilier - Beamage 3.0

Beam Profilier - Beamage 3.0 Profilier - age 3.0 KEY FEATURES High resolution (160x120 points) 2.2 MPixels resolution gives accurate profile measurements on very small beams Large Area The 11.3 x 6.0 mm sensor allows to measure very

More information

Grade 7 Math Curriculum Map Erin Murphy

Grade 7 Math Curriculum Map Erin Murphy Topic 1 Algebraic Expressions and Integers 2 Weeks Summative Topic Test: SWBAT use rules to add and subtract integers, Evaluate algebraic expressions, use the order of operations, identify numerical and

More information

SYSTEM LINEARITY LAB MANUAL: 2 Modifications for P551 Fall 2013 Medical Physics Laboratory

SYSTEM LINEARITY LAB MANUAL: 2 Modifications for P551 Fall 2013 Medical Physics Laboratory SYSTEM LINEARITY LAB MANUAL: 2 Modifications for P551 Fall 2013 Medical Physics Laboratory Introduction In this lab exercise, you will investigate the linearity of the DeskCAT scanner by making measurements

More information

MetroPro Surface Texture Parameters

MetroPro Surface Texture Parameters MetroPro Surface Texture Parameters Contents ROUGHNESS PARAMETERS...1 R a, R q, R y, R t, R p, R v, R tm, R z, H, R ku, R 3z, SR z, SR z X, SR z Y, ISO Flatness WAVINESS PARAMETERS...4 W a, W q, W y HYBRID

More information

Practical Robotics (PRAC)

Practical Robotics (PRAC) Practical Robotics (PRAC) A Mobile Robot Navigation System (1) - Sensor and Kinematic Modelling Nick Pears University of York, Department of Computer Science December 17, 2014 nep (UoY CS) PRAC Practical

More information

1. Polarization effects in optical spectra of photonic crystals

1. Polarization effects in optical spectra of photonic crystals Speech for JASS 05. April 2005. Samusev A. 1. Polarization effects in optical spectra of photonic crystals Good afternoon. I would like to introduce myself. My name is Anton Samusev. I m a student of Saint

More information

Chapter 18. Geometric Operations

Chapter 18. Geometric Operations Chapter 18 Geometric Operations To this point, the image processing operations have computed the gray value (digital count) of the output image pixel based on the gray values of one or more input pixels;

More information

Advanced Image Reconstruction Methods for Photoacoustic Tomography

Advanced Image Reconstruction Methods for Photoacoustic Tomography Advanced Image Reconstruction Methods for Photoacoustic Tomography Mark A. Anastasio, Kun Wang, and Robert Schoonover Department of Biomedical Engineering Washington University in St. Louis 1 Outline Photoacoustic/thermoacoustic

More information

GEOG 4110/5100 Advanced Remote Sensing Lecture 4

GEOG 4110/5100 Advanced Remote Sensing Lecture 4 GEOG 4110/5100 Advanced Remote Sensing Lecture 4 Geometric Distortion Relevant Reading: Richards, Sections 2.11-2.17 Review What factors influence radiometric distortion? What is striping in an image?

More information

Single-photon sensitive light-in-flight imaging

Single-photon sensitive light-in-flight imaging Single-photon sensitive light-in-flight imaging Genevieve Gariepy, 1 Nikola Krstajić, 2,3 Robert Henderson, 2 Chunyong Li, 1 Robert R. Thomson, 1 Gerald S. Buller, 1 Barmak Heshmat, 4 Ramesh Raskar, 4

More information

Measurements using three-dimensional product imaging

Measurements using three-dimensional product imaging ARCHIVES of FOUNDRY ENGINEERING Published quarterly as the organ of the Foundry Commission of the Polish Academy of Sciences ISSN (1897-3310) Volume 10 Special Issue 3/2010 41 46 7/3 Measurements using

More information

SIMULATION AND VISUALIZATION IN THE EDUCATION OF COHERENT OPTICS

SIMULATION AND VISUALIZATION IN THE EDUCATION OF COHERENT OPTICS SIMULATION AND VISUALIZATION IN THE EDUCATION OF COHERENT OPTICS J. KORNIS, P. PACHER Department of Physics Technical University of Budapest H-1111 Budafoki út 8., Hungary e-mail: kornis@phy.bme.hu, pacher@phy.bme.hu

More information

CHAPTER 3 WAVEFRONT RECONSTRUCTION METHODS. 3.1 Spatial Correlation Method

CHAPTER 3 WAVEFRONT RECONSTRUCTION METHODS. 3.1 Spatial Correlation Method CHAPTER 3 WAVEFRONT RECONSTRUCTION METHODS Pierce [9] defines a wavefront as any moving surface along which a waveform feature is being simultaneously received. The first step in performing ray tracing

More information

ECE 172A: Introduction to Intelligent Systems: Machine Vision, Fall Midterm Examination

ECE 172A: Introduction to Intelligent Systems: Machine Vision, Fall Midterm Examination ECE 172A: Introduction to Intelligent Systems: Machine Vision, Fall 2008 October 29, 2008 Notes: Midterm Examination This is a closed book and closed notes examination. Please be precise and to the point.

More information

axis, and wavelength tuning is achieved by translating the grating along a scan direction parallel to the x

axis, and wavelength tuning is achieved by translating the grating along a scan direction parallel to the x Exponential-Grating Monochromator Kenneth C. Johnson, October 0, 08 Abstract A monochromator optical design is described, which comprises a grazing-incidence reflection and two grazing-incidence mirrors,

More information

dq dt I = Irradiance or Light Intensity is Flux Φ per area A (W/m 2 ) Φ =

dq dt I = Irradiance or Light Intensity is Flux Φ per area A (W/m 2 ) Φ = Radiometry (From Intro to Optics, Pedrotti -4) Radiometry is measurement of Emag radiation (light) Consider a small spherical source Total energy radiating from the body over some time is Q total Radiant

More information

Physics for Scientists & Engineers 2

Physics for Scientists & Engineers 2 Geometric Optics Physics for Scientists & Engineers 2 Spring Semester 2005 Lecture 36! The study of light divides itself into three fields geometric optics wave optics quantum optics! In the previous chapter,

More information

1. ABOUT INSTALLATION COMPATIBILITY SURESIM WORKFLOWS a. Workflow b. Workflow SURESIM TUTORIAL...

1. ABOUT INSTALLATION COMPATIBILITY SURESIM WORKFLOWS a. Workflow b. Workflow SURESIM TUTORIAL... SuReSim manual 1. ABOUT... 2 2. INSTALLATION... 2 3. COMPATIBILITY... 2 4. SURESIM WORKFLOWS... 2 a. Workflow 1... 3 b. Workflow 2... 4 5. SURESIM TUTORIAL... 5 a. Import Data... 5 b. Parameter Selection...

More information

Radiometric Calibration of S-1 Level-1 Products Generated by the S-1 IPF

Radiometric Calibration of S-1 Level-1 Products Generated by the S-1 IPF Radiometric Calibration of S-1 Level-1 Products Generated by the S-1 IPF Prepared by Nuno Miranda, P.J. Meadows Reference ESA-EOPG-CSCOP-TN-0002 Issue 1 Revision 0 Date of Issue 21/05/2015 Status Final

More information

Reconstruction of Images Distorted by Water Waves

Reconstruction of Images Distorted by Water Waves Reconstruction of Images Distorted by Water Waves Arturo Donate and Eraldo Ribeiro Computer Vision Group Outline of the talk Introduction Analysis Background Method Experiments Conclusions Future Work

More information

STRAIGHT LINE REFERENCE SYSTEM STATUS REPORT ON POISSON SYSTEM CALIBRATION

STRAIGHT LINE REFERENCE SYSTEM STATUS REPORT ON POISSON SYSTEM CALIBRATION STRAIGHT LINE REFERENCE SYSTEM STATUS REPORT ON POISSON SYSTEM CALIBRATION C. Schwalm, DESY, Hamburg, Germany Abstract For the Alignment of the European XFEL, a Straight Line Reference System will be used

More information

EKF Localization and EKF SLAM incorporating prior information

EKF Localization and EKF SLAM incorporating prior information EKF Localization and EKF SLAM incorporating prior information Final Report ME- Samuel Castaneda ID: 113155 1. Abstract In the context of mobile robotics, before any motion planning or navigation algorithm

More information

Adaptive Image Sampling Based on the Human Visual System

Adaptive Image Sampling Based on the Human Visual System Adaptive Image Sampling Based on the Human Visual System Frédérique Robert *, Eric Dinet, Bernard Laget * CPE Lyon - Laboratoire LISA, Villeurbanne Cedex, France Institut d Ingénierie de la Vision, Saint-Etienne

More information

Increased Underwater Optical Imaging Performance Via Multiple Autonomous Underwater Vehicles

Increased Underwater Optical Imaging Performance Via Multiple Autonomous Underwater Vehicles DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Increased Underwater Optical Imaging Performance Via Multiple Autonomous Underwater Vehicles Jules S. Jaffe Scripps Institution

More information

New Opportunities for 3D SPI

New Opportunities for 3D SPI New Opportunities for 3D SPI Jean-Marc PEALLAT Vi Technology St Egrève, France jmpeallat@vitechnology.com Abstract For some years many process engineers and quality managers have been questioning the benefits

More information

Lawrence Public Schools Mathematics Curriculum Map, Geometry

Lawrence Public Schools Mathematics Curriculum Map, Geometry Chapter 1: An Informal Introduction to Suggested Time frame: 21 days G-CO.12 Make formal geometric constructions with a variety of tools and methods (compass and straightedge, string, reflective devices,

More information

Formulas of possible interest

Formulas of possible interest Name: PHYS 3410/6750: Modern Optics Final Exam Thursday 15 December 2011 Prof. Bolton No books, calculators, notes, etc. Formulas of possible interest I = ɛ 0 c E 2 T = 1 2 ɛ 0cE 2 0 E γ = hν γ n = c/v

More information

Using Geometric Blur for Point Correspondence

Using Geometric Blur for Point Correspondence 1 Using Geometric Blur for Point Correspondence Nisarg Vyas Electrical and Computer Engineering Department, Carnegie Mellon University, Pittsburgh, PA Abstract In computer vision applications, point correspondence

More information

Scattering/Wave Terminology A few terms show up throughout the discussion of electron microscopy:

Scattering/Wave Terminology A few terms show up throughout the discussion of electron microscopy: 1. Scattering and Diffraction Scattering/Wave Terology A few terms show up throughout the discussion of electron microscopy: First, what do we mean by the terms elastic and inelastic? These are both related

More information

August 3 - August 31

August 3 - August 31 Mathematics Georgia Standards of Excellence Geometry Parent Guide Unit 1 A All About Our Unit of Study Transformations in the Coordinate Plane August 3 - August 31 In this unit students will perform transformations

More information

SUPPLEMENTARY INFORMATION DOI: /NPHOTON

SUPPLEMENTARY INFORMATION DOI: /NPHOTON DOI:.38/NPHOTON.2.85 This supplement has two parts. In part A, we provide the rigorous details of the wavefront correction algorithm and show numerical simulations and experimental data for the cases of

More information

Edge and local feature detection - 2. Importance of edge detection in computer vision

Edge and local feature detection - 2. Importance of edge detection in computer vision Edge and local feature detection Gradient based edge detection Edge detection by function fitting Second derivative edge detectors Edge linking and the construction of the chain graph Edge and local feature

More information