Information page for written examinations at Linköping University TER2

Similar documents
1 Projective Geometry

Outline. ETN-FPI Training School on Plenoptic Sensing

Epipolar geometry contd.

Introduction to Computer Vision. Introduction CMPSCI 591A/691A CMPSCI 570/670. Image Formation

3D Sensing. 3D Shape from X. Perspective Geometry. Camera Model. Camera Calibration. General Stereo Triangulation.

Multiple View Geometry

PART IV: RS & the Kinect

Optics Vac Work MT 2008

Institutionen för systemteknik

Three-Dimensional Sensors Lecture 2: Projected-Light Depth Cameras

Multiple View Geometry

AP* Optics Free Response Questions

1 (5 max) 2 (10 max) 3 (20 max) 4 (30 max) 5 (10 max) 6 (15 extra max) total (75 max + 15 extra)

Range Sensors (time of flight) (1)

CS 4620 Midterm, March 21, 2017

Rigid Body Motion and Image Formation. Jana Kosecka, CS 482

Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation

TDA361/DIT220 Computer Graphics, January 15 th 2016

Laser sensors. Transmitter. Receiver. Basilio Bona ROBOTICA 03CFIOR

Understanding Variability

Material for Chapter 6: Basic Principles of Tomography M I A Integral Equations in Visual Computing Material

CS223b Midterm Exam, Computer Vision. Monday February 25th, Winter 2008, Prof. Jana Kosecka

Camera Model and Calibration

AP Physics Problems -- Waves and Light

Computer Vision. Coordinates. Prof. Flávio Cardeal DECOM / CEFET- MG.

3D Modeling of Objects Using Laser Scanning

Stereo Vision. MAN-522 Computer Vision

INFOGR Computer Graphics. J. Bikker - April-July Lecture 10: Shading Models. Welcome!

The main problem of photogrammetry

Topic 9: Lighting & Reflection models 9/10/2016. Spot the differences. Terminology. Two Components of Illumination. Ambient Light Source

Flexible Calibration of a Portable Structured Light System through Surface Plane

Short on camera geometry and camera calibration

Topic 9: Lighting & Reflection models. Lighting & reflection The Phong reflection model diffuse component ambient component specular component

3D Sensing and Reconstruction Readings: Ch 12: , Ch 13: ,

CoE4TN4 Image Processing. Chapter 5 Image Restoration and Reconstruction

Final Exam Study Guide

w Foley, Section16.1 Reading

Project 3 code & artifact due Tuesday Final project proposals due noon Wed (by ) Readings Szeliski, Chapter 10 (through 10.5)

Optimized Design of 3D Laser Triangulation Systems

3D Photography: Active Ranging, Structured Light, ICP

DEVELOPMENT OF CAMERA MODEL AND GEOMETRIC CALIBRATION/VALIDATION OF XSAT IRIS IMAGERY

Digital Image Processing

Engineered Diffusers Intensity vs Irradiance

Polarization of Light

Camera Calibration. Schedule. Jesus J Caban. Note: You have until next Monday to let me know. ! Today:! Camera calibration

DD2429 Computational Photography :00-19:00

Introduction to 3D Machine Vision

Jump Stitch Metadata & Depth Maps Version 1.1

Medical Image Reconstruction Term II 2012 Topic 6: Tomography

Topics and things to know about them:

Camera model and multiple view geometry

Reflectance & Lighting

EE565:Mobile Robotics Lecture 2

Geometric camera models and calibration

PHYSICS 1040L LAB LAB 7: DIFFRACTION & INTERFERENCE

Measurements using three-dimensional product imaging

10/5/09 1. d = 2. Range Sensors (time of flight) (2) Ultrasonic Sensor (time of flight, sound) (1) Ultrasonic Sensor (time of flight, sound) (2) 4.1.

2012 Imaging Science Ph.D. Comprehensive Examination June 15, :00AM to 1:00PM IMPORTANT INSTRUCTIONS

Other approaches to obtaining 3D structure

Applications of Piezo Actuators for Space Instrument Optical Alignment

55:148 Digital Image Processing Chapter 11 3D Vision, Geometry

Tecnologie per la ricostruzione di modelli 3D da immagini. Marco Callieri ISTI-CNR, Pisa, Italy

Today. Stereo (two view) reconstruction. Multiview geometry. Today. Multiview geometry. Computational Photography

Hartley - Zisserman reading club. Part I: Hartley and Zisserman Appendix 6: Part II: Zhengyou Zhang: Presented by Daniel Fontijne

Assignment 2 : Projection and Homography

Chapter 36. Image Formation

Assignment #2. (Due date: 11/6/2012)

Mu lt i s p e c t r a l

Experiment 8 Wave Optics

Computed tomography (Item No.: P )

Stereo. 11/02/2012 CS129, Brown James Hays. Slides by Kristen Grauman

Structured Light. Tobias Nöll Thanks to Marc Pollefeys, David Nister and David Lowe

Depth Sensors Kinect V2 A. Fornaser

Announcements. Image Formation: Outline. Homogenous coordinates. Image Formation and Cameras (cont.)

L2 Data Acquisition. Mechanical measurement (CMM) Structured light Range images Shape from shading Other methods

Pipeline Operations. CS 4620 Lecture 14

Pipeline Operations. CS 4620 Lecture 10

Pipeline Operations. CS 4620 Lecture Steve Marschner. Cornell CS4620 Spring 2018 Lecture 11

Stereo vision. Many slides adapted from Steve Seitz

55:148 Digital Image Processing Chapter 11 3D Vision, Geometry

CIS 580, Machine Perception, Spring 2015 Homework 1 Due: :59AM

Formulas of possible interest

Geometric Accuracy Evaluation, DEM Generation and Validation for SPOT-5 Level 1B Stereo Scene

Midterm Examination CS 534: Computational Photography

Midterm Exam Solutions

TLS Parameters, Workflows and Field Methods

SIMULATION AND VISUALIZATION IN THE EDUCATION OF COHERENT OPTICS

Diffraction and Interference Lab 7 PRECAUTION

Camera Model and Calibration. Lecture-12

Epipolar Geometry and Stereo Vision

Visualisatie BMT. Rendering. Arjan Kok

CS354 Computer Graphics Ray Tracing. Qixing Huang Januray 24th 2017

COSC579: Scene Geometry. Jeremy Bolton, PhD Assistant Teaching Professor

Polarization of light

Chapter 3 Image Registration. Chapter 3 Image Registration

Stereo imaging ideal geometry

Illumination & Shading

CHAPTER 3. Single-view Geometry. 1. Consequences of Projection

CS 130 Exam I. Fall 2015

TLS Parameters, Workflows and Field Methods

Transcription:

Information page for written examinations at Linköping University Examination date 2016-08-19 Room (1) TER2 Time 8-12 Course code Exam code Course name Exam name Department Number of questions in the examination Teacher responsible/contact person during the exam time Contact number during the exam time Visit to the examination room approximately Name and contact details to the course administrator (name + phone nr + mail) Equipment permitted Other important information Number of exams in the bag TSBB09 TEN2 Images Sensors (Bildsensorer) Written examination (Skriftlig tentamen) ISY 24 Klas Nordberg 013-281634, 0739-037628 around 10 am Calculator, pen and paper

Guide The written examination consists of 3 parts, one part for each of the three course aims in the curriculum. Part I: standard image sensors, including IR Part II: geometry and multiple views Part III: non-standard image sensors Each part consists of 6 exercises where the student should demonstrate ability to explain concepts, phenomena, etc (type A exercises), and 2 additional exercises that test a deeper understanding of various topics in the course, for example, in terms of simpler calculations (type B exercises). Type A exercises give at most 1 point each. Type B exercises give at most 2 points each. To pass with grade 3: At least one type B exercise passed (i.e., with 2 points) for the whole examination AND at least a total of 4 points each in each of the three parts. To pass with grade 4: At least three type B exercises passed for the whole examination AND at least a total of 6 points each in each of the three parts. To pass with grade 5: At least five type B exercises passed for the whole examination AND at least a total of 8 points each in each of the three parts. The answers to the A-exercises should be given in the blank spaces of this examination thesis, below the questions. If an A-exercise requires two pieces of information, indicated by an AND, both should be given to get 1p. Otherwise 0p is given. The answers to the B-exercises should be given on blank paper sheets, with no more than one exercise per sheet, that will be appended to the thesis by the student. Write your AID code at the top of the pages in this examination thesis and any sheet appended to the examination thesis. Appended sheets must also have the course code and date written on them and be numbered. Good luck! Klas Nordberg and Maria Magnusson 1

PART I: STANDARD & IR IMAGE SENSORS Exercise 1 (A, 1p) Light from two separate monochromatic sources with different wavelengths, λ 1 < λ 2, are measured by a light sensor. It senses the same energy per time and area unit for both sources. Does this mean that the number of detected photons per time and area unit from the first source, n 1, is the same as from the second source, n 2, or is it the case that n 1 < n 2 or n 1 > n 2? Motivate your answer. Exercise 2 (A, 1p) Describe the distinct difference between an image produced by a camera using a global shutter versus a rolling shutter. Exercise 3 (A, 1p) Explain the concept depth of field for a lens based camera. Exercise 4 (A, 1p) Describe the concept of the virtual image plane of a pin-hole camera, and how it differs from the normal image plane. TSBB09 2 TEN2, 2016-08-19

Exercise 5 (A, 1p) A digital camera is subject to shot-noise (pixel noise). Why does this noise occur? Exercise 6 (A, 1p) The infra-red wave-length range is usually divided into a number of intervals, one of which is referred to as near IR (or NIR). Which wave-length band do you find immediately below the NIR interval, i.e., that have shorter wave-lengths than NIR? Exercise 7 (B, 2p) Explain the so-called transport problem in an image sensor, and describe in general how it is solved in the case of a CCD-sensor and a CMOS-sensor. Explain the qualitative differences in the image measurement process that the two solutions produce. Exercise 8 (B, 2p) Describe the basic measurement cycle of the photo diode as it is used in a digital camera. Describe which electrical quantity, proportional to the incident light, it produces and how. TSBB09 3 TEN2, 2016-08-19

PART II: GEOMETRY AND MULTIPLE VIEWS Exercise 9 (A, 1p) In order to stitch a set of smaller images to larger panorama image, it is important that the initial set of images are taken by rotating the camera around its center. Explain way. Exercise 10 (A, 1p) How many constraints on the fundamental matrix, F, are provided by one pair of corresponding image points? Exercise 11 (A, 1p) The intrinsic camera parameters are represented as α γ u 0 A = 0 β v 0. 0 0 1 Why can we often assume that γ 0? TSBB09 4 TEN2, 2016-08-19

Exercise 12 (A, 1p) Why are the image panoramas that are created in the computer exercise combined in a spherical coordinate system, rather than an ordinary Cartesian coordinate system? Exercise 13 (A, 1p) Assume that x = (X, Y, Z, 1) T are the homogeneous coordinates of a point in the world coordinate system and that y = (U, V, W ) T are the homogeneous coordinates of a point in a normalized camera coordinate system. The relation between the two homogenous coordinates are y [R t] x, Suppose that the camera is translated a distance (10, 20, 30) in relation to the world origin and rotated an angle 0 around the Y-axis. Give the [R t] matrix! Exercise 14 (A, 1p) Reconstruction of a 3D point based on the positions of its projection in a pair of stereo images can be done using a linear method. How can you derive linear equations in the unknown 3D point x based on the usual camera projection equations: y 1 C 1 x and y 2 C 2 x TSBB09 5 TEN2, 2016-08-19

Exercise 15 (B, 2p) In a pair of stereo images, a point in one image corresponds to a line in the other image. Motive why this is so. Exercise 16 (B, 2p) Zhang s method for camera calibration is based on the expression λ A [ r 1 r 2 t ] = [ h 1 h 2 h 3 ] where λ 0 is a scaling parameter which implies that the left hand side is proportional to the right hand side. Describe the remaining parameters in the above equation and explain how they allow you to formulate two constraints that are useful for camera calibration. TSBB09 6 TEN2, 2016-08-19

PART III: NON-STANDARD IMAGE SENSORS Exercise 17 (A, 1p) A satellite produces images of the earth using a line camera, thereby forming a push-broom camera. What is the advantage of using a line camera in this case? Exercise 18 (A, 1p) A flat surface has a printed pattern, shown below. It is scanned with a range camera based on sheet-of-light and triangulation. Although the surface is flat, the resulting range image contains minor deviations from a flat surface. Explain why and describe where the range deviations occur. Exercise 19 (A, 1p) Modern medical CT-scanners produce high quality 3D volumes. It is common to use a helix-shaped source path. Describe how such a source path is obtained in terms of how the X-ray source, the detector and the patient move. TSBB09 7 TEN2, 2016-08-19

Exercise 20 (A, 1p) The intensity I at a point on an object can be written I = I a k d + I l k d cos ϕ + I l k s (cos θ) n. State mathematical expressions for the angles ϕ and θ in terms of the vectors L, N, R and V, defined in the figure below. L N R V Exercise 21 (A, 1p) An active range camera, e.g. SICK-IVP s laser sheet-of-light or MICROSOFT s KINECT, can measure the shape of many types of objects. An object with a specular surface might cause problems, however. Why? Exercise 22 (A, 1p) How can occlusion in general be detected in the sensor image for a sheet-of-light range camera? TSBB09 8 TEN2, 2016-08-19

Exercise 23 (B, 2p) See the figure below that illustrates backprojection of the filtered projection q(r) over the image f(x, y). Which value is backprojected at the position (x, y) = (2, 1)? Use linear interpolation. Let the distance between projection values as well as between pixels in the image be the same. The projection angle is φ = 26.565. f(x,y) y x 1 1 2 0 q(r) 4 φ 3 2 r Exercise 24 (B, 2p) A 3D volume of data is generated from a 3D tomography device. Describe the computational steps that are necessary in order to compute the required information to produce a 2D image based on surface shading using the Blinn-Phong model. TSBB09 9 TEN2, 2016-08-19