Augmented Reality, Advanced SLAM, Applications

Similar documents
Autonomous Mobile Robot Design

Jakob Engel, Thomas Schöps, Daniel Cremers Technical University Munich. LSD-SLAM: Large-Scale Direct Monocular SLAM

Application questions. Theoretical questions

Direct Methods in Visual Odometry

Computer Vision I - Algorithms and Applications: Multi-View 3D reconstruction

Step-by-Step Model Buidling

CS 532: 3D Computer Vision 7 th Set of Notes

Monocular Visual Odometry

Dense 3D Reconstruction. Christiano Gava

Dense 3D Reconstruction. Christiano Gava

Visual Tracking (1) Tracking of Feature Points and Planar Rigid Objects

Image processing and features

CSE 527: Introduction to Computer Vision

Live Metric 3D Reconstruction on Mobile Phones ICCV 2013

Computer Vision Lecture 20

Structure from Motion. Introduction to Computer Vision CSE 152 Lecture 10

Computer Vision Lecture 20

Robot Localization based on Geo-referenced Images and G raphic Methods

Hybrids Mixed Approaches

User Interface Engineering HS 2013

Camera Drones Lecture 3 3D data generation

Multiview Stereo COSC450. Lecture 8

On the Use of Inverse Scaling in Monocular SLAM

A Systems View of Large- Scale 3D Reconstruction

Feature Tracking and Optical Flow

3D Scene Reconstruction with a Mobile Camera

Monocular SLAM with Inverse Scaling Parametrization

Fundamental matrix. Let p be a point in left image, p in right image. Epipolar relation. Epipolar mapping described by a 3x3 matrix F

Dense Tracking and Mapping for Autonomous Quadrocopters. Jürgen Sturm

Visual SLAM for small Unmanned Aerial Vehicles

EE795: Computer Vision and Intelligent Systems

Lecture 10 Multi-view Stereo (3D Dense Reconstruction) Davide Scaramuzza

Lecture 10 Dense 3D Reconstruction

/10/$ IEEE 4048

Augmented Reality VU. Computer Vision 3D Registration (2) Prof. Vincent Lepetit

Augmenting Reality, Naturally:

Autonomous Navigation for Flying Robots

Zürich. Roland Siegwart Margarita Chli Martin Rufli Davide Scaramuzza. ETH Master Course: L Autonomous Mobile Robots Summary

Overview. Augmented reality and applications Marker-based augmented reality. Camera model. Binary markers Textured planar markers

Robotics. Lecture 7: Simultaneous Localisation and Mapping (SLAM)

Visual-Inertial RGB-D SLAM for Mobile Augmented Reality

Geometric Reconstruction Dense reconstruction of scene geometry

Depth from two cameras: stereopsis

Large-Scale Robotic SLAM through Visual Mapping

3D Computer Vision. Structured Light II. Prof. Didier Stricker. Kaiserlautern University.

Feature Tracking and Optical Flow

Accurate Motion Estimation and High-Precision 3D Reconstruction by Sensor Fusion

Stereo and Epipolar geometry

CS 4495 Computer Vision Motion and Optic Flow

Structure from motion

arxiv: v1 [cs.cv] 28 Sep 2018

Stable Vision-Aided Navigation for Large-Area Augmented Reality

Multi-stable Perception. Necker Cube

Monocular SLAM for a Small-Size Humanoid Robot

Mobile Robotics. Mathematics, Models, and Methods. HI Cambridge. Alonzo Kelly. Carnegie Mellon University UNIVERSITY PRESS

Visual Odometry. Features, Tracking, Essential Matrix, and RANSAC. Stephan Weiss Computer Vision Group NASA-JPL / CalTech

FLaME: Fast Lightweight Mesh Estimation using Variational Smoothing on Delaunay Graphs

CS 395T Lecture 12: Feature Matching and Bundle Adjustment. Qixing Huang October 10 st 2018

BIL Computer Vision Apr 16, 2014

Segmentation and Tracking of Partial Planar Templates

Visual Pose Estimation System for Autonomous Rendezvous of Spacecraft

3D Computer Vision. Dense 3D Reconstruction II. Prof. Didier Stricker. Christiano Gava

Online Learning of Binary Feature Indexing for Real-time SLAM Relocalization

Towards Monocular On-Line 3D Reconstruction

Matching. Compare region of image to region of image. Today, simplest kind of matching. Intensities similar.

Epipolar Geometry and Stereo Vision

AR Cultural Heritage Reconstruction Based on Feature Landmark Database Constructed by Using Omnidirectional Range Sensor

Real-Time Vision-Based State Estimation and (Dense) Mapping

WANGSIRIPITAK, MURRAY: REASONING ABOUT VISIBILITY AND OCCLUSION 1 Reducing mismatching under time-pressure by reasoning about visibility and occlusion

Visual Tracking (1) Pixel-intensity-based methods

Real-Time Model-Based SLAM Using Line Segments

Flow Estimation. Min Bai. February 8, University of Toronto. Min Bai (UofT) Flow Estimation February 8, / 47

Computational Optical Imaging - Optique Numerique. -- Single and Multiple View Geometry, Stereo matching --

Fast and Stable Tracking for AR fusing Video and Inertial Sensor Data

COMPUTER VISION Multi-view Geometry

ORB SLAM 2 : an Open-Source SLAM System for Monocular, Stereo and RGB-D Cameras

Visual SLAM. An Overview. L. Freda. ALCOR Lab DIAG University of Rome La Sapienza. May 3, 2016

Depth from two cameras: stereopsis

Real-Time Monocular SLAM with Straight Lines

Video Mosaics for Virtual Environments, R. Szeliski. Review by: Christopher Rasmussen

Visual Tracking (1) Feature Point Tracking and Block Matching

Semi-Dense Direct SLAM

Srikumar Ramalingam. Review. 3D Reconstruction. Pose Estimation Revisited. School of Computing University of Utah

Capturing, Modeling, Rendering 3D Structures

CS 4495 Computer Vision A. Bobick. Motion and Optic Flow. Stereo Matching

Comparison between Motion Analysis and Stereo

Tutorial on 3D Surface Reconstruction in Laparoscopic Surgery. Simultaneous Localization and Mapping for Minimally Invasive Surgery

Lecture 10: Multi view geometry

Real-time Image-based Reconstruction of Pipes Using Omnidirectional Cameras

Hidden View Synthesis using Real-Time Visual SLAM for Simplifying Video Surveillance Analysis

L15. POSE-GRAPH SLAM. NA568 Mobile Robotics: Methods & Algorithms

Lecture 19: Motion. Effect of window size 11/20/2007. Sources of error in correspondences. Review Problem set 3. Tuesday, Nov 20

Computational Optical Imaging - Optique Numerique. -- Multiple View Geometry and Stereo --

Structured Light II. Thanks to Ronen Gvili, Szymon Rusinkiewicz and Maks Ovsjanikov

Intrinsic3D: High-Quality 3D Reconstruction by Joint Appearance and Geometry Optimization with Spatially-Varying Lighting

Removing Scale Biases and Ambiguity from 6DoF Monocular SLAM Using Inertial

An Overview of Matchmoving using Structure from Motion Methods

C280, Computer Vision

Stereo Vision. MAN-522 Computer Vision

Nonlinear State Estimation for Robotics and Computer Vision Applications: An Overview

Transcription:

Augmented Reality, Advanced SLAM, Applications Prof. Didier Stricker & Dr. Alain Pagani alain.pagani@dfki.de Lecture 3D Computer Vision AR, SLAM, Applications 1

Introduction Previous lectures: Basics (camera, projective geometry) Structure From Motion Structured Light Dense 3D Reconstruction Depth Cameras Today: Insights into SLAM techniques Augmented Reality Applications of 3D Computer Vision 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 2

Introduction Previous lectures: Basics (camera, projective geometry) Structure From Motion Structured Light Dense 3D Reconstruction Depth Cameras Today: Insights into Advanced SLAM techniques Augmented Reality Applications of 3D Computer Vision 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 3

Recall: structure and motion (SAM) Unknown camera viewpoints Reconstruct Sparse scene geometry Camera motion 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 4

Offline vs. online structure and motion Offline: Online: E.g. as basis for dense 3D model reconstruction No real-time requirements, all images are available at once E.g. for mobile Augmented Reality in unknown environments Real-time requirements, images become available one by one, output required at each time-step 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 5

Online structure and motion (calibrated case) Reminder - Lecture 7 Iterative SFM Alternating estimation of camera poses and 3D feature locations (triangulation) from a (continuous) image sequence. Compute pose of first 2 cameras Relative Pose Problem 8 Point Algorithm 2D feature location (from image processing) t = 1 Camera pose t = 2 2D Matches 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 6

Online structure and motion (calibrated case) Reminder - Lecture 7 Iterative SFM Alternating estimation of camera poses and 3D feature locations (triangulation) from a (continuous) image sequence. Triangulate 3D points 3D feature location 2D feature location (from image processing) Camera pose t = 1 t = 2 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 7

Online structure and motion (calibrated case) Reminder - Lecture 7 Iterative SFM Alternating estimation of camera poses and 3D feature locations (triangulation) from a (continuous) image sequence. 3D feature location 2D feature location (from image processing) t = 1 t = 2 t = 3 Camera pose 2D Matches 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 8

Online structure and motion (calibrated case) Reminder - Lecture 7 Iterative SFM Alternating estimation of camera poses and 3D feature locations (triangulation) from a (continuous) image sequence. Estimate next camera pose (now from 2D/3D correspondences) 3D feature location Pose Problem PnP 2D feature location (from image processing) t = 1 t = 2 t = 3 Camera pose 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 9

Online structure and motion (calibrated case) Reminder - Lecture 7 Iterative SFM Alternating estimation of camera poses and 3D feature locations (triangulation) from a (continuous) image sequence. 3D feature location Triangulate additional 3D points 2D feature location (from image processing) t = 1 t = 2 t = 3 Camera pose 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 10

Online structure and motion (calibrated case) Reminder - Lecture 7 Iterative SFM Alternating estimation of camera poses and 3D feature locations (triangulation) from a (continuous) image sequence. 3D feature location Refine known 3D points with new camera poses 2D feature location (from image processing) t = 1 t = 2 t = 3 Camera pose 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 11

Online structure and motion (calibrated case) Reminder - Lecture 7 Iterative SFM Alternating estimation of camera poses and 3D feature locations (triangulation) from a (continuous) image sequence. 3D feature location 2D feature location (from image processing) t = 1 t = 2 t = 3 Camera pose Refine known cameras with new 3D points 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 12

Global Bundle Adjustment Global bundle adjustment: jointly optimize over all camera poses and 3D points (previous lecture) x = arg min k t=1 n l=1 r (i) Estimate Minimize over parameter vector containing all camera poses and 3D points 6 parameters for each camera + 3 for each 3D point 6k + 3l parameters must be estimated matrices are sparse! Residual/reprojection error Nonlinear estimation problem: use e.g. Levenberg-Marquard, start at the linear solution Open source libraries available, e.g. Sparse Bundle Adjustment (SBA) 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 13

Drift reduction using uncertainties Incorporate uncertainties, e.g. simple stochastic model and WLS estimation All entities modelled as Gaussian random variables Θ 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 14

3D point refinement Incorporate new camera view, each time the feature is observed in an image Methods: Repeated triangulation Recursive filtering (e.g. extended Kalman filter) Treated in lecture Computer Vision: Object and People Tracking Filter-based SLAM 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 15

SLAM with a Bayesian filter Continous image stream (image sequence) Triangulation difficult (short baseline) Matching of keypoint can drift over time SfM-based SLAM is not adapted Introduction of filtering techniques SLAM first used in robotics, with simpler sensors (ex: LIDAR) 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 16

Bayesian Tracking: the components State x t : camera position Measurement z t : image-based measurements Control input u t - in visual tracking: no control input Treated in lecture Computer Vision: Object and People Tracking State is hidden, only measurement is observed Markovian assumptions MA1: State x t depends only on previous state x t 1 MA2: Measurement z t depends only on state x t 17 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 17

Bayesian Tracking: derivation How to express p x t z 1:t when knowing p x t 1 z 1:t 1? p x t z 1:t = p x t z t, z 1:t 1 Treated in lecture Computer Vision: Object and People Tracking = p z t x t,z 1:t 1 p x t z 1:t 1 p z t z 1:t 1 = p z t x t p x t z 1:t 1 p(z t ) Bayes theorem Markovian assumption 1 = p z t x t p x t x t 1 p x t 1 z 1:t 1 dx t 1 p(z t ) Marginalisation (Chapman-Kolmogorov) p x t z 1:t = η p z t x t Measurement model p x t x t 1 p x t 1 z 1:t 1 dx t 1 Motion model 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 18

Bayesian Tracking: generic equation, components p x t z 1:t = η p z t x t Measurement model p x t x t 1 p x t 1 z 1:t 1 dx t 1 Motion model correct measure predict Solutions in the general case: Kalman Filter if the model is linear Gaussian Extended Kalman Filter if the model is non linear (linearization by Taylor expansion) Particle Filter in the general case In vision-based tracking, the models are not linear! 19 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 19

Filter-based SLAM The map (environment) has to be added in the equations: Probability of interest Motion model Measurement model 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 20

Filter-based SLAM 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 21

MonoSLAM (EKF-SLAM) MonoSLAM 1) EKF-based Filter initialization Map management ( Generate & delete features ) Prediction Prediction Measurements acquisition Measurements Acquisition Data association Update Update 1) A. J. Davison, I. D. Reid, N. D. Molton, O. Stasse, MonoSLAM: Real-Time Single Camera SLAM, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 29, no. 6, June 2007. 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 22 22

MonoSLAM (EKF-SLAM) MonoSLAM Prediction Measurements Acquisition Update State x( t 1) x ( t 1) y 1 y2 v Prediction ˆ () q xv t W W r ( t 1) v ( t 1) t R ( t 1) q( ω ( t 1) t) W v ( t 1) R ω ( t 1) WR Dynamic System Model (Constant Velocity Model) W r () t : 3D position vector WR () () q t : orientation quaternion xv t W v () t : linear velocity vector R ω () t : angular velocity vector : landmark position vector y i 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 23 23

MonoSLAM (EKF-SLAM) MonoSLAM Prediction Measurements Acquisition Update Active search 1),2) Prediction of measurements u h ( ) ( ˆ i t hi xv( t)) v r K ˆ W () t qˆ WR () t ku y () i kv P t 1 k Find measurements T 1 For u ( uv, ) u Si u Matching the patch by NCC at h () t u i Max NCC value at Measurement h () t u i > threshold z ( t) h ( t) u i i S i : a covariance matrix for the 2D position of i th landmark 1) A. J. Davison, Active Search for Real-Time Vision, International Conference Computer Vision, 2005. 2) M. Chli, A. J. Davison, Active Matching for Visual Tracking, Robotics and Autonomous Systems, 57(12):1173-1187, 2009. 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 24 24

MonoSLAM (EKF-SLAM) MonoSLAM Prediction Measurements Acquisition Update Update z1( t) h1( t) x( t) xˆ ( t) K( t) zn( t) hn( t) K() t : a Kalman gain at time t 1) A. J. Davison, Active Search for Real-Time Vision, International Conference Computer Vision, 2005. 2) M. Chli, A. J. Davison, Active Matching for Visual Tracking, Robotics and Autonomous Systems, 57(12):1173-1187, 2009. 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 25 25

MonoSLAM (EKF-SLAM) MonoSLAM Initialization of features Delayed : SfM Undelayed : Inverse depth parameterization 1) Experiment 1.6GHz Pentium M processor 1) J. Civera, A. J. Davison, J. M. M. Montieal, Inverse Depth Parametrization for Monocular SLAM, IEEE Transactions on Robotics 24(5):932-945, 2008. 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 26 26

Comparison SfM-based Filter-based Initialization Measurement 8-point algorithm NCC matching (from extracted feature points) KLT tracker Delayed : SfM Undelayed : Inverse depth parameterization Active search (prediction & template matching) KLT tracker Estimation technique SBA (after p3p algorithm) Kalman filtering (prediction & update) Tracking 3~400 points in a frame Working in real time within 100 landmarks. 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 27 27

Demonstration 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 28

PTAM: Klein and Murray, ISMAR 2007 Title: Parallel tracking and mapping for small AR workspaces Known as PTAM system MANY features, (simple) correlation based tracking Parallel pose tracking and 3D reconstruction threads Local bundle adjustment (based on keyframes) Code, videos, papers, slides available here 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 29

Why is SLAM fundamentally harder? Frame by Frame SLAM Time One frame Find features Update camera pose and entire map Many DOF Draw graphics 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 30

Frame by frame SLAM Standard SLAM Updating entire map every frame is expensive Needs sparse map of high-quality features (A. Davison) Proposed approach Use dense map (of low quality features) Don t update the map every frame : Keyframes Split the tracking and mapping into two threads 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 31

Parallel Tracking And Mapping Proposed method - Split the tracking and mapping into two threads Time Thread #2 Mapping Update map One frame Thread #1 Tracking Find features Update camera pose only Draw graphics 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 32

Parallel Tracking and Mapping Tracking thread: Responsible estimation of camera pose and rendering augmented graphics Must run at 30 Hz Make as robust and accurate as possible Mapping thread: Responsible for providing the map Can take long time per key frame Make as rich and accurate as possible 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 33

Tracking thread Overall flow Pre-process frame Map Project points Project points Measure points Measure points Update Camera Pose Coarse stage Update Camera Pose Fine stage Draw Graphics 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 34

Pre-process frame Mono and RGB version of image 4 pyramid levels Detect FAST corners (E. Rosten et al ECC 2006) 640x480 320x240 160x120 80x60 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 35

Pre-process frame Make for pyramid levels Detect Fast corners E. Rosten et al (ECCV 2006) 640x480 320x240 160x120 80x60 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 36

Project Points Use motion model to update camera pose Constant velocity model Estimated current Pt+1 Previous pos Pt Previous pos Pt-1 t t Vt =(Pt Pt-1)/ t Pt+1=Pt+ t (Vt) 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 37

Project Points Choose subset to measure ~ 50 features for coarse stage 1000 randomly selected for fine stage 1000 ~50 640x480 320x240 160x120 80x60 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 38

Measure Points Generate 8x8 matching template (warped from source keyframe:map) Search a fixed radius around projected position Use Zero-mean SSD Only search at Fast corner points 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 39

Update camera pose 6-DOF problem Obtain by SFM (Three-point algorithm) 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 40

Mapping thread Overall flow Stereo Initialization Wait for new key frame Add new map points Tracker Optimize map Map maintenance 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 41

Stereo Initialization Use five-point-pose algorithm D. Nister et. al. 2006 Requires a pair of frames and feature correspondences Provides initial map User input required: Two clicks for two key-frames Smooth motion for feature correspondence 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 42

Wait for new key frame Key frames are only added if : There is a sufficient baseline to the other key frame Tracking quality is good When a key frame is added : The mapping thread stops whatever it is doing All points in the map are measured in the keyframe New map points are found and added to the map 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 43

Add new map points Aim: as many map points as possible Check all maximal FAST corners in the key frame : Check score Check if already in map Epipolar search in a neighboring key frame Triangulate matches and add to map Repeat in four image pyramid levels 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 44

Optimize map Use batch SFM method: Bundle Adjustment Adjusts map point positions and key frame poses Minimize reprojection error of all points in all keyframes (or use only last N key frames) 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 45

System and Results Environment Desktop PC (Intel Core 2 Duo 2.66 GHz) OS : Linux Language : C++ Tracking speed Total Key frame preparation Feature Projection Patch search Iterative pose update 19.2 ms 2.2 ms 3.5 ms 9.8 ms 3.7 ms 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 46

System and Results Mapping scalability and speed Practical limit 150 key frames 6000 points Bundle adjustment timing Key frames 2-49 50-99 100-149 Local Bundle Adjustment 170 ms 270 ms 440 ms Global Bundle Adjustment 380 ms 1.7 s 6.9 s 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 47

Draw graphics Distorted rendering Plane estimation 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 48

Draw graphics What can we draw in an unknown scene? Assume single plane visible at start Run VR simulation on the plane 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 49

Draw graphics What can we draw in an unknown scene? Assume single plane visible at start Run VR simulation on the plane 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 50

Draw graphics What can we draw in an unknown scene? Assume single plane visible at start Run VR simulation on the plane 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 51

Draw graphics What can we draw in an unknown scene? Assume single plane visible at start Run VR simulation on the plane 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 52

Demonstration 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 53

Loop closing in SLAM Recognize previously visited location Update the beliefs accordingly Different solutions exists Bag of SIFT features Keyframe recognition Suppl. sensors 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 54

Introduction Previous lectures: Basics (camera, projective geometry) Structure From Motion Structured Light Dense 3D Reconstruction Depth Cameras Today: Insights into Advanced SLAM techniques Augmented Reality Applications of 3D Computer Vision 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 55

Augmented Reality AR is mostly based on 3D Computer Vision Camera calibration required First attempts with visual markers (based on homographies) Visual features (keypoints) (PnP problem) SLAM approaches 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 56

Calibration Matrix K 57

Augmented Reality Interface with rendering K Matrix Projection Matrix (e.g. opengl) R, t (pose) Modelview Matrix (e.g. opengl) Distortion parameters have to be estimated Undistort image, or Distorted rendering Visual coherence 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 58

Visual coherence Realistic integration between virtual and real 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 59

Visual coherence Requires to estimate the lighting conditions Light probe Direct estimation of camera artifacts (blur, colors) advanced 3D CV 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 60

Introduction Previous lectures: Basics (camera, projective geometry) Structure From Motion Structured Light Dense 3D Reconstruction Depth Cameras Today: Insights into Advanced SLAM techniques Augmented Reality Applications of 3D Computer Vision 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 61

3DCV: 3D reconstruction and printing 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 62

04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 63 Diffused Texture Appearance modeling Reference picture

04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 64

Measurements, planning 09.02.2015 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications Copyright 2013 Augmented Vision - DFKI 65 65

Person reconstruction for clothes industry http://youtu.be/gqoggkiaktw 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 66

Gesture and HCI 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 67

We are hiring! Projects / Seminars Bachelor and Master theses Hiwi positions 3D computer vision Reconstruction 2D computer vision 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 68

11.02.2015: Questions + Exercises Next appointment Thanks! 04.02.2015 Lecture 3D Computer Vision AR, SLAM, Applications 69