Mixed-Reality for Intuitive Photo-Realistic 3D-Model Generation Wolfgang Sepp, Tim Bodenmueller, Michael Suppa, and Gerd Hirzinger DLR, Institut für Robotik und Mechatronik @ GI-Workshop VR/AR 2009 Folie 1
3D-Modeling with hand-held devices Overview scanning process cycle over acquire data observe object real world scan reconstruct Inspect move deferred problem-recognition scanner non-efficient user-interface inspect object reconstruct virtual world Folie 2
3D-Modeling with hand-held devices Overview scanning process cycle over acquire data reconstruct Inspect deferred problem-recognition? object real world scanner non-efficient user-interface object virtual world Folie 3
3D-Modeling with hand-held devices Overview scanning process cycle over acquire data reconstruct Inspect deferred problem-recognition? object real world scanner non-efficient user-interface object virtual world Folie 4
3D-Modeling with hand-held devices Aspects Level of concurrency which results are immediately available View Planning Support which views are necessary to scan the object accumulated viewing positions or scan results next best viewing positions Navigation Support how to navigate the scanner to a desired view link virtual object to real object relative poses between object, scanner, operator Folie 5
3D-Modeling with hand-held devices State of the Art system modalities registration view highest level of concurrency Metris Modelmaker shape mechanical arm Perceptron/Romer/PolyWorks shape mechanical arm Handyscan shape & texture active N points ego-motion NDI Laser Scanner shape optical tracker N points David shape & texture software 1 points N N shaded points shaded points Folie 6
3D-Modeling with hand-held devices Requirements spatial domain depth measurement unit (scanner) pose measurement unit (tracker) calibration radiometric domain surface texture measurement unit (camera) temporal domain synchronisation (time coherence) latencies (online generation of 3D model) requirements on spatial synchronisation depend on velocities of movements (expected 0.5m/sec, 45deg/sec) Folie 7
The Multi-sensory 3D-Modeler Hardware depth sensing principles laser triangulation (rotating laser-scanner, laser-stripe profiler) stereo triangulation pose measurements internal optical tracker (image-based ego-motion) external optical tracker (active/passive markers) coordinate measurement (mechanical arm) Folie 8
The Multi-sensory 3D-Modeler Hardware depth sensing principles laser triangulation (rotating laser-scanner, laser-stripe profiler) stereo triangulation pose measurements internal optical tracker (image-based ego-motion) external optical tracker (active/passive markers) coordinate measurement (mechanical arm) Folie 9
The Multi-sensory 3D-Modeler Synchronisation hardware synchronisation avoid interferences software synchronisation un-triggerable devices labeling of sensor data consistent data fusion Folie 10
The Multi-sensory 3D-Modeler Data-Flow Folie 11
Streaming Generation of 3D Models Data Flow estimation of surface normal mesh generation 3D point density update of selection of mesh local measurement limitation normal vertices limitation triangulation point model wire-frame model visualisation Level of concurrency immediate triangulation immediate view integration Folie 12
Photo-Realistic Texture Mapping Texture-mapping modes single-view mapping single-view mapping with brightness correction multi-view mapping Dependencies accuracy of 3D model accuracy of pose measurement synchronisation Level of concurrency sequential (manual) acquisition of texture images immediate mapping to surface model Folie 13
Navigation Support Linking of virtual and real world objects seamless factor view alignment display navigation support usability high operator pose see-through HMD, AR operator scanner - virtual object real object scanner pose desktop, AR scanner virtual object real object scanner pose desktop, VR scanner virtual object requires high accuracy in pose measurement measurement desktop, VR triangulation virtual object separate manual inspection low mouse desktop, VR operator manual inspection Folie 14
Navigation Support via sensor-pose Folie 15
View Planning Support Deciding next best view(s) seamless factor modality of visualisation view planning support high textured surface-model textureoriented shaded surface- model shapeoriented usability virtual object and real object not distinguishable in AR good wire-frame model link-oriented weak 3D impression low raw point model measurement -oriented missing 3D impression Folie 16
View Planning Support via shaded, textured surface Folie 17
Concurrent Scanning & Reconstruction with AR-support Folie 18
Thank you! Folie 19
Further Examples 1/2 Folie 20
Further Examples 2/2 Folie 21