Subframe Video Post-Synchronization

Similar documents
Depthmap based Stereoscopic Panorama Viewer for Head-Mounted Displays

Ceilbot vision and mapping system

Motion. 1 Introduction. 2 Optical Flow. Sohaib A Khan. 2.1 Brightness Constancy Equation

Linescan System Design for Robust Web Inspection

Tracking Trajectories of Migrating Birds Around a Skyscraper

Introduction.

IEEE Consumer Electronics Society Calibrating a VR Camera. Adam Rowell CTO, Lucid VR

User manual Horus Movie Player 1

Phantom v7.3. Phantom v7.3. Datasheet. Mechanical Drawing. Sensor Spectral Response 1 / 10

CS 563 Advanced Topics in Computer Graphics Stereoscopy. by Sam Song

Rodenstock Products Photo Optics / Digital Imaging

OSMO MOBILE 2. User Manual. v

Creating a distortion characterisation dataset for visual band cameras using fiducial markers.

Lab 5: Diffraction and Interference

CS4758: Rovio Augmented Vision Mapping Project

OSMO Release Notes Major Updates. Notes

SYNTHETIC SCHLIEREN. Stuart B Dalziel, Graham O Hughes & Bruce R Sutherland. Keywords: schlieren, internal waves, image processing

Product information. Hi-Tech Electronics Pte Ltd

Project 1 : Dead Reckoning and Tracking

DEVELOPMENT OF A TRACKING AND GUIDANCE SYSTEM FOR A FIELD ROBOT

Teledyne PDS. Multibeam Calibration. Version January Teledyne RESON B.V. Stuttgartstraat AS Rotterdam The Netherlands

Advanced Imaging Applications on Smart-phones Convergence of General-purpose computing, Graphics acceleration, and Sensors

This overview summarizes topics described in detail later in this chapter.

SteadXP Demo - April 7th 2017 DEMO SOFTWARE USER MANUAL

Product Specification Sapphire

PROJECTILE MOTION PURPOSE

Geometric Accuracy Evaluation, DEM Generation and Validation for SPOT-5 Level 1B Stereo Scene

Natural Viewing 3D Display

HIGH PRECISION SURVEY AND ALIGNMENT OF LARGE LINEAR COLLIDERS - HORIZONTAL ALIGNMENT -

NEW CONCEPT FOR JOINT DISPARITY ESTIMATION AND SEGMENTATION FOR REAL-TIME VIDEO PROCESSING

SOFTWARE VERSION 7.0. Panoneed User Manual.

DV2. Alignment Procedure. Install DV2 on Microscope NOTE: PLEASE READ THE ENTIRE PROCEDURE BEFORE YOU BEGIN ALIGNMENT OF THE DV2. Alignment Procedure

Jump Stitch Metadata & Depth Maps Version 1.1

User English Manual for Sputnik Stereo Camera

OptoFidelity Measurement Services High-end smartphone measurement report Q1/2013

The Phantom v211 provides a widescreen CMOS sensor and greater than 2 Gigapixels/second throughput

Exterior Orientation Parameters

Help For TorontoMLS. Report Designer

Customisation and production of Badges. Getting started with I-Color System Basic Light

OSMO MOBILE. User Manual V

About Phoenix FD PLUGIN FOR 3DS MAX AND MAYA. SIMULATING AND RENDERING BOTH LIQUIDS AND FIRE/SMOKE. USED IN MOVIES, GAMES AND COMMERCIALS.

The Phantom v3411 provides a widescreen CMOS sensor and greater than 4 Gigapixels/second throughput

TERRESTRIAL AND NUMERICAL PHOTOGRAMMETRY 1. MID -TERM EXAM Question 4

FSST Passive 3D Kit for Blackwing (3D models) ASSEMBLY AND CALIBRATION. Part. No.: R Ref: R Rev: 01

Chapter 9 Object Tracking an Overview

Module 7 VIDEO CODING AND MOTION ESTIMATION

BCC Comet Generator Source XY Source Z Destination XY Destination Z Completion Time

LOCAL AMBASSADOR TRAINING

5MP Global Shutter. High Dynamic Range Global Shutter CMOS Sensor

HOBO Pendant G Data Logger (UA ) White Paper

N-Views (1) Homographies and Projection

Evaluating the Performance of a Vehicle Pose Measurement System

Introduction to Mobile Robotics Probabilistic Motion Models

BCC Optical Stabilizer Filter

FCB-EV Series Color Block Camera

Two-Dimensional Projectile Motion

DYNAMIC POSITIONING CONFERENCE September 16-17, Sensors

Computer Graphics Fundamentals. Jon Macey

IR Integration with Vic-Snap and Combining IR Data with Vic-3D. Procedure Guide

Rolling Shutter Correction in 3DE4

XCL-SG Series. XCL-CG Series. Digital Camera Module Equipped with the Global Shutter CMOS Sensor

Checking the values using backscatter data

Automatic 2D-to-3D Video Conversion Techniques for 3DTV

FCB-EV Series. Colour Block Cameras FCB-EV7500 FCB-EV7300 FCB-EV7310 FCB-EV7100 FCB-EV5500 FCB-EV

THE STA013 AND STA015 MP3 DECODERS

Figure 2.1: High level diagram of system.

SUMMARY: DISTINCTIVE IMAGE FEATURES FROM SCALE- INVARIANT KEYPOINTS

UNIT-2 IMAGE REPRESENTATION IMAGE REPRESENTATION IMAGE SENSORS IMAGE SENSORS- FLEX CIRCUIT ASSEMBLY

Corona Sky Corona Sun Corona Light Create Camera About

Release Notes for IDS Software Suite 4.90

Orah 4i User Guide - v1.2.0

animation, and what interface elements the Flash editor contains to help you create and control your animation.

Camera & Imaging. Capturing Image, Recording Video & Editing Camera Using Display as Viewfinder...7-9

SteadXP Demo - April 28th 2017 DEMO SOFTWARE USER MANUAL

Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation

ifp Universität Stuttgart Performance of IGI AEROcontrol-IId GPS/Inertial System Final Report

Stereo imaging ideal geometry

Multi Time Series Rev

Surveyor Autocalibration Guide Objective Imaging Ltd.

Brightness and geometric transformations

DD2423 Image Analysis and Computer Vision IMAGE FORMATION. Computational Vision and Active Perception School of Computer Science and Communication

CS 498 VR. Lecture 20-4/11/18. go.illinois.edu/vrlect20

Product Specifications Q-4A180/CL

An Introduction to Super-8mm Film. Imperfect Cine Club

Contents. Contents. Perfecting people shots Making your camera a friend.5. Beyond point and shoot Snapping to the next level...

Graphing. ReportMill Graphing User Guide. This document describes ReportMill's graph component. Graphs are the best way to present data visually.

Advanced Vision Guided Robotics. David Bruce Engineering Manager FANUC America Corporation

User Manual V K Camera with an Integrated 3-axis Gimbal

Skybox. Ruoqi He & Chia-Man Hung. February 26, 2016

Available online at ScienceDirect. Energy Procedia 69 (2015 )

Broad field that includes low-level operations as well as complex high-level algorithms

Automated Measurement of Viscosity with Ubbelohde Viscometers, Camera Unit and Image Processing Software

3-Axis Stabilized Handheld Camera

arxiv: v1 [cs.cv] 1 Jan 2019

Fotonic E-series User Manual

JWildfire tutorial on: the dancing-flamemodule

CS223b Midterm Exam, Computer Vision. Monday February 25th, Winter 2008, Prof. Jana Kosecka

SE Mock Online Retest 2-CG * Required

Self-calibration of a pair of stereo cameras in general position

VSPlayer Software User Manual

Transcription:

Subframe Video Post-Synchronization 1. Testpattern for Measuring Offset Helmut Dersch November 24, 2016 Abstract This is a 3-part series of articles dealing with post-synchronization of video streams for stereoscopic applications, and/or for stitching panorama movies. The first two parts present methods for determining the offsets between video streams with sub-millisecond accuracy. The third part presents software tools for time-shifting videos by frame-interpolation establishing synchronized streams with the same accuracy. Contents 1 Introduction 1 2 Testpattern 2 3 Testvideo 3 4 Results 4 5 Comparison with other methods 6 1 Introduction Videocapture using several cameras like stereoscopic video or stitched videopanoramas requires synchronized streams: Frames from each camera have to be taken at exactly the same time otherwise stereoscopic vision suffers or stitching errors occur. Aligning to integer frame numbers may result in delays of up to half of a frame duration (8.3ms for framerate 60fps) when using independent cameras. Often, this 1

results in visible errors. As an example, typical velocities of human movements (walking, gesturing, etc) are 1 2m/s ˆ=1 2cm per halfframe, a car moving at 50km/h is 12cm ahead in the delayed image. These distances may be of the same order of magnitude as the left-right parallax responsible for stereovision. Hardware synchronization solves this problem but is only available in some (quite expensive) cameras [1]. There are a number of ways to retrofit some cameras either with hardware sync, or the ability to at least start recording synchronously, which will not be reviewed here. We present a software solution which allows the user to synchronize unaligned video streams with drifting clocks. This first article deals with precisely measuring the time delay between several streams using a testpattern. 2 Testpattern A simple technique for measuring time delays between video streams makes use of the special way cathode-ray-tubes (CRT) in monitors render images [4]. The position of the electron beam comprises a very accurate time stamp, and taking images of the moving beam using several cameras can be used to determine the time delay between cameras for this particular take. CRTs are extinct now, and they can not easily be used outside in the field. A similar but less accurate method can be realized using fast OLED-displays on smartphones capable of displaying 60 frames per second (Samsung Note 4 in this work). These displays render images from top to bottom (portrait-mode) one row at a time. When the image is finished, the next frame is rendered immediately afterwards. The currently rendered line moves at a constant speed downwards and its position can be used as timestamp. An ideal test should consist of a serious of alternating black / white (or other high contrast) images making the currently rendered line easily identifyable as black/white border. Figure 1: Two consecutive frames of testvideo. The stream alternates between these two patterns. 2

Figure 1 shows two consecutive frames of this pattern. In addition to the background color, a measuring grid and frame numbers are provided. 60 seconds of these frames alternating at a rate of 60 f ps can be downloaded here [2]. The pattern is easily destroyed by modest compression levels, so we ask to not upload and distribute this video through Youtube or similar websites. 3 Testvideo Video cameras usually have CMOS-imaging chips which exhibit rolling-shutters [3], i.e. the image is detected one row at a time (landscape mode for the cameras used in this work), similar to how the screen image for the test pattern is rendered. The apparent movement of the shutter overlaps the movement of the testpattern in the smartphone screen, and the final testvideo depends on the relative orientation of the two. Figure 2: Frame of Testvideo. Rolling shutter moves perpendicular to screen image Figure 2 shows results for the case when the two movements are perpendicular (rolling shutter moves from top to bottom, smartphome screen image from left to right). This leads to a slanted black-white border. The grid spacing horizontally represents 1/20 of a frame duration (0.83ms for framerate 60 f ps). The transition region extends about 3ms horizontally which is due to the finite exposure time of this particular frame. The timing value can be read with better accuracy than this 3

Figure 3: Testvideo for parallel and antiparallel rolling shutter movement. exposure duration, because there is one distinct point which is exposed equally by the white and black pattern, and which therefore is homogenously grey (zerocontrast). In figure 3 results for parallel and antiparallel movement of shutter and rendered screen are shown. In these cases, the border is straight, and becomes widened or compressed. The perpendicular mode of figure 2 is simpler to analyze and has been used for the tests below. It should be noted that the pattern observed in figures 2 and 3 slowly moves across the screen indicating that playback of 60 f ps videos on the Note 4 is actually somewhat slower, and occasionally, a frame is dropped. This introduces a tiny and negligible error in the timing values, as long as measurements do not coincide with frame dropping. 4 Results The following results were obtained in a field test using two action cameras (Sony HDR AS100V/200V mounted as stereo-pair). and a Samsung Note 4 delivering the test pattern. Cameras are controlled using the Live-View Remote RM-LVR, and a recording session was started using the multi-camera feature of this setup. At the beginning and end of the recording interval the testpattern was briefly held into the view. Supposedly, the cameras should start recording simultaneously, but 4

Figure 4: Left and right frames of stereovideo for the same testpattern frame. Figure 5: Enlarged measurement regions of frames in figure 4 above frames with identical testpattern framenumber (2217 in the example below) are 12 frames (200ms) apart in the two recorded videos. Figure 4 shows an extracted frame (Number 539) of the left camera, and the corresponding frame (Number 551) of the right camera. Analyzing the position of the slanted border between black and white regions enables us to determine the delay more precisely: Figure 5 shows the border regions of both images enlarged. We have to measure time at the same rolling shutter position in both images, which means that the vertical pixel coordinate y has to be the same in both images. Frame size is 1920 1080 pixels; I arbitrarily choose y = 780 which is indicated as horizontal yellow line in both images. We then have to find the intersection with the slanted border line, and count the horizontal units on the smartphone screen. I obtain about 2.6 units in the right and 2.3 units in the left image, i.e. a difference of 0.3 units. 1 unit corresponds to 1/10 of a frame duration (1.67ms), so the two images are 0.5ms apart. Since the frame numbers differ by 12, the total delay is 12.03 16.7ms = 200.5ms. 30 minutes, 108000 frames and 10km later this measurement is repeated and we get a delay of 11.25 f rames=187.5ms. ˆ The internal clocks of the two cameras 6 were drifting by 13ms/30min = 7.2 10 apart. Given the rough environmental conditions (freezing temperatures, vibrations, bicycle mount) this is not unusual, 5

and shows that synchronized record start alone is not sufficient. 5 Comparison with other methods The second article of this series presents a method for determining the same delay using the output of a software videostabilizer. This stabilizer calculates motion induced rotational angles for both cameras (yaw, pitch, roll). Since the two cameras are mounted rigidly to each other on a bicycle helmet, these movements are (almost) identical for both cameras, but time shifted by the delay. The delay then can be determined by a correlation analysis to subframe precision. Results are summarized in figure 6. The values determined by the testpattern method are shown Figure 6: Delay of left videostream with respect to the right stream as measured by the testpattern-method (circles), and videostabilizer (yaw-green, roll-cyan, pitchblue). The video consists of 110000 frames at 60fps (30 minutes). Frame duration is 16.7ms. as two circles at the beginning and end. The lines in between are determined by the correlation analysis of roll, yaw and pitch angles. The agreement is excellent. 6

Using a leastsquare linear fit we calculate the delay (in fractional frames) for each left-right framepair using the equation n = 7.2475 10 6 n + 12.038 This function is shown as red line in figure 6. Multiplying n by the frame duration 1/60s = 16.7ms yields the time delay with accuracy of approximately ±0.5ms. References [1] Wikipedia: Genlock [2] Homepage H. Dersch [3] Wikipedia: Rolling Shutter [4] Camera Sync Tester software 7