Margarita Grinvald. Gesture recognition for Smartphones/Wearables

Similar documents
98.2% 85.5% 93.8% Graduate School of Information Science and Technology, Osaka University

Detecting Fingertip Method and Gesture Usability Research for Smart TV. Won-Jong Yoon and Jun-dong Cho

Touch Technologies. Sara Kilcher MSc Computer Science Student Distributed Systems Group ETH Zrich Raemistrasse Zurich

Mouse Simulation Using Two Coloured Tapes

Ubiquitous and Mobile Computing CS 528: Duet: Exploring Joint Interactions on a Smart Phone and a Smart Watch

Mobile UI. Device, Input, Interaction Characteristics. Mobile UI 1

Augmenting Reality with Projected Interactive Displays

MIME: A Gesture-Driven Computer Interface

GTC Interaction Simplified. Gesture Recognition Everywhere: Gesture Solutions on Tegra

Rich Augmented Reality Interactions (without goggles, gloves or 3D trackers)

The 3D Terrain Interactive Technique Based on Gesture Recognition Yanyan Li1, a, Xiaomeng Xu2, b, Jiayu Sun3, c, Haimeng Zhao4, d*

HoverLink: Joint Interactions using Hover Sensing Capability

Note: Text based on automatic Optical Character Recognition processes. SAMSUNG GALAXY NOTE

Steerable Interfaces for Pervasive Computing Spaces

Micro-scale Stereo Photogrammetry of Skin Lesions for Depth and Colour Classification

26/05/2015 AR & VR? 1 1

Computer Animation and Visualisation. Lecture 3. Motion capture and physically-based animation of characters

User Interfaces for Web Sites and Mobile Devices. System and Networks

Advanced Imaging Applications on Smart-phones Convergence of General-purpose computing, Graphics acceleration, and Sensors

Vision-Based Hand Detection for Registration of Virtual Objects in Augmented Reality

Gesture Recognition: Hand Pose Estimation. Adrian Spurr Ubiquitous Computing Seminar FS

Lecture 19: Depth Cameras. Visual Computing Systems CMU , Fall 2013

Gesture Recognition Technique:A Review

Sensor technology for mobile robots

Data-driven Approaches to Simulation (Motion Capture)

DIGITAL SIXTH SENSE & PICO PROJECTOR PRESENTED BY: HARSH VERDHAN ( ) AMIT PATEL( )

SmartWatch. February Specification. Developer World sonymobile.com/developer

Interactive Devices. EPSON Interactive whiteboard

Static Gesture Recognition with Restricted Boltzmann Machines

Interaction with the Physical World

Comparative Study of Hand Gesture Recognition Techniques

Fingertips Tracking based on Gradient Vector

Smart Glasses & Applications. Presenter: Xu Qiu! Advisor: C.-C. (Jay) Kuo!

VSP-Virtual Smart Phone (First Step to Connect Virtual World)

3D Time-of-Flight Image Sensor Solutions for Mobile Devices

A Study on Wearable Gestural Interface A SixthSense Technology

Opportunities for ML Analytics at the Sensor Endpoint

Putting the Touch in Multi-Touch: An in-depth look at the future of interactivity. Gary L. Barrett, Chief Technical Officer at Touch International

John Ray. Sams Teach Yourself. iphone. Application Development. Second Edition. S^/MS 800 East 96th Street, Indianapolis, Indiana, USA

Available online at ScienceDirect. Procedia Technology 18 (2014 )

Outline Sensors. EE Sensors. H.I. Bozma. Electric Electronic Engineering Bogazici University. December 13, 2017

Camera Phones with Pen Input as Annotation Devices

Overview. Augmented reality and applications Marker-based augmented reality. Camera model. Binary markers Textured planar markers

An Interactive Technique for Robot Control by Using Image Processing Method

Kinect for Windows An Update for Researchers

PressTact: Side Pressure-Based Input for Smartwatch Interaction

OmniTouch: Wearable Multitouch Interaction Everywhere

A method for depth-based hand tracing

High-Fidelity Augmented Reality Interactions Hrvoje Benko Researcher, MSR Redmond

TOWARDS NEW INTERFACES AT MULTITOUCH TECHNOLOGY Ms. Pratima B. Khot 1, Ms.Tejashri H. Mohite 2

SIXTH SENSE TECHNOLOGY

Outline. Introduction System Overview Camera Calibration Marker Tracking Pose Estimation of Markers Conclusion. Media IC & System Lab Po-Chen Wu 2

3D Scanning. Qixing Huang Feb. 9 th Slide Credit: Yasutaka Furukawa

U-Remo: Projection-assisted Gesture Control for Home Electronics

Tap Position Inference on Smart Phones

Game Programming with. presented by Nathan Baur

A Wearable Hardware Platform for Capturing Expert s Experience

Measuring Grasp Posture Using an Embedded Camera

Deep Learning for Virtual Shopping. Dr. Jürgen Sturm Group Leader RGB-D

ArchGenTool: A System-Independent Collaborative Tool for Robotic Architecture Design

MediaTek Natural User Interface

Computer STMicroelectronics. Competencies and Advanced Research Topics

System Requirements:-

Coin Size Wireless Sensor Interface for Interaction with Remote Displays

HCI FOR IPHONE. Veronika Irvine PhD Student, VisID lab University of Victoria

Paper. A Specification Method of Character String Region in Augmented Reality

Photoshop PSD Export. Basic Tab. Click here to expand Table of Contents... Basic Tab Additional Shading Tab Material Tab Motion Tab Geometry Tab

Gesture-Based 3D Mesh Modeler

Classifying Hand Shapes on a Multitouch Device

Towards the Consumerization of Smart Sensors

Fundamentals of Stereo Vision Michael Bleyer LVA Stereo Vision

Handheld Augmented Reality. Reto Lindegger

Research and Literature Review on Developing Motion Capture System for Analyzing Athletes Action

ios Accessibility Features

[Panhalkar*, 5(2): February, 2016] ISSN: (I2OR), Publication Impact Factor: 3.785

Robust Fingertip Tracking with Improved Kalman Filter

3D Computer Vision 1

A Two-stage Scheme for Dynamic Hand Gesture Recognition

The Kinect Sensor. Luís Carriço FCUL 2014/15

My Google Glass Sees Your Password!

Galileo Standard Series

Human Detection. A state-of-the-art survey. Mohammad Dorgham. University of Hamburg

Three-Dimensional Hand Pointing Recognition using Two Cameras by Interpolation and Integration of Classification Scores

Stylus Enhancement to Enrich Interaction with Computers

User Experience Design

Auto-focusing Technique in a Projector-Camera System

EMBEDDED VISION AND 3D SENSORS: WHAT IT MEANS TO BE SMART

Segmentation and Tracking of Partial Planar Templates

3D HAND LOCALIZATION BY LOW COST WEBCAMS

Tap Input as an Embedded Interaction Method for Mobile Devices

Accessibility on an ipad

Contour LS-K Optical Surface Profiler

Computer Graphics. Bing-Yu Chen National Taiwan University The University of Tokyo

Active Stereo Vision. COMP 4900D Winter 2012 Gerhard Roth

Phone Tilt. Dynamic UI Adaptations for single-handed smartphone interaction. Fasil Negash Mtr No.:

User Experience Design

ANDROID BASED OBJECT RECOGNITION INTO VOICE INPUT TO AID VISUALLY IMPAIRED

Bridging the Paper and Electronic Worlds

Video in the Interface

COMS W4735: Visual Interfaces To Computers. Final Project (Finger Mouse) Submitted by: Tarandeep Singh Uni: ts2379

Transcription:

Margarita Grinvald Gesture recognition for Smartphones/Wearables

Gestures hands, face, body movements non-verbal communication human interaction 2

Gesture recognition interface with computers increase usability intuitive interaction 3

Gesture sensing Contact type: Touch based Non-contact type: Device gesture Vision based Electrical Field Sensing (EFS) 4

Issues on mobile devices miniaturisation lack tactile clues no link between physical and digital interactions computational power 5

Approaches augment environment with digital information Sixthsense [Mistry et al. SIGGRAPH 2009] Skinput [Harrison et al. CHI 2010] OmniTouch [Harrison et al. UIST 2011] 6

Approaches augment hardware In-air typing interface for mobile devices with vibration feedback [Niikura et al. SIGGRAPH 2010] A low-cost transparent electric field sensor for 3D interaction [Le Goc et al. CHI 2014] MagGetz [Hwang et al. UIST 2013] 7

Approaches efficient algorithms combine devices In-air gestures around unmodified mobile devices [Song et al. UIST 2014] Duet: Exploring Joint interactions on a smart phone and a smart watch [Chen et al. CHI 2014] 8

Sixthsense [Mistry et al. SIGGRAPH 2009] augment environment with visual information interact through natural hand gestures wearable to be truly mobile 9

Color markers Camera Projector Smartphone Mirror 10

Support for arbitrary surfaces 11

Support for multitouch 12

Limitations inability track surfaces differentiate hover and click accuracy limitations 13

Skinput [Harrison et al. CHI 2010] skin as input canvas wearable bio-acoustic sensor localisation of finger tap 14

Projector Armband 15

Mechanical phenomena finger tap on skin generates acoustic energy some energy becomes sound waves some energy transmitted through the arm 16

17

Transverse waves 18

Longitudinal waves 19

Sensing array of tuned vibrations sensors sensitive only to motion perpendicular to skin two sensing arrays to disambiguate different armband positions. 20

Sensor packages Weights 21

Tap localisation sensor data segmented into taps ML classification of location initial training stage 22

23

Limitations lack of support of other surfaces than skin no multitouch support no touch drag movement 24

OmniTouch [Harrison et al. UIST 2011] appropriate on demand ad hoc surfaces depth sensing and projection wearable depth driven template matching 25

Depth Camera Projector 26

Finger tracking multitouch finger tracking on arbitrary surfaces no calibration or training resolve position and distinguish hover from click 27

Finger segmentation Depth map Depth map gradient 28

Finger segmentation Candidates Tip estimation 29

Click detection Finger hovering Finger clicking 30

On demand interfaces expand application space with graphical feedback track surface on which rendered update interface as surface moves 31

Interface glued to surface 32

33

In-air typing interface for mobile devices with vibration feedback [Niikura et al. SIGGRAPH 2010] vision based 3D input interface detect keystroke action in the air provide vibration feedback 34

vibration motor white LEDs Camera 35

Tracking high frame rate camera wide angle lens needs distortion correction skin colour extraction to detect fingertip estimate fingertip translation, rotation and scale 36

Keystroke feedback difference of the dominant frequency of the fingertips scale to detect keystroke tactile feedback is important vibration feedback is conveyed after a keystroke 37

Vision limitations camera is rich and flexible but with limitations minimal distance between sensor and scene sensitivity to lighting changes computational overheads high power requirements 38

A low-cost transparent electric field sensor for 3D interaction [Le Goc et al. CHI 2014] smartphone augmented with EFS resilient to illumination changes mapping measurements to 3D finger positions. 39

Drive electronics Electrode array 40

Recognition microchip built-in 3D positioning has low accuracy Random Decision Forests for regression on raw signal data speed and accuracy 41

42

MagGetz [Hwang et al. UIST 2013] tangible control widgets for richer tactile clues wider interaction area low cost and user configurable unpowered magnets 43

Magnetic fields Tangibles 44

Tangibles traditional physical input controls with magnets magnetic traces change on widget state change track physical movement of control widgets 45

Tangibles magnetism Toggle switch Slider 46

47

Limitations object damage by magnets magnetometer limitations 48

In-air gestures around unmodified mobile devices [Song et al. UIST 2014] extend interaction space with gesturing mobile devices RGB camera robust ML based algorithm 49

Gesture recognition detection of salient hand parts (fingertips) works without relying on highly discriminative depth data and rich computational resources no strong assumption about users environment reasonably robust to rotation and depth variation 50

Recognition algorithm real time algorithm pixel labelling with random forests techniques to reduce memory footprint of classifier 51

Recognition steps RGB input Segmentation Labeling 52

Applications division of labor works on many devices new apps enabled just by collecting new data 53

54

55

Duet: Exploring joint interactions on a smart phone and a smart watch [Chen et al. CHI 2014] beyond usage of single device allow individual input and output joint interactions smart phone and smart watch 56

Design space theory conversational duet foreground interaction background interaction 57

Design space 58

Design space 60

Gesture recognition ML techniques on accelerometer data handedness recognition promising accuracy 62

Summary wearables extend interaction space to everyday surfaces augmented hardware in general provides an intuitive interface no additional hardware is preferable but there are still computational limitations combination of devices may be redundant 63

References SixthSense: a wearable gestural interface [Mistry et al. SIGGRAPH 2009] Skinput: Appropriating the Body As an Input Surface [Harrison et al. CHI 2010] OmniTouch: Wearable Multitouch Interaction Everywhere [Harrison et al. UIST 2011] In-air typing interface for mobile devices with vibration feedback [Niikura et al. SIGGRAPH 2010] A Low-cost Transparent EF Sensor for 3D Interaction on Mobile Devices [Le Goc et al. CHI 2014] MagGetz: customizable passive tangible controllers on and around [Hwang et al. UIST 2013] In-air gestures around unmodified mobile devices mobile devices [Song et al. UIST 2014] Duet: Exploring Joint Interactions on a Smart Phone and a Smart Watch [Chen et al. CHI 2014] 64