A Mouse-Like Hands-Free Gesture Technique for Two-Dimensional Pointing

Similar documents
Scroll Display: Pointing Device for Palmtop Computers

Contactless Hand Gesture Recognition System

Effects of Target Size and Distance on Kinematic Endpoint Prediction

Mouse Pointer Tracking with Eyes

An Auto-Stereoscopic VRML Viewer for 3D Data on the World Wide Web

Interface for Digital Notes Using Stylus Motions Made in the Air

A Comparison of Gaze-Based and Gesture-Based Input for a Point-and-Click Task

Two-Dimensional Projectile Motion

Input: is any data or instructions that are used by a computer.

UniGest: Text Entry Using Three Degrees of Motion

Visual Physics Introductory Lab [Lab 0]

The 3D Terrain Interactive Technique Based on Gesture Recognition Yanyan Li1, a, Xiaomeng Xu2, b, Jiayu Sun3, c, Haimeng Zhao4, d*

Multi-Touch Gestures for 3D Objects

Projectile Trajectory Scenarios

Gesture Recognition using Temporal Templates with disparity information

Binarization of Color Character Strings in Scene Images Using K-means Clustering and Support Vector Machines

Recall Butlers-Based Design

Application of the Fourier-wavelet transform to moving images in an interview scene

An Evaluation of Techniques for Grabbing and Manipulating Remote Objects in Immersive Virtual Environments

Date Course Name Instructor Name Student(s) Name WHERE WILL IT LAND?

Variability in Wrist-Tilt Accelerometer Based Gesture Interfaces

Pen-Based Interface Using Hand Motions in the Air

Aim 2015 to assess pointing skills

Visual Physics - Introductory Lab Lab 0

Usability Evaluation of Cell Phones for Early Adolescent Users

Recognition of Finger Pointing Direction Using Color Clustering and Image Segmentation

Hierarchical Decomposition of Handwriting Deformation Vector Field Using 2D Warping and Global/Local Affine Transformation

Projection simulator to support design development of spherical immersive display

The Implementation of a Glove-Based User Interface

Quick guide. Remote control. What it does How to do it Usability tips. Touch the glider gently, no need to press it. your thumb

Double Click issues. UPDD assisted double click setting

Software User s Manual

Static Gesture Recognition with Restricted Boltzmann Machines

Input Method Using Divergence Eye Movement

Kinect Cursor Control EEE178 Dr. Fethi Belkhouche Christopher Harris Danny Nguyen I. INTRODUCTION

PRESENTED BY: M.NAVEEN KUMAR 08M31A05A7

Ubi Quick Start Guide

Mouse Simulation Using Two Coloured Tapes

Coin Size Wireless Sensor Interface for Interaction with Remote Displays

2 OVERVIEW OF RELATED WORK

Avatar Communication: Virtual Instructor in the Demonstration Exhibit

Detection of Small-Waving Hand by Distributed Camera System

Programming of Interactive Systems.

Three-Dimensional Hand Pointing Recognition using Two Cameras by Interpolation and Integration of Classification Scores

How do you roll? Fig. 1 - Capstone screen showing graph areas and menus

Extended Fractional View Integral Photography Using Slanted Orthogonal Lenticular Lenses

Using the Interactive Whiteboard

Basic Computer Operations

Pointing on TV: A Comparative Study of Touch vs. Motion

Using DataQuest on a Handheld

Programming-By-Example Gesture Recognition Kevin Gabayan, Steven Lansel December 15, 2006

Real Time Break Point Detection Technique (RBPD) in Computer Mouse Trajectory

HOG-Based Person Following and Autonomous Returning Using Generated Map by Mobile Robot Equipped with Camera and Laser Range Finder

Natural Viewing 3D Display

Local Image Registration: An Adaptive Filtering Framework

Video Aesthetic Quality Assessment by Temporal Integration of Photo- and Motion-Based Features. Wei-Ta Chu

Quick Guide. Chromebook Accessibility Features. What about Accessibility? How do you turn Accessibility Features on?

Projection Center Calibration for a Co-located Projector Camera System

Bone Age Classification Using the Discriminative Generalized Hough Transform

Convergence Point Adjustment Methods for Minimizing Visual Discomfort Due to a Stereoscopic Camera

Real-time Integral Photography Holographic Pyramid using a Game Engine

Drag and Drop by Laser Pointer: Seamless Interaction with Multiple Large Displays

Physical Input and Tangible Computing

Application of a Visual Computer Simulator into Collaborative Learning

Phone Tilt. Dynamic UI Adaptations for single-handed smartphone interaction. Fasil Negash Mtr No.:

Stylus Enhancement to Enrich Interaction with Computers

2/22/12. CS160: User Interface Design. Prototyping 02/2 2/12. Berkeley UNIVERSITY OF CALIFORNIA

Implementation of a Smart Ward System in a Hi-Tech Hospital by Using a Kinect Sensor Camera

Available online at ScienceDirect. Procedia Computer Science 59 (2015 )

ThumbSpace: Generalized One-Handed Input for Touchscreen-Based Mobile Devices

U-Remo: Projection-assisted Gesture Control for Home Electronics

Input/Output. Display Keyboard Mouse Removable storage

Usability evaluation of input devices for navigation and interaction in 3D visualisation

A Detailed Analysis of RSIGuard s AutoClick Functionality

Motion Graphs. Plotting position against time can tell you a lot about motion. Let's look at the axes:

ELECOM MouseAssistant

Arranging Touch Screen Software Keyboard Split-keys based on Contact Surface

A Novel Video Retrieval Method to Support a User's Recollection of Past Events Aiming for Wearable Information Playing

Position and Piecewise Velocity

Abstract. Die Geometry. Introduction. Mesh Partitioning Technique for Coextrusion Simulation

Command composition approach in crossing interfaces. with more than two continuous goals

Putting the Touch in Multi-Touch: An in-depth look at the future of interactivity. Gary L. Barrett, Chief Technical Officer at Touch International

A Prototype System to Browse Web News using Maps for NIE in Elementary Schools in Japan

How Pointing Devices Work

A Review on Virtual Mouse Using Hand Gesture and Color Detection

Augmenting Reality with Projected Interactive Displays

3D Environment Measurement Using Binocular Stereo and Motion Stereo by Mobile Robot with Omnidirectional Stereo Camera

Interactive PTZ Camera Control System Using Wii Remote and Infrared Sensor Bar

A study of the Graphical User Interfaces for Biometric Authentication System

Cross-Language Interfacing and Gesture Detection with Microsoft Kinect

Routing Table Construction Method Solely Based on Query Flows for Structured Overlays

Robustness of Selective Desensitization Perceptron Against Irrelevant and Partially Relevant Features in Pattern Classification

Points Lines Connected points X-Y Scatter. X-Y Matrix Star Plot Histogram Box Plot. Bar Group Bar Stacked H-Bar Grouped H-Bar Stacked

STUDY ON DISTORTION CONSPICUITY IN STEREOSCOPICALLY VIEWED 3D IMAGES

Visual Physics Camera Parallax Lab 1

Interaction Style Categories. COSC 3461 User Interfaces. Windows. Window Manager

Sniper Pointing: Above the Surface Pointing with Multiple Resolutions

Indian Institute of Technology Kanpur District : Kanpur Team number: 2. Prof. A. K. Chaturvedi SANKET

ARC-Pad: A Mobile Device for Efficient Cursor Control on Large Displays

Extraction and recognition of the thoracic organs based on 3D CT images and its application

Transcription:

A Mouse-Like Hands-Free Gesture Technique for Two-Dimensional Pointing Yusaku Yokouchi and Hiroshi Hosobe Faculty of Computer and Information Sciences, Hosei University 3-7-2 Kajino-cho, Koganei-shi, Tokyo 184-8584, Japan Abstract. The use of motion sensing for input devices is becoming increasingly popular. In particular, hands-free gesture input is promising for such devices. We propose a mouse-like hands-free gesture technique for two-dimensional pointing. It is characterized as follows: (1) a user horizontally moves his/her hand to position a cursor shown on a vertical screen; (2) the user activates cursor movement by opening his/her hand, and deactivates it by clenching; (3) the user performs target selection by clicking in the air with his/her index finger; (4) the user is assisted in quick but precise cursor movement by automatic acceleration. We present results of a user study that experimentally compared the mouse-like technique with a tablet-like one. 1 Introduction The use of motion sensing for input devices such as the Nintendo Wii remote and Microsoft Kinect is becoming increasingly popular. In particular, hands-free gesture input is promising for such devices. Research efforts have often been devoted to ray casting and its variants [5] that are pointing techniques based on the sensed directions of users hands or fingers. However, a different approach can also be taken for pointing by using the sensed positions of users hands or fingers, that is, mapping the sensed positions to two-dimensional (2D) or three-dimensional (3D) coordinates in virtual spaces [4]. Along this line, previous research on 3D pointing [3] compared a tablet-like absolute positioning-based technique with a mouse-like relative one, and experimentally showed that the tablet-like one was better. We propose a mouse-like hands-free gesture technique for 2D pointing. It is characterized as follows: 1. A user horizontally moves his/her hand to position a cursor shown on a vertical screen (Fig. 1(a)); 2. The user activates cursor movement by opening his/her hand, and deactivates it by clenching; 3. The user performs target selection by clicking in the air with his/her index finger (Fig. 1(b)); 4. The user is assisted in quick but precise cursor movement by automatic acceleration.

2 Relative movement (a) (b) Fig. 1. Mouse-like pointing technique. We performed a user study to experimentally compare the mouse-like technique with a tablet-like one. We studied eight subjects who performed target selection tasks. A two-way ANOVA on the experimental results indicated a significant difference between the two techniques. Six out of the eight subjects answered that the mouse-like technique was easier to use. 2 Proposed Technique This section proposes a mouse-like hands-free gesture technique for 2D pointing. 2.1 Cursor Movement Most of hands-free gesture techniques for 2D pointing can be grouped into two categories: ones based on the sensed directions of users hands or fingers; the others based on the sensed positions of hands or fingers. The latter category can be further grouped into two approaches: the tablet-like approach that uses absolute positioning of hands or fingers; the mouse-like one that adopts relative positioning of hands or fingers. We propose a mouse-like pointing technique. Our technique is unique in that it utilizes the horizontal movement of a user s hand to obtain a cursor position on a vertical screen as shown in Fig. 1(a). More specifically, if the user moves his/her hand toward the screen, the cursor moves upward on the screen; if the user moves his/her hand toward the user, the cursor moves downward on the screen; if the user moves his/her hand to the left or to the right, the cursor moves in the same direction. We can expect that, compared to the vertical movement, the horizontal movement of the user s hand reduces his/her fatigue. Although the movement of the hand and the corresponding movement of the cursor are not parallel, this situation is common to existing 2D pointing devices such as mice and touchpads. Since our pointing technique is based on the mouse-like approach, the cursor moves on the screen only while it is activated. For this purpose, we let the user make the following hand gestures: to activate the movement of the cursor, the user opens his/her hand; to deactivate it, the user clenches.

3 In addition, we incorporate a cursor acceleration method that is common to existing 2D pointing devices to assist users in quick but precise cursor movement. In the case of mice, cursor acceleration uses a nonlinear mapping from the velocity of a mouse to the velocity of the cursor. More specifically, if the user moves the mouse slowly, the cursor moves slowly on the screen; if the user moves the mouse more quickly, the cursor moves much more quickly. This enables the user to do almost stress-free cursor movement on a large screen while enabling the user to do careful cursor movement if necessary. Since our pointing technique is based on the mouse-like approach, we can naturally incorporate cursor acceleration into our technique. 2.2 Target Selection We adopt a hand gesture again to let the user select a target on the screen. Specifically, the user selects the target by clicking in the air with his/her index finger as shown in Fig. 1(b). Since the user moves only his/her index finger without needing to move his/her hand, we can expect that this operation has almost no negative effect on the cursor movement operation. Also, we can incorporate other operations in the same way by using other fingers than the index finger. 3 Implementation We implemented the proposed technique by using the Leap Motion controller [4]. The controller uses infrared LEDs and cameras to recognize hand gestures that a user performs above it. We implemented the necessary program as a Windows Presentation Foundation application in the C# language of Microsoft Visual Studio 2013. The program consists of 850 lines of code. We used the position of the user s middle finger as the position of the user s hand. To identify whether the user opens or clenches his/her hand, we used Leap Motion s confidence data. To identify whether the user clicks with his/her index finger, we used its relative height compared to the average height of his/her ring and little fingers. We implemented mouse acceleration by simulating that of Windows XP [1]. Specifically, if the distance of the movement of the user s hand per frame is less than 0.44 millimeters, the movement of the cursor is 12.0 pixels/millimeter; if the hand movement is between 0.44 and 1.25 millimeters, the cursor movement is 18.1 pixels/millimeter; if the hand movement is between 1.26 and 3.86 millimeters, the cursor movement is 27.5 pixels/millimeter; if the hand movement is more than 3.86 millimeters, the cursor movement is 56.8 pixels/millimeter. 4 Experiments This section describes the experiments that we conducted to evaluate the performance of the proposed mouse-like technique.

4 4.1 Method We experimentally compared the proposed mouse-like technique with a tabletlike one. The tablet-like technique directly maps the absolute position of a user s hand to coordinates on a screen (Fig. 2(a)), and enables the user to perform target selection by tapping in the air (Fig. 2(b)). Absolute movement (a) (b) Fig. 2. Tablet-like pointing technique. We designed a task to evaluate the performance of users pointing by using the mouse-like and the tablet-like technique. In this task, a user selects targets that repeatedly appear at random positions on the screen. The position of a new target is always at most 1000-pixel distant from the previous one. The size of the target is 74 100 pixels that are almost equal to the size of a standard icon on Windows with a 1920 1080-resolution screen. We designed this task in a similar way to that for a predictive cursor movement technique [2]. However, the behavior of targets in our task is more random than that for the predictive cursor movement technique. We studied eight subjects who performed the target selection task. All the subjects were male and right-handed, and their average age was 21.8 years. Half of them used the mouse-like and the tablet-like technique in this order, and the others used these techniques in the reverse order. They had a five-minute practice and a five-minute rest before and after the task respectively. A subject repeated the task 100 times for each technique, that is, 200 times in total. The needed time of an experiment for a subject was approximately 30 minutes. We used a 1920 1080-resolution LCD display. We recorded the time for each task, the number of misclicks, and the position of each target. After the experiment of each subject, we made an inquiry with five-grade assessment about the usability and the fatigability of the two techniques. 4.2 Results We compared the two techniques by analyzing the data that we obtained from the experiments. To simplify the analysis, we classified target distances into ten levels. The two-way ANOVA on the two techniques and the target distances

5 indicated significant differences between the two techniques (F (1, 1580) = 3.900, p < 0.001) and between the target distances (F (9, 1580) = 7.006, p < 0.01). However, no interaction occurred between the two techniques and the target distances. Fig. 3 plots relations between target distances and pointing times. The bars indicate the average times of all the subjects, and the error bars indicate the standard deviations. The mouse-like technique resulted in smaller standard deviations than the tablet-like one for most of the target distances. Tablet-like pointing Mouse-like pointing Fig. 3. Results of experiments for comparing the mouse-like and the tablet-like pointing technique. Both techniques resulted in the average error rates of approximately 40 %. By contrast, the average error rate for the same task using a mouse was approximately 4 %. This indicates that the hand gesture techniques were less stable than mouse-based operations. The total numbers of misclicks were 774 for the mouse-like technique and 862 for the tablet-like one. In the inquiries after the experiments, six out of the eight subjects answered that the mouse-like technique was easier to use. Four out of the eight subjects answered that the tablet-like technique had caused more fatigue than the mouselike one. Also, we received comments that they had experienced difficulty in selecting targets located near sides of the screen. Both techniques obtained only a few high evaluations, which indicated that subjects had felt stress in hands-free gesture pointing.

6 5 Conclusions and Future Work We proposed a mouse-like hands-free gesture technique for 2D pointing. It uses the horizontal movement of a user s hand, and adopts hand gestures to enable relative cursor movement and target selection. We experimentally evaluated the performance of the proposed mouse-like technique by comparing a tablet-like one. We studied eight subjects, and found that the mouse-like technique had reduced pointing times and obtained more stable operations. Also, the inquiries after the experiments indicated that the mouse-like technique was easier to use and caused less fatigue. Our future work includes improving the recognition of hand gestures. There are still many hand gestures that cannot be precisely recognized. To enhance the usefulness of hands-free gesture pointing techniques, we need more precise recognition of hand gestures. Other future work is to devise a better method for evaluating gesture pointing techniques since the experiments that we performed were originally designed for mouse-based pointing techniques. Acknowledgement. This work was supported by JSPS KAKENHI Grant Number 25540029. References 1. Adjusting the acceleration of a mouse pointer on Windows XP, http://07.net/mouse/ 2. Asano, T., Sharlin, E., Kitamura, Y., Takashima, K., Kishino, F.: Predictive interaction using the Delphian Desktop. In: Proc. ACM UIST. pp. 133 141 (2005) 3. Averkiou, M., Dodgson, N.A.: Comparison of relative (mouse-like) and absolute (tablet-like) interaction with a large stereoscopic workspace. In: SD&A. Proc. SPIE, vol. 7863, p. 41 (2011) 4. Plemmons, D., Mandel, P.: Introduction to motion control (2014), https://developer.leapmotion.com/articles/intro-to-motion-control 5. Vogel, D., Balakrishnan, R.: Distant freehand pointing and clicking on very large, high resolution displays. In: Proc. ACM UIST. pp. 33 42 (2005)