Contactless Hand Gesture Recognition System

Similar documents
Input: is any data or instructions that are used by a computer.

Touch Less Touch Screen Technology

Swipe, pinch, zoom & tab All you want to know about Touchscreen Technology

Multi-Touch Gestures for 3D Objects

Computer Basics. Need more help? What s in this guide? Types of computers and basic parts. Why learn to use a computer?

VOICE AND TOUCH BASED INPUT

central processing unit (CPU) software operating system (OS) input device output device

Discovering Computers Chapter 5 Input

Bulbul NUB 1. Outline. Week # 02. Standard Input & Output Devices. Input Device 6/3/2018

Module 2. Input and output devices Page 11

SYLLABUS. Subject Information B.B.A. I SEM Technology

Discovering Computers Chapter 5 Input. CSA 111 College of Applied Studies UOB

What is a computer Types of computers Computer Peripherals Role of Computers & ICT in development

Solving the Windows 8 Puzzle

A Two-stage Scheme for Dynamic Hand Gesture Recognition

File Storage & Windows Tips

National Digital Literacy Mission Module 1: Power in Your Hands

Initial Activities. Getting Started on Your Equipment

INPUT DEVICES. Maninder Kaur

INPUT DEVICES 8/29/2010. Device: is an instrument that performs a simple task. Input: something put into a system.

Mouse Simulation Using Two Coloured Tapes

Redefine y Th our futur i e s i s the way!

CMP 477 Computer Graphics Module 2: Graphics Systems Output and Input Devices. Dr. S.A. Arekete Redeemer s University, Ede

Computer Basics. Page 1 of 10. We optimize South Carolina's investment in library and information services.

Touch screen. Uses of Touch screen: Advantages of Touch screen: Disadvantages of Touch screen:

Lesson 2: Input, Output, and Processing

COMPUTER DESCRIPTION...

KAUSTUBH DEKATE KEDAR AYACHIT KALPESH BHOIR SAGAR KEDARE DHAWAL JADHAV NITIN DHALE

Basic Computer and Mouse Skills Windows 10

Introduction to Computer Technology. Chapter 3. Engr. Naveed Jan Assistant Professor Electronic Technology. Chapter 3A

The Implementation of a Glove-Based User Interface

MATS College of Technology. Instructor: Kristine Mae M. Adlaon COMPUTER FUNDAMENTALS LABORATORY MANUAL

Sir Sadiq s computer notes for class IX. Chapter no 3. Input/Output Devices

A Mouse-Like Hands-Free Gesture Technique for Two-Dimensional Pointing

BIM Navigation with Hand-Based Gesture Control on Sites. Chao-Chung Yang 1 and Shih-Chung Kang 2

ICT IGCSE Theory Revision Presentation 2.1 Input devices and their uses

Design av brukergrensesnitt på mobile enheter

Hand Gesture Recognition Method for Human Machine Interface

Input devices are hardware devices that allow data to be entered into a computer.

ALL ABOUT COMPUTERS 2

VIRTUAL CONTROL HAND GESTURE RECOGNITION SYSTEM USING RASPBERRY PI

Human Arm Simulation Using Kinect

The 3D Terrain Interactive Technique Based on Gesture Recognition Yanyan Li1, a, Xiaomeng Xu2, b, Jiayu Sun3, c, Haimeng Zhao4, d*

ICT IGCSE Theory Revision Presentation 2.1 Input devices and their uses

A Real-Time Hand Gesture Recognition for Dynamic Applications

Virtual mouse with. Color Detection Using Computer Vision I. INTRODUCTION. Pravin Pabale. Dr. Maithili Arjunwadkar. A. Mouse Control Algorithm

Explore Windows 8.1 Update

Setting Accessibility Options in Windows 7

Plug and Play Home Automation with Alternative Control

Maxwell RSC Tablet PC Configuration Manual for use with Windows 8 Operating System

A Hand Gesture Recognition Method Based on Multi-Feature Fusion and Template Matching

Accessibility Options for Visual Impairment. Date: November 2017

Information Communications Technology (CE-ICT) 6 th Class

Lesson 2 Essential Computer Concepts

Radio Mini Wireless Wheel Mouse User s Guide MODEL: RFMSW-15

PAGE TITLE KEYBOARD SHORTCUTS

SOLIDWORKS 2017 Keyboard Modifiers & Shortcuts

A Kinect Sensor based Windows Control Interface

Gesture Recognition Using 3D MEMS Accelerometer

GLOVES & TOUCHSCREEN COMPATIBILITY PROTECTIVE INDUSTRIAL PRODUCTS, INC. BRINGING THE BEST OF THE WORLD TO YOU

Windows VISTA Built-In Accessibility. Quick Start Guide

Static Gesture Recognition with Restricted Boltzmann Machines

Haldia Institute of Technology. Touch Screen Monitors

Controlling Windows with gestures

COrvus Accessible Kit for Android

Discovering Computers Living in a Digital World

SolidWorks Intro Part 1b

Bluetooth Google TV Keyboard. - Multi-language supporting - Multi-Touch & Scrolling bar. User Manual

Introduction to Microsoft Excel

WINDOWS 8 CHEAT SHEET

Development of real-time motion capture system for 3D on-line games linked with virtual character

A Review on Virtual Mouse Using Hand Gesture and Color Detection

Arthur Mellows Village College Vision Impairment Hub Touch Typing

Technical White Paper TECHNICAL WHITE PAPER. Multi Touch Technology and HMI

An input device is a piece of hardware that is used to enter data into a computer.

UniGest: Text Entry Using Three Degrees of Motion

Genetic Algorithm Utilizing Image Clustering with Merge and Split Processes Which Allows Minimizing Fisher Distance Between Clusters

GESTURE RECOGNITION SYSTEM

UNIT V SYSTEM SOFTWARE TOOLS

SYSTEM AND METHOD FOR SPEECH RECOGNITION

Computer Devices Part 1 25 Question(s) Test ID:

MSc-IT 1st semester Fall

Introduction to Information & Communication Technologies

Where Did I Save That File?

Note: Text based on automatic Optical Character Recognition processes. SAMSUNG GALAXY NOTE

OFFICE COMPUTER RETAILING

Comparative Study of Hand Gesture Recognition Techniques

Getting to Know Windows 10. Handout

USER GUIDE. GO-Global Android Client. Using GO-Global Android Client

SolidWorks 2½D Parts

Wearable Computing. Holger Kenn WS 05/06. Universität Bremen

Functional Skills. Entry 3 to Level 2. IT Basics Information

PERFORMANCE EVALUATION OF ONTOLOGY AND FUZZYBASE CBIR

SILVACO. An Intuitive Front-End to Effective and Efficient Schematic Capture Design INSIDE. Introduction. Concepts of Scholar Schematic Capture

Introduction to Windows 10 Part 1

Gesture Recognition using Neural Networks

Homeschool Enrichment. Input & Output

What is Emcee? Actions in Thumbnail View Starting Emcee and Activating the Thumbnail View... 5

Changing How the Keyboard Works in Windows 7

Created by Eugene Stephens 2015

Transcription:

www.ijecs.in International Journal Of Engineering And Computer Science ISSN:2319-7242 Volume 3 Issue 11 November, 2014 Page No. 9238-9242 Contactless Hand Gesture Recognition System Puja D. Kumbhare Email-id: Vasu11oct@gmail.com Branch: Wireless communication branch. College Name: Abha Gaikwad patil of Engineering, Mohgaon. Abstract A gesture recognition system is such a system that recognizes and differentiates between gestures. These gestures can be any type of facial or body gestures. Various facial expressions constitute facial gestures. Similarly the various gestures that can be made using our hand, or the palm to be more specific are called HAND GESTURES In this project, we aim to develop such a system that is able to recognize some of such hand gestures and differentiate between them, thus triggering a certain event corresponding to the gesture. We propose to develop a system that can successfully recognize hand into two dimensional hand gestures, using an IR sensor matrix for the purpose of recognition. We also plan to connect the recognition system to a computer keyboard emulation system so that appropriate gestures can cause corresponding keyboard events like pressing and releasing of a key. we reduce the number of the required IR sensors to two and thus reduce the power consumption, which is mentioned as a critical issue in Even using the limited information from only two IR sensors, our system can achieve accurate gesture recognition using the proposed IR feature set and the classifier. Keywords Hand gesture, IR Sensor, Two dimentional. I. INTRODUCTION As mentioned in abstract, we plan to develop a contactless gesture recognition system that will successfully recognize and differentiate between various hand gestures over the two dimensional plane. Gesture Recognition systems are broadly classified into two types: Gesture Recognition over Two Dimensional Plane Gesture Recognition over Three Dimensional Plane 1) Gesture Recognition Over two dimensional Plane: Here, any gesture, be it a hand gesture or facial gestures, or any other gestures, they are detected using sensors that return information over only two dimensions of plane. Eg.X and Y. Consider the Trackpad on a Laptop. It sensors your finger, and tracks it movement. However sensing is done only to gather the movements of your finger over the X Y plane such that it can be reflected on the X Y plane on the screen. In simple words, as your fingers move over the trackpad, the X and Y co ordinates are gathered, and the cursor on the screen is appropriately moved over the X Y plane on the screen. Considering another example, any touchscreen device is also a gesture recognition system over a 2D plane. A mobile phone that is operated using the touch interface is actually a sensing device gathering data about the movements and touch gestures that you make on the touchscreen surface, while it is translated into appropriate actions and events on the device. Operation such as scroll, zoom in and zoom out, draw, etc. can be performed on the device using their corresponding gestures. 2) Gesture recognition system over a three dimensional plane : On the other hand, there are such systems that allow sensing of gestures on a three dimensional plane. That is, here, information is gathered about the gesture on all axis of X, Y and Z. A doctor performing a procedure on a patient from a remote location using a robot and a gesture recognition glove is a good example of such systems. It must be noted that such systems are relatively more complex to create, more expensive and require highly sophisticated and complex algorithms for the purpose of recognition of gestures over three dimensions. II. EASE OF USE We used here multiple IR sensors[1]are combined into a specific arrangement and then data from them is combined and processed by specifically designed to translate the data into meaningful gestures. Gesture-based interfaces provide an intuitive way for users to specify commands and interact with computer. As mobile phones and tablets become ubiquitous, there is an increasing need of an intuitive user interfaces for small-sized, resource-limited mobile devices. Most existing gesture recognition systems can be classified into three types: motion-based, touch-based, and vision-based systems. For motion-based systems user cannot make gestures unless holding a mobile device or an external controller. Puja D. Kumbhare, IJECS Volume 3 Issue 11 November 2014, Page No.9238-9242 Page 9238

Touch-based systems can accurately map the finger/pen positions and moving directions on the touch-screen to different commands. However, 3D gestures are not supported Fig: 2D gesture using IR Proximity Sensor the mouse, touch-screen, pendrive, voice recognization the keyboard remains the most commonly used and most versatile device used for direct (human) input into computers. In normal usage, the keyboard is used to type text and numbers into a word processor, text editor or other programs. In a modern computer, the interpretation of key presses is generally left to the software. A computer keyboard distinguishes each physical key from every other and reports all key presses to the controlling software. Keyboards are also used for computer gaming, either with regular keyboards or by using keyboards with special gaming features, which can expedite frequently used keystroke combinations. A keyboard is also used to give commands to the operating system of a computer, such as windows alt + ctrl +del combination, which brings up a task window or shuts down the machine. Keyboards are the only way to enter commands on a command- line interface 2) Mouse computing: 6 because all possible gestures are confined within the 2D screen surface. While the first two types of system require users to make contact with devices, vision-based systems using camera and computer vision techniques allow users to make intuitive gestures without touching the device. However, most visionbased systems are computationally expensive and powerconsuming, which is undesirable for resource-limited mobile devices like tablets or mobile phones. III. REVIEW OF LITERATURE 1) Computer keyboard: Fig2: Mouse as a Input Fig1: Computer keyboard as a Input The keyboard is a input device which uses an arrangement of buttons or key which act as mechanical or electronic switches. the decline of punch card and paper tape, interaction via tele-printer keyboards, the main input device for computer. A keyboard typically has characters printed the keys and each press of a key typically corresponds to a single written symbol, to produce some symbols requires pressing and holding several keys simultaneously or in sequence. While most keyboard keys produce letters, numbers or signs or signs Despite the development of alternative input devices, such as A computer mouse with the most common standard features: two buttons and a scroll wheel, which can also act as a third button. In computing, a mouse is a pointing device that functions by detecting two dimensional motion to its supporting surface. Physically, a mouse consists of an object held under one of the user's hands, with one or more buttons. The mouse sometimes features other elements, such as "wheels", which allow the user to perform various systemdependent operations, or extra buttons or features that can add more control or dimensional input. The mouse's motion typically translates into the motion of a pointer on a display which allows for fine control of a graphical user interface. 3)Touch-Screen: Puja D. Kumbhare, IJECS Volume 3 Issue 11 November 2014, Page No.9238-9242 Page 9239

Fig3: A child solves a computerized puzzle using a touchscreen. A touch screen is an electronic visual display that can detect the presence and location of a touch within the display area. The term generally refers to touching the display of the device with a finger and hand. Touch-screens are common in devices such as smart phones and tablet computers and games console. Fig4: Touch-less operation Using IR Proximity sensor 5)Gesture recognition system: The touch-screen has two main attributes. First, it enables one to interact directly with what is displayed, rather than indirectly with a pointer controlled by a mouse and touchpad. With a display of a simple smooth surface, and direct interaction without any hardware i.e. keyboard and mouse between user and content. 4)TOUCH-LESS: Multiple IR sensors are combined into a specific arrangement and then data from them is combined and processed by an algorithm specifically designed to translate the data into meaningful gestures. Hand Gesture-based interfaces [4] provide an intuitive way for users to specify commands and interact with computers. To solve the existing challenges, we present a contactless gesture recognition system using only two infrared proximity sensors. We propose a set of infrared feature extraction and gesture classification algorithms. Using the system as a gesture interface, a user can flip e-book pages, scroll web pages, zoom in/out, and play games on mobile devices using intuitive hand gestures, without touching, wearing, or holding any additional devices. The design also reduces the frequency of users contact with devices, alleviating the wear and tear to screen surfaces. Fig 5: Gesture recognition system In above fig, Idle mode contain sensor grid and circuit with stored gesture which are connected directly using sensor grid user can interact with input system, if gesture is handle by the input system then corresponding gesture is perform accurately otherwise gesture is said to be unknown gesture and we want to perform the hand gesture again. Emulation of Following Keys will be done based on the Gesture Performed 1) Page UP key Puja D. Kumbhare, IJECS Volume 3 Issue 11 November 2014, Page No.9238-9242 Page 9240

2) Page DOWN key 3) UP arrow key 4) DOWN arrow key 5) ENTER key 6) BACKSPAE Key 7) SPACEBAR Key 8) RIGHT click Fig1: Sensor arranged in 3*3 matrix For eg. If Activation order is C3 C2 C1 Indicates a Right to Left Hand Movementi.e. LEFT SWIPE Fig.4.1.2: Example of LEFT Swipe Similarly, If Activation order is R1 R2 R3, Indicates a top to bottom Hand Movement i.e. SCROLL UP Fig3: Example of SCROLL-UP Gestures Implementation 1) Left Swipe or Next Page PC equivalent Key: Page Down This is a Gesture where the User swipes his palm from right, TOWARDS LEFT, hence the name LEFT SWIPE. Since this gesture is similar to turning the page of a book, it is used to go to NEXT PAGE. Hence the PC equivalent Key is PAGE DOWN.Sensor Activation Sequence: C3 C2 C1. 2) Right Swipe or Previous Page PC equivalent Key: Page UP This is a Gesture where the User swipes his palm from left, TOWARDS RIGHT, hence the name RIGHT SWIPE. Since this gesture is similar to turning the page of a book, it is used to go to PREVIOUS PAGE. Hence the PC equivalent Key is PAGE UP. Sensor Activation Sequence: C1 C2 C3. 3) DOWN Swipe or SCROLL UP PC equivalent Key: UP ARROW KEY This is a Gesture where the User swipes his palm from TOP, DOWNWARDS, hence the name DOWN SWIPE. Since this gesture is similar to scrolling text upwards, it is used to go to SCROLL UP. Hence the PC equivalent Key is UP ARROW KEY. Sensor Activation Sequence: R1 R2 R3. 4) UPWARD Swipe or SCROLL Down PC equivalent Key: Down ARROW KEY This is a Gesture where the User swipes his palm from Bottom, UPWARDS, hence the name UPWARD SWIPE. Since this gesture is similar to scrolling text upwards, it is used to go to SCROLL DOWN. Hence the PC equivalent Key is DOWN ARROW KEY. Sensor Activation Sequence:R3 R2 R1. 5) HOLD Gesture PC equivalent Key: ENTER i.e. SELECT KEY This is a Gesture where the User HOLDS HIS HAND over the entire sensor matrix for a couple of seconds. Since this gesture is Intended to cause a SELECT or ENTER Operation, it is also called so. Hence the PC equivalent Key is ENTER KEY. Sensor Activation Sequence: R1 + R2 + R1, Can also be called C1 + C2 + C3. 6)FORWORD REVERSE PC equivalent Key: Backspace Key This is a Gesture where the User move horizontally forward and reverse over the entire sensor matrix. Since this gesture is Intended to cause and move to the previous page. Hence the PC equivalent Key is BACKSPACE KEY. Sensor Activation Sequence:R3 R3. 7)L Gesture PC equivalent Key: Spacebar Key This is a Gesture where the User move Like L over the entire sensor matrix. Since this gesture is Intended to cause and select the any file from the folder.hence the PC equivalent Key is SPACEBAR KEY. Sensor Activation Sequence: C1 R3 8) UPDOWN Gesture PC equivalent Key: Right click This is a Gesture where the User move up down over the entire sensor matrix.since this gesture is Intended to cause and open the right click options.hence the PC equivalent Key is Right click. Sensor Activation Sequence: R1 R2 R3 R2 R1 IV CONCLUSION AND FUTURE WORK Puja D. Kumbhare, IJECS Volume 3 Issue 11 November 2014, Page No.9238-9242 Page 9241

CONCLUSION: 1. We have presented a contactless gesture recognition system that allows users to make gesture inputs without touching, holding, or wearing any device. 2. Using the proposed IR feature set and classifier, the system can recognize gestures with 98% precision and 88% recall rate. 3. The low power consumption and high accuracy make the system particularly desirable for deployment on resource-limited mobile consumer devices. FUTURE SCOPE: Our future work is to extend the configuration to multiple sensor arrays to get more information from sensor data. Using the basic gesture set as building blocks, we can further recognize more compound 3D gestures as permutations of the simple ones. Hidden Markov model can also be incorporated to learn the gesture sequences performed by users. V REFERENCES [1] J. Clerk Maxwell, A Treatise on Electricity and Magnetism, 3rd ed., vol. 2. Oxford: Clarendon, 1892, pp.68-73. [2] K. Elissa, Title of paper if known, unpublished. [3] R. Nicole, Title of paper with only first word capitalized, J. Name Stand. Abbrev., in press. [4] V. I. Pavlovic, r. Sharma, and t. S. Huang. Visual interpretation of hand gestures for human-computer interaction: a review. Pami,1997,19(7):677-695 [5] J. Kim, S. Mastnik, and E. Andr e. EMG-based hand gesture recognition for realtime biosignal interfacing. In Proc. IUI, pages 30 39, 2008. [6] ] A. Butler, s. Izadi, and s. Hodges. Sidesight: multi- touch interaction around small devices. In proc. Uist, 2008, pages 201 204 [7] S. Kratz and Rohs. Overflow: exploring around-device interaction with IR distance sensors. In proc. Mobilehci, 2009, pages 42:1 42:4. Puja D. Kumbhare, IJECS Volume 3 Issue 11 November 2014, Page No.9238-9242 Page 9242