Virtual mouse with. Color Detection Using Computer Vision I. INTRODUCTION. Pravin Pabale. Dr. Maithili Arjunwadkar. A. Mouse Control Algorithm

Similar documents
Mouse Simulation Using Two Coloured Tapes

A Review on Virtual Mouse Using Hand Gesture and Color Detection

VIRTUAL CONTROL HAND GESTURE RECOGNITION SYSTEM USING RASPBERRY PI

COMS W4735: Visual Interfaces To Computers. Final Project (Finger Mouse) Submitted by: Tarandeep Singh Uni: ts2379

A COMPLETE SOFTWARE SOLUTION FOR A HANDICAPPED USER TO ACCESS A COMPUTER SYSTEM USING HEADMOUSE TECHNOLOGY & SPEECH RECOGNITION

Human Computer Interface Based Virtual Mouse

Touch Less Touch Screen Technology

Biometric Device Assistant Tool: Intelligent Agent for Intrusion Detection at Biometric Device using JESS

Hand Gesture Recognition Method for Human Machine Interface

Comparative Study of Hand Gesture Recognition Techniques

International Journal of Engineering Trends and Applications (IJETA) Volume 4 Issue 6, Nov-Dec 2017

Portable, Robust and Effective Text and Product Label Reading, Currency and Obstacle Detection For Blind Persons

An Artificially Intelligent Path Planner for an Autonomous Robot

A Real-Time Hand Gesture Recognition for Dynamic Applications

Contactless Hand Gesture Recognition System

Keywords Binary Linked Object, Binary silhouette, Fingertip Detection, Hand Gesture Recognition, k-nn algorithm.

VIRTUAL TRAIL ROOM. South Asian Journal of Engineering and Technology Vol.3, No.5 (2017) 87 96

ISSN: (Online) Volume 2, Issue 3, March 2014 International Journal of Advance Research in Computer Science and Management Studies

Color Tracking Robot

Colour Object Counting and Sorting Mechanism Using Image Processing Approach Avadhoot R.Telepatil 1, 2 Prashant M. Jadhav 2 1

Keyless Car Entry Authentication System Based on A Novel Face-Recognition Structure

Biometric Palm vein Recognition using Local Tetra Pattern

Available online at ScienceDirect. Procedia Computer Science 59 (2015 )

A Kinect Sensor based Windows Control Interface

Mobile Phone Operations using Human Eyes Only and its Applications

On-line One Stroke Character Recognition Using Directional Features

A Study on Wearable Gestural Interface A SixthSense Technology

GESTURE RECOGNITION SYSTEM

Fingertips Tracking based on Gradient Vector

On-road obstacle detection system for driver assistance

An Interactive Technique for Robot Control by Using Image Processing Method

Sixth Sense Technology

Operating an Application Using Hand Gesture Recognition System

Mouse Pointer Tracking with Eyes

XMReality 6. User Manual for Windows XMReality AB Teknikringen 10, 8 fl SE Linköping Sweden

Operation Improvement of an Indoor Robot by Using Hand Fingers Recognition

A Novel Approach to the Indian Paper Currency Recognition Using Image Processing

Control Mouse Cursor by Head Movement: Development and Implementation

Measuring Height of an Object using Accelerometer and Camera in ios and Android Devices

Indian Institute of Technology Kanpur District : Kanpur Team number: 2. Prof. A. K. Chaturvedi SANKET

The Implementation of a Glove-Based User Interface

Robust Fingertip Tracking with Improved Kalman Filter

VOICE CONTROLLED WHEEL CHAIR USING ARDUINO

Object Tracking System Using Motion Detection and Sound Detection

Smart Autonomous Camera Tracking System Using myrio With LabVIEW

MOVING OBJECT DETECTION USING BACKGROUND SUBTRACTION ALGORITHM USING SIMULINK

XMReality 6. User Manual for Windows XMReality AB Teknikringen 10, 8 fl SE Linköping Sweden

Discovering Computers Chapter 5 Input. CSA 111 College of Applied Studies UOB

Novel Approaches of Image Segmentation for Water Bodies Extraction

Digital Literacy. Identify types of computers, how they process information, and the purpose and function of different hardware components

Smart Band for Women Security Based on Internet of Things (IOT)

International Journal of Technical Research (IJTR) Vol. 4, Issue 2, Jul-Aug 2015 DETECTION OF ROAD SIDE TRAFFIC SIGN

OBJECT SORTING IN MANUFACTURING INDUSTRIES USING IMAGE PROCESSING

Estimating Speed, Velocity, Acceleration and Angle Using Image Addition Method

Histogram Difference Methods Using Colour Models

TRIAL INSIDE 30 DAY. Magnification & speech for people with low vision. Available in 3 editions to suit your level of sight: Magnifier

Meta-Content framework for back index generation

IoT Based Smart Energy Meter Monitoring and Theft Detection for Home Management System

Design of A DIP System for Circumstantial Examination and Determination for Visually Disabled Persons

Computer Devices Part 1 25 Question(s) Test ID:

International Journal of Modern Engineering and Research Technology

Hand Gesture Recognition System

Discovering Computers Chapter 5 Input

Analysis of Image and Video Using Color, Texture and Shape Features for Object Identification

OFFLINE SIGNATURE VERIFICATION

A method for depth-based hand tracing

SIXTH SENSE TECHNOLOGY

Gesture Recognition and Voice Synthesis using Intel Real Sense

Tourism Guide for Tamilnadu (Android Application)

Footprint Recognition using Modified Sequential Haar Energy Transform (MSHET)

International Journal of Advanced Research in Electrical, Electronics and Instrumentation Engineering

A Two-stage Scheme for Dynamic Hand Gesture Recognition

Researcher 2015;7(5) A New Emerging Interface: Sorcerous User Interface SUI. Abdur Razzaq

Voice Command Based Computer Application Control Using MFCC

ISSN (PRINT): , (ONLINE): , VOLUME-4, ISSUE-11,

Face Objects Detection in still images using Viola-Jones Algorithm through MATLAB TOOLS

4.03 IT PowerPoint. Objective 4.03 Understand Information Technology activities and careers.

PRESENTED BY: M.NAVEEN KUMAR 08M31A05A7

INTERNATIONAL JOURNAL OF PURE AND APPLIED RESEARCH IN ENGINEERING AND TECHNOLOGY

By: Computer Education. Introduction Level One

Gesture Recognition Using 3D MEMS Accelerometer

Feature Extraction of Edge Detected Images

To appear in ECCV-94, Stockholm, Sweden, May 2-6, 1994.

Computer Basics. Need more help? What s in this guide? Types of computers and basic parts. Why learn to use a computer?

Lesson 2: Input, Output, and Processing

Detecting and Tracking a Moving Object in a Dynamic Background using Color-Based Optical Flow

Development of real-time motion capture system for 3D on-line games linked with virtual character

Smart Door Lock Opening In Cars Using Face Recognition

DISTANCE MEASUREMENT USING STEREO VISION

Advanced Mouse Cursor Control And Speech Recognition Module

True/False Indicate whether the statement is true or false. Bubble A for True and B for False

True/False Indicate whether the statement is true or false. Bubble A for True and B for False

TOWARDS NEW INTERFACES AT MULTITOUCH TECHNOLOGY Ms. Pratima B. Khot 1, Ms.Tejashri H. Mohite 2

Detecting Fingertip Method and Gesture Usability Research for Smart TV. Won-Jong Yoon and Jun-dong Cho

Portable Vision-Based HCI - A Real-time Hand Mouse System on Handheld Devices

QOMO Journey Touch Panel. The QOMO Journey is a touch screen smart panel that you can control with a stylus or with your fingers.

Touchless Fingerprint recognition using MATLAB

UAccess ANALYTICS A New Look, New Features, and New Functions

A Survey on improving performance of Information Retrieval System using Adaptive Genetic Algorithm

System Based on Voice and Biometric Authentication

Transcription:

Virtual mouse with Color Detection Using Computer Vision Pravin Pabale Dr. Maithili Arjunwadkar Abstract-- One of the important challenges in Human Computer Interactions is to develop more perceptive and more natural interfaces. User can not interact with a computer system without using keyboard and mouse. But because of various disabilities that affect many people to use keyboard and mouse with computers. We present in this paper a virtual mouse designed by us with color detection technique using web camera which can be used to interact with the computer in a more user friendly manner. This is an alternative approach for the traditional input devices. Keywords- Hand gesture, Human Computer Interaction, Color Detection, Web camera I. INTRODUCTION In this era of computer, every individual performs most of their day-to-day tasks using computers. The input devices like keyboard and mouse are mostly used by them while operating computers. But because of various disabilities that affects many people to continuous work with computers. Especially the disability with fingers is major problem to work with computer. Because of that disability the person can not operate keyboard and mouse. Instead of using those traditional input devices hands can be used directly as an input device which is an innovative method of Human Computer Interaction using Virtual Environment (VE). In this paper we designed mechanism for virtual mouse using computer vision which can be used to interact with the computer in a more user friendly manner using Java and MATLAB. This is an alternative approach for the traditional static input devices like keyboard and mouse. II. RELATED WORK Many researchers are doing their research in the human computer interaction virtual environment. Abhik Banerjee et al [1] have developed an object tracking based virtual mouse application using a webcam. The system has been implemented in MATLAB environment using MATLAB Image Processing Toolbox. Kinjal N. Shahh et al [2] have established the mechanism to interact with computer system using computer vision. This interaction is better than normal static keyboard and mouse. They have represented most of innovative mechanisms of the finger tracking used to interact with a computer system using computer vision. Erdem et al [3] have described a computer vision based mouse, which can control and command the cursor e.g. click a mouse button using a camera. A click of the mouse button was implemented by defining a screen such that a click occurred when a user s hand passed over the region. Chu-Feng Lien [4] had used only finger tips which were used to control both the movement of mouse cursor and mouse click. In his approach the user need to hold the mouse cursor on the desired spot for a specific period of time for clicking operation which is based on image density. Hojoon Park [5] had shown how to build this mouse control system. In his paper he had used a camera and computer vision technology, such as image segmentation and gesture recognition, to control mouse tasks (left and right clicking, double-clicking, and scrolling) and he had shown how it can perform everything current mouse devices can. Kamran Niyazi et al[6] have developed method by using camera and computer vision technology such as image segmentation and gesture recognition, to control mouse tasks and have shown how it can perform everything as current mouse devices can. Y. Sato et al[7] have proposed method that is shown to be effective even in such a challenging situation and tracking method through demonstration using augmented desk interface. Paul Robertson et al [8] have described a vision-based virtual mouse interface in which they utilized a robotic head, visual tracking of the users head and hand positions and recognition of user hand signs to control an intelligent kiosk. III. PROPOSED SYSTEM In our work, we have tried to control mouse cursor movement, click events using a web camera which is mounted on the laptop. Here we used color detection techniques using Java for the mouse cursor movements and click events. The system collects the input from the colored taps which worn by the users. A. Mouse Control Algorithm Human-Computer Interaction through Real-Time Hand Tracking and Gesture Recognition is an appealing option for replacing primitive Human Computer Interaction (HCI) using a mouse or touchpad. We propose a system for using a webcam to track a users hand and recognize gestures to initiate specific interactions [1]. The contributions of our work will be to implement a system for hand tracking and simple gesture recognition in real time. The Basic algorithm is shown below. 66

complete mouse functionality using three color bands. Red color-band is used for move the mouse pointer by user with the help of java.awt.robot object. Figure (3.1.1) shows Step 1. Capture Real time video from Web camera when webcam detects the Red color tap in front of it, it Step 2. Take snapshot from real time video triggers the mouse move functionality. Figure (3.1.2) Step 3. Filter the color and Process shows Blue color-band triggers the right click operation. Step 4. Perform different mouse operation And Figure (3.1.3) shows Green color-band triggers the left according to color detection click operation. With all these color bands user is able to handle the cursor without touching the mouse. When webcam is taking images from real time video, it performs various operations on it. First it extracts the color and filters the image. Then after that it converts it into binary image. We designed simulating the left click and the right click events of the mouse by assigning different color pointers. After performing these processes on image, we check particular color to perform operation. If we can t find any color in them then we again take new image from real-time video and perform same operation again in a loop. These three colors can help user to achieve complete mouse functionality using hand and its gesture which is shown below. Colour Purpose Red Moving Mouse Cursor Blue Right Click Operation Green Left Click Operation The centroid has been calculated according to the position of color tap on image and performs various activities. B. Snapshot Algorithm By using snapshot gesture, user is able to take snapshot from real time video. It requires two color bands, one is red and other is green. By calculating the distance or gap between them we set a specific threshold. When distance between two color band is less then specified value then it perform a process of taking snapshot. We use the algorithm which is given below to develop virtual mouse in Java: Step 1. Capture the real time video using Web- Camera. Step 2. This image is getting frame by frame from real time video. Then we perform certain process on these images. Process is done the individual image frame. Step 3: Filtering each image taken by real-time video and convert it into gray-scale image. Step 4: Detect the colors (Red, Green and Blue) from image and extract it. Step 5: After detecting the color from image, we convert it into binary image. Step 6: Calculate centroid of an image region for purpose of tracking mouse pointer. If color is Red then we calculate the x-axis and y-axis of image region, calculate its centroid and move the mouse according to that value. If color is Blue then we perform a right click operation on give location of mouse pointer. If the color is Green then left click operation will be performed. The following images show screen shots (images) which we developed. As it shows, a user is able to handle the 67 Step 1. Capturing the real time video using Web- frame. Camera. Step 2. Process the individual image Step 3. Convert each frame to a grey scale image. Step 4. Color detection and extraction of the different colors (RGB) from gray scale image. Step 5. Conversion of the detected image into a binary image. Step 6. If color is red and green then calculate the distance between them.. x=xa xb y=ya yb Step 7. Check if x + y <= 75 then goto step 8 else goto step 1. Step 8. Perform the process of taking snapshot and store it on harddisk. Steps of taking snapshot: In taking of snapshot from real-time video, first we capture the image using web-cam. These images are taken for processing and stored on hard- process on these disk of user computer. We perform certain images. The processes are: a) Filtering each image taken by real-time video and convert it into gray-scale image. b) Detect the colors (red and green) from image and extract it. c) After detecting the color from image, we convert it into binary image.

d) Calculate centroid of an image region for purpose of tracking mouse pointer. After that processes we calculate the distance between these two colors taps e.g. Red and Green color. We calculate the distance by finding the difference of centroid between red and green color tap. If the calculated value is near the VI. REFERENCES threshold value then we execute the code of taking snapshot 1. Abhik Banerjee, Abhirup Ghosh,, Koustuvmoni Bharadwaj, and store that on hard-disk. If the given two colors (red and Hemanta Saikia Mouse Control using a Web Camera based on Colour Detection, International Journal of Computer Trends green) are not found in images then we again take snapshot and Technology (IJCTT), 9(1), 2014 15-20. from real-time video and perform same operation in loop. 2. K N. Shah, K R. Rathod and S. J. Agravat, A survey on Figure (3.2.1) shows the picture of how to take snap shots. Human Computer Interaction Mechanism Using Finger Tracking International Journal of Computer Trends and User is able to take snapshot by using gesture as shown in Technology, 7(3), 2014, 174-177 above image. It first detects the red color and green color in 3. A. Erdem, E. Yardimci, Y. Atalay, V. Cetin, A. E. Computer front of web-cam. If the web-cam detects these two colors vision based mouse, Acoustics, Speech, and Signal Processing, Proceedings. (ICASS). IEEE International Conference, 2002 then it calculates its centroid. We set some threshold value 4. Chu-Feng Lien, Portable Vision-Based HCI A Realtime and according to that when both the color band come in that Hand Mouse System on Handheld Devices, National Taiwan region then it performs the snapshot operation and store it to University, Computer Science and Information Engineering Department the hard-disk with image name. Taking snapshot is totally 5. Hojoon Park, A Method for Controlling the Mouse Movement based on detection of two colors (red and green) in front of using a Real Time Camera, Brown University, Providence, RI, web-cam and its actual distance between them. If the USA, Department of computer science, 2008 distance between this two color marker is less than decided 6. Kamran Niyazi, Vikram Kumar, Swapnil Mahe, Swapnil Vyawahare, Mouse simulation Using Two Coloured Tapes, threshold value then it takes snapshot and store that on user Department of Computer Science, University of Pune, India, computer. International Journal of Information Sciences and Techniques (IJIST) Vol.2, No.2, March 2012 7. Y. Sato, Y. Kobayashi, H. Koike, Fast tracking of hands and IV. CONCLUSION fingertips in infrared images for augmented desk interface. In Proceedings of IEEE, Internationall Conference on Automatic In this era of computer, Human Computer Interaction (HCI) Face and Gesture Recognition (FG), 2000. pp. 462-467. 8. Robertson P., Laddaga R., Van Kleek M., Virtual mouse through computer vision is more convenient way to interact vision based interface, January 2004. with the computer. Every individual performs most of their day-to-day tasks using traditional input devices like keyboard and mouse by them while operating computers. Users with the disability with fingers have major problem to work with computer. In this paper, virtual mouse AUTHOR S PROFILE Mr. Pravin Pabale application using color detection technique has been He is pursuing his Master s in Computer developed in Java using a webcam. This is very useful for Application under Engg. Faulty from P.E.S. disabled users. In future we are adding voice output for every operation. So together both i.e. virtual mouse with voice output will create the most robust application for Modern College of Engineering, Shivajinagar Pune-5. Currently He is studying in II Year. Prof. Dr.(Ms.) Maithili Arjunwadkar disabled users. She is working as professor MCA(Engg) department in P.E.S. Modern College of V. ACKNOWLEDGMENT Engineering Shivajinagar Pune-5. She has done her P.hd from Symbiosis International University under the faculty of Computer I would like to pay special thankfulness, warmth and appreciation to the persons below who made my software Studies. She has published more than 8 papers in various International Journals. development successful and assisted me at every point to cherish my goal: My Guide, Dr. Ms. Maithili Arjunwadkar for her vital support and assistance. Her encouragement made it possible to achieve the goal.my Mom and Dad, family members and friends, without whom I was nothing; they not only assisted me financially but also extended their support morally and emotionally. 68

Fig (3.1.1.): Moving mouse cursor using red color tap. Fig (3.1.2): Perform right click operation using blue color tap. 69

Fig (3.1.3): Perform left click operation using green color tap. Fig (3.2.1): Perform snap shot operation 70