U-Remo: Projection-assisted Gesture Control for Home Electronics

Similar documents
Augmenting Cognition with Wearable Computers

Margarita Grinvald. Gesture recognition for Smartphones/Wearables

Touch Less Touch Screen Technology

DIGITAL SIXTH SENSE & PICO PROJECTOR PRESENTED BY: HARSH VERDHAN ( ) AMIT PATEL( )

A Mouse-Like Hands-Free Gesture Technique for Two-Dimensional Pointing

Using TRIZ for Minimizing Cursor Movements in GUI

A personal digital assistant as an advanced remote control for audio/video equipment

A Full-Range of 3D Body Scanning Solutions

Augmenting Reality with Projected Interactive Displays

(19) United States (12) Patent Application Publication (10) Pub. No.: US 2003/ Al Sun et at. (43) Pub. Date: May 1, 2003

Coin Size Wireless Sensor Interface for Interaction with Remote Displays

Interactive Campaign Planning for Marketing Analysts

Gesture-Based Controls Via Bone Conduction

Implementation of a Smart Ward System in a Hi-Tech Hospital by Using a Kinect Sensor Camera

AV Guide for 2308 McGavran-Greenberg

ACTRONCONNECT Wireless control of your air conditioner at your fingertips.

A Video Optimization Framework for Tracking Teachers in the Classroom

Usability Testing. ISBVI_Magnifier

OCR Interfaces for Visually Impaired

Interaction Using Nearby-and-Far Projection Surfaces with a Body-Worn ProCam System

Chapter 6: Interfaces and interactions

Design and Engineering of Software Applications for Touch Input Collection on Mobile Devices

6 in 1 Universal Remote with LCD KAUNIRMLCDA. User Manual

How to Exploit Abstract User Interfaces in MARIA

Welcome to the world of Quick Start Guide Open Sesame Version 3.0.0

ELECTRONIC CLASSROOM OPERATIONS GUIDE Room 502 V4.1

The Nureva Span ideation system. Installation guide. Single panoramic system

Ubi Quick Start Guide

Human Arm Simulation Using Kinect

3D Sensing/Gesture Recognition and Lumentum

A Paradigm for Orientation-Based Universal Remote Control

1644 Bank Street, Suite 103 Ottawa, Ontario K1V 7Y6. Tel: Fax:

Wolfee Presenter+ User Guide

Mobile Phone Operations using Human Eyes Only and its Applications

Optimizing Simulation of Movement in Buildings by Using People Flow Analysis Technology

HoverLink: Joint Interactions using Hover Sensing Capability

USER GUIDE. Security Multi-Sensor

CC300 Ready for smart operation

Input: is any data or instructions that are used by a computer.

Employee Training Manual and Resource Guide

Stereo Turntable System

Gesture Recognition and Voice Synthesis using Intel Real Sense

Creative Digital Spaces Technology Encourages Inspired Human Communication

Instructional Technology Classroom Documentation For use in the Best Classroom (DSB 114)

Figure 2-1: Membership Functions for the Set of All Numbers (N = Negative, P = Positive, L = Large, M = Medium, S = Small)

Smart-Voice Invocation of Scenes in Home- Automation Systems

User Manual. eufy Security SpaceView Baby Monitor

Home Monitoring and Control service provided by Verizon Online LLC

890 Pro Harmony Remote. User Manual version1.0

Face & Fingerprint Recognition Product

FastCam Outdoor Speed Dome Camera

Using the rear projection of the Socibot Desktop robot for creation of applications with facial expressions

Multi-Touch Gestures for 3D Objects

How to use A/V Podium Equipment at Niagara College

AV Guide for 2306 McGavran-Greenberg

A Kinect Sensor based Windows Control Interface

Wearable Internet Appliances and Their Applications

Hybrid Indoor Positioning and Directional Querying on a Ubiquitous Mobile Device

FengMi Wemax One Laser Projection TV

Click Install View Touch. Installation starts. Click Next. Click Finish.

Infrared Interactive Whiteboard User Manual

Dr. Shuang LIANG. School of Software Engineering TongJi University

Highly Secure Authentication Scheme: A Review

PERVASIVE HOME AUTOMATION THROUGH MOBILE PHONES

Information Select and Transfer Between Touch Panel and Wearable Devices Using Human Body Communication

Smart Office Area Monitoring and Control Based on IoT

CPE/CSC 486: Human-Computer Interaction

Owner s Manual RBC-AX32U(W)-E RBC-AX32U(WS)-E AIR CONDITIONER (SPLIT TYPE) Wireless remote controller kit. Model name: English.

CWA BT320 Product Information Guide

Andy Project. Tushar Chugh, William Seto, Bikramjot Hanzra, Shiyu Dong, Tae-Hyung Kim. The Robotics Institute Carnegie Mellon University

Acceptance Test Plan

Getting the most from your WiFi modem

PANNO. User manual. enno electronics Co., Ltd.

E-BALL Technology Submitted in partial fulfillment of the requirement for the award of

The Most User-Friendly 3D scanner

Control Pad and Touch Unit Installation Guide

Gesture-Based 3D Mesh Modeler

Gesture Human-Machine Interface (GHMI) in Home Automation

Quick Start Guide MAC Operating System Built-In Accessibility

INSTRUCTION BOOK. www. klikr.mobi

Gesture Recognition Using 3D MEMS Accelerometer

Technical Disclosure Commons

Plug and Play Home Automation with Alternative Control

1. Safety instructions

Voice Operated Home Automation for Senior Citizens

Scroll Display: Pointing Device for Palmtop Computers

INNOV.NET PTY LTD. User guide

Gaze, Eye Tracking and Mouse Tracking. Human Computer Interaction Topic 1 - Design and Evaluation Student: Susan Möller Ferreira

communications and software

NEC Avio Infrared Technologies Co., Ltd.

NOKTON 42.5mm F0.95 INSTRUCTION MANUAL

SAVANT PRO 8 APP AT A GLANCE

QUICK START GUIDE ODT-HH-MAH300 Handheld

Quick start Guide POCKET TV

Technology Classrooms Podium at Community College of Philadelphia

SMART Room System for Skype for Business

file:///volumes/macintosh HD/Web12/port/WebCollage/BJ881624A...

SPIN remote SDC-1. Discover a new, playful user experience.

Laser Pointer Interaction

26/05/2015 AR & VR? 1 1

Transcription:

U-Remo: Projection-assisted Gesture Control for Home Electronics Kaori UJIMA Department of Computer Science, Ochanomizu University ujima.kaori@is.ocha.ac.jp Azusa KADOMURA Department of Computer Science, Ochanomizu University azusa@is.ocha.ac.jp Itiro SIIO Department of Computer Science, Ochanomizu University siio@acm.org Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the owner/author(s). Copyright is held by the owner/author(s). CHI 2014, Apr 26 - May 01 2014, Toronto, Ontario, Canada ACM 978-1-4503-2474-8/14/04. http://dx.doi.org/10.1145/2559206.2581215 Abstract Various home appliances and electronic devices require remote control in homes, such as air conditioners and televisions, and the number of household remote control devices is increasing. However, the users of remote control devices sometimes experience stress because they might forget where they left a device, or they might have problems selecting the correct control from an array of confusing devices. Thus, we propose an innovative control method for home electronics that detects user gestures, where the prompts for gestures are projected onto the user s body. We also designed gestures and a GUI, which are highly intuitive and easy to understand. We refer to this system as U-Remo (ubiquitous + you are the remote control). As an example of this method, we applied the U-Remo system to an air conditioner. We developed a prototype system that comprised an air conditioner embedded device, a depth sensor camera, and a projector, which allowed the detection of user actions and the provision of graphical feedback based on the user s actions. Author Keywords Air conditioner; Gesture; Home appliance; Projection; Remote control ACM Classification Keywords H.5.m. Information interfaces and presentation (e.g., HCI): Miscellaneous. 1609

Introduction Most current home appliances and electronic products such as air conditioners or televisions are operated by a remote control device. However, the users of remote control devices sometimes experience stress. For example, it is easy to forget where a remote control was left and it might be difficult to identify the correct remote control among an array of confusing remote controls. Understanding the correct buttons to press on a remote control can also be non-intuitive. A possible solution to these problems is the device-free operation of home appliances without the need for remote controls. It is also important to design intuitive and readily understandable systems. In this study, we propose an innovative remote control system, which we call U-Remo, i.e., ubiquitous + you are the remote control (Figure 1). This system detects user actions and projects usage instructions onto the user s body, which prompt the correct gestures required to operate home appliances. As an example of the use of this system, we applied U-Remo to an air conditioner. Related Work Many studies have proposed various prototypes and products for operating home electronic devices in a device-free and intuitive manner. We categorize previous research into four types: (1) gaze operation; (2) voice and sound operation, such as claps; (3) gesture operation; and (4) operation via a projection system. For example, Jurek [1] proposed a system that recognized the user's point of gaze based on an eyeglass-type device with an embedded infrared camera, which operated a cursor on the menu screen of Figure 1: Basic usage of U-Remo a television. The increasing availability and accuracy of eye gaze detection equipment has encouraged its use for investigation and control applications. Novel methods have also been presented for navigating and inspecting extremely large images solely or primarily using eye gaze control [2]. The advantage of gaze operation compared with gesture recognition is that major body movements are not required. However, a disadvantage is that it is necessary to wear a device, such as glasses, to achieve sufficient accuracy in practical implementations for the remote control of appliances. Some commercial products also employ voice operation, including the Daiseikai VOiCE 1 series air conditioner (TOSHIBA). Various products are based on gesture operation, 1 http://www.toshiba.co.jp/living/air_conditioners/pickup/edr/ 2 http://ctlg.panasonic.com/jp/bd-dvd/bd-dvd/portable-tv/dmp- HV200.html 1610

including a television (DMP-HV200 2 ) produced by Panasonic, where the remote control of TV channel selection and volume adjustment are facilitated by the detection of the user s hand gestures by two infra red sensors at the top of the television screen. FreeDigiter [3] is another interface that allows the rapid entry of digits using finger gestures. An advantage of operation based on gestures and sounds is that appliances can be controlled without a device. However, the user has to remember the correct words or gestures to trigger the desired operations. Some device-free methods use hand recognition and projection onto the user. For example, PALMbit [4] projects an operation menu onto a user s palm and responses are selected from the user s menu via finger recognition. Skinput [5] is a wearable bio-acoustic sensor, which appropriates the human body as an input device. Vatavu [6] made a system that projects additional controls and widgets onto the wall near the television, although this system is not device-free. An advantage of operating via projected images is that it is easy to understand how to operate the system because prompts are projected. A disadvantage is that the projector tends to be bulky. U-Remo In the present study, we developed a prototype remote control device called U-Remo using a depth camera and a projector, which was embedded in an air conditioner unit. The U-Remo system projects graphical prompts onto the user s body, detects the user s hands gesture via the depth camera, and provides visual feedback onto the user s body. The recognized gesture is used to control the air conditioner. The user does not need to hold or wear a device. Figure 2: System architecture Among the possible device-free methods available, we selected gesture operation because it is facilitated by the inexpensive depth camera components found in Microsoft s Kinect. As mentioned above, one of the difficulties of gesture operation is that a user has to learn and remember the gesture. Thus, the projection of graphical prompts helps users to learn and remember gestures. Projection-based interaction tends to be bulky because of the size of the projector and the requirement for projection surfaces such as a wall, table, or floor. However, the bulky size of the projector can be overcome by using LED or laser light sources. We also designed our system so it could be projected onto the user s body instead of the walls or floors, which are not always suitable in actual rooms. Projecting onto a user s body also has the advantage that the projector can be combined with a depth sensor, which has an identical optical alignment. This makes the overall system compact and allows the possibility of 1611

Figure 3: Main menu with four function components Figure 4: Gestures in the main menu embedding in home appliances and electronic devices to reduce the production and maintenance costs. Implementation The U-Remo system uses the Xtion-Pro (ASUS), the iremocon (Glamo Inc.), the projector (SONY: VPL- CX86), the air-conditioner (TOSHIBA: RAS-221E) and the PC (Apple: MacBook Air). The Xtion-Pro is a depth camera to measure the depth as well as Kinect described above. The iremocon is a PC controllable remote-controller for sending infra red light signal to the air-conditioner. It is possible to sent remotecontroller command from the computer via the network. Figure 2 shows the system overview. The system projects prompt graphics to the user s body through the projector. Beside the projector, we placed the depth camera (Xtion) to detect user s gesture. Depth images are processed by the middleware of OpenNI, and the Processing language program receives the user s position and skeleton, and it recognizes the user s gesture. Then the program sends commands related to the specific gesture to the iremocon unit, and it sends infrared signals to the air-conditioner. The information of the user s position and gesture recognition is also used to generate prompt graphics, so that it appears on the user s body and it interact with user s gesture. The content of the prompt graphics is described in the following section. Application In this study, we applied the U-Remo system to an air conditioner, which is typically operated using a remote control in the home. We designed an intuitive and readily understandable gesture system and GUI to operate an air conditioner, as a possible application of our novel method. The gestures can be performed using both hands to improve the recognition performance. First, when a user stands in front of U-Remo and raises their left hand, the main menu image is projected onto the user s body as shown in Figure 1 and 3. Thus, this hand-raising gesture indicates Hello. The main menu has four components: (1) ON/OFF for the power supply, (2) temperature adjustment, (3) fan speed, and (4) fan direction. The user can select each function by moving their left or right hand over the main menu image from the left or right side of their body and by moving up or down. For example, as shown in Figure 4 (upper side), the user can operate the power or fan direction functions by moving their left hand from the left side to the center of their body and then moving up (power) or down (fan direction). As shown in Figure 4 (lower side), the user can operate the temperature or fan speed by moving their right hand to the center of their body and moving up (temp) or down (fan speed). Next, an instruction image is projected onto the user s body after selecting a function, except the power function. Instructive animations are available for beginners or users who cannot remember how to make gestures. Experienced users can ignore these animations and perform the requisite gestures. The main menu can also be skipped before adjusting the temperature and the fan. Therefore, operations can be made rapidly by experienced users. Our system projects step-by-step prompts and provides feedback to gestures on the user s body. Thus, even a beginner can visually recognize the necessary actions. 1612

Power Supply: As shown in Figure 3, the power switch icon is located on the upper left of the user. Thus, the power toggles ON and OFF when a user moves their left hand to the center of the body and moves it upward. This icon constantly displays the current ON or OFF status. Fan Direction: As shown in Figure 3, the fan direction icon is located on the lower left of the user. After it has been selected by left hand s moving down gesture, the user can set the fan direction with fan direction gestures, as shown in Figure 5. The user can set the fan direction from three direction selections by changing the angle of their arm, as shown in Figure 6. Thus, the user can change the angle of their arm to select the appropriate fan angle. The pink arrows in Figure 6 show the directions of the gestures. Temperature: As shown in Figure 3, the thermometer icon is located on the upper right of the user. After it is selected by right hand s moving up gesture, the user can set the temperature with gestures, as shown in Figure 7. If the user wants to increase the temperature, they must move both of their hands to the sides of their body and raise both of their hands, as shown in Figure 7 (right). The user performs the gesture in a downward direction if they want to reduce the temperature. The pink arrows in Figure 7 show the directions of the gestures. Fan Speed: As shown in Figure 3, the fan icon is located on the lower right of the user. When it is selected from the main menu, the user can set the fan speed with gestures, as shown in Figure 8. Figure 9 shows that the Figure 5: Image prompt for the fan direction Figure 6: Fan direction gestures Figure 7: GUI and temperature gestures fan speed becomes stronger when the user pushes forward with both hands away from the body, thus the fan speed becomes weaker when they position both hands in front of the chest as shown Figure 9 (left). 1613

conduct usability tests by actual users from children to elderly person in a real house. And we will also apply our method to various types of home appliances and electronic devices besides air conditioners to explore the innovative control interface. Figure 8: Fan speed GUI Figure 9: Fan speed gestures Conclusion and Future Works In this study, we developed an innovative home electronics remote control method called U-Remo. This method employs user gestures to control gesture prompts, which are projected onto the user s body. We also produced a prototype air conditioner control system based on the U-Remo method. In addition, we have conducted the performance evaluation of the U-Remo system with four people (age: 20-33 years, height: 155-175cm) and we found that our system have been operated just only providing a brief description. As the future works, we plan to Acknowledgements This project was supported by Samsung Electronics Japan Co. Ltd. We express our gratitude to Mitsuhiro Shigeri and Munetaka Nishihira of Samsung Electronics Japan Co. Ltd. We would also like to express our appreciation to Rumi Takahashi, Otto Lindqvist and Eri Masuda for their technical and design support. References [1] Jurek B, Christian L, and Klaus B, Implementing Gaze Control for Peripheral Devices, In Proceedings of PETMEI 11, pp. 3-8, ACM, 2011. [2] Nicholas A, Mark W, and Robert S. The inspection of very large images by eye-gaze control. In Proceedings of AVI '08. pp. 111-118, ACM, 2008. [3] Christian M, Matt A, and Thad S, FreeDigiter: A Contact-Free Device for Gesture Control. In Proceedings of ISWC '04, 18-21, IEEE Computer Society, 2004. [4] Goshiro Y, Huichuan X, Kazuto I, and Kosuke S, PALMbit-Silhouette: A User Interface by Superimposing Palm-Silhouette to Access Wall Displays. In Proceedings of HCI Intercational, pp. 281-290, SpringerVerlag, 2009. [5] Harrison C, Tan D, and Morris D, Skinput: Appropriating the Body as an Input Surface. In Proceedings of CHI '10. pp.453-462, ACM, 2010. [6] Radu-Daniel V, There's a world outside your TV: exploring interactions beyond the physical TV screen, In Proceedings of EuroITV '13, ACM, pp. 143-152, 2013. 1614