Information Select and Transfer Between Touch Panel and Wearable Devices Using Human Body Communication

Similar documents
Stylus Enhancement to Enrich Interaction with Computers

HoverLink: Joint Interactions using Hover Sensing Capability

Coin Size Wireless Sensor Interface for Interaction with Remote Displays

InsTangible: A Tangible User Interface Combining Pop-up Cards with Conductive Ink Printing

Keyboard Clawing: Input Method by Clawing Key Tops

Pen-Based Interface Using Hand Motions in the Air

Extending Mobile Interfaces with External Screens

Touch Screen technology

Interface for Digital Notes Using Stylus Motions Made in the Air

Plus 10: Descending the Design Funnel THE 10 PLUS 10 METHOD. developing 10 different ideas and refinements of selected ideas

Hard- and Software for Interactive Surfaces

Note: Text based on automatic Optical Character Recognition processes. SAMSUNG GALAXY NOTE

Prototyping for the Development of Ergonomic Interactive Surfaces

Tangible Linker: organizing and accessing digital contents using untagged tangible objects

EVALUATING DIFFERENT TOUCH-BASED INTERACTION TECHNIQUES IN A PUBLIC INFORMATION KIOSK

98.2% 85.5% 93.8% Graduate School of Information Science and Technology, Osaka University

A Paradigm for Orientation-Based Universal Remote Control

An Energy Efficient MAC for Wireless Full Duplex Networks

Museums and the Web Jim Spadaccini Paul Lacey Wednesday, April 6, 2011

Creative Digital Spaces Technology Encourages Inspired Human Communication

Design And Implementation Of Ordering System For Restaurants

Enhancing Large Display Interaction with User Tracking Data

Realization of Automatic Keystone Correction for Smart mini Projector Projection Screen

Conductive Fabric Gesture-Controlled Sleeve

A Time Synchronized Wireless Sensor Tree Network using SimpliciTI

Binding an Handheld Device with its Owner

IPS Device Deployments and Configuration

Haldia Institute of Technology. Touch Screen Monitors

StripComm. Interference-resilient Cross-technology Communication in Coexisting Environments. Tsinghua University. Xiaolong Zheng, Yuan He, Xiuzhen Guo

Swipe, pinch, zoom & tab All you want to know about Touchscreen Technology

EEG Based Detection of Area of Interest in a Web Page

Multi-track Scratch Player on a Multi-touch Sensing Device

TRAINING GUIDE LEVEL 3 MODBUS WRITE IMPORT COMMAND

A Context-Aware Keyboard Generator for Smartphone Using Random Forest and Rule-Based System

ONLINE REAL TIME HEALTH MONITORING OF PATIENT USING INTRABODY COMMUNICATION

Sukjun Lim Strategic Planning, User interaction, Design research specialist

Margarita Grinvald. Gesture recognition for Smartphones/Wearables

Augmenting Reality with Projected Interactive Displays

Power-efficient Communication Protocol for Social Networking Tags for Visually Impaired

SMART SERVER BOT USING ZIGBEE TRANSCIEVER

Usability Tests Descriptions

INTRODUCTION DATA COMMUNICATION TELECOMMUNICATIONS SYSTEM COMPONENTS 1/28/2015. Satish Chandra satish0402.weebly.com

IdeaCrepe: Creativity Support Tool With History Layers

Improving Touch Based Gesture Interfaces Ridhô Jeftha University of Cape Town Computer Science Student Rondebosch Cape Town 03 July 2012

Evaluation of Electronic Guidebook Mobile Web Resources

Phone Tilt. Dynamic UI Adaptations for single-handed smartphone interaction. Fasil Negash Mtr No.:

An Accessible Environment to Integrate Blind Participants into Brainstorming Sessions

SIXTH SENSE TECHNOLOGY

CS 5520/ECE 5590NA: Network Architecture I Spring Lecture 13: UDP and TCP

Kent State University

Beyond-screen Interface for Interactive Television

{s.h.khandkar,

Program Title: Telecommunication Engineering Technology Postsecondary Number: (AS) (AAS

Special Articles on PREMIUM 4G Introduction of LTE Advanced

AC : INFRARED COMMUNICATIONS FOR CONTROLLING A ROBOT

Gesture-Based 3D Mesh Modeler

Microcontroller Based Data Acquisition System

On-screen Note Pad for Creative Activities

Web Applications Usability Testing With Task Model Skeletons

Icons++: An Interface that Enables Quick File Operations Using Icons

Tangoscope: a tangible audio device for tabletop interaction

Introduction 2. The Base of Projected Capacitive Touch 3. How Projected Capacitive Works 5. Types of Projected Capacitive Touch Screens 6

GLOVES & TOUCHSCREEN COMPATIBILITY PROTECTIVE INDUSTRIAL PRODUCTS, INC. BRINGING THE BEST OF THE WORLD TO YOU

MotionBender: A Gesture-Based Interaction Technique for Editing Motion Paths

A Mobile Scenario for the XX Olympic Winter Games Torino 2006

Lesson 5 Arduino Prototype Development Platforms. Chapter-8 L05: "Internet of Things ", Raj Kamal, Publs.: McGraw-Hill Education

A hacker in a hoodie with leather gloves tapping a glowing blue lock icon on a transparent touchscreen with ones and zeroes raining down in green

Alessandra de Vitis. Arduino

OVERVIEW OF WIRELESS CHARGING

SianoTV. User Manual. Accessories Product Family V 1.0

Outline. A Professional Company in Software-Defined Networking (SDN) Copyright , EstiNet Technologies Inc. All Rights Reserved..

User s Receptacle. User Appliance. Sensors. Outlet. Transformer. Transmitter. Receiver. Data. Microcontroller. Display. Keypad

Touch Less Touch Screen Technology

A Mouse-Like Hands-Free Gesture Technique for Two-Dimensional Pointing

Special Articles on PREMIUM 4G Introduction of LTE Advanced. large-capacity, and low-delay to improve

What you can do: Use Transparency #10

Avadesign Technology DP-104. IP Video Door Phone. and APP - 1 -

A Study on Evaluation of Conceptual Designs of Machine tools

International Journal of Engineering Research & Management Technology

IEEE Testing Signal Compliance of ZigBee Standard

ECE791/792 Project Proposal

A New Full Duplex MAC Protocol to Solve the Asymmetric Transmission Time

DESIGNS OF A FREE-SPACE WHITE-LED MASS-STORAGE TRANSCEIVER FOR SD-CARD FILE TRANSFER

Command composition approach in crossing interfaces. with more than two continuous goals

PL1167. Low Power High Performance Single Chip 2.4GHz Transceiver. Product Description: Key Features: Applications: Pin Configuration:

Flickey: Flick-Based QWERTY Software Keyboard for Ultra-small Touch Screen Devices

Wireless Authentication System for Barcode Scanning Using Infrared Communication Technique

Aalborg Universitet. Just-In-Place Information for Mobile Device Interfaces Kjeldskov, Jesper. Published in: Paterno, F. (ed.)

Arranging Touch Screen Software Keyboard Split-keys based on Contact Surface

RootCap: Touch Detection on Multi-electrodes using Single-line Connected Capacitive Sensing

Samsung MagicIWB Solution (Interactive White Board)

Design and Research of Virtual Instrument Development Board

Plug and Play Home Automation with Alternative Control

ECE 653: Computer Networks Mid Term Exam all

The development of wireless communication technology for smart gas meters in Japan

Hermes: Hands-Free Authentication in Physical Spaces

Quick Start Guide. MRB-KW01 Development Platform Radio Utility Application Demo MODULAR REFERENCE BOARD

ECE Fall 2016

A Low-Cost Energy Management System That Compares Power Consumption of Electronic Home Appliances

Case Studies of Using the Gigabitcompatible. as a Troubleshooting Tool for Home IP Systems

Transcription:

Information Select and Transfer Between Touch Panel and Wearable Devices Using Human Body Communication Yuto Kondo (B), Shin Takahashi, and Jiro Tanaka Department of Computer Science, University of Tsukuba, 1-1-1 Tennodai, Tsukuba, Ibaraki, Japan kondoy@iplab.cs.tsukuba.ac.jp, {shin,jiro}@cs.tsukuba.ac.jp Abstract. This paper proposes a technique to enable the simple transfer of information between a computer with a large touch-panel display, such as a tabletop PC, and another computer, typically one worn by the user. With our technique, the user touches an intended item displayed on the panel to select and transfer it to his or her device. We describe some illustrative usage scenarios and outline a prototype system that can communicate image data between a tabletop PC and a wearable device. We conducted preliminary experiments to evaluate this system s user interface and performed interviews with test subjects regarding the prototype. 1 Introduction With the drop in the cost of computers, people now often have the opportunity to use multiple devices simultaneously. Users typically own and operate various types of PCs, tablets, and smart phones. In addition, they occasionally encounter public access terminals such as information kiosks, ATMs, and digital signages. It is desirable to establish communication between such devices, but enabling them to function cooperatively is often difficult and requires tedious configuration. It is necessary to provide more user-friendly interfaces for connecting and using multiple devices in cooperation. This paper proposes a method to enable the straightforward transfer of information between a computer with a large touch-panel display, such as a tabletop PC, and another device, typically one worn by the user. The technique uses human body communication protocols proposed by Zimmerman [1], which utilize the human body as part of an electronic circuit. With our technique, the user touches an item on the display to select and transfer it to his or her computer. Upon touching the panel, the two devices are automatically connected, and the selected information is transmitted directly through the user s body. The user does not need to configure the connections between devices in advance, and can easily transfer data between devices. c Springer International Publishing Switzerland 2015 M. Kurosu (Ed.): Human-Computer Interaction, Part II, HCII 2015, LNCS 9170, pp. 208 216, 2015. DOI: 10.1007/978-3-319-20916-6 20

Information Select and Transfer Between Touch Panel and Wearable Devices 209 2 Selection and Transfer by Touch 2.1 Basic Interaction Technique We propose a method for information transfer between multiple devices with a touch panel, utilizing human body communication. Figure 1 illustrates our proposed technique. Icons that represent information such as a file or an image are displayed on the large touch screen of a tabletop PC. The user has a wearable device on his or her wrist and the device has a small touch screen, much like a smartphone. To transfer information from the tabletop PC to the user s wearable, he or she selects an icon and touches it with his or her finger. The system automatically connects the two devices, and transmits the data represented by the selected icon. Alternatively, to transfer information from the wearable device to the tabletop PC, the user first selects an icon on the wearable, and then touches the screen of the tabletop PC. Fig. 1. Overview of our proposed method: transfer by touching the interactive screen. 2.2 Example Use of the Interaction Technique Combining with Multitouch and Gesture Operations. Various types of t- ouch interaction can be used with our technique. For example, multiple pieces of information can be selected and transferred simultaneously by touching multiple icons. In addition, multi-touch operations can be used. The user designates the range with his or her fingers and items within the range are selected and transferred simultaneously. Images transferred to a device can also be scaled with multi-touch gestures. Receiving Streaming Data. In our method, devices can communicate only while the user is touching the screen. We can take advantage of this characteristic. For example, it is only possible to stream video while the user is touching the screen. This may be useful when presenting promotional material to specific customers, or offering coupons corresponding to the time the visitor watched the video.

210 Y. Kondo et al. Transferring a User s Profile to Customize Interactions. With our technique, a touch operation can be used to transfer user profiles from a consumer device to a public terminal, and vice versa, enabling customized interaction. Information can be recommended based on the user s personal preferences. For example, if a user is searching for a restaurant from an information kiosk, he or she can provide his or her preference by touching the screen. When a restaurant is decided upon, a map can be transferred to the user s device by touching the restaurant s icon. 3 Prototype Implementation 3.1 Overview Figure 2 shows an overview of a prototype implementation. The system consists of two parts: a device worn by the user (Fig. 2 (right)) and a tabletop PC with a touch panel screen (Fig. 2 (left)). When the user interacts with the tabletop PC, these two parts are connected via the user s body, and a communication link is established. For example, when an image selected by the user is transferred from the tabletop PC to the wearable device, it is sent via the transmit/receive (TX/RX) module of the tabletop PC, through the transparent conductive sheet, and into the user s body. Copper foil attached to the wearable couples the signal into the device via a second TX/RX module. This path can be used for bidirectional communication between the wearable device and the tabletop PC. 3.2 Wearable Devices Figure 3 shows a prototype wearable device designed to be worn on the wrist (Fig. 3 (a)). A smartphone (Samsung Galaxy S II LTE) was used for interaction Fig. 2. Overview of the data communication path established when a user touches the tabletop PC display.

Information Select and Transfer Between Touch Panel and Wearable Devices 211 Fig. 3. The prototype wearable device. with the user. An external TX/RX circuit was connected to the smartphone via copper foil fitted to the back of the wearable device that allowed signals to be coupled into the human body (Fig. 3 (b)). 3.3 TX/RX Circuit Board Figure 4 shows a schematic of the TX/RX circuit board that was used for both the tabletop PC and the wearable device. The transmission (TX) circuit comprised a crystal oscillator and an AND logic circuit. Data were sent using amplitude shift keying (ASK) modulation. A carrier wave of 1 MHz was generated by the crystal oscillator. The receive (RX) circuit comprised a band-pass filter, two operational amplifiers, an envelope-detection circuit, and a comparator. Received signals were amplified, demodulated and read by an Arduino microcontroller. The signal path was half-duplex and therefore only one circuit could operate in TX mode at any one time, a condition satisfied by using relays. For example, when data were transmitted from the tabletop PC to the wearable device, the TX module of the PC and the RX module of the wearable were activated. The prototype implementation of the TX/RX circuits used a breadboard. Printed circuit boards are being fabricated at the time of writing. 3.4 Tabletop PC with a Touch Panel The tabletop PC, fitted with a touch panel screen (Fig. 5), communicated with the wearable device via the screen s surface while the user was touching it. The touch panel screen was covered with a transparent conductive sheet, connected to the PC via the TX/RX circuit board.

212 Y. Kondo et al. Fig. 4. A schematic circuit diagram of the TX/RX modules. Fig. 5. A tabletop PC with a touch panel screen, and the TX/RX circuit board. A PQ Labs multitouch sensor frame1 was used to detect the user s touch gestures on the 60 display. Since infrared sensors were used to detect the point at which contact was made, interference between the display unit and the touch operation was avoided. 3.5 Data Transmission Protocol Upon user contact with the screen, an electrical connection between the tabletop PC and wearable device was established. At this moment, the system attempted to establish a logical data connection using the following handshake method (Fig. 6 (a)). First, a synchronization packet (SYN) was sent by the TX module. Upon reception of the SYN packet, the RX module returned a synchronization acknowledgement packet (SYN/ACK), which the TX module replied to with an ACK 1 http://multitouch.com.

Information Select and Transfer Between Touch Panel and Wearable Devices 213 packet. Upon receiving the ACK at the RX module, a logical connection was established and the TX module began transmission. If the TX module did not receive a SYN/ACK after transmission of a SYN packet, the SYN was resent several times. This situation was typically caused by insufficient contact between the user and the screen. Transmission was successful in most cases where a firm contact was achieved. The protocol for termination of the communication link was similar to the initiation protocol (Fig. 6 (b)). First, the TX module sent a final packet (FIN). When the RX module received the FIN packet, it returned a FIN/ACK packet, to which the TX module replied with an ACK. Upon receiving the ACK packet at the RX module, the data transmission was complete and the connection was closed. The above sequence assumes that the TX module knows when the user has touched the screen, making transmission possible. The tabletop PC can sense contact via the touch panel, however the wearable device did not have this information. Therefore, when user input was detected at the tabletop PC without data to be transmitted, a connection packet (CNT) was sent to the wearable device. When a CNT packet was detected by the wearable, the transmission protocol described above was initiated. 3.6 An Application Example: Exchanging Image Data Software was written and installed on a tabletop PC and a wearable device to enable the exchange of image data. To transfer the image data from the tabletop PC to the wearable device, the user was required to touch the image file icon on the screen. This operation was recognized by the touch panel sensor, triggering transmission of the selected file. The user was required to maintain contact until the transmission process was completed. During transmission, a progress indicator was displayed on the screen Fig. 6. The data transmission protocol.

214 Y. Kondo et al. of the wearable device. After transmission had been completed, the received image was displayed on the screen of the wearable device. To send the image data from the wearable device to the tabletop PC, the user first selected an image by touching the relevant icon on the screen of the wearable. The device waited until a connection with the tabletop PC had been established to transmit the data. Then, the user touched the screen of the tabletop PC; if he or she touched the background area rather than an image icon, the wearable device sent the selected image data to the PC. If he or she touched an image icon, the transmission from the wearable device was cancelled. If the user maintained contact and the transmission was successful, the transmitted image was displayed on the screen of the tabletop PC. In this case, a progress indicator was also displayed on the screen of the wearable device. In Fig. 7, the user has downloaded image data from the tabletop PC. Since the transmission has been successfully completed, the image selected by the user is displayed on the screen of the device. The user can then browse the image on the wearable. 3.7 Preliminary Evaluation We conducted a preliminary experiment to evaluate the system. We recruited four students, two undergraduates and two graduates, as subjects, and asked them to use our system. The system operated at a data rate of approximately 17 kbps. It took 4 6 s to transfer an image in this experiment. The transmission was generally successful, but failed in some cases because of insufficient contact between the user and the screen, or release of the user s finger before the transmission was complete. Fig. 7. Transfer of an image file with our prototype system.

Information Select and Transfer Between Touch Panel and Wearable Devices 215 We conducted interviews about the usability of the system after each subject completed the experiment. Most subjects responded that they could easily transfer image data between the wearable device and the tabletop PC. For example, some of their positive comments were: The process of information transfer is very simple, It is intuitive because we can visually recognize a target image, and select it with a direct physical operation, It seems secure because information is not sent without touch. On the other hand, they made some negative comments: Maintaining touch during the transmission is troublesome, I felt the transmission time was too long, Sometimes, the transmission was not stable, The wearable device is a little large. Improvements to the speed and stability of the transmission, and reductions in the size of the device are therefore necessary. We are planning to improve these factors in future implementations. 4 Related Work In Pick-and-Drop [2] systems, users can transfer files between computers. The user picks a file from one computer by touching its icon with a pen, and drops it on the other by touching on its display with the same pen. In Memory Stones [3], a user can pick up a file displayed on a device screen, carry it to another device, and place the data by using his or her fingers. During this operation, the user is invited to pantomime the act of carrying a tangible object (the stone ) and to keep his or her fingertip positions unchanged. The system identifies the source and target devices by matching the shape of the polygon formed by the fingertips when touching the respective screens. Using these techniques, a user can transfer information in an intuitive way, but since the actual data transmission is achieved via the wireless network, the user needs to configure the connection in advance. With our method, no configuration is necessary to communicate, even with public access terminals. Hinckley [4] proposed interaction techniques to connect multiple computers by physically bumping them. A user can bump a tablet into another resting on a desk. The software recognizes the gesture by synchronizing the two accelerometers across a wireless network. The tablet moved by the user annexes the display of the stationary tablet, allowing a panoramic image to span both displays. Seifert et al. [5] presented a system that allowed users to extend the display of their mobile phones by using external screens. Users connected their mobile phone with a screen by holding it against the border. When the connection was established, the GUI of the active mobile application was distributed across the phone and the external display. Using our technique, the user only has to touch an item on the touch-panel display to select and transfer data between two devices. DiamondTouch [6] is a multi-user touch technology that uses human body communication for tabletop front-projected displays. It works by transmitting a different electrical signal to each part of the table surface that is to be uniquely identified. When a user touches the table, signals are capacitively coupled from directly beneath the touch point, through the user, and into a receiver unit associated with that user. The receiver can then determine which parts of the table

216 Y. Kondo et al. surface are being touched by the user. Our method also communicates between wearable devices and tabletop PCs using the human body. DiamondTouch technology is used to identify which person is touching where. On the other hand, our study is intended to transfer information easily from or to the user s device by automatically connecting it and a tabletop PC. 5 Conclusion We proposed a technique for information transfer between multiple devices, utilizing human body communication with a touch-panel. We also developed a prototype system that could send and receive image data between a tabletop PC and a wearable device. We performed a preliminary experiment to evaluate this system. Users responded mostly positively, but some comments indicated that the prototype system should be refined so that the transmission speed and stability were improved and the size of the device was reduced. In future work, we aim to improve upon our design by developing a system that results in more stable information transfer between devices. References 1. Zimmerman, T.G.: Personal area networks: near-field intrabody communication. IBM Syst. J. 35(3 4), 609 617 (1996) 2. Rekimoto, J.: Pick-and-drop: a direct manipulation technique for multiple computer environments. In: Proceedings of the 10th Annual ACM Symposium on User Interface Software and Technology (UIST 1997), pp. 31 39 (1997) 3. Ikematsu, K., Siio, I.: Memory stones: an intuitive copy-and-paste method between multi-touch computers. In: CHI 2013 Extended Abstracts on Human Factors in Computing Systems (CHI EA 2013), pp. 1287 1292 (2013) 4. Hinckley, K.: Synchronous gestures for multiple persons and computers. In: Proceedings of the 16th Annual ACM Symposium on User Interface Software and Technology (UIST 2003), pp. 149 158 (2003) 5. Seifert, J., Schneider, D., Rukzio, E.: Extending mobile interfaces with external screens. In: Kotzé, P., Marsden, G., Lindgaard, G., Wesson, J., Winckler, M. (eds.) INTERACT 2013, Part II. LNCS, vol. 8118, pp. 722 729. Springer, Heidelberg (2013) 6. Dietz, P., Leigh, D: DiamondTouch: a multi-user touch technology. In: Proceedings of the 14th Annual ACM Symposium on User Interface Software and Technology (UIST 2001), pp. 219 226 (2001)