Proposal of a Touch Panel Like Operation Method For Presentation with a Projector Using Laser Pointer

Similar documents
D-Calib: Calibration Software for Multiple Cameras System

Lens Screw Pad Arm (a) Glasses frames Screw (b) Parts Bridge Nose Pad Fig. 2 Name of parts composing glasses flames Temple Endpiece 2.2 Geometric mode

A Practical Camera Calibration System on Mobile Phones

3D X-ray Laminography with CMOS Image Sensor Using a Projection Method for Reconstruction of Arbitrary Cross-sectional Images

X y. f(x,y,d) f(x,y,d) Peak. Motion stereo space. parameter space. (x,y,d) Motion stereo space. Parameter space. Motion stereo space.

A Method to Measure Eye-Hand Coordination for Extracting Skilled Elements-Simultaneous Measurement of Eye-Gaze and Hand Location-

Developing a Tracking Algorithm for Underwater ROV Using Fuzzy Logic Controller

E V ER-growing global competition forces. Accuracy Analysis and Improvement for Direct Laser Sintering

Research Article Scene Semantics Recognition Based on Target Detection and Fuzzy Reasoning

Three-Dimensional Image Security System Combines the Use of Smart Mapping Algorithm and Fibonacci Transformation Technique

Development of Pseudo 3D Visualization System by Superimposing Ultrasound Images

Vehicle s Kinematics Measurement with IMU

Disparity Fusion Using Depth and Stereo Cameras for Accurate Stereo Correspondence

Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation. Range Imaging Through Triangulation

A Circle Detection Method Based on Optimal Parameter Statistics in Embedded Vision

A Simple Interface for Mobile Robot Equipped with Single Camera using Motion Stereo Vision

3D Face Reconstruction Using the Stereo Camera and Neural Network Regression

A Real-Time Measuring Method of Translational/Rotational Velocities of a Flying Ball

A 100Hz Real-time Sensing System of Textured Range Images

Tracking of Dynamic Objects Based on Optical Flow

EECS 556 Image Processing W 09

PRECISION TARGETING USING GPS/INERTIAL-AIDED SENSORS

High Accuracy Indoor Imaging Positioning Algorithm Based on Angle Feedback with Visible Light

Robert Collins CSE486, Penn State. Robert Collins CSE486, Penn State. Image Point Y. O.Camps, PSU. Robert Collins CSE486, Penn State.

Automatic Facial Expression Recognition Using Neural Network

A New Concept on Automatic Parking of an Electric Vehicle

Range Sensors (time of flight) (1)

Multibody Motion Estimation and Segmentation from Multiple Central Panoramic Views

Measurements using three-dimensional product imaging

Epipolar Constraint. Epipolar Lines. Epipolar Geometry. Another look (with math).

Auto-focusing Technique in a Projector-Camera System

KINEMATICS STUDY AND WORKING SIMULATION OF THE SELF- ERECTION MECHANISM OF A SELF-ERECTING TOWER CRANE, USING NUMERICAL AND ANALYTICAL METHODS

3D Modeling of Objects Using Laser Scanning

EVALUTION METHOD OF COATING THICKNESS OF COATING THICKNESS STANDARD

Visual motion. Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys

World Academy of Science, Engineering and Technology International Journal of Computer and Information Engineering Vol:10, No:4, 2016

Handy Rangefinder for Active Robot Vision

The Immune Self-adjusting Contour Error Coupled Control in Machining Based on Grating Ruler Sensors

Vision-based Real-time Road Detection in Urban Traffic

HIGH SPEED 3-D MEASUREMENT SYSTEM USING INCOHERENT LIGHT SOURCE FOR HUMAN PERFORMANCE ANALYSIS

Optical flow Estimation using Fractional Quaternion Wavelet Transform

Tilt Sensing Using Linear Accelerometers

Ashish Negi Associate Professor, Department of Computer Science & Engineering, GBPEC, Pauri, Garhwal, Uttarakhand, India

A Novel Adaptive Algorithm for Fingerprint Segmentation

Perspective Projection Transformation

High-speed Three-dimensional Mapping by Direct Estimation of a Small Motion Using Range Images

MRI Imaging Options. Frank R. Korosec, Ph.D. Departments of Radiology and Medical Physics University of Wisconsin Madison

HEXFLEX: A PLANAR MECHANISM FOR SIX-AXIS MANIPULATION AND ALIGNMENT

EELE 482 Lab #3. Lab #3. Diffraction. 1. Pre-Lab Activity Introduction Diffraction Grating Measure the Width of Your Hair 5

Study on Image Retrieval Method of Integrating Color and Texture

Stereoscopic particle image velocimetry measurements of a turbulent axisymmetric jet

10/5/09 1. d = 2. Range Sensors (time of flight) (2) Ultrasonic Sensor (time of flight, sound) (1) Ultrasonic Sensor (time of flight, sound) (2) 4.1.

Study on Determination of Preceding Vehicle Motion State at the Traffic Lights Intersection

An Interactive Technique for Robot Control by Using Image Processing Method

SECTION 3-4 Rational Functions

Improved N-port optical quasi-circulator by using a pair of orthogonal holographic spatialand polarization- modules

Depth Camera for Mobile Devices

Visual compensation in localization of a robot on a ceiling map

The Design And Experimental Study Of A Kind of Speech Instruction. Control System Prototype of Manned Spacecraft

An Improvement of the Occlusion Detection Performance in Sequential Images Using Optical Flow

Project: IEEE P Working Group for Wireless Personal Area Networks N

Development of real-time motion capture system for 3D on-line games linked with virtual character

Approaches to Simulate the Operation of the Bending Machine

Visual compensation in localization of a robot on a ceiling map

Suguden functions that enable a user. These original functions were first installed in the 2016 summer models of. NTT DOCOMO smartphones.

Multi-Instrument Analysis From Point and Raster Data

3D Environment Measurement Using Binocular Stereo and Motion Stereo by Mobile Robot with Omnidirectional Stereo Camera

Geometry of image formation

science. In this course we investigate problems both algebraically and graphically.

STEREOSCOPIC ROBOT VISION SYSTEM

<

Sensor Modalities. Sensor modality: Different modalities:

Projections. Brian Curless CSE 457 Spring Reading. Shrinking the pinhole. The pinhole camera. Required:

COMPUTER-BASED WORKPIECE DETECTION ON CNC MILLING MACHINE TOOLS USING OPTICAL CAMERA AND NEURAL NETWORKS

Modeling with CMU Mini-FEA Program

Construction Tele-robot System With Virtual Reality

The Research about Interactive Intelligent Projection Handwritten System Based on Wiimote Wei Zhou, Yao Deng, Luxi Li, Di Hu

CAMERA-BASED CALIBRATION OF MULTIPLE LASER PROJECTORS FOR COLLABORATIVE PROJECTION

The Graph of an Equation

Modeling and Simulation Exam

3D Tomographic Reconstruction Using Geometrical Models

NEW. Plug & Automate 3D Machine Vision Product Line Ready-To-Use for Factory Automation. Scalable & Configurable. From Sensor to Software

Max-Flow Min-Cost Algorithm for A Supply Chain Network

Laser sensors. Transmitter. Receiver. Basilio Bona ROBOTICA 03CFIOR

Overview of Active Vision Techniques

Self-calibration of a pair of stereo cameras in general position

NUMERICAL PERFORMANCE OF COMPACT FOURTH ORDER FORMULATION OF THE NAVIER-STOKES EQUATIONS

Exploiting Rolling Shutter Distortions for Simultaneous Object Pose and Velocity Computation Using a Single View

Lucas-Kanade Image Registration Using Camera Parameters

9. f(x) = x f(x) = x g(x) = 2x g(x) = 5 2x. 13. h(x) = 1 3x. 14. h(x) = 2x f(x) = x x. 16.

A Robust and Real-time Multi-feature Amalgamation. Algorithm for Fingerprint Segmentation

Estimation of Tikhonov Regularization Parameter for Image Reconstruction in Electromagnetic Geotomography

Introduction to Homogeneous Transformations & Robot Kinematics

Color Interpolation for Single CCD Color Camera

Interaction with the Physical World

Association Rules for a Swarming Behavior of Multiple Agents

A New Method for Representing and Matching Two-dimensional Shapes

A Robot Recognizing Everyday Objects

Panasonic Develops Long-range TOF Image Sensor

Basilio Bona DAUIN Politecnico di Torino

Transcription:

Proposal of a Touch Panel Like Operation Method For Presentation with a Projector Using Laser Pointer Yua Kawahara a,* and Lifeng Zhang a a Kushu Institute of Technolog, 1-1 Sensui-cho Tobata-ku, Kitakushu cit, Fukuoka-ken, 804-8550, Japan *Corresponding Author: kawahara@boss.ecs.kutech.ac.jp Abstract A presentation using a projector and a laser pointer is a ver popular scene at a business meeting or at an academic conference. A sstem operate the mouse cursor b a laser pointer is proposed to perform more comfortable presentation. However, as for the proposed sstem using the image processing approach, the using environment is limited because of the frame rate, environment light problems. To wipe such a disadvantage, an additional operation tools equipped with an acceleration sensor that coordinate the movement of the cursor based on visual feedback are suggested. But this could make the usabilit spoiled if the presenter is not well trained. In this sstem, a modulated laser source is adopted to reduce the surrounding light noise, and the PS sensor gives a ver rapid response than a CC camera sstem. In addition, we become able to operate it in real time because no image processing was used. Kewords: PS sensor, modulated laser, presentation. 1. Introduction Presentations using a computer projector sstem are efficient wa for one-wa communication. It is performed in various scenes including the business meeting, the lecture and the class at the universit and the compan. In recent ears, people use a laser pointer to point to the gazing point during the presentation time. Furthermore, various tools have been developed for making presentations more effectivel and efficientl. In those sstems, a stick mouse in which an acceleration sensor installed can operate the sstem b a swing of the hand is eists. However, the user ma feel stress because such operations need to regulate the movements based on feedback b the sight through the presentation time. Besides, to operate the view screen, another sstem etracts the laser irradiation spot position b image processing is proposed. (1)(2)(3) But the light of projector and the illumination of eternal light influence it easil, the using environment is limited. (4)(5) Therefore, in this stud, the proposed sstem performs an intuitive cursor operation b the laser pointer using PS. In this sstem, we utilized a modulated laser source and there are no limits of the using environment because the modulated laser source can be separated well from ambient light. In addition, we become able to operate it in real time because we do not use image processing. 2. Proposed Sstem 2.1 Construction of proposed sstem In this stud, we assume that a PC screen was projected to a view screen b a projector, and irradiate a light spot to the view screen b using the laser pointer in which installed a modulation function. This sstem first calculates the coordinate of the laser spot irradiated on the view screen, and then converts it to the PC screen coordinate. Finall, based on the calculated information, the light spot can be treated as a mouse cursor on the PC screen. Therefore, we can realize various operations b using a laser pointer. stud. The sstem implements two following functions in this (a) raw a picture on the PC screen b tracing a picture projected on the view screen using a laser light spot irradiated on screen (b) Performs the touch panel like operation corresponding to a laser light spot irradiated on view screen OI: 10.12792/iciae2014.012 47 2014 The Institute of Industrial Applications Engineers, Japan.

Using these two functions, it is possible to write a memo timel on the view screen during the presentation or operate a PC with a touch panel-like operation b using a laser pointer. In addition, the presenter can operate it intuitivel b chasing a laser light spot with ees. Figure 1 shows the proposed sstem configuration and Fig.2 shows the flowchart of processing. This device is constructed b a laser pointer, a PS sensor and a PC shown in Fig.1. At first a frequenc modulation circuit produces an modulated power source and supplies it to a laser. When the user irradiates the laser point on screen, the PS receives the reflected light through a lens. Because the PS is a light responding device, the ecited signal in which include the surrounding noise is input into the circuit of the light-receiving department. In this research, through a band pass filter, the etra noise is removed. After /A convert the etracted signal, the digitalized signal is inputted into a PC and the coordinates of the laser point are figured out. In this wa, the sstem replaces a calculated coordinate of view screen with the coordinate of the PC screen and moves mouse cursor to the coordinate on the PC screen. 2.2 The two-dimensional coordinate detection using the PS In order to measure the two-dimensional coordinates of the laser point, a two-dimensional position sensitive detector (PS) shown in Fig.3 is used. The PS is a kind of optical sensor using the Si photodiode. PS can detect the position of the gravit center of the total irradiation light. The feature of PS is high sensitivit, and could detect the position with a high resolution and a fast response. The PS generates the electro charge according to the amount of light reaching a light-receiving surface. The PS has 4 terminals corresponding to the -Y ais. For these terminals, the resistance values from the light-receiving point to each terminal var according to the lighted position. That is, the output currents from 4 terminals are also changing b the lighted position. B using the ration calculated b the currents from 4 terminals, the position of the light spot can be detected as the gravit of the light spot. In this stud, we use the 2 PS sensor S2044, which manufactured b Hamamatsu Photonics. Its active area is 4.7 4.7 mm. In Fig.3, we show coordinates of the center-of-gravit point as (, ). According to the center-of-gravit point of this spot light, current value I 1, I 2, I Y1, and I Y2 are obtained from the electrode of the four electrodes 1, 2, Y1, and Y2, respectivel. In this case, the coordinate of the center-of-gravit is given as the following equation. I!! + I!! (I!! + I!! ) I!! + I!! + I!! + I!! = 2 (1) The following equation calculates the Y coordinate of the center of gravit. I!! + I!! (I!! + I!! ) I!! + I!! + I!! + I!! Frequenc modulation circuit Laser pointer = 2 (2) Light Laser Two-dimensional PS I/Vconversion circuit PC Receiver Filter circuit Smoothing circuit PC Fig. 1. Sstem Configuration. Fig. 2. Flowchart. 48

Moreover, since L and L Y are L = 5.7 mm and LY = 5.7 mm in S2044 respectivel, the center-of-gravit of laser light spot can be calculated b the conversion equations (1) and (2). The light of the laser spot on view screen is focused on the surface of PS through the lens. Figure 4 shows this relationship. Figure 5 shows OG AO'G ' plane of Fig.4. Since the value z and is constant in Fig.5. is calculated G A O z L Y1 O Fig. 5. OG AO'G 'plane. G L Y 1 2 b the following equations. In addition, it becomes similar about the -ais direction. = z Y = z (3) (4) Y2 2.3 Coordinate transformation Laser Point G(,Y) G Fig. 3. O Y G 2 imensional PS. of The sstem converts the coordinate (,Y) on the view screen into the coordinate ( PC,Y PC ) on the PC. Figure 6 shows the relationships between these two coordinates. Equation (5), (6) gives the coordinate converting method of figure 6. In this wa, the sstem moves the mouse cursor on the PC screen corresponding to the position of the laser spot L Y L Y Lens A G O G (, ) G Light-Receiving Surface of the PS View of Projector (0,0) (1440,900) Coordinate sstem of the PC screen Fig. 4. State of the Light Receiving of PS. Fig. 6. Relations of the Coordinate. 49

on the view screen.!" = 1440 + 720 (5) Y!" = 900Y + 450 (6) 2.4 Operation of the mouse cursor Because this sstem do not want gives an etra operating device which reducing the operation performance. The proposed laser pointer has two tpes light source installed. One is normal tpe laser light source that gives a gazing point to the audience for a good understanding. The other one is modulated light source that the reflect light can be received b PS sensor to give a coordinate information for PC operation. These two tpes light source can be switched b control the power source with two buttons installed on the laser pointer. One additional button can rapidl accept b user. 2.5 drawing sstem We implement a drawing function based on the coordinate information that the sstem calculated. Consider the coordinate on PC screen as start point when the modulated laser source is emitted, then move the laser light spot on the view screen, if the moving distance is larger than a threshold value, then treat the current point as a end of line interval, then draw a line between theses two points. Repeat this procedure until the modulated light source stop, thus the sstem draw a picture. 0 240 480 720 view of this sstem was approimatel 43.6 degrees. 3.2 etection accurac measurement The eperimental environment is set close to an actuall using environment. Then project a PC screen on view screen with a projector in a darkroom and tested it. The power source suppl of laser pointer is a pulse modulated one onl. We still have not a 2 mode laser point at this time. 960 1200 In this eperiment, it was aimed to evaluate the accurac of the measurement of the coordinate detection precision of the sstem. The light-receiving module is set 2 meters far awa from the view screen generated b the projector. B acquiring the coordinate of laser irradiation point on the view screen and calculate the relative error with the theoretical value of coordinate, the measurement accurac was obtained. relative error[ %] 30 20 10 0 900 750 600 450 300 150 0 1440 Fig. 7. Relative Error of the Measurement Result. Fig. 7. Relative Error of the Measurement Result. 3. Eperiments 3.1 etection range measurement In this eperiment, the detection range L, LY of this sstem is eamined first. Project a picture of PC screen to the view screen generated b projector. Measure the distance from view screen to the light-receiving module. The measurement range is from 1 meter to 2 meters, and the measuring interval is 0.2 meter. etect the view angle that can measure b PS in its lens-imaging angle. Table 1 shows the result of this eperiment. B this eperiment, we were able to confirm that the angle o Figure 7 shows the measurement results. Although the measurement result is not satisfied enough now, it shows the coordinate of laser spot on the view screen in real time. Table 1. Measuring Range. istance [m] 1.0 1.2 1.4 1.6 1.8 2.0 [m] 0.79 1.02 1.12 1.29 1.45 1.59 [m] 0.79 0.94 1.09 1.26 1.43 1.56 4. Conclusions In this stud, we studied detection of laser spot position in real time. According to the result of the eperiment, proposed approach could measure the coordinate of the laser spot using a two-dimensional PS sensor and a lens imaging structure. In addition, we confirmed that the 50

accurac was reasonable to draw picture using a power suppl modulated laser pointer. It is possible that the mouse cursor pursued the coordinate of the position of laser spot irradiate on the view screen. Moreover, we could draw picture along the laser point. In this sstem, we can operate in real time because it didn t use image processing like an eisting method. Thereb, we considered that the proposed sstem could make the presenter to have a presentation smoothl. In future, it is necessar to reduce the noise comes from the circuit because the PS output current is ver tin and be influenced easil. We will add other operation such as turn the pages to the sstem. Therefore, presenter won t get stress when the present with this sstem. References (1) POPOVICH E : Presenter Mouse LASER-Pointer Tracking Sstem, http://www.mpi-inf.mpg.de/ karni/presentermouse/inde.htm, access date: 2014.1.31 (2) LUCAS B. : An iterative image registration technique with an application to stereo vision, Proc. of International Joint Conference on Artificial Intelligence, 1981 (3) C. Kirstein, H. Muller : Interaction with a projection screen using a camera-tracked laser pointer, MMM '98 Proceedings of the 1998 Conference on Multi Media Modeling, pp.191, 1998 (4). Toosaki, K. Kai, and M. Iwatsuki: A Laser Pointer Tracking Mouse with an Internal CC Camera for Presentations, IEICE Technical Report, ET2005-73, pp. 93-96, 2005 (5) SUKTHANKAR R: Self-Calibrating Camera-Assisted Presentation Interface, Proceedings of International Conference on Control, Automation, Robotics and Vision (ICARCV), 2000 51