WWW.IJITECH.ORG ISSN 2321-8665 Vol.04,Issue.02, February-2016, Pages:0345-0349 Implementation of Voice Controlled Color Sensing Robotic ARM on NI myrio using NI Lab VIEW RAJ KUMAR L BIRADAR 1, PHANINDRA REDDY K 2 1 Associate Professor, Dept of E&TM, GNITS, JNTU, Hyderabad, TS, E-mail: rajkumar_lb@yahoo.com. 2 Assistant Professor, Dept of E&CE, RYMEC, Ballari, India, E-mail: phanindrareddyk@gmail.com. Abstract: This paper presents the implementation of voice controlled color sensing robotic intended to sort colored objects with a three axes robotic arm. The color sensing robotic arm is capable of gripping the different colored cubes based on the color, sort and place them in different containers. The object s color is detected by using an inexpensive, yet more powerful sensor viz., a camera and by applying image processing functions on the acquired image available in NI LabVIEW. The designed system is flexible enough as the robot can pick and place the object using 360 movement of the arm that is controlled by NImyRIO device by appropriately directing the DC Motors. Keywords: Color Sensing Robotic Arm, VoiceBot, NI myrio, NI LabVIEW, Three Axes Robotic Arm. I. INTRODUCTION The twenty first century has witnessed the design, development and revolution in robotic technology, especially in the last few decades, unmanned robots-vehicles have become very popular in various areas like die-casting, different kinds of welding, the concurrent prismatic arms that are used to handle the cockpit flight simulators, articulated arms in spray painting, and today s unparalleled robotic arms that facilitate the most complicated surgical procedures like prostatectomies, cardiac valve repair and gynaecologic procedures remotely thereby increase the chances of patients survival. Other potential areas of robot are in the field of R&D, welding and parts rotation and placement during car assembly, in packaging and in some cases[2], close emulation of the human hand is very much desired to defuse a bomb and to dispose it and can also be subjected to dangerously polluted environments such as radioactive waste clean-up and in chemical spills. There also exists a huge demand for the color sensing robot arms in automotive assembly lines that perform a variety of tasks such as sorting of medicines in pharmaceutical industries, sorting of raw coffee beans in the coffee processing plant. The robotic arms are in general used to increase the productivity by continuously producing on par quality products without getting exhausted, reducing defective products production and minimise the material waste, effectively conduct rescue missions in difficult and hazardous environment as they are immune to toxic, diseases, and biological viruses. Also, by considering the dynamic design morphologies it might be proven that robots have the incredible potential and can reach places where human beings once considered impossible. With the rapid development in robotic technology around the world, many robotic applications are developed to improve our quality of lives[1]. The proposed system implements the voice controlled three axes color sensor robotic arm that rotates in 360, designed using NI myrio device and the LabVIEW application software has the flexibility of accepting the commands dynamically from the user. Initially, the system accepts the voice commands (viz., red, green, yellow) from the user, processes them using the LabVIEW in-built functions and then the camera acquires an image and perceives the objects extracts the objects color in the image using the image processing functions and in turn the three axes robotic arm picks the object based on the input voice command and places the object in the respective color container. II. THE SYSTEM MODEL The Fig.1 shows the block diagram of the voice controlled three axes color sensing pick and place robot arm consisting of the voice recognition system that processes the input voice commands, the camera acting as the visual sensor to acquire the image, the DC Motor & Motor Driver to manoeuvre the arm in 360. All of these components are interfaced with the NImyRIO device and the LabVIEW software that precisely supervise and coordinate the activities of the arm movement for object picking, and placing the object in its respective color container. The flowchart of the system is shown in Fig.2. Fig.1. Block Diagram of the voice controlled three axes color sensing pick and place robotic arm. Copyright @ 2016 IJIT. All rights reserved.
III. THE SYSTEM IMPLEMENTATION The overall implementation of the system is described under two sections as described below: A. Software Section The software part plays vital role in controlling the hardware according to the user commands, having the hardware alone cannot help in accomplishing the intended task. Such an imperative task is carried out with the aid of the application program developed using the NI LabVIEW software. The input to the Voice controlled pick and place robot arm is given by user s voice which is converted to commands using speech recognition system present in Windows OS, these commands are taken by the host program and writes to the shared variable between the host program and the code running on the NI myrio. Using the Wi-Fi hosting capability of the NI myrio, it has been possible for the host device to communicate wirelessly over a WLAN network with myrio. The code on the myrio continuously monitors the value updations on the shared variable and executes the appropriate cases in the case structure. RAJ KUMAR L BIRADAR, PHANINDRA REDDY K The entire application program is developed using NI LabVIEW as it has wide range of predefined library functions [3] and other tools that makes the graphical programming that makes the application development easier. The application developers can combine the power of LabVIEW software with modular, reconfigurable hardware to overcome the ever-increasing complexity involved in delivering measurement and control systems on time and under budget. The following sections explain briefly about the working of the graphical code: Fig.3. Voice Command Processing code. The voice command processing code accepts the user s command and in response to the recognized commands, the IMAQ function[5] creates the temporary memory to store the template image and the respective color template is loaded into the memory using the IMAQ Read Image and Vision function as shown in fig.3. Fig.2. Flowchart of the voice controlled three axes color sensing pick and place robotic arm. Fig.4. Image Acquisition from the camera and object position identification.
Implementation of Voice Controlled Color Sensing Robotic ARM on NI myrio using NI Lab VIEW After the command is recognized, an image is acquired from the camera using the vision assistant function and then the image is searched for the requested color object/color template using the IMAQ Match Color Pattern function. This function actually accepts the one input image from the camera and the other input is the template image which is a reference to the color template for which is searched during the matching phase. Once the desired color object/template is recognized in the image, then the position of the requested color object is to be detected as shown in fig.4 and fig.5 shows its front panel. Fig.6. Event Registration-RT Code. Fig.5. Front Panel of the colored object position determination. The movement of the arm in 360, raising/ lowering of the arm s elbow, opening/closing of the gripper are all under the supervision of the NI myrio device that runs a Real Time (RT) code. Based on the received input command and after determining the object s position, the RT code executing on myrio directs the arm accordingly. The RT code is developed using the functions available in NI LabVIEW software. The code contains creation and registration of the event when position of the object is detected is shown in fig.6 and subsequently the event handler provides the required control and coordination among all the three DC motors connected on three different axes. A part of the event handler code containing the PWM express VI enclosed inside a case structure is shown in fig.7. After obtaining the position information, the arm manoeuvres to the exact position and the gripper then opens so that the object can be picked and then gripper closes, allowing the arm to rotate until it reaches the exact color container and then the elbow lowers itself into the container to drop the object and the gripper releases the object. Then the arm goes back to its initial position and waits for the next user command. Fig.7. Event Handler code controlling the arm movements-rt Code. Fig.8. Hardware Overview of NI MyRIO.
RAJ KUMAR L BIRADAR, PHANINDRA REDDY K B. Hardware Section top of the object, the next task is to lower the elbow (PWM To implement the system prototype the following hardware Control of DC Motor2) so that the object can be reached, devices/ components have been used: then finally by opening the arm s gripper (PWM Control DC NI myrio Motor3) the object is grabbed. USB Camera 12V Motor Drivers & 12V DC Motors -3 No s. TABLE I: Motor s Control Gears, Wooden pieces to design the arm. The NI myrio-1900 provides several analog inputs (AI), analog outputs (AO), digital inputs and outputs (DIO), audio, and power output in a compact embedded device. The NI myrio- 1900 connects to a host computer over USB and wireless 802.11b,g,n. The fig.8 shows the arrangement of NI myrio-1900 components. The NI myrio-1900 is a portable reconfigurable I/O (RIO) device that can be used to design control, robotics, and mechatronics systems [3]. The myrio consists of various other in-built devices like UART, Accelerometer, FPGA and an a dual-core ARM Cortex-A9 real time processor (Xilinx Zynq system on a chip (SOC) running a Linux real-time OS) [4]. All of the above mentioned components are interfaced with the NImyRIO device and the LabVIEW software that precisely supervise and coordinate the activities of the arm movement for object picking, and placing the object in its respective color container. A DC motor is any of a class of electrical machines that converts direct current electrical power into a mechanical rotation as shown in fig.9. At times it is required that the DC motors be driven by an external driver to boost the current level. The driver board used allows control up to two 12V DC motors individually. Each motor can be driven at a maximum of 750mA offering a decent driving current for the motors as shown in fig.9. It supports both clock-wise and anti-clockwise rotation and speed control. It can easily be interfaced with a microcontroller such as Arduino, 8051 or with any Data Acquisition system. The next phase is to raise the elbow, rotate the arm (PWM Control DC Motor1) until the arm reaches the appropriate color container and then the elbow lower s itself into the container and finally the gripper opens up to place the object. The wide angle USB camera is used to acquire the input image of the objects. Gears, Wooden pieces are used inorder to prepare the prototype of the robotic arm possessing three axes. IV. RESULTS AND OBSERVATIONS The prototype as shown in the figs.10 and 11 accepts Red, Green, Yellow as the voice commands, based on which the different directions of motions are possible in the DC motors. Then the Arm picks and places the requested color object in the respective color container. Fig.10 Prototype showing the elbow and gripper functions. Fig.9. The 12V Motor driver and the 12V DC Motor. The design uses DC motors for the movement of robotic arm in all the three axes and the desired function from the arm can be obtained by controlling the motors individually using the PWM express VI. This VI generates the Pulse Width Modulating signal that is being fed to the base DC Motor1, whose duty cycle is adjusted dynamically so that the arm positions itself with respect to the detected object position according to Table 1. Once the arm position is on Fig.11. Prototype showing the camera position and arm movement.
Implementation of Voice Controlled Color Sensing Robotic ARM on NI myrio using NI Lab VIEW V. CONCLUSION The intelligent voice controlled robotic arm is been designed and implemented using NI myrio and NI LabVIEW. The arm robot design has an enhanced potential of color sensing and offers. The design supports userfriendly voice interface, which could be used by larger user base and offers superior precision in object position determination, picking and placing the object. VI. REFERENCES [1] Kader Ibrahim, Internet Controlled Robotic Arm, Engineering Procedia, 2012, page 1072-1078. [2] Dhanoj.M, et.al, Colour Sensor Based Object Sorting Robot Using Embedded System, International Journal of Advanced Research in Computer and Communication Engineering, Vol. 4, Issue 4, April 2015. [3] Jeffrey Travis, Jim Kring, LabVIEW for Everyone: Graphical Programming Made Easy and Fun, Third Edition, Prentice Hall Professional, 2007. [4] NI myrio-1900, User Guide and Specifications, National Instruments, 2013. [5] Thomas Klinger, Image Processing with LabVIEW and IMAQ Vision, Prentice Hall, June 11, 2003.