ISSN Vol.04,Issue.02, February-2016, Pages:

Similar documents
ISSN Vol.07,Issue.16, November-2015, Pages:

Wireless Colour Sensing Arm Robot

Colour Object Counting and Sorting Mechanism Using Image Processing Approach Avadhoot R.Telepatil 1, 2 Prashant M. Jadhav 2 1

VISION ASSISTED PICK AND PLACE ROBOTIC ARM

International Journal of Engineering Trends and Applications (IJETA) Volume 4 Issue 6, Nov-Dec 2017

I.INTRODUCTION II. ANDROID APPLICATION. A. Android SDK

Sensor Based Color Identification Robot For Type Casting

Embedded Vision Systémy - využití ve výuce a v průmyslu

Wireless Fire Fighting Robot

Colour And Shape Based Object Sorting

Wifi Based Surveillance Robotic car UsingRaspberry Pi

War Field Spy Robot Using Night Vision Technology

Running OPAL-RT s ehs on National Instruments crio: Sub-microsecond power-electronic simulation

Smart Autonomous Camera Tracking System Using myrio With LabVIEW

Simplify System Complexity

An Innovative Image Processing Method Using Sum of Absolute Difference Algorithm For Real-time Military Applications

International Journal of Informative & Futuristic Research ISSN:

Simplify System Complexity

Adaptive Motion Control of FIREBIRD V Robot

Developing Measurement and Control Applications with the LabVIEW FPGA Pioneer System

Mechanical structure of a robot=skeleton of human body Study of structure of a robot=physical structure of the manipulator structure

WIFI ENABLED SMART ROBOT

International Journal Of Advanced Research In Engineering Technology & Sciences

Speed Control of A DC Motor Through Temperature Variation Using NI ELVIS LabVIEW

Arduinodroid Controlled Car

Sviluppa Sistemi embedded con LabVIEW Design Real Systems, Fast

WIRELESS VEHICLE WITH ANIMATRONIC ROBOTIC ARM

Design and Analysis of Voice Activated Robotic Arm

ISSN Vol.07,Issue.08, July-2015, Pages:

Investigation and Evaluation of Embedded Controller Nodes of the Robotic Arm for Industrial Automation 1

DESIGN AND IMPLEMENTATION OF FPGA BASED MULTIPURPOSE REAL-TIME CONTROLLER FOR HYBRID STEPPER MOTOR

Gesture Controlled Robotic Arm

ISSN: [Sekhar* et al., 6(6): June, 2017] Impact Factor: 4.116

Unlocking the Potential of Your Microcontroller

Human Arm Simulation Using Kinect

On the basis of degree of freedom of the arm and the type of joints used, a robotic arm can have any of the following designs:

LabVIEW FPGA in Hardware-in-the-Loop Simulation Applications

Smart Home Intruder Detection System

AUTOMATED GARBAGE COLLECTION USING GPS AND GSM. Shobana G 1, Sureshkumar R 2

NI Smart Cameras PRODUCT FLYER CONTENTS. Have a question? Contact Us.

A Practical Applications of Virtual PLC using LabVIEW Software

Autonomous, Surveillance Fire Extinguisher Robotic Vehicle with Obstacle Detection and Bypass using Arduino Microcontroller

10/25/2018. Robotics and automation. Dr. Ibrahim Al-Naimi. Chapter two. Introduction To Robot Manipulators

A MULTI-ROBOT SYSTEM FOR ASSEMBLY TASKS IN AUTOMOTIVE INDUSTRY

Gesture Recognition Using 3D MEMS Accelerometer

Autonomous Rubik's Cube Solver Using Image Processing

Integrating Machine Vision and Motion Control. Huntron

Automation of wafer handling

Design of Intelligent Mobile Human Recognition and Location Identification System Based on Arm7 and Open CV

Introduction to LabVIEW and NI Hardware Platform

Robotic Systems ECE 401RB Fall 2006

VOICE CONTROLLED WHEEL CHAIR USING ARDUINO

Hands-On Seminar. Evaluate CompactDAQ and LabVIEW for Your Application 09/05/2017. ni.com

Embedded Software: Its Growing Influence on the Hardware world

ScienceDirect. Virtual Instrumentation for Visual Inspection in Mechatronic Applications

Implementation of ATM security using IOT

An Ethernet Based Control and Monitoring System Using ARM Processor

EEL 4924: Senior Design. 27 January Project Design Report: Voice Controlled RC Device

International Journal of Engineering Research ISSN: & Management Technology November-2016 Volume 3, Issue-6

Android Spybot. ECE Capstone Project

Design of LabVIEW Based SCADA System for Pneumatics Applications

Spatial R-C-C-R Mechanism for a Single DOF Gripper

SMART DUSTBIN ABSTRACT

ARDUINO-BASED WIRELESS MOBOT

Arduino Based Speech Controlled Robot for Human Interactions

IoT Based Disaster Detection and Early Warning Device

The data acquisition components, and their relationship to each other, are shown below.

NI ELVIS RIO Control Module

Color Tracking Robot

International Journal of Advance Engineering and Research Development REAL TIME VEHICLE TRACKING SYSTEM USING GSM MODEM AND GPS MODULE

NI Technical Symposium ni.com

Android Application Based Bluetooth Controlled Robotic Car

Cranking Out Camshaft Timing Data

IoT Based Traffic Signalling System

Keywords: DC Motor, Incremental Encoder, Pulse Width Modulation (PWM), Programmable Servant Robot.

International Journal Of Global Innovations -Vol.6, Issue.I Paper Id: SP-V6-I1-P01 ISSN Online:

Major Components of the Internet of Things Systems

Real time data acquisition using mobile robot

30 Years of TI s DSP: what s next? Fernando Mujica, Ph.D. Director, System Architectures Research Lab

The LabVIEW RIO Architecture and the Newest Member to the CompactRIO Family

Home Automation using IoT

Multi-Featured Shopping Trolley with Billing System

Remote Area Monitoring Robot

INSTITUTE OF AERONAUTICAL ENGINEERING

Project Proposal Guide MATHWORKS TRACK Disclaimer:

DETEKCIJA, LOKALIZACIJA I RASPOZNAVANJE OBJEKATA POMOĆU PROGRAMSKOG ALATA LABVIEW

SMART VEHICLE CONTROLLED SYSTEM

Design and Development of SMS Based Platform for Controlling Stepper Motor

Watchmaker precision for robotic placement of automobile body parts

Mid-year Design Review

A Geometric Approach to Inverse Kinematics of a 3 DOF Robotic Arm

Traffic Sign Recognition for Autonomous Driving Robot

Design and Development of a Programmable Logic Controller Using Atmel Controller and MATLAB Simulink

Anaglyph 3Dimesional Image Processing Using NI-LabVIEW

Mohammad Jafar Navabi Medtronic Microelectronics Center, Tempe, Arizona, USA

Arm11 Based Accident Alert and Vehicle Tracking Using GSM and GPS

National Instruments Approach

Autodesk's VEX Robotics Curriculum. Unit 15: Linkages

Industrial Appliances Control Using Android Mobile & Bluetooth Technology

Automation of Weather Station in Agricultural Zone

Transcription:

WWW.IJITECH.ORG ISSN 2321-8665 Vol.04,Issue.02, February-2016, Pages:0345-0349 Implementation of Voice Controlled Color Sensing Robotic ARM on NI myrio using NI Lab VIEW RAJ KUMAR L BIRADAR 1, PHANINDRA REDDY K 2 1 Associate Professor, Dept of E&TM, GNITS, JNTU, Hyderabad, TS, E-mail: rajkumar_lb@yahoo.com. 2 Assistant Professor, Dept of E&CE, RYMEC, Ballari, India, E-mail: phanindrareddyk@gmail.com. Abstract: This paper presents the implementation of voice controlled color sensing robotic intended to sort colored objects with a three axes robotic arm. The color sensing robotic arm is capable of gripping the different colored cubes based on the color, sort and place them in different containers. The object s color is detected by using an inexpensive, yet more powerful sensor viz., a camera and by applying image processing functions on the acquired image available in NI LabVIEW. The designed system is flexible enough as the robot can pick and place the object using 360 movement of the arm that is controlled by NImyRIO device by appropriately directing the DC Motors. Keywords: Color Sensing Robotic Arm, VoiceBot, NI myrio, NI LabVIEW, Three Axes Robotic Arm. I. INTRODUCTION The twenty first century has witnessed the design, development and revolution in robotic technology, especially in the last few decades, unmanned robots-vehicles have become very popular in various areas like die-casting, different kinds of welding, the concurrent prismatic arms that are used to handle the cockpit flight simulators, articulated arms in spray painting, and today s unparalleled robotic arms that facilitate the most complicated surgical procedures like prostatectomies, cardiac valve repair and gynaecologic procedures remotely thereby increase the chances of patients survival. Other potential areas of robot are in the field of R&D, welding and parts rotation and placement during car assembly, in packaging and in some cases[2], close emulation of the human hand is very much desired to defuse a bomb and to dispose it and can also be subjected to dangerously polluted environments such as radioactive waste clean-up and in chemical spills. There also exists a huge demand for the color sensing robot arms in automotive assembly lines that perform a variety of tasks such as sorting of medicines in pharmaceutical industries, sorting of raw coffee beans in the coffee processing plant. The robotic arms are in general used to increase the productivity by continuously producing on par quality products without getting exhausted, reducing defective products production and minimise the material waste, effectively conduct rescue missions in difficult and hazardous environment as they are immune to toxic, diseases, and biological viruses. Also, by considering the dynamic design morphologies it might be proven that robots have the incredible potential and can reach places where human beings once considered impossible. With the rapid development in robotic technology around the world, many robotic applications are developed to improve our quality of lives[1]. The proposed system implements the voice controlled three axes color sensor robotic arm that rotates in 360, designed using NI myrio device and the LabVIEW application software has the flexibility of accepting the commands dynamically from the user. Initially, the system accepts the voice commands (viz., red, green, yellow) from the user, processes them using the LabVIEW in-built functions and then the camera acquires an image and perceives the objects extracts the objects color in the image using the image processing functions and in turn the three axes robotic arm picks the object based on the input voice command and places the object in the respective color container. II. THE SYSTEM MODEL The Fig.1 shows the block diagram of the voice controlled three axes color sensing pick and place robot arm consisting of the voice recognition system that processes the input voice commands, the camera acting as the visual sensor to acquire the image, the DC Motor & Motor Driver to manoeuvre the arm in 360. All of these components are interfaced with the NImyRIO device and the LabVIEW software that precisely supervise and coordinate the activities of the arm movement for object picking, and placing the object in its respective color container. The flowchart of the system is shown in Fig.2. Fig.1. Block Diagram of the voice controlled three axes color sensing pick and place robotic arm. Copyright @ 2016 IJIT. All rights reserved.

III. THE SYSTEM IMPLEMENTATION The overall implementation of the system is described under two sections as described below: A. Software Section The software part plays vital role in controlling the hardware according to the user commands, having the hardware alone cannot help in accomplishing the intended task. Such an imperative task is carried out with the aid of the application program developed using the NI LabVIEW software. The input to the Voice controlled pick and place robot arm is given by user s voice which is converted to commands using speech recognition system present in Windows OS, these commands are taken by the host program and writes to the shared variable between the host program and the code running on the NI myrio. Using the Wi-Fi hosting capability of the NI myrio, it has been possible for the host device to communicate wirelessly over a WLAN network with myrio. The code on the myrio continuously monitors the value updations on the shared variable and executes the appropriate cases in the case structure. RAJ KUMAR L BIRADAR, PHANINDRA REDDY K The entire application program is developed using NI LabVIEW as it has wide range of predefined library functions [3] and other tools that makes the graphical programming that makes the application development easier. The application developers can combine the power of LabVIEW software with modular, reconfigurable hardware to overcome the ever-increasing complexity involved in delivering measurement and control systems on time and under budget. The following sections explain briefly about the working of the graphical code: Fig.3. Voice Command Processing code. The voice command processing code accepts the user s command and in response to the recognized commands, the IMAQ function[5] creates the temporary memory to store the template image and the respective color template is loaded into the memory using the IMAQ Read Image and Vision function as shown in fig.3. Fig.2. Flowchart of the voice controlled three axes color sensing pick and place robotic arm. Fig.4. Image Acquisition from the camera and object position identification.

Implementation of Voice Controlled Color Sensing Robotic ARM on NI myrio using NI Lab VIEW After the command is recognized, an image is acquired from the camera using the vision assistant function and then the image is searched for the requested color object/color template using the IMAQ Match Color Pattern function. This function actually accepts the one input image from the camera and the other input is the template image which is a reference to the color template for which is searched during the matching phase. Once the desired color object/template is recognized in the image, then the position of the requested color object is to be detected as shown in fig.4 and fig.5 shows its front panel. Fig.6. Event Registration-RT Code. Fig.5. Front Panel of the colored object position determination. The movement of the arm in 360, raising/ lowering of the arm s elbow, opening/closing of the gripper are all under the supervision of the NI myrio device that runs a Real Time (RT) code. Based on the received input command and after determining the object s position, the RT code executing on myrio directs the arm accordingly. The RT code is developed using the functions available in NI LabVIEW software. The code contains creation and registration of the event when position of the object is detected is shown in fig.6 and subsequently the event handler provides the required control and coordination among all the three DC motors connected on three different axes. A part of the event handler code containing the PWM express VI enclosed inside a case structure is shown in fig.7. After obtaining the position information, the arm manoeuvres to the exact position and the gripper then opens so that the object can be picked and then gripper closes, allowing the arm to rotate until it reaches the exact color container and then the elbow lowers itself into the container to drop the object and the gripper releases the object. Then the arm goes back to its initial position and waits for the next user command. Fig.7. Event Handler code controlling the arm movements-rt Code. Fig.8. Hardware Overview of NI MyRIO.

RAJ KUMAR L BIRADAR, PHANINDRA REDDY K B. Hardware Section top of the object, the next task is to lower the elbow (PWM To implement the system prototype the following hardware Control of DC Motor2) so that the object can be reached, devices/ components have been used: then finally by opening the arm s gripper (PWM Control DC NI myrio Motor3) the object is grabbed. USB Camera 12V Motor Drivers & 12V DC Motors -3 No s. TABLE I: Motor s Control Gears, Wooden pieces to design the arm. The NI myrio-1900 provides several analog inputs (AI), analog outputs (AO), digital inputs and outputs (DIO), audio, and power output in a compact embedded device. The NI myrio- 1900 connects to a host computer over USB and wireless 802.11b,g,n. The fig.8 shows the arrangement of NI myrio-1900 components. The NI myrio-1900 is a portable reconfigurable I/O (RIO) device that can be used to design control, robotics, and mechatronics systems [3]. The myrio consists of various other in-built devices like UART, Accelerometer, FPGA and an a dual-core ARM Cortex-A9 real time processor (Xilinx Zynq system on a chip (SOC) running a Linux real-time OS) [4]. All of the above mentioned components are interfaced with the NImyRIO device and the LabVIEW software that precisely supervise and coordinate the activities of the arm movement for object picking, and placing the object in its respective color container. A DC motor is any of a class of electrical machines that converts direct current electrical power into a mechanical rotation as shown in fig.9. At times it is required that the DC motors be driven by an external driver to boost the current level. The driver board used allows control up to two 12V DC motors individually. Each motor can be driven at a maximum of 750mA offering a decent driving current for the motors as shown in fig.9. It supports both clock-wise and anti-clockwise rotation and speed control. It can easily be interfaced with a microcontroller such as Arduino, 8051 or with any Data Acquisition system. The next phase is to raise the elbow, rotate the arm (PWM Control DC Motor1) until the arm reaches the appropriate color container and then the elbow lower s itself into the container and finally the gripper opens up to place the object. The wide angle USB camera is used to acquire the input image of the objects. Gears, Wooden pieces are used inorder to prepare the prototype of the robotic arm possessing three axes. IV. RESULTS AND OBSERVATIONS The prototype as shown in the figs.10 and 11 accepts Red, Green, Yellow as the voice commands, based on which the different directions of motions are possible in the DC motors. Then the Arm picks and places the requested color object in the respective color container. Fig.10 Prototype showing the elbow and gripper functions. Fig.9. The 12V Motor driver and the 12V DC Motor. The design uses DC motors for the movement of robotic arm in all the three axes and the desired function from the arm can be obtained by controlling the motors individually using the PWM express VI. This VI generates the Pulse Width Modulating signal that is being fed to the base DC Motor1, whose duty cycle is adjusted dynamically so that the arm positions itself with respect to the detected object position according to Table 1. Once the arm position is on Fig.11. Prototype showing the camera position and arm movement.

Implementation of Voice Controlled Color Sensing Robotic ARM on NI myrio using NI Lab VIEW V. CONCLUSION The intelligent voice controlled robotic arm is been designed and implemented using NI myrio and NI LabVIEW. The arm robot design has an enhanced potential of color sensing and offers. The design supports userfriendly voice interface, which could be used by larger user base and offers superior precision in object position determination, picking and placing the object. VI. REFERENCES [1] Kader Ibrahim, Internet Controlled Robotic Arm, Engineering Procedia, 2012, page 1072-1078. [2] Dhanoj.M, et.al, Colour Sensor Based Object Sorting Robot Using Embedded System, International Journal of Advanced Research in Computer and Communication Engineering, Vol. 4, Issue 4, April 2015. [3] Jeffrey Travis, Jim Kring, LabVIEW for Everyone: Graphical Programming Made Easy and Fun, Third Edition, Prentice Hall Professional, 2007. [4] NI myrio-1900, User Guide and Specifications, National Instruments, 2013. [5] Thomas Klinger, Image Processing with LabVIEW and IMAQ Vision, Prentice Hall, June 11, 2003.