ROBOTIC SURVEILLANCE PROJECT REFERENCE NO. : 37S1070 COLLEGE : PES INSTITUTE OF TECHNOLOGY, BANGALORE BRANCH : TELECOMMUNICATION ENGINEERING GUIDE : SUSHMA RAWAL STUDENTS : SHASHANK C VAISHAKH SHIRUR DHANUSH D M Keywords: Robot vehicle system, military surveillance, hostage situations, surprise enemy attack, camera, sensor, image, distance, ZigBee, etc. Introduction: Human life is very precious. Micro controller based robotic vehicle system is a sincere effort made by us to save the precious human life in some dangerous fields such as military during hostage situations. During war or when military people enter uncharted territory (hostage situations) they will sometimes be victims of surprise enemy attack. This robotic vehicle is a new method to trace out the enemies and use that information to make a tactical move. It is having all the necessary accessories to trace enemies like camera and sensor. These accessories help to find the position of the enemies, and save precious life that could have been lost. Robotic Surveillance project is to implement a system that can be used for military real time applications. The robot wheels (equipped with high power motors and battery) are designed in such a way that it can move in all terrain. The movement of the robot can be controlled remotely by any concerned person. The whole system can be visualized through the camera that uses ZigBee transceiver. The distance of the enemy can be determined by a proximity sensor, and is transmitted to the laptop via zigbee module. Objectives of the project: The main objective of the project is to design a system that will be useful in military real time applications. It is having all necessary components like camera and sensor. Movement of the robot can be controlled at the base station. The real time video will be appearing on the screen. The image can be captured and stored in the database. Captured 1
image can be compare with the image stored in the database. The decision can be taken based on this, whether to which direction the robot has to be moved. The distance of enemy can be found using the Ultrasonic sensor and the value is transmitted to the base station via ZigBee Module. The whole station can be visualized through the camera mounted on the robot. The GUI for camera, movement of robot and sensor will be at base station. Methodology: Materials: Camera, Motors, ZigBee Module, Microcontroller 8051, Motor Driver, Sensor, etc... The system consist of microcontroller 8051, zigbee module, motor drivers for DC motor which is used for the movement of the robot, camera to capture images and ultrasonic sensor to find the distance of the object. The communication between the base station and the robot happens through the zigbee. The communication between the controller, driver and the analog to digital converter happens through ports of 8051. The camera is mounted on the robot is used to view the things happening around. The rotational angle of the camera is controlled by a stepper motor. Communication between the camera and its Graphical interface is done serially. Camera is used to take real time video and the captured images are compared with the images stored in the database. After verifying the captured images with the data base, appropriate commands are issued to the robot using the GUI. The drivers are used to control the motor wheels, the ultrasonic sensor is used to calculate the distance of the objects required. The value obtained from the sensor will be in the analog form, so analog to digital converters are used to convert the analog value to digital and it is transmitted to the user interface through zigbee via controller. Graphical user interface is created in order to control the movement of the robot wireless. 2
Results and Conclusions: Fig.1: Main GUI of Robot. Fig1.1: GUI of camera Fig.2 Fig.2: This is the picture of a book (object) as taken by the camera mounted on the robot by carefully positioning the camera angle using camera controls. 3
Fig.3 Fig.3 (Perfect match): This picture depicts feature matching used in the project. The features of the object captured by the camera are featured matched with the image in the data base. The above picture shows all the dominant features of the object matched to the image in the data base (perfect match). Fig.4 Fig.4 (Image mismatch): The picture shows a mismatch between the object under consideration and the object in the data base. In the following case none of the features will be matched resulting in a perfect mismatch Conclusions: This project is our sincere effort to simplify the work of military people to tackle problem like hostage situation. The project integrates rotation control camera to a remotely control robot. This together with a sensor can be used to control any hostage situation effectively. We have used a zigbee transceiver to achieve wide range wireless communication. 4
The front end (GUI) is developed using visual basic to provide easy access to both camera and the robot. Another important technology used is the feature matching (using matlab) which enables us to match the objects with the images in the data stored in database. Scope for Future Work: The robot can be fitted with a metal detector to survey mine fields or in detecting bombs. Additional features like night vision can be implemented for the robot to come in handy during nights. The movement of the robot can be made completely voice controlled instead of accessing it using the laptop. The camera can be made wireless which currently is wired. The robot can also be fitted with GPS tracking system to give exact location of the robot while covering large areas. Zigbee module which covers ranges-up to 2kms can be used for wide area coverage. 5