Please use this identifier to cite or link to this item: http://localhost:8081/xmlui/handle/123456789/2772
Title: INVESTIGATION OF COMPUTER VISION TECHNIQUES FOR ROBOT NAVIGATION
Authors: Chauhan, Shobhraj Singh
Keywords: ELECTRICAL ENGINEERING;COMPUTER VISION TECHNIQUES;ROBOT NAVIGATION;WEB-BASED STEREO CAMERA
Issue Date: 2009
Abstract: Accurate localization and mapping is a problem of great significance in the mobile robotics community, and especially in the context of Nuclear Research applications, where a teleoperated (or sometimes autonomous) agent is dispatched inside an unknown hazardous radioactive environment to collect sensory data or material on a time-critical task. Almost always the environments surveyed are unstructured and previously unknown, so localization is essential. This report presents an implementation of a visual odometry system as described by David Nister, using i-ball C2. 0 web-cameras based stereo head having 5G Lenses and 2. OM interpolated pixels, which runs on a Pentium Dual core computer with 2GB RAM Visual Odometry is the process of determining the position and orientation of a camera by analyzing a sequence of images obtained from it. In mobile ' robotics, localization plays a fundamental role for the navigation task, since it is necessary for every kind of pathplanning. Stereo visual odometry has also been used on Mars since early 2004. The designed system tracks the 3D position of a calibrated stereo head as it moves through the environment. Harris corner features are identified in the video stream, and are then matched between pairs of frames and linked into image trajectories. Robust estimates of the camera motion are then produced from the feature tracks using a geometric hypothesize-and-test architecture. Direct orientation of 3D points is used to estimate the camera position since triangulation of matched points continuously gives the 3D location of world point. The system is able to estimate movement in X and Y directions of camera reference frame with a maximum error of 20.51 mm for a 1340 mm run. A random drift in the Z dimension (along the camera optical ray) prevents the extraction of accurate movement data along that axis, if camera is moved along Z direction. The results from the system on offline tests indicate a promising potential of the system.
URI: http://hdl.handle.net/123456789/2772
Other Identifiers: M.Tech
Research Supervisor/ Guide: Kumar, Vinod
Agarwal, Pramod
metadata.dc.type: M.Tech Dessertation
Appears in Collections:MASTERS' THESES (Electrical Engg)

Files in This Item:
File Description SizeFormat 
EEDG14602.pdf9.76 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.