Please use this identifier to cite or link to this item: http://localhost:8081/xmlui/handle/123456789/2772
Full metadata record
DC FieldValueLanguage
dc.contributor.authorChauhan, Shobhraj Singh-
dc.date.accessioned2014-09-29T06:05:25Z-
dc.date.available2014-09-29T06:05:25Z-
dc.date.issued2009-
dc.identifierM.Techen_US
dc.identifier.urihttp://hdl.handle.net/123456789/2772-
dc.guideKumar, Vinod-
dc.guideAgarwal, Pramod-
dc.description.abstractAccurate localization and mapping is a problem of great significance in the mobile robotics community, and especially in the context of Nuclear Research applications, where a teleoperated (or sometimes autonomous) agent is dispatched inside an unknown hazardous radioactive environment to collect sensory data or material on a time-critical task. Almost always the environments surveyed are unstructured and previously unknown, so localization is essential. This report presents an implementation of a visual odometry system as described by David Nister, using i-ball C2. 0 web-cameras based stereo head having 5G Lenses and 2. OM interpolated pixels, which runs on a Pentium Dual core computer with 2GB RAM Visual Odometry is the process of determining the position and orientation of a camera by analyzing a sequence of images obtained from it. In mobile ' robotics, localization plays a fundamental role for the navigation task, since it is necessary for every kind of pathplanning. Stereo visual odometry has also been used on Mars since early 2004. The designed system tracks the 3D position of a calibrated stereo head as it moves through the environment. Harris corner features are identified in the video stream, and are then matched between pairs of frames and linked into image trajectories. Robust estimates of the camera motion are then produced from the feature tracks using a geometric hypothesize-and-test architecture. Direct orientation of 3D points is used to estimate the camera position since triangulation of matched points continuously gives the 3D location of world point. The system is able to estimate movement in X and Y directions of camera reference frame with a maximum error of 20.51 mm for a 1340 mm run. A random drift in the Z dimension (along the camera optical ray) prevents the extraction of accurate movement data along that axis, if camera is moved along Z direction. The results from the system on offline tests indicate a promising potential of the system.en_US
dc.language.isoenen_US
dc.subjectELECTRICAL ENGINEERINGen_US
dc.subjectCOMPUTER VISION TECHNIQUESen_US
dc.subjectROBOT NAVIGATIONen_US
dc.subjectWEB-BASED STEREO CAMERAen_US
dc.titleINVESTIGATION OF COMPUTER VISION TECHNIQUES FOR ROBOT NAVIGATIONen_US
dc.typeM.Tech Dessertationen_US
dc.accession.numberG14602en_US
Appears in Collections:MASTERS' THESES (Electrical Engg)

Files in This Item:
File Description SizeFormat 
EEDG14602.pdf9.76 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.