Change search
ReferencesLink to record
Permanent link

Direct link
Vision Based Robotic Control
Norwegian University of Science and Technology, Faculty of Engineering Science and Technology, Department of Productions and Quality Engineering.
2014 (English)MasteroppgaveStudent thesis
Abstract [en]

The main objective of this project was to create a visual system for object tracking, and implement this on the new robotic lab at the Department of Production and Quality Engineering. There are used two identical KUKA Agilus manipulators with six rotating axes. The robot kinematics with the corresponding Denavit-Hartenberg representation are presented. The position based visual servoing method, used for robotic control, is shown with control loop and relations between the different coordinate bases and frames in the system. Computer vision is a large part of this thesis. The different camera parameters and how to obtain them are explained. The results show the importance of accurate camera calibration, and that one should avoid the temptation to leave out some parameters from the equation. This can cause large errors in the measurements. The SIFT object detection method is explained, and the performance is compared with another method named SURF. The tests show that while SURF is faster than SIFT, it is outperformed when it comes to the robustness of the algorithm. SIFT was therefore chosen for implementation at the lab. With the object detected, the manipulators movement needs to be calculated in order to position the camera in the desired position, which is 300mm perpendicular to the object. The algorithm calculates the rotational and translational offset between the current camera position and the desired camera pose. A proportional regulator is then applied to calculate the next small step on the desired trajectory for the Agilus manipulator. The practical setup of the robot cell is explained with each step needed in order to have a working vision system. The information flow in the system can be chaotic, and a graphic representation is therefore developed and show all steps from image capturing, to robotic movement, and plotting of the trajectories. Results are presented as plots for both distance calculations and the actual movement of the manipulator. The visual system tracks and follows the object successfully. There are however some issues with variations in the output from the object detection algorithm. This cause variations in the signal used as reference for the robot. A filter was able to reduce these variations, but not eliminate them. Possible solutions are presented and are believed to improve the speed and accuracy of the system if further investigated.

Place, publisher, year, edition, pages
Institutt for produksjons- og kvalitetsteknikk , 2014. , 100 p.
URN: urn:nbn:no:ntnu:diva-26206Local ID: ntnudaim:11410OAI: diva2:745627
Available from: 2014-09-10 Created: 2014-09-10 Last updated: 2014-09-10Bibliographically approved

Open Access in DiVA

fulltext(20482 kB)1534 downloads
File information
File name FULLTEXT01.pdfFile size 20482 kBChecksum SHA-512
Type fulltextMimetype application/pdf
cover(595 kB)7 downloads
File information
File name COVER01.pdfFile size 595 kBChecksum SHA-512
Type coverMimetype application/pdf
attachment(104614 kB)21 downloads
File information
File name ATTACHMENT01.zipFile size 104614 kBChecksum SHA-512
Type attachmentMimetype application/zip

By organisation
Department of Productions and Quality Engineering

Search outside of DiVA

GoogleGoogle Scholar
Total: 1534 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

Total: 185 hits
ReferencesLink to record
Permanent link

Direct link