Change search
ReferencesLink to record
Permanent link

Direct link
Learning to Assess Grasp Stability from Vision, Touch and Proprioception
KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
2012 (English)Doctoral thesis, monograph (Other academic)
Abstract [en]

Grasping and manipulation of objects is an integral part of a robot’s physical interaction with the environment. In order to cope with real-world situations, sensor based grasping of objects and grasp stability estimation is an important skill. This thesis addresses the problem of predicting the stability of a grasp from the perceptions available to a robot once fingers close around the object before attempting to lift it. A regrasping step can be triggered if an unstable grasp is identified. The percepts considered consist of object features (visual), gripper configurations (proprioceptive) and tactile imprints (haptic) when fingers contact the object. This thesis studies tactile based stability estimation by applying machine learning methods such as Hidden Markov Models. An approach to integrate visual and tactile feedback is also introduced to further improve the predictions of grasp stability, using Kernel Logistic Regression models.

Like humans, robots are expected to grasp and manipulate objects in a goal-oriented manner. In other words, objects should be grasped so to afford subsequent actions: if I am to hammer a nail, the hammer should be grasped so to afford hammering. Most of the work on grasping commonly addresses only the problem of finding a stable grasp without considering the task/action a robot is supposed to fulfill with an object. This thesis also studies grasp stability assessment in a task-oriented way based on a generative approach using probabilistic graphical models, Bayesian Networks. We integrate high-level task information introduced by a teacher in a supervised setting with low-level stability requirements acquired through a robot’s exploration. The graphical model is used to encode probabilistic relationships between tasks and sensory data (visual, tactile and proprioceptive). The generative modeling approach enables inference of appropriate grasping configurations, as well as prediction of grasp stability. Overall, results indicate that the idea of exploiting learning approaches for grasp stability assessment is applicable in realistic scenarios.

Place, publisher, year, edition, pages
Stockholm: KTH Royal Institute of Technology, 2012. , vi, 99 p.
Trita-CSC-A, ISSN 1653-5723 ; 2012:12
Keyword [en]
Robotic grasping, Machine Learning, Tactile Sensing
National Category
Computer Science
URN: urn:nbn:se:kth:diva-104035ISBN: 978-91-7501-522-4OAI: diva2:562726
Public defence
2012-11-14, F3, Lindstedtsvägen 26, Kungliga Tekniska Högskolan, Stockholm, 10:00 (English)
ICT - The Next Generation

QC 20121026

Available from: 2012-10-26 Created: 2012-10-25 Last updated: 2013-04-15Bibliographically approved

Open Access in DiVA

Yasemin_Bekiroglu_thesis(18946 kB)1515 downloads
File information
File name FULLTEXT01.pdfFile size 18946 kBChecksum SHA-512
Type fulltextMimetype application/pdf

Search in DiVA

By author/editor
Bekiroglu, Yasemin
By organisation
Computer Vision and Active Perception, CVAP
Computer Science

Search outside of DiVA

GoogleGoogle Scholar
Total: 1515 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

Total: 395 hits
ReferencesLink to record
Permanent link

Direct link