Digitala Vetenskapliga Arkivet

Change search
Refine search result
1 - 18 of 18
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 1.
    Almeida, Diogo
    et al.
    KTH, School of Computer Science and Communication (CSC), Robotics, perception and learning, RPL.
    Ambrus, Rares
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Caccamo, Sergio
    KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.
    Chen, Xi
    KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.
    Cruciani, Silvia
    Pinto Basto De Carvalho, Joao F
    KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.
    Haustein, Joshua
    KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.
    Marzinotto, Alejandro
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Vina, Francisco
    KTH.
    Karayiannidis, Yiannis
    KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.
    Ögren, Petter
    KTH, School of Engineering Sciences (SCI), Mathematics (Dept.), Optimization and Systems Theory.
    Jensfelt, Patric
    KTH, School of Computer Science and Communication (CSC), Robotics, perception and learning, RPL.
    Kragic, Danica
    KTH, School of Computer Science and Communication (CSC), Robotics, perception and learning, RPL.
    Team KTH’s Picking Solution for the Amazon Picking Challenge 20162017In: Warehouse Picking Automation Workshop 2017: Solutions, Experience, Learnings and Outlook of the Amazon Robotics Challenge, 2017Conference paper (Other (popular science, discussion, etc.))
    Abstract [en]

    In this work we summarize the solution developed by Team KTH for the Amazon Picking Challenge 2016 in Leipzig, Germany. The competition simulated a warehouse automation scenario and it was divided in two tasks: a picking task where a robot picks items from a shelf and places them in a tote and a stowing task which is the inverse task where the robot picks items from a tote and places them in a shelf. We describe our approach to the problem starting from a high level overview of our system and later delving into details of our perception pipeline and our strategy for manipulation and grasping. The solution was implemented using a Baxter robot equipped with additional sensors.

  • 2.
    Almeida, Diogo
    et al.
    KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.
    Ambrus, Rares
    KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.
    Caccamo, Sergio
    KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.
    Chen, Xi
    KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.
    Cruciani, Silvia
    KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.
    Pinto Basto de Carvalho, Joao Frederico
    KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.
    Haustein, Joshua
    KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.
    Marzinotto, Alejandro
    KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.
    Viña, Francisco
    Karayiannidis, Yiannis
    KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.
    Ögren, Petter
    KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.
    Jensfelt, Patric
    KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.
    Kragic, Danica
    KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.
    Team KTH’s Picking Solution for the Amazon Picking Challenge 20162020In: Advances on Robotic Item Picking: Applications in Warehousing and E-Commerce Fulfillment, Springer Nature , 2020, p. 53-62Chapter in book (Other academic)
    Abstract [en]

    In this chapter we summarize the solution developed by team KTH for the Amazon Picking Challenge 2016 in Leipzig, Germany. The competition, which simulated a warehouse automation scenario, was divided into two parts: a picking task, where the robot picks items from a shelf and places them into a tote, and a stowing task, where the robot picks items from a tote and places them in a shelf. We describe our approach to the problem starting with a high-level overview of the system, delving later into the details of our perception pipeline and strategy for manipulation and grasping. The hardware platform used in our solution consists of a Baxter robot equipped with multiple vision sensors.

  • 3.
    Almeida, Diogo
    et al.
    KTH, School of Computer Science and Communication (CSC), Robotics, perception and learning, RPL. KTH, School of Computer Science and Communication (CSC), Centres, Centre for Autonomous Systems, CAS.
    Viña, Francisco E.
    KTH, School of Computer Science and Communication (CSC), Robotics, perception and learning, RPL. KTH, School of Computer Science and Communication (CSC), Centres, Centre for Autonomous Systems, CAS.
    Karayiannidis, Yiannis
    Bimanual Folding Assembly: Switched Control and Contact Point Estimation2016In: IEEE-RAS 16th International Conference on Humanoid Robots (Humanoids), Cancun, 2016, Cancun: IEEE, 2016Conference paper (Refereed)
    Abstract [en]

    Robotic assembly in unstructured environments is a challenging task, due to the added uncertainties. These can be mitigated through the employment of assembly systems, which offer a modular approach to the assembly problem via the conjunction of primitives. In this paper, we use a dual-arm manipulator in order to execute a folding assembly primitive. When executing a folding primitive, two parts are brought into rigid contact and posteriorly translated and rotated. A switched controller is employed in order to ensure that the relative motion of the parts follows the desired model, while regulating the contact forces. The control is complemented with an estimator based on a Kalman filter, which tracks the contact point between parts based on force and torque measurements. Experimental results are provided, and the effectiveness of the control and contact point estimation is shown.

    Download full text (pdf)
    fulltext
  • 4.
    Hang, Kaiyu
    et al.
    KTH, School of Electrical Engineering and Computer Science (EECS), Centres, Centre for Autonomous Systems, CAS. KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.
    Vina, Francisco
    KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.
    Colledanchise, Michele
    KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.
    Pauwels, Karl
    KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.
    Pieropan, Alessandro
    KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.
    Kragic, Danica
    KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Robotics, Perception and Learning, RPL.
    Team CVAP’s Mobile Picking System at the Amazon Picking Challenge 20152020In: Advances on Robotic Item Picking: Applications in Warehousing and E-Commerce Fulfillment, Springer Nature , 2020, p. 1-12Chapter in book (Other academic)
    Abstract [en]

    In this paper we present the system we developed for the Amazon Picking Challenge 2015, and discuss some of the lessons learned that may prove useful to researchers and future teams developing autonomous robot picking systems. For the competition we used a PR2 robot, which is a dual arm robot research platform equipped with a mobile base and a variety of 2D and 3D sensors. We adopted a behavior tree to model the overall task execution, where we coordinate the different perception, localization, navigation, and manipulation activities of the system in a modular fashion. Our perception system detects and localizes the target objects in the shelf and it consisted of two components: one for detecting textured rigid objects using the SimTrack vision system, and one for detecting non-textured or nonrigid objects using RGBD features. In addition, we designed a set of grasping strategies to enable the robot to reach and grasp objects inside the confined volume of shelf bins. The competition was a unique opportunity to integrate the work of various researchers at the Robotics, Perception and Learning laboratory (formerly the Computer Vision and Active Perception Laboratory, CVAP) of KTH, and it tested the performance of our robotic system and defined the future direction of our research.

  • 5.
    Karayiannidis, Yiannis
    et al.
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP. Chalmers, Sweden.
    Smith, Christian
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Barrientos, Francisco Eli Vina
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Ögren, Petter
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Kragic, Danica
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    An Adaptive Control Approach for Opening Doors and Drawers Under Uncertainties2016In: IEEE Transactions on robotics, ISSN 1552-3098, E-ISSN 1941-0468, Vol. 32, no 1, p. 161-175Article in journal (Refereed)
    Abstract [en]

    We study the problem of robot interaction with mechanisms that afford one degree of freedom motion, e.g., doors and drawers. We propose a methodology for simultaneous compliant interaction and estimation of constraints imposed by the joint. Our method requires no prior knowledge of the mechanisms' kinematics, including the type of joint, prismatic or revolute. The method consists of a velocity controller that relies on force/torque measurements and estimation of the motion direction, the distance, and the orientation of the rotational axis. It is suitable for velocity controlled manipulators with force/torque sensor capabilities at the end-effector. Forces and torques are regulated within given constraints, while the velocity controller ensures that the end-effector of the robot moves with a task-related desired velocity. We give proof that the estimates converge to the true values under valid assumptions on the grasp, and error bounds for setups with inaccuracies in control, measurements, or modeling. The method is evaluated in different scenarios involving opening a representative set of door and drawer mechanisms found in household environments.

    Download full text (pdf)
    fulltext
  • 6.
    Karayiannidis, Yiannis
    et al.
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Smith, Christian
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP. KTH, School of Computer Science and Communication (CSC), Centres, Centre for Autonomous Systems, CAS.
    Vina, Francisco
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Kragic, Danica
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Online Contact Point Estimation for Uncalibrated Tool Use2014In: Robotics and Automation (ICRA), 2014 IEEE International Conference on, IEEE Robotics and Automation Society, 2014, p. 2488-2493Conference paper (Refereed)
    Abstract [en]

    One of the big challenges for robots working outside of traditional industrial settings is the ability to robustly and flexibly grasp and manipulate tools for various tasks. When a tool is interacting with another object during task execution, several problems arise: a tool can be partially or completely occluded from the robot's view, it can slip or shift in the robot's hand - thus, the robot may lose the information about the exact position of the tool in the hand. Thus, there is a need for online calibration and/or recalibration of the tool. In this paper, we present a model-free online tool-tip calibration method that uses force/torque measurements and an adaptive estimation scheme to estimate the point of contact between a tool and the environment. An adaptive force control component guarantees that interaction forces are limited even before the contact point estimate has converged. We also show how to simultaneously estimate the location and normal direction of the surface being touched by the tool-tip as the contact point is estimated. The stability of the the overall scheme and the convergence of the estimated parameters are theoretically proven and the performance is evaluated in experiments on a real robot.

    Download full text (pdf)
    fulltext
  • 7.
    Karayiannidis, Yiannis
    et al.
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP. KTH, School of Computer Science and Communication (CSC), Centres, Centre for Autonomous Systems, CAS.
    Smith, Christian
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP. KTH, School of Computer Science and Communication (CSC), Centres, Centre for Autonomous Systems, CAS.
    Vina, Francisco
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP. KTH, School of Computer Science and Communication (CSC), Centres, Centre for Autonomous Systems, CAS.
    Kragic, Danica
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP. KTH, School of Computer Science and Communication (CSC), Centres, Centre for Autonomous Systems, CAS.
    Online Kinematics Estimation for Active Human-Robot Manipulation of Jointly Held Objects2013In: 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), IEEE , 2013, p. 4872-4878Conference paper (Refereed)
    Abstract [en]

    This paper introduces a method for estimating the constraints imposed by a human agent on a jointly manipulated object. These estimates can be used to infer knowledge of where the human is grasping an object, enabling the robot to plan trajectories for manipulating the object while subject to the constraints. We describe the method in detail, motivate its validity theoretically, and demonstrate its use in co-manipulation tasks with a real robot.

    Download full text (pdf)
    iros2013karayiannidis
  • 8.
    Karayiannidis, Yiannis
    et al.
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Smith, Christian
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Vina, Francisco
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Ögren, Petter
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Kragic, Danica
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Design of force-driven online motion plans for door opening under uncertainties2012In: Workshop on Real-time Motion Planning: Online, Reactive, and in Real-time, 2012Conference paper (Refereed)
    Abstract [en]

    The problem of door opening is fundamental for household robotic applications. Domestic environments are generally less structured than industrial environments and thus several types of uncertainties associated with the dynamics and kinematics of a door must be dealt with to achieve successful opening. This paper proposes a method that can open doors without prior knowledge of the door kinematics. The proposed method can be implemented on a velocity-controlled manipulator with force sensing capabilities at the end-effector. The velocity reference is designed by using feedback of force measurements while constraint and motion directions are updated online based on adaptive estimates of the position of the door hinge. The online estimator is appropriately designed in order to identify the unknown directions. The proposed scheme has theoretically guaranteed performance which is further demonstrated in experiments on a real robot. Experimental results additionally show the robustness of the proposed method under disturbances introduced by the motion of the mobile platform.

  • 9.
    Karayiannidis, Yiannis
    et al.
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Smith, Christian
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Vina, Francisco
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Ögren, Petter
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Kragic, Danica
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Interactive perception and manipulation of unknown constrained mechanisms using adaptive control2013In: ICRA 2013 Mobile Manipulation Workshop on Interactive Perception, 2013Conference paper (Refereed)
  • 10.
    Karayiannidis, Yiannis
    et al.
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP. KTH, School of Computer Science and Communication (CSC), Centres, Centre for Autonomous Systems, CAS.
    Smith, Christian
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP. KTH, School of Computer Science and Communication (CSC), Centres, Centre for Autonomous Systems, CAS.
    Vina, Francisco
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP. KTH, School of Computer Science and Communication (CSC), Centres, Centre for Autonomous Systems, CAS.
    Ögren, Petter
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP. KTH, School of Computer Science and Communication (CSC), Centres, Centre for Autonomous Systems, CAS.
    Kragic, Danica
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP. KTH, School of Computer Science and Communication (CSC), Centres, Centre for Autonomous Systems, CAS.
    Model-free robot manipulation of doors and drawers by means of fixed-grasps2013In: 2013 IEEE International Conference on Robotics and Automation (ICRA), New York: IEEE , 2013, p. 4485-4492Conference paper (Refereed)
    Abstract [en]

    This paper addresses the problem of robot interaction with objects attached to the environment through joints such as doors or drawers. We propose a methodology that requires no prior knowledge of the objects’ kinematics, including the type of joint - either prismatic or revolute. The method consists of a velocity controller which relies onforce/torque measurements and estimation of the motion direction,rotational axis and the distance from the center of rotation.The method is suitable for any velocity controlled manipulatorwith a force/torque sensor at the end-effector. The force/torquecontrol regulates the applied forces and torques within givenconstraints, while the velocity controller ensures that the endeffectormoves with a task-related desired tangential velocity. The paper also provides a proof that the estimates converge tothe actual values. The method is evaluated in different scenarios typically met in a household environment.

    Download full text (pdf)
    icra2013Karayiannidis
  • 11.
    Karayiannidis, Yiannis
    et al.
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP. KTH, School of Computer Science and Communication (CSC), Centres, Centre for Autonomous Systems, CAS.
    Smith, Christian
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP. KTH, School of Computer Science and Communication (CSC), Centres, Centre for Autonomous Systems, CAS.
    Vina, Francisco
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP. KTH, School of Computer Science and Communication (CSC), Centres, Centre for Autonomous Systems, CAS.
    Ögren, Petter
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP. KTH, School of Computer Science and Communication (CSC), Centres, Centre for Autonomous Systems, CAS.
    Kragic, Danica
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP. KTH, School of Computer Science and Communication (CSC), Centres, Centre for Autonomous Systems, CAS.
    "Open Sesame!" Adaptive Force/Velocity Control for Opening Unknown Doors2012In: Intelligent Robots and Systems (IROS), 2012 IEEE/RSJ International Conference on, IEEE , 2012, p. 4040-4047Conference paper (Refereed)
    Abstract [en]

    The problem of door opening is fundamental for robots operating in domestic environments. Since these environments are generally less structured than industrial environments, several types of uncertainties associated with the dynamics and kinematics of a door must be dealt with to achieve successful opening. This paper proposes a method that can open doors without prior knowledge of the door kinematics. The proposed method can be implemented on a velocity-controlled manipulator with force sensing capabilities at the end-effector. The method consists of a velocity controller which uses force measurements and estimates of the radial direction based on adaptive estimates of the position of the door hinge. The control action is decomposed into an estimated radial and tangential direction following the concept of hybrid force/motion control. A force controller acting within the velocity controller regulates the radial force to a desired small value while the velocity controller ensures that the end effector of the robot moves with a desired tangential velocity leading to task completion. This paper also provides a proof that the adaptive estimates of the radial direction converge to the actual radial vector. The performance of the control scheme is demonstrated in both simulation and on a real robot.

    Download full text (pdf)
    Iros2012Karayiannidis
  • 12.
    Vina, Francisco
    et al.
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Karayiannidis, Yiannis
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Pauwels, Karl
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Smith, Christian
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP. KTH, School of Computer Science and Communication (CSC), Centres, Centre for Autonomous Systems, CAS.
    Kragic, Danica
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    In-hand manipulation using gravity and controlled slip2015In: Intelligent Robots and Systems (IROS), 2015 IEEE/RSJ International Conference on, IEEE conference proceedings, 2015, p. 5636-5641Conference paper (Refereed)
    Abstract [en]

    In this work we propose a sliding mode controllerfor in-hand manipulation that repositions a tool in the robot’shand by using gravity and controlling the slippage of the tool. In our approach, the robot holds the tool with a pinch graspand we model the system as a link attached to the grippervia a passive revolute joint with friction, i.e., the grasp onlyaffords rotational motions of the tool around a given axis ofrotation. The robot controls the slippage by varying the openingbetween the fingers in order to allow the tool to move tothe desired angular position following a reference trajectory.We show experimentally how the proposed controller achievesconvergence to the desired tool orientation under variations ofthe tool’s inertial parameters.

    Download full text (pdf)
    fulltext
  • 13.
    Vina, Francisco
    et al.
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Smith, Christian
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP. KTH, School of Computer Science and Communication (CSC), Centres, Centre for Autonomous Systems, CAS.
    Kragic, Danica
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Karayiannidis, Yiannis
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Adaptive Contact Point Estimation for Autonomous Tool Manipulation2014Conference paper (Refereed)
    Abstract [en]

    Autonomous grasping and manipulation of toolsenables robots to perform a large variety of tasks in unstructuredenvironments such as households. Many commonhousehold tasks involve controlling the motion of the tip of a toolwhile it is in contact with another object. Thus, for these types oftasks the robot requires knowledge of the location of the contactpoint while it is executing the task in order to accomplish themanipulation objective. In this work we propose an integraladaptive control law that uses force/torque measurements toestimate online the location of the contact point between thetool manipulated by the robot and the surface which the tooltouches

  • 14.
    Viña Barrientos, Francisco
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Robotic Manipulation under Uncertainty and Limited Dexterity2016Doctoral thesis, comprehensive summary (Other academic)
    Abstract [en]

    Robotic manipulators today are mostly constrained to perform fixed, repetitive tasks. Engineers design the robot’s workcell specifically tailoredto the task, minimizing all possible uncertainties such as the location of tools and parts that the robot manipulates. However, autonomous robots must be capable of manipulating novel objects with unknown physical properties such as their inertial parameters, friction and shape. In this thesis we address the problem of uncertainty connected to kinematic constraints and friction forces in several robotic manipulation tasks. We design adaptive controllers for opening one degree of freedom mechanisms, such as doors and drawers, under the presence of uncertainty in the kinematic parameters of the system. Furthermore, we formulate adaptive estimators for determining the location of the contact point between a tool grasped by the robot and the environment in manipulation tasks where the robot needs to exert forces with the tool on another object, as in the case of screwing or drilling. We also propose a learning framework based on Gaussian Process regression and dual arm manipulation to estimate the static friction properties of objects. The second problem we address in this thesis is related to the mechanical simplicity of most robotic grippers available in the market. Their lower cost and higher robustness compared to more mechanically advanced hands make them attractive for industrial and research robots. However, the simple mechanical design restrictsthem from performing in-hand manipulation, i.e. repositioning of objects in the robot’s hand, by using the fingers to push, slide and roll the object. Researchers have proposed thus to use extrinsic dexterity instead, i.e. to exploit resources and features of the environment, such as gravity or inertial forces,  that can help the robot to perform regrasps. Given that the robot must then interact with the environment, the problem of uncertainty becomes highly relevant. We propose controllers for performing pivoting, i.e. reorienting the grasped object in the robot’s hand, using gravity and controlling the friction exerted by the fingertips by varying the grasping force.

    Download full text (pdf)
    fulltext
  • 15.
    Viña Barrientos, Francisco
    et al.
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Karayiannidis, Yiannis
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Smith, Christian
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Kragic, Danica
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Adaptive Control for Pivoting with Visual and Tactile Feedback2016Conference paper (Refereed)
    Abstract [en]

    In this work we present an adaptive control approach for pivoting, which is an in-hand manipulation maneuver that consists of rotating a grasped object to a desired orientation relative to the robot’s hand. We perform pivoting by means of gravity, allowing the object to rotate between the fingers of a one degree of freedom gripper and controlling the gripping force to ensure that the object follows a reference trajectory and arrives at the desired angular position. We use a visual pose estimation system to track the pose of the object and force measurements from tactile sensors to control the gripping force. The adaptive controller employs an update law that accommodates for errors in the friction coefficient,which is one of the most common sources of uncertainty in manipulation. Our experiments confirm that the proposed adaptive controller successfully pivots a grasped object in the presence of uncertainty in the object’s friction parameters.

    Download full text (pdf)
    fulltext
  • 16.
    Viña, Francisco
    et al.
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP. KTH, School of Computer Science and Communication (CSC), Centres, Centre for Autonomous Systems, CAS.
    Bekiroglu, Yasemin
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP. KTH, School of Computer Science and Communication (CSC), Centres, Centre for Autonomous Systems, CAS.
    Smith, Christian
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP. KTH, School of Computer Science and Communication (CSC), Centres, Centre for Autonomous Systems, CAS.
    Karayiannidis, Yiannis
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP. KTH, School of Computer Science and Communication (CSC), Centres, Centre for Autonomous Systems, CAS.
    Kragic, Danica
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP. KTH, School of Computer Science and Communication (CSC), Centres, Centre for Autonomous Systems, CAS.
    Predicting Slippage and Learning Manipulation Affordances through Gaussian Process Regression2013In: Proceeding of the 2013 IEEE-RAS International Conference on Humanoid Robots, IEEE Computer Society, 2013Conference paper (Refereed)
    Abstract [en]

    Object grasping is commonly followed by someform of object manipulation – either when using the grasped object as a tool or actively changing its position in the hand through in-hand manipulation to afford further interaction. In this process, slippage may occur due to inappropriate contact forces, various types of noise and/or due to the unexpected interaction or collision with the environment. In this paper, we study the problem of identifying continuous bounds on the forces and torques that can be applied on a grasped object before slippage occurs. We model the problem as kinesthetic rather than cutaneous learning given that the measurements originate from a wrist mounted force-torque sensor. Given the continuous output, this regression problem is solved using a Gaussian Process approach.We demonstrate a dual armed humanoid robot that can autonomously learn force and torque bounds and use these to execute actions on objects such as sliding and pushing. We show that the model can be used not only for the detection of maximum allowable forces and torques but also for potentially identifying what types of tasks, denoted as manipulation affordances, a specific grasp configuration allows. The latter can then be used to either avoid specific motions or as a simple step of achieving in-hand manipulation of objects through interaction with the environment.

    Download full text (pdf)
    humanoids13_vbskk.pdf
  • 17.
    Wang, Yuquan
    et al.
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Ogren, Petter
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Smith, Christian
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Vina, Francisco
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Karayiannidis, Yiannis
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Dual Arm Manipulation using ConstraintBased Programming2014In: Proceedings of the 19th World CongressThe International Federation of Automatic Control / [ed] Boje, Edward, Xia, Xiaohua, Elsevier, 2014, Vol. 19, p. 311-319Conference paper (Refereed)
    Abstract [en]

    In this paper, we present a technique for online generation of dual arm trajectories using constraint based programming based on bound margins. Using this formulation, we take both equality and inequality constraints into account, in a way that incorporates both feedback and feedforward terms, enabling e.g. tracking of timed trajectories in a new way. The technique is applied to a dual arm manipulator performing a bi-manual task. We present experimental validation of the approach, including comparisons between simulations and real experiments of a complex bimanual tracking task. We also show how to add force feedback to the framework, to account for modeling errors in the systems. We compare the results with and without feedback, and show how the resulting trajectory is modified to achieve the prescribed interaction forces.

    Download full text (pdf)
    fulltext
  • 18.
    Wang, Yuquan
    et al.
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Vina, Francisco
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Karayiannidis, Yiannis
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Smith, Christian
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Ögren, Petter
    KTH, School of Computer Science and Communication (CSC), Computer Vision and Active Perception, CVAP.
    Whole body control of a dual-arm mobile robot using a virtual kinematic chain2014Manuscript (preprint) (Other academic)
1 - 18 of 18
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf