Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Toward a Sustainable Human-Robot Collaborative Production Environment
KTH, School of Industrial Engineering and Management (ITM), Production Engineering.
2017 (English)Doctoral thesis, comprehensive summary (Other academic)
Abstract [en]

This PhD study aimed to address the sustainability issues of the robotic systems from the environmental and social aspects. During the research, three approaches were developed: the first one an online programming-free model-driven system that utilises web-based distributed human-robot collaboration architecture to perform distant assembly operations. It uses a robot-mounted camera to capture the silhouettes of the components from different angles. Then the system analyses those silhouettes and constructs the corresponding 3D models.Using the 3D models together with the model of a robotic assembly cell, the system guides a distant human operator to assemble the real components in the actual robotic cell. To satisfy the safety aspect of the human-robot collaboration, a second approach has been developed for effective online collision avoidance in an augmented environment, where virtual three-dimensional (3D) models of robots and real images of human operators from depth cameras are used for monitoring and collision detection. A prototype system is developed and linked to industrial robot controllers for adaptive robot control, without the need of programming by the operators. The result of collision detection reveals four safety strategies: the system can alert an operator, stop a robot, move away the robot, or modify the robot’s trajectory away from an approaching operator. These strategies can be activated based on the operator’s location with respect to the robot. The case study of the research further discusses the possibility of implementing the developed method in realistic applications, for example, collaboration between robots and humans in an assembly line.To tackle the energy aspect of the sustainability for the human-robot production environment, a third approach has been developed which aims to minimise the robot energy consumption during assembly. Given a trajectory and based on the inverse kinematics and dynamics of a robot, a set of attainable configurations for the robot can be determined, perused by calculating the suitable forces and torques on the joints and links of the robot. The energy consumption is then calculated for each configuration and based on the assigned trajectory. The ones with the lowest energy consumption are selected.

Place, publisher, year, edition, pages
KTH Royal Institute of Technology, 2017. , p. 98
Series
TRITA-IIP, ISSN 1650-1888 ; 17-01
Keywords [en]
vision sensor, 3D image processing, collision detection, safety, robot, kinematics, dynamics, collaborative assembly, energy consumption, optimisation, manufacturing
National Category
Engineering and Technology
Research subject
Production Engineering
Identifiers
URN: urn:nbn:se:kth:diva-202388ISBN: 978-91-7729-301-9 (print)OAI: oai:DiVA.org:kth-202388DiVA, id: diva2:1076454
Public defence
2017-03-24, M311, Brinellvägen 68, Stockholm, 10:00 (English)
Opponent
Supervisors
Note

QC 20170223

Available from: 2017-02-23 Created: 2017-02-22 Last updated: 2017-02-23Bibliographically approved
List of papers
1. Remote robotic assembly guided by 3D models linking to a real robot
Open this publication in new window or tab >>Remote robotic assembly guided by 3D models linking to a real robot
2014 (English)In: CIRP annals, ISSN 0007-8506, E-ISSN 1726-0604, Vol. 63, no 1, p. 1-4Article in journal (Refereed) Published
Abstract [en]

This paper presents a 3D model-driven remote robotic assembly system. It constructs 3D models at runtime to represent unknown geometries at the robot side, where a sequence of images from a calibrated camera in different poses is used. Guided by the 3D models over the Internet, a remote operator can manipulate a real robot instantly for remote assembly operations. Experimental results show that the system is feasible to meet industrial assembly requirements with an acceptable level of modelling quality and relatively short processing time. The system also enables programming-free robotic assembly where the real robot follows the human's assembly operations instantly.

Place, publisher, year, edition, pages
Elsevier, 2014
Keywords
Robot, 3D-image processing, Assembly
National Category
Mechanical Engineering
Identifiers
urn:nbn:se:kth:diva-148644 (URN)10.1016/j.cirp.2014.03.013 (DOI)000338811000001 ()2-s2.0-84902543990 (Scopus ID)
Note

QC 20140811

Available from: 2014-08-11 Created: 2014-08-11 Last updated: 2017-12-05Bibliographically approved
2. Active collision avoidance for human–robot collaboration driven by vision sensors
Open this publication in new window or tab >>Active collision avoidance for human–robot collaboration driven by vision sensors
2016 (English)In: International journal of computer integrated manufacturing (Print), ISSN 0951-192X, E-ISSN 1362-3052, p. 1-11Article in journal (Refereed) Published
Abstract [en]

Establishing safe human–robot collaboration is an essential factor for improving efficiency and flexibility in today’s manufacturing environment. Targeting safety in human–robot collaboration, this paper reports a novel approach for effective online collision avoidance in an augmented environment, where virtual three-dimensional (3D) models of robots and real images of human operators from depth cameras are used for monitoring and collision detection. A prototype system is developed and linked to industrial robot controllers for adaptive robot control, without the need of programming by the operators. The result of collision detection reveals four safety strategies: the system can alert an operator, stop a robot, move away the robot, or modify the robot’s trajectory away from an approaching operator. These strategies can be activated based on the operator’s existence and location with respect to the robot. The case study of the research further discusses the possibility of implementing the developed method in realistic applications, for example, collaboration between robots and humans in an assembly line.

Place, publisher, year, edition, pages
Taylor & Francis, 2016
Keywords
collision detection, collaborative assembly, safety, vision sensor
National Category
Engineering and Technology
Research subject
Production Engineering
Identifiers
urn:nbn:se:kth:diva-202380 (URN)10.1080/0951192X.2016.1268269 (DOI)000402991300006 ()2-s2.0-85006100717 (Scopus ID)
Note

QC 20170308

Available from: 2017-02-22 Created: 2017-02-22 Last updated: 2017-07-03Bibliographically approved
3. Energy-Efficient Robot Configuration for Assembly
Open this publication in new window or tab >>Energy-Efficient Robot Configuration for Assembly
2017 (English)In: Journal of manufacturing science and engineering, ISSN 1087-1357, E-ISSN 1528-8935, Vol. 139, no 5, article id 051007Article in journal (Refereed) Published
Abstract [en]

Optimizing the energy consumption of robot movements has been one of the main focuses for most of today's robotic simulation software. This optimization is based on minimizing a robot's joint movements. In many cases, it does not take into consideration the dynamic features. Therefore, reducing energy consumption is still a challenging task and it involves studying the robot's kinematic and dynamic models together with application requirements. This research aims to minimize the robot energy consumption during assembly. Given a trajectory and based on the inverse kinematics and dynamics of a robot, a set of attainable configurations for the robot can be determined, perused by calculating the suitable forces and torques on the joints and links of the robot. The energy consumption is then calculated for each configuration and based on the assigned trajectory. The ones with the lowest energy consumption are selected. Given that the energyefficient robot configurations lead to reduced overall energy consumption, this approach becomes instrumental and can be embedded in energy-efficient robotic assembly.

Place, publisher, year, edition, pages
ASME Press, 2017
Keywords
Computer software, Energy utilization, Inverse kinematics, Kinematics, Robotic assembly, Robotics, Robots, Application requirements, Dynamic features, Energy efficient, Forces and torques, Kinematics and dynamics, Reducing energy consumption, Robot configurations, Robotic simulation
National Category
Robotics
Identifiers
urn:nbn:se:kth:diva-200846 (URN)10.1115/1.4034935 (DOI)000399395000009 ()2-s2.0-84995900489 (Scopus ID)
Note

QC 20170206

Available from: 2017-02-06 Created: 2017-02-03 Last updated: 2017-06-02Bibliographically approved
4. Ubiquitous manufacturing system based on Cloud: A robotics application
Open this publication in new window or tab >>Ubiquitous manufacturing system based on Cloud: A robotics application
2017 (English)In: Robotics and Computer-Integrated Manufacturing, ISSN 0736-5845, E-ISSN 1879-2537, Vol. 45, p. 116-125Article in journal (Refereed) Published
Abstract [en]

Modern manufacturing industry calls for a new generation of production system with better interoperability and new business models. As a novel information technology, Cloud provides new service models and business opportunities for manufacturing industry. In this research, recent Cloud manufacturing and Cloud robotics approaches are reviewed. Function block-based integration mechanisms are developed to integrate various types of manufacturing facilities. A Cloud-based manufacturing system is developed to support ubiquitous manufacturing, which provides a service pool maintaining physical facilities in terms of manufacturing services. The proposed framework and mechanisms are evaluated by both machining and robotics applications. In practice, it is possible to establish an integrated manufacturing environment across multiple levels with the support of manufacturing Cloud and function blocks. It provides a flexible architecture as well as ubiquitous and integrated methodologies for the Cloud manufacturing system.

Place, publisher, year, edition, pages
Elsevier, 2017
Keywords
Cloud manufacturing, Cloud robotics, Interoperability, Ubiquitous manufacturing
National Category
Production Engineering, Human Work Science and Ergonomics
Identifiers
urn:nbn:se:kth:diva-185008 (URN)10.1016/j.rcim.2016.01.007 (DOI)2-s2.0-84956865298 (Scopus ID)
Note

QC 20160413

Available from: 2016-04-08 Created: 2016-04-08 Last updated: 2017-11-30Bibliographically approved

Open Access in DiVA

fulltext(5504 kB)117 downloads
File information
File name FULLTEXT01.pdfFile size 5504 kBChecksum SHA-512
8a9e88b38251981620a89df5f50492b6727b9baf9a727f4b1dc567899918e5b17a5d3f9dc5c0c8205f4e29df9aff3b4a7d1c4d743b3d15b6c7939ae6e6ca78ca
Type fulltextMimetype application/pdf

Search in DiVA

By author/editor
Alhusin Alkhdur, Abdullah
By organisation
Production Engineering
Engineering and Technology

Search outside of DiVA

GoogleGoogle Scholar
Total: 117 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

isbn
urn-nbn

Altmetric score

isbn
urn-nbn
Total: 616 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf