Change search
Refine search result
1 - 18 of 18
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the 'Create feeds' function.
  • 1.
    Abut, Hüseyin
    et al.
    Mälardalen University, School of Innovation, Design and Engineering.
    Ercil, Aytul
    Mälardalen University, School of Innovation, Design and Engineering.
    Erdogan, Hakan
    Mälardalen University, School of Innovation, Design and Engineering.
    Çürüklü, Baran
    Mälardalen University, School of Innovation, Design and Engineering.
    Koman, Hakki Can
    Mälardalen University, School of Innovation, Design and Engineering.
    Tas, Fatih
    Mälardalen University, School of Innovation, Design and Engineering.
    Argunsah, Ali Özgur
    Mälardalen University, School of Innovation, Design and Engineering.
    Cosar, Serhan
    Mälardalen University, School of Innovation, Design and Engineering.
    Akan, Batu
    Mälardalen University, School of Innovation, Design and Engineering.
    Karabalkan, Harun
    Mälardalen University, School of Innovation, Design and Engineering.
    Cökelek, Emre
    Mälardalen University, School of Innovation, Design and Engineering.
    Ficici, Rahmi
    Mälardalen University, School of Innovation, Design and Engineering.
    Sezer, Volkan
    Mälardalen University, School of Innovation, Design and Engineering.
    Danis, Serhan
    Mälardalen University, School of Innovation, Design and Engineering.
    Karaca, Mehmet
    Mälardalen University, School of Innovation, Design and Engineering.
    Abbak, Mehmet
    Mälardalen University, School of Innovation, Design and Engineering.
    Uzunbaş, Mustafa Gökhan
    Mälardalen University, School of Innovation, Design and Engineering.
    Eritmen, Kayhan
    Mälardalen University, School of Innovation, Design and Engineering.
    Kalaycıoglu, Caglar
    Mälardalen University, School of Innovation, Design and Engineering.
    Imamoğlu, Mümin
    Mälardalen University, School of Innovation, Design and Engineering.
    Karabat, Cagatay
    Mälardalen University, School of Innovation, Design and Engineering.
    Peyic, Merve
    Mälardalen University, School of Innovation, Design and Engineering.
    Arslan, Burak
    Mälardalen University, School of Innovation, Design and Engineering.
    Data Collection with UYANIK: Too Much Pain; But Gains are Coming2007In: Proc. of the Biennial on DSP for In-Vehicle and Mobile Systems, Istanbul, Turkey, 2007Conference paper (Refereed)
  • 2.
    Abut, Hüseyin
    et al.
    San Diego State University.
    Erdogan, Hakan
    Ercil, Aytul
    Çürüklü, Baran
    Mälardalen University, School of Innovation, Design and Engineering.
    Koman, Hakki Can
    Tas, Fatih
    Argunsah, Ali Özgur
    Cosar, Serhan
    Akan, Batu
    Mälardalen University, School of Innovation, Design and Engineering.
    Karabalkan, Harun
    Cökelek, Emre
    Ficici, Rahmi
    Sezer, Volkan
    Danis, Serhan
    Karaca, Mehmet
    Abbak, Mehmet
    Uzunbaş, Mustafa Gökhan
    Eritmen, Kayhan
    Imamoğlu, Mümin
    Kalaycıoglu, Caglar
    Real-World Data Collection with UYANIK2009In: In-Vehicle Corpus and Signal Processing for Driver Behavior, Springer, 2009, p. 23-44Chapter in book (Other academic)
  • 3.
    Akan, Batu
    Mälardalen University, School of Innovation, Design and Engineering.
    Human Robot Interaction Solutions for Intuitive Industrial Robot Programming2012Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    Over the past few decades the use of industrial robots has increased the efficiency as well as competitiveness of many companies. Despite this fact, in many cases, robot automation investments are considered to be technically challenging. In addition, for most small and medium sized enterprises (SME) this process is associated with high costs. Due to their continuously changing product lines, reprogramming costs are likely to exceed installation costs by a large margin. Furthermore, traditional programming methods for industrial robots are too complex for an inexperienced robot programmer, thus assistance from a robot programming expert is often needed.  We hypothesize that in order to make industrial robots more common within the SME sector, the robots should be reprogrammable by technicians or manufacturing engineers rather than robot programming experts. In this thesis we propose a high-level natural language framework for interacting with industrial robots through an instructional programming environment for the user.  The ultimate goal of this thesis is to bring robot programming to a stage where it is as easy as working together with a colleague.In this thesis we mainly address two issues. The first issue is to make interaction with a robot easier and more natural through a multimodal framework. The proposed language architecture makes it possible to manipulate, pick or place objects in a scene through high level commands. Interaction with simple voice commands and gestures enables the manufacturing engineer to focus on the task itself, rather than programming issues of the robot. This approach shifts the focus of industrial robot programming from the coordinate based programming paradigm, which currently dominates the field, to an object based programming scheme.The second issue addressed is a general framework for implementing multimodal interfaces. There have been numerous efforts to implement multimodal interfaces for computers and robots, but there is no general standard framework for developing them. The general framework proposed in this thesis is designed to perform natural language understanding, multimodal integration and semantic analysis with an incremental pipeline and includes a novel multimodal grammar language, which is used for multimodal presentation and semantic meaning generation.

  • 4.
    Akan, Batu
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Planning and Sequencing Through Multimodal Interaction for Robot Programming2014Doctoral thesis, comprehensive summary (Other academic)
    Abstract [en]

    Over the past few decades the use of industrial robots has increased the efficiency as well as the competitiveness of several sectors. Despite this fact, in many cases robot automation investments are considered to be technically challenging. In addition, for most small and medium-sized enterprises (SMEs) this process is associated with high costs. Due to their continuously changing product lines, reprogramming costs are likely to exceed installation costs by a large margin. Furthermore, traditional programming methods of industrial robots are too complex for most technicians or manufacturing engineers, and thus assistance from a robot programming expert is often needed. The hypothesis is that in order to make the use of industrial robots more common within the SME sector, the robots should be reprogrammable by technicians or manufacturing engineers rather than robot programming experts. In this thesis, a novel system for task-level programming is proposed. The user interacts with an industrial robot by giving instructions in a structured natural language and by selecting objects through an augmented reality interface. The proposed system consists of two parts: (i) a multimodal framework that provides a natural language interface for the user to interact in which the framework performs modality fusion and semantic analysis, (ii) a symbolic planner, POPStar, to create a time-efficient plan based on the user's instructions. The ultimate goal of this work in this thesis is to bring robot programming to a stage where it is as easy as working together with a colleague.This thesis mainly addresses two issues. The first issue is a general framework for designing and developing multimodal interfaces. The general framework proposed in this thesis is designed to perform natural language understanding, multimodal integration and semantic analysis with an incremental pipeline. The framework also includes a novel multimodal grammar language, which is used for multimodal presentation and semantic meaning generation. Such a framework helps us to make interaction with a robot easier and more natural. The proposed language architecture makes it possible to manipulate, pick or place objects in a scene through high-level commands. Interaction with simple voice commands and gestures enables the manufacturing engineer to focus on the task itself, rather than the programming issues of the robot. The second issue addressed is due to inherent characteristics of communication with the use of natural language; instructions given by a user are often vague and may require other actions to be taken before the conditions for applying the user's instructions are met. In order to solve this problem a symbolic planner, POPStar, based on a partial order planner (POP) is proposed. The system takes landmarks extracted from user instructions as input, and creates a sequence of actions to operate the robotic cell with minimal makespan. The proposed planner takes advantage of the partial order capabilities of POP to execute actions in parallel and employs a best-first search algorithm to seek the series of actions that lead to a minimal makespan. The proposed planner can also handle robots with multiple grippers, parallel machines as well as scheduling for multiple product types.

  • 5.
    Akan, Batu
    et al.
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Ameri E., Afsh
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Çürüklü, Baran
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Towards Creation of Robot Programs Through User InteractionArticle in journal (Other academic)
    Abstract [en]

    This paper proposes a novel system for task-level programming of industrial robots. The user interacts with an industrial robot by giving instructions in a structured natural language and by selecting objects through an augmented reality interface. The proposed system consists of two parts. First, a multimodal framework that provides a natural language interface to the user. This framework performs modality fusion, semantic analysis and helps the user to interact with the system easier and more naturally. The proposed language architecture makes it possible to manipulate, pick or place objects in a scene through high-level commands. The second component is the POPStar planner, which is based on partial order planner (POP), that takes landmarks extracted from user instructions as input, and creates a sequence of actions to operate the robotic cell with minimal makespan. The proposed planner takes advantage of partial order capabilities of POP to plan execution of actions in parallel and employs a best-first search algorithm to seek a series of actions that lead to a minimal makespan. The proposed planner can as well handle robots with multiple grippers, and  parallel machines. Using different topologies for the landmark graphs, we show that it is possible to create schedules for changing object types, which are processed in different stages in the robot cell. Results show that the proposed system can create and adapt schedules for robot cells with changing product types in low volume production based on the user's instructions.

  • 6.
    Akan, Batu
    et al.
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Ameri E., Afshin
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Curuklu, Baran
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Scheduling for Multiple Type Objects Using POPStar Planner2014In: Proceedings of the 19th IEEE International Conference on Emerging Technologies and Factory Automation (ETFA'14), Barcelona, Spain, September, 2014, 2014, p. Article number 7005148-Conference paper (Refereed)
    Abstract [en]

    In this paper, scheduling of robot cells that produce multiple object types in low volumes are considered. The challenge is to maximize the number of objects produced in a given time window as well as to adopt the  schedule for changing object types. Proposed algorithm, POPStar, is based on a partial order planner which is guided by best-first search algorithm and landmarks. The best-first search, uses heuristics to help the planner to create complete plans while minimizing the makespan. The algorithm takes landmarks, which are extracted from user's instructions given in structured English as input. Using different topologies for the landmark graphs, we show that it is possible to create schedules for changing object types, which will be processed in different stages in the robot cell. Results show that the POPStar algorithm can create and adapt schedules for robot cells with changing product types in low volume production.

  • 7.
    Akan, Batu
    et al.
    Mälardalen University, School of Innovation, Design and Engineering.
    Ameri E., Afshin
    Mälardalen University, School of Innovation, Design and Engineering.
    Çürüklü, Baran
    Mälardalen University, School of Innovation, Design and Engineering.
    Augmented Reality-based Industrial Robot Control2011In: Proceedings of SIGRAD 2011 / [ed] Larsson, Thomas ; Kjelldahl, Lars ; Jää-Aro, Kai-Mikael, Linköping University Electronic Press, 2011, p. 113-114Conference paper (Refereed)
    Abstract [en]

    Most of the interfaces which are designed to control or program industrial robots are complex and require special training for the user. This complexity alongside the changing environment of small medium enterprises (SMEs) has lead to absence of robots from SMEs. The costs of (re)programming the robots and (re)training the robot users exceed initial costs of installation. In order to solve this shortcoming, we propose a new interface which uses augmented reality (AR) and multimodal human-robot interaction. We show that such an approach allows easier manipulation of robots at industrial environments.

  • 8.
    Akan, Batu
    et al.
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Ameri E., Afshin
    Mälardalen University, School of Innovation, Design and Engineering.
    Çürüklü, Baran
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Asplund, Lars
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Intuitive Industrial Robot Programming Through Incremental Multimodal Language and Augmented Reality2011In: 2011 IEEE International Conferance on Robotics and Automation (ICRA 2011), IEEE , 2011, p. 3934-3939Conference paper (Refereed)
    Abstract [en]

    Developing easy to use, intuitive interfaces is crucial to introduce robotic automation to many small medium sized enterprises (SMEs). Due to their continuously changing product lines, reprogramming costs exceed installation costs by a large margin. In addition, traditional programming methods for industrial robots is too complex for an inexperienced robot programmer, thus external assistance is often needed. In this paper a new incremental multimodal language, which uses augmented reality (AR) environment, is presented. The proposed language architecture makes it possible to manipulate, pick or place the objects in the scene. This approach shifts the focus of industrial robot programming from coordinate based programming paradigm, to object based programming scheme. This makes it possible for non-experts to program the robot in an intuitive way, without going through rigorous training in robot programming.

  • 9.
    Akan, Batu
    et al.
    Mälardalen University, School of Innovation, Design and Engineering.
    Çürüklü, Baran
    Mälardalen University, School of Innovation, Design and Engineering.
    Asplund, Lars
    Mälardalen University, School of Innovation, Design and Engineering.
    Interacting with industrial robots through a multi-modal language and sensory systems2008In: 39th International Sysmposium on Robotics, Seoul, Korea, 2008, p. 66-69Conference paper (Refereed)
    Abstract [en]

    Over the past few decades the use of industrial robots has increased a company's efficiency as well as strengthening their competitiveness in the market.

    Despite this fact, in many cases, robot automation investments are considered to be technically challenging as well as costly by small and medium sized enterprises (SME). We hypothesize that in order to make industrial robots more common within the SME sector, the robots should be reprogrammable by task experts rather than robot programming experts. Within this project we propose to develop a high level language for intelligent human robot interaction that relies on multi-sensor inputs providing an abstract instructional programming environment for the user. Eventually to bring robot programming to stage where it is as easy as working together with a colleague

  • 10.
    Akan, Batu
    et al.
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Çürüklü, Baran
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Asplund, Lars
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems.
    Scheduling POP-Star for Automatic Creation of Robot Cell Programs2013Conference paper (Refereed)
    Abstract [en]

    Typical pick and place, and machine tending applications often require an industrial robot to be embedded in a cell and to communicate with other devices in the cell. Programming the program logic is a tedious job, requiring expert programming knowledge, and it can take more time than programming the specific robot movements itself. We propose a new system, which takes in the description of the whole manufacturing process in natural language as input, fills in the implicit actions, and plans the sequence of actions to accomplish the task described in minimal makespan using a modified partial planning algorithm. Finally we demonstrate that the proposed system can come up with a sensible plan for the given instructions.

  • 11.
    Akan, Batu
    et al.
    Mälardalen University, School of Innovation, Design and Engineering.
    Çürüklü, Baran
    Mälardalen University, School of Innovation, Design and Engineering.
    Spampinato, Giacomo
    Mälardalen University, School of Innovation, Design and Engineering.
    Asplund, Lars
    Mälardalen University, School of Innovation, Design and Engineering.
    Object selection using a spatial language for flexible assembly2009Conference paper (Refereed)
  • 12.
    Akan, Batu
    et al.
    Mälardalen University, School of Innovation, Design and Engineering.
    Çürüklü, Baran
    Mälardalen University, School of Innovation, Design and Engineering.
    Spampinato, Giacomo
    Mälardalen University, School of Innovation, Design and Engineering.
    Asplund, Lars
    Mälardalen University, School of Innovation, Design and Engineering.
    Object Selection using a Spatial Language for Flexible Assembly2009In: 14th IEEE International Conference on Emerging Technologies and Factory Automation, 2009. (ETFA 2009), Mallorca, Spain, 2009Conference paper (Refereed)
    Abstract [en]

    In this paper we present a new simplified natural language that makes use of spatial relations between the objects in scene to navigate an industrial robot for simple pick and place applications. Developing easy to use, intuitive interfaces is crucial to introduce robotic automation to many small medium sized enterprises (SMEs). Due to their continuously changing product lines, reprogramming costs are far higher than installation costs. In order to hide the complexities of robot programming we propose a natural language where the use can control and jog the robot based on reference objects in the scene. We used Gaussian kernels to represent spatial regions, such as left or above. Finally we present some dialogues between the user and robot to demonstrate the usefulness of the proposed system.

  • 13.
    Akan, Batu
    et al.
    Mälardalen University, School of Innovation, Design and Engineering.
    Çürüklü, Baran
    Mälardalen University, School of Innovation, Design and Engineering.
    Spampinato, Giacomo
    Mälardalen University, School of Innovation, Design and Engineering.
    Asplund, Lars
    Mälardalen University, School of Innovation, Design and Engineering.
    Towards Robust Human Robot Collaboration in Industrial Environments2010In: 5th ACM/IEEE International Conference on Human-Robot Interaction, HRI 2010, 2010, p. 71-72Conference paper (Refereed)
    Abstract [en]

    In this paper a system, which is driven through natural language, that allows operators to select and manipulate objects in the environment using an industrial robot is proposed. In order to hide the complexities of robot programming we propose a natural language where the user can control and jog the robot based on reference objects in the scene. We used semantic networks to relate different types of objects in the scene

  • 14.
    Ameri E., Afshin
    et al.
    Mälardalen University, School of Innovation, Design and Engineering.
    Akan, Batu
    Mälardalen University, School of Innovation, Design and Engineering.
    Çürüklü, Baran
    Mälardalen University, School of Innovation, Design and Engineering.
    Incremental Multimodal Interface for Human-Robot Interaction2010In: Proceedings of the 15th IEEE International Conference on Emerging Technologies and Factory Automation, ETFA 2010, 2010, p. Art.nr. 5641234-Conference paper (Refereed)
    Abstract [en]

    Face-to-face human communication is a multimodal and incremental process. An intelligent robot that operates in close relation with humans should have the ability to communicate with its human colleagues in such manner. The process of understanding and responding to multimodal inputs has been an interesting field of research and resulted in advancements in areas such as syntactic and semantic analysis, modality fusion and dialogue management. Some approaches in syntactic and semantic analysis take incremental nature of human interaction into account. Our goal is to unify syntactic/semantic analysis, modality fusion and dialogue management processes into an incremental multimodal interaction manager. We believe that this approach will lead to a more robust system which can perform faster than today's systems.

  • 15.
    Ameri E., Afshin
    et al.
    Mälardalen University, School of Innovation, Design and Engineering.
    Akan, Batu
    Mälardalen University, School of Innovation, Design and Engineering.
    Çürüklü, Baran
    Mälardalen University, School of Innovation, Design and Engineering.
    Asplund, Lars
    Mälardalen University, School of Innovation, Design and Engineering.
    A General Framework for Incremental Processing of Multimodal Inputs2011In: Proceedings of the 13th international conference on multimodal interfaces, New York: ACM Press, 2011, p. 225-228Conference paper (Refereed)
    Abstract [en]

    Humans employ different information channels (modalities) such as speech, pictures and gestures in their commu- nication. It is believed that some of these modalities are more error-prone to some specific type of data and therefore multimodality can help to reduce ambiguities in the interaction. There have been numerous efforts in implementing multimodal interfaces for computers and robots. Yet, there is no general standard framework for developing them. In this paper we propose a general framework for implementing multimodal interfaces. It is designed to perform natural language understanding, multi- modal integration and semantic analysis with an incremental pipeline and includes a multimodal grammar language, which is used for multimodal presentation and semantic meaning generation.

  • 16.
    Hägg, Johan
    et al.
    Mälardalen University, School of Innovation, Design and Engineering.
    Akan, Batu
    Mälardalen University, School of Innovation, Design and Engineering.
    Çürüklü, Baran
    Mälardalen University, School of Innovation, Design and Engineering.
    Asplund, Lars
    Mälardalen University, School of Innovation, Design and Engineering.
    Automatic Generation of Neural Networks for Gesture Recognition2010Manuscript (preprint) (Other academic)
  • 17.
    Hägg, Johan
    et al.
    Mälardalen University, School of Innovation, Design and Engineering.
    Akan, Batu
    Mälardalen University, School of Innovation, Design and Engineering.
    Çürüklü, Baran
    Mälardalen University, School of Innovation, Design and Engineering.
    Asplund, Lars
    Mälardalen University, School of Innovation, Design and Engineering.
    Gesture Recognition Using Evolution Strategy Neural Network2008In: 2008 IEEE INTERNATIONAL CONFERENCE ON EMERGING TECHNOLOGIES AND FACTORY AUTOMATION, PROCEEDINGS, 2008, p. 245-248Conference paper (Refereed)
    Abstract [en]

    A new approach to interact with an industrial robot using hand gestures is presented. System proposed here can learn first time user's hand gestures rapidly. This improves product usability and acceptability. Artificial neural networks trained with the evolution strategy technique are found to be suited for this problem. The gesture recognition system is an integrated part of a larger project for addressing intelligent human-robot interaction using a novel multi-modal paradigm. The goal of the overall project is to address complexity issues related to robot programming by providing a multi-modal user friendly interacting system that can be used by SMEs.

  • 18.
    Çürüklü, Baran
    et al.
    Mälardalen University, School of Innovation, Design and Engineering.
    Dodig-Crnkovic, Gordana
    Mälardalen University, School of Innovation, Design and Engineering.
    Akan, Batu
    Mälardalen University, School of Innovation, Design and Engineering.
    Towards Industrial Robots with Human Like Moral Responsibilities2010In: 5th ACM/IEEE International Conference on Human-Robot Interaction, HRI 2010, Osaka, Japan, 2010, p. 85-86Conference paper (Refereed)
    Abstract [en]

    Robots do not have any capability of taking moral responsibility. At the same time industrial robotics is entering a new era with "intelligent" robots sharing workbench with humans. Teams consisting of humans and industrial robots are no longer science fiction. The biggest worry in this scenario is the fear of humans losing control and robots running amok. We believe that the current way of implementing safety measures have shortcomings, and cannot address challenges related to close collaboration between humans and robots. We propose that "intelligent" industrial robots of the future should have moral responsibilities towards their human colleagues. We also propose that implementation of moral responsibility is radically different from standard safety measures.

1 - 18 of 18
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
v. 2.34-SNAPSHOT
|