Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Responsive Joint Attention in Human-Robot Interaction
KTH, School of Electrical Engineering and Computer Science (EECS), Intelligent systems, Speech, Music and Hearing, TMH, Speech Communication and Technology.
Computer-Human Interaction Lab for Learning & Instruction Ecole Polytechnique Federale de Lausanne, Switzerland..
Show others and affiliations
2019 (English)Conference paper, Published paper (Refereed)
Abstract [en]

Joint attention has been shown to be not only crucial for human-human interaction but also human-robot interaction. Joint attention can help to make cooperation more efficient, support disambiguation in instances of uncertainty and make interactions appear more natural and familiar. In this paper, we present an autonomous gaze system that uses multimodal perception capabilities to model responsive joint attention mechanisms. We investigate the effects of our system on people’s perception of a robot within a problem-solving task. Results from a user study suggest that responsive joint attention mechanisms evoke higher perceived feelings of social presence on scales that regard the direction of the robot’s perception.

Place, publisher, year, edition, pages
2019. p. 1080-1087
National Category
Interaction Technologies
Identifiers
URN: urn:nbn:se:kth:diva-267228DOI: 10.1109/IROS40897.2019.8968130OAI: oai:DiVA.org:kth-267228DiVA, id: diva2:1391479
Conference
2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
Note

QC 20200217

Available from: 2020-02-04 Created: 2020-02-04 Last updated: 2020-02-17Bibliographically approved

Open Access in DiVA

fulltext(3019 kB)2 downloads
File information
File name FULLTEXT01.pdfFile size 3019 kBChecksum SHA-512
d7bb35b9902a0d15faa221bc78275ec6cd70ad999b186ae3381d5aa8da9bfbeb4c4df148c7cdd138a722b20959f58e43c040c842f4d5cfe121b4c3f604eef545
Type fulltextMimetype application/pdf

Other links

Publisher's full text

Search in DiVA

By author/editor
Abelho Pereira, André TiagoOertel, CatharineGustafson, Joakim
By organisation
Speech Communication and TechnologySpeech, Music and Hearing, TMH
Interaction Technologies

Search outside of DiVA

GoogleGoogle Scholar
Total: 2 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 47 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf