Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Should Beat Gestures Be Learned Or Designed?: A Benchmarking User Study
KTH, School of Electrical Engineering and Computer Science (EECS), Robotics, Perception and Learning, RPL.ORCID iD: 0000-0001-9838-8848
KTH, School of Electrical Engineering and Computer Science (EECS), Robotics, Perception and Learning, RPL.ORCID iD: 0000-0002-5750-9655
2019 (English)In: ICDL-EPIROB 2019: Workshop on Naturalistic Non-Verbal and Affective Human-Robot Interactions, IEEE conference proceedings, 2019Conference paper, Published paper (Refereed)
Abstract [en]

In this paper, we present a user study on gener-ated beat gestures for humanoid agents. It has been shownthat Human-Robot Interaction can be improved by includingcommunicative non-verbal behavior, such as arm gestures. Beatgestures are one of the four types of arm gestures, and are knownto be used for emphasizing parts of speech. In our user study,we compare beat gestures learned from training data with hand-crafted beat gestures. The first kind of gestures are generatedby a machine learning model trained on speech audio andhuman upper body poses. We compared this approach with threehand-coded beat gestures methods: designed beat gestures, timedbeat gestures, and noisy gestures. Forty-one subjects participatedin our user study, and a ranking was derived from pairedcomparisons using the Bradley Terry Luce model. We found thatfor beat gestures, the gestures from the machine learning modelare preferred, followed by algorithmically generated gestures.This emphasizes the promise of machine learning for generating communicative actions.

Place, publisher, year, edition, pages
IEEE conference proceedings, 2019.
Keywords [en]
gesture generation, machine learning, beat gestures, user study, virtual agents
National Category
Human Computer Interaction
Research subject
Computer Science
Identifiers
URN: urn:nbn:se:kth:diva-255998OAI: oai:DiVA.org:kth-255998DiVA, id: diva2:1342898
Conference
ICDL-EPIROB 2019 Workshop on Naturalistic Non-Verbal and Affective Human-Robot Interactions
Note

QC 20190815

Available from: 2019-08-14 Created: 2019-08-14 Last updated: 2019-08-15Bibliographically approved

Open Access in DiVA

fulltext(1434 kB)6 downloads
File information
File name FULLTEXT01.pdfFile size 1434 kBChecksum SHA-512
1b443fbc48c8f4fb0ffae92a8a7708ea093264787bf886e740eeb7ddf31abe54559a9a8234170d254fdbdf46e378f3756a4f9f6d7d283472a0b4b5a9d4567544
Type fulltextMimetype application/pdf

Search in DiVA

By author/editor
Kucherenko, TarasKjellström, Hedvig
By organisation
Robotics, Perception and Learning, RPL
Human Computer Interaction

Search outside of DiVA

GoogleGoogle Scholar
Total: 6 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

urn-nbn

Altmetric score

urn-nbn
Total: 22 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf