Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Tactile Interaction and Social Touch: Classifying Human Touch using a Soft Tactile Sensor
University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre. (Interaction Lab)
University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre. (Interaction Lab)
University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre. (Interaction Lab)ORCID iD: 0000-0002-6568-9342
University of Skövde, School of Engineering Science. University of Skövde, The Virtual Systems Research Centre. (Användarcentrerad produktdesign, User Centred Product Design)ORCID iD: 0000-0003-4596-3815
Show others and affiliations
2017 (English)In: HAI '17: Proceedings of the 5th International Conference on Human Agent Interaction, New York: Association for Computing Machinery (ACM), 2017, p. 523-526Conference paper, Poster (with or without abstract) (Refereed)
Abstract [en]

This paper presents an ongoing study on affective human-robot interaction. In our previous research, touch type is shown to be informative for communicated emotion. Here, a soft matrix array sensor is used to capture the tactile interaction between human and robot and 6 machine learning methods including CNN, RNN and C3D are implemented to classify different touch types, constituting a pre-stage to recognizing emotional tactile interaction. Results show an average recognition rate of 95% by C3D for classified touch types, which provide stable classification results for developing social touch technology. 

Place, publisher, year, edition, pages
New York: Association for Computing Machinery (ACM), 2017. p. 523-526
Keywords [en]
Tactile interaction, social touch, affective HRI, machine learning
National Category
Computer Vision and Robotics (Autonomous Systems)
Research subject
Interaction Lab (ILAB); User Centred Product Design
Identifiers
URN: urn:nbn:se:his:diva-14282DOI: 10.1145/3125739.3132614Scopus ID: 2-s2.0-85034856772ISBN: 978-1-4503-5113-3 (print)OAI: oai:DiVA.org:his-14282DiVA, id: diva2:1153767
Conference
5th International Conference on Human-Agent Interaction, Bielefeld, Germany, 17-20 October 2017
Projects
Design, textil och hållbar utveckling (VGR)
Funder
Region Västra GötalandAvailable from: 2017-10-31 Created: 2017-10-31 Last updated: 2018-09-04Bibliographically approved

Open Access in DiVA

fulltext(383 kB)60 downloads
File information
File name FULLTEXT02.pdfFile size 383 kBChecksum SHA-512
765224e9000bf43d0b24abe39fd40d7c76e267be906a494d2d1a0dd618942896d80b76de7f6a50560587d784d8c8b037f93ba875726fc8863d760d506d927353
Type fulltextMimetype application/pdf

Other links

Publisher's full textScopus

Search in DiVA

By author/editor
Jiong, SunRedyuk, SergeyBilling, ErikHögberg, DanHemeren, Paul
By organisation
School of InformaticsThe Informatics Research CentreSchool of Engineering ScienceThe Virtual Systems Research Centre
Computer Vision and Robotics (Autonomous Systems)

Search outside of DiVA

GoogleGoogle Scholar
Total: 60 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 252 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf