Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Pose AR: Assessing Pose Based Input in an AR Context
Linnaeus University, Faculty of Technology, Department of computer science and media technology (CM).
2019 (English)Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
Abstract [en]

Despite the rapidly growing adoption of augmented reality (AR) applications, existing methods for interacting with AR content are rated poorly, with surveyors of the area calling for better means of interaction, while researchers strive to create more natural input methods, mainly focusing on gesture input.

This thesis aims to contribute to the aforementioned efforts by recognizing that technologies for consumer-grade smartphone-based pose estimation have been rapidly improving in recent years and due to their increased accuracy may have untapped potential ready to be utilized for user input. To this end, a rudimentary system for pose based input is integrated into prototype applications, which are constructed with both pose based input and touch input in mind.

In this work, pose, pose estimation, and posed based input refer to using the distance and orientation of the user (or more precisely, the distance and orientation of their device) in relation to the AR content.

Using said prototypes within a user interaction study allowed the identification of user preferences which indicate the approaches that future efforts into utilizing pose for input in an AR context ought to adopt. By comparing questionnaire answers and logged positional data across four prototype scenarios, it can be clearly identified that to perceive pose input as intuitive, the AR experiences shouldn’t employ a scale which is so large that it requires substantial shifts in the position of the user, as opposed to merely shifts in the position of the user’s device.

Place, publisher, year, edition, pages
2019. , p. 60
Keywords [en]
Augmented Reality, Natural User Interfaces, Mobile AR, Pose Estimation, relative distance and orientation input
National Category
Human Computer Interaction
Identifiers
URN: urn:nbn:se:lnu:diva-89228OAI: oai:DiVA.org:lnu-89228DiVA, id: diva2:1353995
Subject / course
Media Technology
Educational program
Social Media and Web Technologies, Master Programme, 120 credits
Presentation
2019-08-30, Online examination, Växjö, 10:00 (English)
Supervisors
Examiners
Available from: 2019-09-26 Created: 2019-09-24 Last updated: 2019-09-26Bibliographically approved

Open Access in DiVA

fulltext(24687 kB)23 downloads
File information
File name FULLTEXT01.pdfFile size 24687 kBChecksum SHA-512
4228b0840ef708bf796cc4f88afebe3a89cbef7b32c6c076c15b84b10777d0fc44b13a4731414553da6fad733f9e13a767e4527f0e5c20cc8f4c6a9107bc92f5
Type fulltextMimetype application/pdf

Search in DiVA

By author/editor
Jakub, Nilsson
By organisation
Department of computer science and media technology (CM)
Human Computer Interaction

Search outside of DiVA

GoogleGoogle Scholar
Total: 23 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

urn-nbn

Altmetric score

urn-nbn
Total: 90 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf