Change search
ReferencesLink to record
Permanent link

Direct link
Towards Embodied Perspective: Exploring rst-person, stereoscopic, 4K, wall-sized rendering of embodied sculpting
KTH, School of Computer Science and Communication (CSC).
2014 (English)Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
Abstract [en]

The central goal of this thesis is creating and testing technology toproduce embodied interaction experiences. Embodied interaction is thesense that we inhabit a digital space with our minds operating on it asif it were our physical bodies, without conscious thought, but as naturalas reaching out with your ngers and touching the object in front of you.Traditional interaction techniques such as keyboard and mouse get in theway of achieving embodiment. In this thesis, we have created an embodiedperspective of virtual three-dimensional objects oating in front of a user.Users can see the object from a rst-person perspective without a headsupdisplay and can change the perspective of the object by shifting theirpoint of view. The technology and aordances to make this possible in aunobtrusive, practical and ecient way is the subject of this thesis.Using a depth sensor, Microsoft's Kinect [7], we track the user's positionin front of a screen in real-time, thus making it possible to changethe perspectives seen by each of the user's eyes to t their real point ofview, in order to achieve a 3D embodied interaction outside the screen.We combined the rst-person perspective into an embodied sculptingproject that includes a wireless haptic glove to allow the user to feel whentouching the model and a small one-hand remote controller used to rotatethe object around as the user desires when pressing its single button.We have achieved what we call Embodied Perspective, which involves anoutside-screen stereoscopic visualization, which reacts to body interactionas if the visualization was really where the user perceives it, thanks to thedata from the depth sensor. This method does not block the user's viewof their own body, but ts and matches their brain's perception.When applied to virtual sculpting (embodied sculpting), it gives theuser the ability to feel and understand much better their actions; wherethey are touching/sculpting and how they should move to reach wherethey want, since the movements are the same one would perform withtheir body in a real-world sculpting situation.A further study of the viability of this method, not only on singleperson interaction but on group visualization of a single user perspective,is discussed and proposed.

Place, publisher, year, edition, pages
National Category
Computer Science
URN: urn:nbn:se:kth:diva-155950OAI: diva2:763672
Available from: 2014-11-19 Created: 2014-11-17 Last updated: 2014-11-19Bibliographically approved

Open Access in DiVA

fulltext(2999 kB)60 downloads
File information
File name FULLTEXT01.pdfFile size 2999 kBChecksum SHA-512
Type fulltextMimetype application/pdf

By organisation
School of Computer Science and Communication (CSC)
Computer Science

Search outside of DiVA

GoogleGoogle Scholar
Total: 60 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

Total: 1782 hits
ReferencesLink to record
Permanent link

Direct link