Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Human Motion Analysis for Creating Immersive Experiences
Umeå University, Faculty of Science and Technology, Department of Applied Physics and Electronics. (Digital Media Lab)
2012 (English)Licentiate thesis, comprehensive summary (Other academic)
Abstract [en]

From an early age, people display the ability to quickly and effortlessly interpret the orientation and movement of human body parts, thereby allowing one to infer the intentions of others who are nearby and to comprehend an important nonverbal form of communication. The ease with which one accomplishes this task belies the difficulty of a problem that has challenged computational systems for decades, human motion analysis.

Technological developments over years have resulted into many systems for measuring body segment positions and angles between segments. In these systems human body is typically considered as a system of rigid links connected by joints. The motion is estimated by the use of measurements from mechanical, optical, magnetic, or inertial trackers. Among all kinds of sensors, optical sensing encompasses a large and varying collection of technologies.

In a computer vision context, human motion analysis is a topic that studies methods and applications in which two or more consecutive images from an image sequences, e.g. captured by a video camera, are processed to produce information based on the apparent human body motion in the images.

Many different disciplines employ motion analysis systems to capture movement and posture of human body for applications such as medical diagnostics, virtual reality, human-computer interaction etc.

This thesis gives an insight into the state of the art human motion analysissystems, and provides new methods for capturing human motion.

Place, publisher, year, edition, pages
Umeå: Umeå University , 2012. , 71 p.
Series
Digital Media Lab, ISSN 1652-6295 ; 15
National Category
Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
URN: urn:nbn:se:umu:diva-55832ISBN: 978-91-7459-416-4 (print)OAI: oai:DiVA.org:umu-55832DiVA: diva2:530604
Presentation
2012-04-13, room A305, Department of Applied Physics and Electronics, Umeå University, Umeå, 10:00 (English)
Supervisors
Available from: 2012-06-04 Created: 2012-06-04 Last updated: 2012-06-04Bibliographically approved
List of papers
1. 3D Active Human Motion Estimation for Biomedical Applications
Open this publication in new window or tab >>3D Active Human Motion Estimation for Biomedical Applications
2012 (English)In: World Congress on Medical Physics and Biomedical Engineering May 26-31, 2012, Beijing, China / [ed] Mian Long, Springer Berlin/Heidelberg, 2012, , 4 p.1014-1017 p.Conference paper, Published paper (Refereed)
Abstract [en]

Movement disorders forbid many people from enjoying their daily lives. As with other diseases, diagnosis and analysis are key issues in treating such disorders. Computer vision-based motion capture systems are helpful tools for accomplishing this task. However Classical motion tracking systems suffer from several limitations. First they are not cost effective. Second these systems cannot detect minute motions accurately. Finally they are spatially limited to the lab environment where the system is installed. In this project, we propose an innovative solution to solve the above-mentioned issues. Mounting the camera on human body, we build a convenient, low cost motion capture system that can be used by the patient while practicing daily-life activities. We refer to this system as active motion capture, which is not confined to the lab environment. Real-time experiments in our lab revealed the robustness and accuracy of the system.

Place, publisher, year, edition, pages
Springer Berlin/Heidelberg, 2012. 4 p.
Series
IFMBE Proceedings, ISSN 1680-0737 ; 39
Keyword
Active motion tracking, Human motion analysis, Movement disorder, SIFT
National Category
Biomedical Laboratory Science/Technology
Identifiers
urn:nbn:se:umu:diva-55831 (URN)10.1007/978-3-642-29305-4_266 (DOI)978-364229304-7 (print) (ISBN)978-3-642-29305-4 (ISBN)
Conference
World Congress on Medical Physics and Biomedical Engineering (WC 2012), Beijing, China, 26-31 May 2012
Available from: 2012-06-04 Created: 2012-06-04 Last updated: 2014-05-21Bibliographically approved
2. 3D Head Pose Estimation Using the Kinect
Open this publication in new window or tab >>3D Head Pose Estimation Using the Kinect
Show others...
2011 (English)Conference paper, Published paper (Refereed)
Abstract [en]

Head pose estimation plays an essential role for bridging the information gap between humans and computers. Conventional head pose estimation methods are mostly done in images captured by cameras. However accurate and robust pose estimation is often problematic. In this paper we present an algorithm for recovering the six degrees of freedom (DOF) of motion of a head from a sequence of range images taken by the Microsoft Kinectfor Xbox 360. The proposed algorithm utilizes a least-squares minimization of the difference between themeasured rate of change of depth at a point and the rate predicted by the depth rate constraint equation. We segment the human head from its surroundings and background, and then we estimate the head motion. Our system has the capability to recover the six DOF of the head motion of multiple people in one image. Theproposed system is evaluated in our lab and presents superior results.

Place, publisher, year, edition, pages
IEEE conference proceedings, 2011
National Category
Medical Engineering
Identifiers
urn:nbn:se:umu:diva-52815 (URN)10.1109/WCSP.2011.6096866 (DOI)978-1-4577-1008-7 (ISBN)
Conference
IEEE International Conference on Wireless Communications and Signal Processing (WCSP2011), 9-11 nov, TBD, Nanjing, China
Available from: 2012-03-02 Created: 2012-03-02 Last updated: 2012-06-04Bibliographically approved
3. Tracking fingers in 3D space for mobile interaction
Open this publication in new window or tab >>Tracking fingers in 3D space for mobile interaction
2010 (English)Conference paper, Published paper (Refereed)
Abstract [en]

Number of mobile devices such as mobile phones or PDAs has been dramatically increased over the recent years. New mobile devices are equipped with integrated cameras and large displays which make the interaction with device easier and more efficient. Although most of the previous works on interaction between humans and mobile devices are based on 2D touch-screen displays, camera-based interaction opens a new way to manipulate in 3D space behind the device in the camera's field of view.In this paper, our gestural interaction relies heavily on particular patterns from local orientation of the image called Rotational Symmetries. This approach is based on finding the most suitable pattern from the large set of rotational symmetries of different orders which ensures a reliable detector for fingertips and human gesture. Consequently, gesture detection and tracking can be used as an efficient tool for 3D manipulation in various applications in computer vision and augmented reality.

Keyword
Mobile interaction, Rotational Symmetries, Gesture detection, Tracking
National Category
Signal Processing
Identifiers
urn:nbn:se:umu:diva-53536 (URN)
Conference
20th Conference on Pattern Recognition (ICPR)
Available from: 2012-03-30 Created: 2012-03-30 Last updated: 2013-08-08Bibliographically approved
4. Real 3D Interaction Behind Mobile Phones for Augmented Environments
Open this publication in new window or tab >>Real 3D Interaction Behind Mobile Phones for Augmented Environments
2011 (English)In: 2011 IEEE International Conference on Multimedia and Expo (ICME), IEEE conference proceedings, 2011, 1-6 p.Conference paper, Published paper (Refereed)
Abstract [en]

Number of mobile devices such as mobile phones or PDAs has been dramatically increased over the recent years. New mobile devices are equipped with integrated cameras and large displays which make the interaction with device easier and more efficient. Although most of the previous works on interaction between humans and mobile devices are based on 2D touch-screen displays, camera-based interaction opens a new way to manipulate in 3D space behind the device in the camera's field of view. This paper suggests the use of particular patterns from local orientation of the image called Rotational Symmetries to detect and localize human gesture. Relative rotation and translation of human gesture between consecutive frames are estimated by means of extracting stable features. Consequently, this information can be used to facilitate the 3D manipulation of virtual objects in various applications in mobile devices.

Place, publisher, year, edition, pages
IEEE conference proceedings, 2011
Series
IEEE International Conference on Multimedia and Expo, ISSN 1945-7871
Keyword
Mobile interaction, rotational symmetries, SIFT, 3D manipulation
National Category
Interaction Technologies
Identifiers
urn:nbn:se:umu:diva-52816 (URN)10.1109/ICME.2011.6012155 (DOI)000304354700154 ()978-1-61284-349-0 (ISBN)
Conference
Multimedia and Expo (ICME), 2011 IEEE International Conference on, Barcelona, Spain, July 11-15, 2011
Available from: 2012-03-02 Created: 2012-03-02 Last updated: 2017-01-16Bibliographically approved

Open Access in DiVA

fulltext(2467 kB)