Change search
ReferencesLink to record
Permanent link

Direct link
Visual Inertial Navigation and Calibration
Linköping University, Department of Electrical Engineering, Automatic Control. Linköping University, The Institute of Technology.
2011 (English)Licentiate thesis, comprehensive summary (Other academic)
Abstract [en]

Processing and interpretation of visual content is essential to many systems and applications. This requires knowledge of how the content is sensed and also what is sensed. Such knowledge is captured in models which, depending on the application, can be very advanced or simple. An application example is scene reconstruction using a camera; if a suitable model of the camera is known, then a model of the scene can be estimated from images acquired at different, unknown, locations, yet, the quality of the scene model depends on the quality of the camera model. The opposite is to estimate the camera model and the unknown locations using a known scene model. In this work, two such problems are treated in two rather different applications.

There is an increasing need for navigation solutions less dependent on external navigation systems such as the Global Positioning System (GPS). Simultaneous Localisation and Mapping (slam) provides a solution to this by estimating both navigation states and some properties of the environment without considering any external navigation systems.

The first problem considers visual inertial navigation and mapping using a monocular camera and inertial measurements which is a slam problem. Our aim is to provide improved estimates of the navigation states and a landmark map, given a slam solution. To do this, the measurements are fused in an Extended Kalman Filter (ekf) and then the filtered estimates are used as a starting solution in a nonlinear least-squares problem which is solved using the Gauss-Newton method. This approach is evaluated on experimental data with accurate ground truth for reference.

In Augmented Reality (ar), additional information is superimposed onto the surrounding environment in real time to reinforce our impressions. For this to be a pleasant experience it is necessary to have a good models of the ar system and the environment.

The second problem considers calibration of an Optical See-Through Head Mounted Display system (osthmd), which is a wearable ar system. We show and motivate how the pinhole camera model can be used to represent the osthmd and the user’s eye position. The pinhole camera model is estimated using the Direct Linear Transformation algorithm. Results are evaluated in experiments which also compare different data acquisition methods.

Place, publisher, year, edition, pages
Linköping: Linköping University Electronic Press , 2011. , 39 p.
Series
Linköping Studies in Science and Technology. Thesis, ISSN 0280-7971 ; 1500
National Category
Engineering and Technology
Identifiers
URN: urn:nbn:se:liu:diva-68858Local ID: LiU-TEK-LIC-2011:39OAI: oai:DiVA.org:liu-68858DiVA: diva2:421464
Presentation
2011-06-16, Visionen, Hus B, Campus Valla, Linköpings universitet, Linköping, 14:30 (English)
Opponent
Supervisors
Available from: 2011-06-08 Created: 2011-06-08 Last updated: 2011-06-13Bibliographically approved
List of papers
1. A Nonlinear Least-Squares Approach to the SLAM Problem
Open this publication in new window or tab >>A Nonlinear Least-Squares Approach to the SLAM Problem
2011 (English)In: Proceedings of the 18th IFAC World Congress, 2011: World Congress, Volume # 18, Part 1 / [ed] Sergio Bittanti, Angelo Cenedese and Sandro Zampieri, IFAC Papers Online, 2011, 4759-4764 p.Conference paper (Refereed)
Abstract [en]

In this paper we present a solution to the simultaneous localisation and mapping (SLAM) problem using a camera and inertial sensors. Our approach is based on the maximum a posteriori (MAP) estimate of the complete SLAM problem. The resulting problem is posed in a nonlinear least-squares framework which we solve with the Gauss-Newton method. The proposed algorithm is evaluated on experimental data using a sensor platform mounted on an industrial robot. In this way, accurate ground truth is available, and the results are encouraging.

Place, publisher, year, edition, pages
IFAC Papers Online, 2011
Keyword
Inertial measurement units, Cameras, Smoothing, Dynamic systems, State estimation
National Category
Control Engineering
Identifiers
urn:nbn:se:liu:diva-68857 (URN)10.3182/20110828-6-IT-1002.02042 (DOI)978-3-902661-93-7 (ISBN)
Conference
The 18th IFAC World Congress, 2011, August 28th to Friday September 2nd, Milano, Italy
Available from: 2011-06-08 Created: 2011-06-08 Last updated: 2016-05-03Bibliographically approved
2. Parameter Estimation Variance of the Single Point Active Alignment Method in Optical See-Through Head Mounted Display Calibration
Open this publication in new window or tab >>Parameter Estimation Variance of the Single Point Active Alignment Method in Optical See-Through Head Mounted Display Calibration
Show others...
2011 (English)In: Proceedings of the IEEE Virtual Reality Conference / [ed] Michitaka Hirose, Benjamin Lok, Aditi Majumder and Dieter Schmalstieg, Piscataway, NJ, USA: IEEE , 2011, 27-24 p.Conference paper (Refereed)
Abstract [en]

The parameter estimation variance of the Single Point Active Alignment Method (SPAAM) is studied through an experiment where 11 subjects are instructed to create alignments using an Optical See-Through Head Mounted Display (OSTHMD) such that three separate correspondence point distributions are acquired. Modeling the OSTHMD and the subject's dominant eye as a pinhole camera, findings show that a correspondence point distribution well distributed along the user's line of sight yields less variant parameter estimates. The estimated eye point location is studied in particular detail. Thefindings of the experiment are complemented with simulated datawhich show that image plane orientation is sensitive to the numberof correspondence points. The simulated data also illustrates someinteresting properties on the numerical stability of the calibrationproblem as a function of alignment noise, number of correspondencepoints, and correspondence point distribution.

Place, publisher, year, edition, pages
Piscataway, NJ, USA: IEEE, 2011
Series
, IEEE Virtual Reality Conference, ISSN 1087-8270
Keyword
single point active alignment method, camera resectioning, calibration, optical see-through head mounted display, augmented reality
National Category
Engineering and Technology
Identifiers
urn:nbn:se:liu:diva-67233 (URN)10.1109/VR.2011.5759432 (DOI)000297260400004 ()978-1-4577-0037-8 (online), 978-1-4577-0039-2 (print) (ISBN)
Conference
IEEE Virtual Reality Conference, pages 27–34, Singapore, Republic of Singapore
Available from: 2011-04-04 Created: 2011-04-04 Last updated: 2015-09-22Bibliographically approved

Open Access in DiVA

Visual Inertial Navigation and Calibration(480 kB)1620 downloads
File information
File name FULLTEXT01.pdfFile size 480 kBChecksum SHA-512
f84b5497c38a006b6749b149ecfaa4cae73a2944e3c1fe39bc763dc34c2327e36210c0e50bc153c6840432455aeb345b8b1208a8507473089e191bc1816d2b88
Type fulltextMimetype application/pdf
omslag(42 kB)44 downloads
File information
File name COVER01.pdfFile size 42 kBChecksum SHA-512
9f5614cf69d63a451876f108ed63ed60a76ffc9ddf4ecaebf2bf124d2c3be38694a96b52b26008d0773f6dab903931ab71de4e5d92c85703272c7b5abc525b04
Type coverMimetype application/pdf

Search in DiVA

By author/editor
Skoglund, Martin A.
By organisation
Automatic ControlThe Institute of Technology
Engineering and Technology

Search outside of DiVA

GoogleGoogle Scholar
Total: 1620 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

Total: 577 hits
ReferencesLink to record
Permanent link

Direct link