Endre søk
Begrens søket
1 - 27 of 27
RefereraExporteraLink til resultatlisten
Permanent link
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Treff pr side
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sortering
  • Standard (Relevans)
  • Forfatter A-Ø
  • Forfatter Ø-A
  • Tittel A-Ø
  • Tittel Ø-A
  • Type publikasjon A-Ø
  • Type publikasjon Ø-A
  • Eldste først
  • Nyeste først
  • Skapad (Eldste først)
  • Skapad (Nyeste først)
  • Senast uppdaterad (Eldste først)
  • Senast uppdaterad (Nyeste først)
  • Disputationsdatum (tidligste først)
  • Disputationsdatum (siste først)
  • Standard (Relevans)
  • Forfatter A-Ø
  • Forfatter Ø-A
  • Tittel A-Ø
  • Tittel Ø-A
  • Type publikasjon A-Ø
  • Type publikasjon Ø-A
  • Eldste først
  • Nyeste først
  • Skapad (Eldste først)
  • Skapad (Nyeste først)
  • Senast uppdaterad (Eldste først)
  • Senast uppdaterad (Nyeste først)
  • Disputationsdatum (tidligste først)
  • Disputationsdatum (siste først)
Merk
Maxantalet träffar du kan exportera från sökgränssnittet är 250. Vid större uttag använd dig av utsökningar.
  • 1.
    Chandaria, Jigna
    et al.
    BBC Research, United Kingdom.
    Thomas, Graham
    BBC Research, United Kingdom.
    Bartczak, Bogumil
    University of Kiel, Germany.
    Koch, Reinhard
    University of Kiel, Germany.
    Becker, Mario
    Fraunhofer IGD, Germany.
    Bleser, Gabriele
    Fraunhofer IGD, Germany.
    Stricker, Didier
    Fraunhofer IGD, Germany.
    Wohlleber, Cedric
    Fraunhofer IGD, Germany.
    Gustafsson, Fredrik
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Felsberg, Michael
    Linköpings universitet, Institutionen för systemteknik, Bildbehandling. Linköpings universitet, Tekniska högskolan.
    Hol, Jeroen
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Schön, Thomas
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Skoglund, Johan
    Linköpings universitet, Institutionen för systemteknik, Bildbehandling. Linköpings universitet, Tekniska högskolan.
    Slycke, Per
    Xsens, The Netherlands.
    Smeitz, Sebastiaan
    Xsens, The Netherlands.
    Real-Time Camera Tracking in the MATRIS Project2007Inngår i: Smpte Journal, ISSN 0036-1682, Vol. 116, nr 7-8, s. 266-271Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    In order to insert a virtual object into a TV image, the graphics system needs to know precisely how the camera is moving, so that the virtual object can be rendered in the correct place in every frame. Nowadays this can be achieved relatively easily in post-production, or in a studio equipped with a special tracking system. However, for live shooting on location, or in a studio that is not specially equipped, installing such a system can be difficult or uneconomic. To overcome these limitations, the MATRIS project is developing a real-time system for measuring the movement of a camera. The system uses image analysis to track naturally occurring features in the scene, and data from an inertial sensor. No additional sensors, special markers, or camera mounts are required. This paper gives an overview of the system and presents some results.

  • 2.
    Chandaria, Jigna
    et al.
    BBC Research, UK.
    Thomas, Graham
    BBC Research, UK.
    Bartczak, Bogumil
    University of Kiel, Germany.
    Koeser, Kevin
    University of Kiel, Germany.
    Koch, Reinhard
    University of Kiel, Germany.
    Becker, Mario
    Fraunhofer IGD, Germany.
    Bleser, Gabriele
    Fraunhofer IGD, Germany.
    Stricker, Didier
    Fraunhofer IGD, Germany.
    Wohlleber, Cedric
    Fraunhofer IGD, Germany.
    Felsberg, Michael
    Linköpings universitet, Institutionen för systemteknik, Datorseende. Linköpings universitet, Tekniska högskolan.
    Gustafsson, Fredrik
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Hol, Jeroen
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Schön, Thomas
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Skoglund, Johan
    Linköpings universitet, Tekniska högskolan.
    Slycke, Per
    Xsens, Netherlands.
    Smeitz, Sebastiaan
    Xsens, Netherlands.
    Real-Time Camera Tracking in the MATRIS Project2006Inngår i: Prcoeedings of the 2006 International Broadcasting Convention, 2006Konferansepaper (Fagfellevurdert)
    Abstract [en]

    In order to insert a virtual object into a TV image, the graphics system needs to know precisely how the camera is moving, so that the virtual object can be rendered in the correct place in every frame. Nowadays this can be achieved relatively easily in postproduction, or in a studio equipped with a special tracking system. However, for live shooting on location, or in a studio that is not specially equipped, installing such a system can be difficult or uneconomic. To overcome these limitations, the MATRIS project is developing a real-time system for measuring the movement of a camera. The system uses image analysis to track naturally occurring features in the scene, and data from an inertial sensor. No additional sensors, special markers, or camera mounts are required. This paper gives an overview of the system and presents some results.  

  • 3.
    Gustafsson, Fredrik
    et al.
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Schön, Thomas
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Hol, Jeroen
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Sensor Fusion for Augmented Reality2009Rapport (Annet vitenskapelig)
    Abstract [en]

    The problem of estimating the position and orientation (pose) of a camera is approached by fusing measurements from inertial sensors (accelerometers and rate gyroscopes) and a camera. The sensor fusion approach described in this contribution is based on nonlinear filtering using the measurements from these complementary sensors. This way, accurate and robust pose estimates are available for the primary purpose of augmented reality applications, but with the secondary effect of reducing computation time and improving the performance in vision processing. A real-time implementation of a nonlinear filter is described, using a dynamic model for the 22 states, where 100 Hz inertial measurements and 12.5 Hz vision measurements are processed. An example where an industrial robot is used to move the sensor unit, possessing almost perfect precision and repeatability, is presented. The results show that position and orientation accuracy is sufficient for a number of augmented reality applications.

  • 4.
    Gustafsson, Fredrik
    et al.
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Schön, Thomas
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Hol, Jeroen
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Sensor Fusion for Augmented Reality2008Inngår i: Proceedings of the 17th IFAC World Congress, 2008, s. 14100-14100Konferansepaper (Fagfellevurdert)
    Abstract [en]

    The problem of estimating the position and orientation (pose) of a camera is approached by fusing measurements from inertial sensors (accelerometers and rate gyroscopes) and a camera. The sensor fusion approach described in this contribution is based on nonlinear filtering using the measurements from these complementary sensors. This way, accurate and robust pose estimates are available for the primary purpose of augmented reality applications, but with the secondary effect of reducing computation time and improving the performance in vision processing. A real-time implementation of a nonlinear filter is described, using a dynamic model for the 22 states, where 100 Hz inertial measurements and 12.5 Hz vision measurements are processed. An example where an industrial robot is used to move the sensor unit, possessing almost perfect precision and repeatability, is presented. The results show that position and orientation accuracy is sufficient for a number of augmented reality applications.

  • 5.
    Hendeby, Gustaf
    et al.
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Hol, Jeroen
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Karlsson, Rickard
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Gustafsson, Fredrik
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    A Graphics Processing Unit Implementation of the Particle Filter2007Inngår i: Proceedings of the 15th European Statistical Signal Processing Conference, European Association for Signal, Speech, and Image Processing , 2007, s. 1639-1643Konferansepaper (Fagfellevurdert)
    Abstract [en]

    Modern graphics cards for computers, and especially their graphics processing units (GPUs), are designed for fast rendering of graphics. In order to achieve this GPUs are equipped with a parallel architecture which can be exploited for general-purpose computing on GPU (GPGPU) as a complement to the central processing unit (CPU). In this paper GPGPU techniques are used to make a parallel GPU implementation of state-of-the-art recursive Bayesian estimation using particle filters (PF). The modifications made to obtain a parallel particle filter, especially for the resampling step, are discussed and the performance of the resulting GPU implementation is compared to one achieved with a traditional CPU implementation. The resulting GPU filter is faster with the same accuracy as the CPU filter for many particles, and it shows how the particle filter can be parallelized.

  • 6.
    Hendeby, Gustaf
    et al.
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Hol, Jeroen
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Karlsson, Rickard
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Gustafsson, Fredrik
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    A Graphics Processing Unit Implementation of the Particle Filter2007Rapport (Annet vitenskapelig)
    Abstract [en]

    Modern graphics cards for computers, and especially their graphics processing units (GPUs), are designed for fast rendering of graphics. In order to achieve this GPUs are equipped with a parallel architecture which can be exploited for general-purpose computing on GPU (GPGPU) as a complement to the central processing unit (CPU). In this paper GPGPU techniques are used to make a parallel GPU implementation of state-of-the-art recursive Bayesian estimation using particle filters (PF). The modifications made to obtain a parallel particle filter, especially for the resampling step, are discussed and the performance of the resulting GPU implementation is compared to one achieved with a traditional CPU implementation. The resulting GPU filter is faster with the same accuracy as the CPU filter for many particles, and it shows how the particle filter can be parallelized.

  • 7.
    Hendeby, Gustaf
    et al.
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Hol, Jeroen
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Karlsson, Rickard
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Gustafsson, Fredrik
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Graphics Processing Unit Implementation of the Particle Filter2006Rapport (Annet vitenskapelig)
    Abstract [en]

    Modern graphics cards for computers, and especially their graphics processing units (GPUs), are designed for fast rendering of graphics. In order to achieve this GPUs are equipped with a parallel architecture which can be exploited for general-purpose computing on GPU (GPGPU) as a complement to the central processing unit (CPU). In this paper GPGPU techniques are used to make a parallel GPU implementation of state-of-the-art recursive Bayesian estimation using particle filters (PF). The modifications made to obtain a parallel particle filter, especially for the resampling step, are discussed and the performance of the resulting GPU implementation is compared to one achieved with a traditional CPU implementation. The resulting GPU filter is faster with the same accuracy as the CPU filter for many particles, and it shows how the particle filter can be parallelized.

  • 8.
    Hol, Jeroen D.
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Sensor Fusion and Calibration of Inertial Sensors, Vision, Ultra-Wideband and GPS2011Doktoravhandling, monografi (Annet vitenskapelig)
    Abstract [en]

    The usage of inertial sensors has traditionally been confined primarily to the aviation and marine industry due to their associated cost and bulkiness. During the last decade, however, inertial sensors have undergone a rather dramatic reduction in both size and cost with the introduction of MEMS technology. As a result of this trend, inertial sensors have become commonplace for many applications and can even be found in many consumer products, for instance smart phones, cameras and game consoles. Due to the drift inherent in inertial technology, inertial sensors are typically used in combination with aiding sensors to stabilize andimprove the estimates. The need for aiding sensors becomes even more apparent due to the reduced accuracy of MEMS inertial sensors.

    This thesis discusses two problems related to using inertial sensors in combination with aiding sensors. The first is the problem of sensor fusion: how to combine the information obtained from the different sensors and obtain a good estimate of position and orientation. The second problem, a prerequisite for sensor fusion, is that of calibration: the sensors themselves have to be calibrated and provide measurement in known units. Furthermore, whenever multiple sensors are combined additional calibration issues arise, since the measurements are seldom acquired in the same physical location and expressed in a common coordinate frame. Sensor fusion and calibration are discussed for the combination of inertial sensors with cameras, UWB or GPS.

    Two setups for estimating position and orientation in real-time are presented in this thesis. The first uses inertial sensors in combination with a camera; the second combines inertial sensors with UWB. Tightly coupled sensor fusion algorithms and experiments with performance evaluation are provided. Furthermore, this thesis contains ideas on using an optimization based sensor fusion method for a multi-segment inertial tracking system used for human motion capture as well as a sensor fusion method for combining inertial sensors with a dual GPS receiver.

    The above sensor fusion applications give rise to a number of calibration problems. Novel and easy-to-use calibration algorithms have been developed and tested to determine the following parameters: the magnetic field distortion when an IMU containing magnetometers is mounted close to a ferro-magnetic object, the relative position and orientation of a rigidly connected camera and IMU, as well as the clock parameters and receiver positions of an indoor UWB positioning system.

  • 9.
    Hol, Jeroen D
    et al.
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan. Xsens Technologies, The Netherlands.
    Schön, Thomas
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Gustafsson, Fredrik
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Ultra-Wideband Calibration for Indoor Positioning2010Inngår i: Proceedings of the 2010 IEEE International Conference on Ultra-Wideband, 2010, Vol. 2Konferansepaper (Fagfellevurdert)
    Abstract [en]

    The main contribution of this work is a novel calibration method to determine the clock parameters of the UWB receivers as well as their 3D positions. It exclusively uses time-of-arrival measurements, thereby removing the need for the typically labor-intensive and time-consuming process of surveying the receiver positions. Experiments show that the method is capable of accurately calibrating a UWB setup within minutes.

  • 10.
    Hol, Jeroen Diederik
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Pose Estimation and Calibration Algorithms for Vision and Inertial Sensors2008Licentiatavhandling, monografi (Annet vitenskapelig)
    Abstract [en]

    This thesis deals with estimating position and orientation in real-time, using measurements from vision and inertial sensors. A system has been developed to solve this problem in unprepared environments, assuming that a map or scene model is available. Compared to ‘camera-only’ systems, the combination of the complementary sensors yields an accurate and robust system which can handle periods with uninformative or no vision data and reduces the need for high frequency vision updates.

    The system achieves real-time pose estimation by fusing vision and inertial sensors using the framework of nonlinear state estimation for which state space models have been developed. The performance of the system has been evaluated using an augmented reality application where the output from the system is used to superimpose virtual graphics on the live video stream. Furthermore, experiments have been performed where an industrial robot providing ground truth data is used to move the sensor unit. In both cases the system performed well.

    Calibration of the relative position and orientation of the camera and the inertial sensor turn out to be essential for proper operation of the system. A new and easy-to-use algorithm for estimating these has been developed using a gray-box system identification approach. Experimental results show that the algorithm works well in practice.

  • 11.
    Hol, Jeroen
    et al.
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Schön, Thomas
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Gustafsson, Fredrik
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    A New Algorithm for Calibrating a Combined Camera and IMU Sensor Unit2008Rapport (Annet vitenskapelig)
    Abstract [en]

    This paper is concerned with the problem of estimating the relative translation and orientation between an inertial measurement unit and a camera which are rigidly connected. The key is to realise that this problem is in fact an instance of a standard problem within the area of system identification, referred to as a gray-box problem. We propose a new algorithm for estimating the relative translation and orientation, which does not require any additional hardware, except a piece of paper with a checkerboard pattern on it. Furthermore, covariance expressions are provided for all involved estimates. The experimental results shows that the method works well in practice.

  • 12.
    Hol, Jeroen
    et al.
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Schön, Thomas
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Gustafsson, Fredrik
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    A New Algorithm for Calibrating a Combined Camera and IMU Sensor Unit2008Inngår i: Proceedings of the 10th International Conference on Control, Automation, Robotics and Vision, 2008, s. 1857-1862Konferansepaper (Fagfellevurdert)
    Abstract [en]

    This paper is concerned with the problem of estimating the relative translation and orientation between an inertial measurement unit and a camera which are rigidly connected. The key is to realise that this problem is in fact an instance of a standard problem within the area of system identification, referred to as a gray-box problem. We propose a new algorithm for estimating the relative translation and orientation, which does not require any additional hardware, except a piece of paper with a checkerboard pattern on it. Furthermore, covariance expressions are provided for all involved estimates. The experimental results shows that the method works well in practice. 

  • 13.
    Hol, Jeroen
    et al.
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Schön, Thomas
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Gustafsson, Fredrik
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Modeling and Calibration of Inertial and Vision Sensors2009Rapport (Annet vitenskapelig)
    Abstract [en]

    This paper is concerned with the problem of estimating the relative translation and orientation of an inertial measurement unit and a camera, which are rigidly connected. The key is to realize that this problem is in fact an instance of a standard problem within the area of system identification, referred to as a gray-box problem. We propose a new algorithm for estimating the relative translation and orientation, which does not require any additional hardware, except a piece of paper with a checkerboard pattern on it. The method is based on a physical model which can also be used in solving, for example, sensor fusion problems. The experimental results show that the method works well in practice, both for perspective and spherical cameras.

  • 14.
    Hol, Jeroen
    et al.
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Schön, Thomas
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Gustafsson, Fredrik
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Modeling and Calibration of Inertial and Vision Sensors2010Inngår i: The international journal of robotics research, ISSN 0278-3649, E-ISSN 1741-3176, Vol. 29, nr 2, s. 231-244Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    This paper is concerned with the problem of estimating the relative translation and orientation of an inertial measurement unit and a camera, which are rigidly connected. The key is to realize that this problem is in fact an instance of a standard problem within the area of system identification, referred to as a gray-box problem. We propose a new algorithm for estimating the relative translation and orientation, which does not require any additional hardware, except a piece of paper with a checkerboard pattern on it. The method is based on a physical model which can also be used in solving, for example, sensor fusion problems. The experimental results show that the method works well in practice, both for perspective and spherical cameras.

  • 15.
    Hol, Jeroen
    et al.
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Schön, Thomas
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Gustafsson, Fredrik
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    On Resampling Algorithms for Particle Filters2006Inngår i: Proceedings of the 2006 IEEE Nonlinear Statistical Signal Processing Workshop, 2006, s. 79-82Konferansepaper (Fagfellevurdert)
    Abstract [en]

    In this paper a comparison is made between four frequently encountered resampling algorithms for particle filters. A theoretical framework is introduced to be able to understand and explain the differences between the resampling algorithms. This facilitates a comparison of the algorithms with respect to their resampling quality and computational complexity. Using extensive Monte Carlo simulations the theoretical results are verified. It is found that systematic resampling is favourable, both in terms of resampling quality and computational complexity.

  • 16.
    Hol, Jeroen
    et al.
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Schön, Thomas
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Gustafsson, Fredrik
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    On Resampling Algorithms for Particle Filters2007Rapport (Annet vitenskapelig)
    Abstract [en]

    In this paper a comparison is made between four frequently encountered resampling algorithms for particle filters. A theoretical framework is introduced to be able to understand and explain the differences between the resampling algorithms. This facilitates a comparison of the algorithms with respect to their resampling quality and computational complexity.Using extensive Monte Carlo simulations the theoretical results are verified. It is found that systematic resampling is favourable, both in terms of resampling quality and computational complexity.

  • 17.
    Hol, Jeroen
    et al.
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Schön, Thomas
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Gustafsson, Fredrik
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Relative Pose Calibration of a Spherical Camera and an IMU2008Rapport (Annet vitenskapelig)
    Abstract [en]

    This paper is concerned with the problem of estimating the relative translation and orientation of an inertial measurement unit and a spherical camera, which are rigidly connected. The key is to realize that this problem is in fact an instance of a standard problem within the area of system identification, referred to as a gray-box problem. We propose a new algorithm for estimating the relative translation and orientation, which does not require any additional hardware, except a piece of paper with a checkerboard pattern on it. The experimental results show that the method works well in practice.

  • 18.
    Hol, Jeroen
    et al.
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Schön, Thomas
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Gustafsson, Fredrik
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Relative Pose Calibration of a Spherical Camera and an IMU2008Inngår i: Proceedings of the 7th IEEE and ACM International Symposium on Mixed and Augmented Reality, 2008, s. 21-24Konferansepaper (Fagfellevurdert)
    Abstract [en]

    This paper is concerned with the problem of estimating the relative translation and orientation of an inertial measurement unit and a spherical camera, which are rigidly connected. The key is to realize that this problem is in fact an instance of a standard problem within the area of system identification, referred to as a gray-box problem. We propose a new algorithm for estimating the relative translation and orientation, which does not require any additional hardware, except a piece of paper with a checkerboard pattern on it. The experimental results show that the method works well in practice.

  • 19.
    Hol, Jeroen
    et al.
    Linköpings universitet, Institutionen för systemteknik.
    Schön, Thomas
    Linköpings universitet, Tekniska högskolan. Linköpings universitet, Institutionen för systemteknik, Reglerteknik.
    Gustafsson, Fredrik
    Linköpings universitet, Tekniska högskolan. Linköpings universitet, Institutionen för systemteknik, Reglerteknik.
    Resampling in Particle Filters2006Inngår i: Nonlinear Statistical Signal Processing Workshop,2006, Cambridge, United Kingdom: IEEE , 2006Konferansepaper (Fagfellevurdert)
  • 20.
    Hol, Jeroen
    et al.
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Schön, Thomas
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Gustafsson, Fredrik
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Sensor Fusion for Augmented Reality2006Inngår i: Proceedings of the 9th International Conference on Information Fusion, 2006Konferansepaper (Fagfellevurdert)
    Abstract [en]

    In Augmented Reality (AR), the position and orientation of the camera have to be estimated with high accuracy and low latency. This nonlinear estimation problem is studied in the present paper. The proposed solution makes use of measurements from inertial sensors and computer vision. These measurements are fused using a Kalman filtering framework, incorporating a rather detailed model for the dynamics of the camera. Experiments show that the resulting filter provides good estimates of the camera motion, even during fast movements.

  • 21.
    Hol, Jeroen
    et al.
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Schön, Thomas
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Gustafsson, Fredrik
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Sensor Fusion for Augmented Reality2006Inngår i: Proceedings of Reglermöte 2006, 2006Konferansepaper (Annet vitenskapelig)
    Abstract [en]

    In Augmented Reality (AR), the position and orientation of the camera have to be estimated with high accuracy and low latency. This nonlinear estimation problem is studied in the present paper. The proposed solution makes use of measurements from inertial sensors and computer vision. These measurements are fused using a Kalman filtering framework, incorporating a rather detailed model for the dynamics of the camera. Experiments show that the resulting filter provides good estimates of the camera motion, even during fast movements.

  • 22.
    Hol, Jeroen
    et al.
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Schön, Thomas
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Gustafsson, Fredrik
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Slycke, Per
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Sensor Fusion for Augmented Reality2007Rapport (Annet vitenskapelig)
    Abstract [en]

    In Augmented Reality (AR), the position and orientation of the camera have to be estimated with high accuracy and low latency. This nonlinear estimation problem is studied in the present paper. The proposed solution makes use of measurements from inertial sensors and computer vision. These measurements are fused using a Kalman filtering framework, incorporating a rather detailed model for the dynamics of the camera. Experiments show that the resulting filter provides good estimates of the camera motion, even during fast movements.

  • 23.
    Hol, Jeroen
    et al.
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Schön, Thomas
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Luinge, Henk
    Xsens Technologies B.V, The Netherlands.
    Slycke, Per
    Xsens Technologies B.V, The Netherlands.
    Gustafsson, Fredrik
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Robust Real-Time Tracking by Fusing Measurements from Inertial and Vision Sensors2007Rapport (Annet vitenskapelig)
    Abstract [en]

    The problem of estimating and predicting position and orientation (pose) of a camera is approached by fusing measurements from inertial sensors (accelerometers and rate gyroscopes) and vision. The sensor fusion approach described in this contribution is based on non-linear filtering of these complementary sensors. This way, accurate and robust pose estimates are available for the primary purpose of augmented reality applications, but with the secondary effect of reducing computation time and improving the performance in vision processing. A real-time implementation of a multi-rate extended Kalman filter is described, using a dynamic model with 22 states, where 12.5 Hz correspondences from vision and 100 Hz inertial measurements are processed. An example where an industrial robot is used to move the sensor unit is presented. The advantage with this configuration is that it provides ground truth for the pose, allowing for objective performance evaluation. The results show that we obtain an absolute accuracy of 2 cm in position and 1° in orientation.

  • 24.
    Hol, Jeroen
    et al.
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Schön, Thomas
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Luinge, Henk
    Xsens Technologies B.V, The Netherlands.
    Slycke, Per
    Xsens Technologies B.V, The Netherlands.
    Gustafsson, Fredrik
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Robust Real-Time Tracking by Fusing Measurements from Inertial and Vision Sensors2007Inngår i: Journal of Real-Time Image Processing, ISSN 1861-8200, E-ISSN 1861-8219, Vol. 2, nr 2-3, s. 149-160Artikkel i tidsskrift (Fagfellevurdert)
    Abstract [en]

    The problem of estimating and predicting position and orientation (pose) of a camera is approached by fusing measurements from inertial sensors (accelerometers and rate gyroscopes) and vision. The sensor fusion approach described in this contribution is based on non-linear filtering of these complementary sensors. This way, accurate and robust pose estimates are available for the primary purpose of augmented reality applications, but with the secondary effect of reducing computation time and improving the performance in vision processing. A real-time implementation of a multi-rate extended Kalman filter is described, using a dynamic model with 22 states, where 12.5 Hz correspondences from vision and 100 Hz inertial measurements are processed. An example where an industrial robot is used to move the sensor unit is presented. The advantage with this configuration is that it provides ground truth for the pose, allowing for objective performance evaluation. The results show that we obtain an absolute accuracy of 2 cm in position and 1° in orientation.

  • 25.
    Hol, Jeroen.D
    et al.
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Dijkstra, Fred
    Xsens Technologies B.V., Netherlands.
    Luinge, Henk
    Xsens Technologies B.V., Netherlands.
    Schön, Thomas
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Tightly Coupled UWB/IMU Pose Estimation2009Rapport (Annet vitenskapelig)
    Abstract [en]

    In this paper we propose a 6DOF tracking system combining Ultra-Wideband measurements with low-cost MEMS inertial measurements. A tightly coupled system is developed which estimates position as well as orientation of the sensorunit while being reliable in case of multipath effects and NLOS conditions. The experimental results show robust and continuous tracking in a realistic indoor positioning scenario.

  • 26.
    Karlsson, Rickard
    et al.
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Törnqvist, David
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Sjöberg, Johan
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Hol, Jeroen
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Hansson, Anders
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Positioning and Control of an Unmanned Aerial Vehicle2006Inngår i: Proceedings of the 2nd International CDIO Conference and Collaborators' Meeting, 2006Konferansepaper (Fagfellevurdert)
    Abstract [en]

    In the CDIO-project course in Automatic Control, an Autonomous Unmanned Aerial vehicle (UAV) is constructed, utilizing an existing radio controlled model aircraft. By adding an inertial sensor measuring acceleration and rotation, together with a Global Positioning System (GPS) sensor, the aim is to construct an accurate positioning system. This is used by an on board computer to calculate rudder control signals to a set of DC-servos in order to follow a predefined way-point trajectory. The project involves 17 students, which is roughly three times as big as previous projects, and it comprises both positioning, control, and hardware design. Since the project is still ongoing some preliminary results and conclusions are presented.

  • 27.
    Karlsson, Rickard
    et al.
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Törnqvist, David
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Sjöberg, Johan
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Hol, Jeroen
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Hansson, Anders
    Linköpings universitet, Institutionen för systemteknik, Reglerteknik. Linköpings universitet, Tekniska högskolan.
    Positioning and Control of an Unmanned Aerial Vehicle2006Rapport (Annet vitenskapelig)
    Abstract [en]

    In the CDIO-project course in Automatic Control, an Autonomous Unmanned Aerial vehicle (UAV) is constructed, utilizing an existing radio controlled model aircraft. By adding an inertial sensor measuring acceleration and rotation, together with a Global Positioning System (GPS) sensor, the aim is to construct an accurate positioning system. This is used by an on board computer to calculate rudder control signals to a set of DC-servos in order to follow a predefined way-point trajectory. The project involves 17 students, which is roughly three times as big as previous projects, and it comprises both positioning, control, and hardware design. Since the project is still ongoing some preliminary results and conclusions are presented.

1 - 27 of 27
RefereraExporteraLink til resultatlisten
Permanent link
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf