A multi-sensor traffic scene dataset with omnidirectional video
2013 (English)In: 2013 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS (CVPRW), IEEE conference proceedings, 2013, 727-734 p.Conference paper (Refereed)
The development of vehicles that perceive their environment, in particular those using computer vision, indispensably requires large databases of sensor recordings obtained from real cars driven in realistic traffic situations. These datasets should be time shaped for enabling synchronization of sensor data from different sources. Furthermore, full surround environment perception requires high frame rates of synchronized omnidirectional video data to prevent information loss at any speeds.
This paper describes an experimental setup and software environment for recording such synchronized multi-sensor data streams and storing them in a new open source format. The dataset consists of sequences recorded in various environments from a car equipped with an omnidirectional multi-camera, height sensors, an IMU, a velocity sensor, and a GPS. The software environment for reading these data sets will be provided to the public, together with a collection of long multi-sensor and multi-camera data streams stored in the developed format.
Place, publisher, year, edition, pages
IEEE conference proceedings, 2013. 727-734 p.
, IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, ISSN 2160-7508
Computer Vision and Robotics (Autonomous Systems)
IdentifiersURN: urn:nbn:se:liu:diva-93277DOI: 10.1109/CVPRW.2013.110ISI: 000331116100112ISBN: 978-0-7695-4990-3OAI: oai:DiVA.org:liu-93277DiVA: diva2:623885
IEEE Conference on Computer Vision and Patter Recognition : Workshop on Ground Truth, June 28, 2013, Portland, U.S.A.