Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
A Combined Approach for Object Recognition and Localisation for an Autonomous Racecar
KTH, School of Industrial Engineering and Management (ITM), Machine Design (Dept.).
KTH, School of Industrial Engineering and Management (ITM), Machine Design (Dept.).
2018 (English)Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesisAlternative title
En kombination av objektigenkänning och lokalisering i en autonom racingbil (Swedish)
Abstract [en]

With autonomous vehicles being a hot topic for research it has also become an interest in theworld of motor sport. To be able to run a vehicle autonomously it needs to know what the currentpose of the vehicle is and what the environment looks like. This thesis aims to solve this problemusing SLAM and object detection with 2D LiDAR and camera as sensor input, looking at theperformance in terms of accuracy and latency.The object detection problem was repurposed as an object recognition problem by utilising the2D LiDAR for cone candidate extraction which was projected onto the camera image andverified by a Convolutional Neural Network (CNN). Two different CNN architecture were used,MobileNet and a minimalistic architecture with less than 7 layers. The best performing CNNwith four convolutional layers and two fully connected layers reached a total of 87.3% accuracywith a classification time of 4.6ms on the demonstrator constructed.Three different SLAM algorithms were implemented, Pose Graph Optimization, Rao-Blackwellized Particle Filter and Extended Kalman Filter (EKF). When tested on thedemonstrator the EKF solution showed the best results with a mere 20mm average error invehicle position and 39mm average error in cone position. Further, the end-to-end timing of theEKF algorithm was the fastest at an average of 32ms.The two best performing algorithms were combined for an evaluation, with the output of theCNN as input to the EKF. The performance was measured to an average error of 19mm for theposition and 51mm for the cones. Further, the latency was only increased by the 4.6ms that theCNN required for classification, to a total of 36.54ms.

Abstract [sv]

Autonoma fordon är ett aktuellt ämne för research och har på senare tid även tagit sig in imotorsporten. För att åstadkomma autonom körning med ett fordon behöver det veta dennuvarande positionen och hur omgivningen ser ut. Detta examensarbete ämnar att lösa detproblemet med SLAM och objektdetektion med hjälp av en 2D Lidar och kamera som sensorer,och utvärdera utifrån precision och latens.Objektdetektionsproblemet omformulerades som ett objektigenkänningsproblem med hjälp av2D LiDAR sensorn för extrahering av kon-kandidater som sedan projicerades på kamerabildenoch verifierades av ett Convolutional Neural Network (CNN). Två olika CNN arkitektureranvändes, MobileNet och en minimalistisk arkitektur med färre än 7 lager. Det bäst presterandeCNN:et med fyra faltningslager och två fullt kopplade lager nådde en total på 87.3% precisionmed en klassificeringstid på 4.6ms på demonstratören.Tre olika SLAM algoritmer implementerades, Pose Graph Optimisation, Rao-BlackwellizedParticle Filter, och Extended Kalman Filter (EKF). När de testades på demonstratorn gav EKFdet bästa resultatet med endast 20mm genomsnittligt fel för fordonets position och 39mm förkonernas position. Den totala tiden för samma algoritm var även den kortaste på ett genomsnittav 32ms.De två bäst presterande algoritmerna kombinerades för utvärdering, med utmatningen från CNNsom inmatning till EKF. Prestandan mättes till ett genomsnittligt fel på 19mm för fordonetsposition och 51mm för konorna. Vidare så ökades latensen med endast den tid det tog för CNNatt klassificera konorna, det vill säga 4.6ms, till en total av 36.54ms,

Place, publisher, year, edition, pages
2018. , p. 88
Series
TRITA-ITM-EX ; 2018:193
National Category
Mechanical Engineering
Identifiers
URN: urn:nbn:se:kth:diva-232368OAI: oai:DiVA.org:kth-232368DiVA, id: diva2:1234088
External cooperation
ÅF AB
Supervisors
Examiners
Available from: 2018-07-23 Created: 2018-07-23 Last updated: 2018-07-23Bibliographically approved

Open Access in DiVA

fulltext(8003 kB)11 downloads
File information
File name FULLTEXT01.pdfFile size 8003 kBChecksum SHA-512
7f6dd4f5fa2e53ea4622d31afcba15fa937663119dbf2dadb40603e16282d84f3cca90135c8c8d423f687a9471686c2e9438b6ae51f3d777e1f3caac29f0d175
Type fulltextMimetype application/pdf

By organisation
Machine Design (Dept.)
Mechanical Engineering

Search outside of DiVA

GoogleGoogle Scholar
Total: 11 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

urn-nbn

Altmetric score

urn-nbn
Total: 6 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf