Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
A comparison of training algorithms when training a Convolutional Neural Network for classifying road signs
KTH, School of Electrical Engineering and Computer Science (EECS).
KTH, School of Electrical Engineering and Computer Science (EECS).
2019 (English)Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesisAlternative title
En jämförelse av träningsalgoritmer vid träning av ett ConvolutionalNeural Network för klassificering av vägskyltar (Swedish)
Abstract [en]

This thesis is a comparison between three dierent training algorithms when training a Convolutional Neural Network for classifying road signs. The algorithms that were compared were Gradient Descent, Adadelta, and Adam. For this study the German Trac Sign Recognition Benchmark (GTSRB) was used, which is a scientically relevant dataset containing around 50000 annotated images. A combination of supervised and offline learning was used and the top accuracy of each algorithm was registered. Adam achieved the highest accuracy, followed by Adadelta and then GradientDescent. Improvements to the neural network were implemented in form of more convolutional layers and more feature recognizing filters. This improved the accuracy of the CNN trained with Adam by 0.76 percentagepoints

Abstract [sv]

Detta examensarbete är en jämförelse av tre olika träningsalgoritmer vid traning av ett Convolutional Neural Network för klassifiering av vägskyltar. De algoritmer som jämfördes var Gradient Descent, Adadelta och Adam. I denna studie användes datamängden German Traffic Sign Recognition Benchmark (GTSRB), som är en vetenskapligt använd datamängd innehållande runt 50000 kommenterade bilder. En kombination av övervakad (supervised) och offline inlärning användes och varje algoritms toppresultat sparades. Adam uppnådde högst resultat, följt av Adadelta och sist Gradient Descent. Det neurala nätverket förbättrades med hjälp av fler convolutional lager och fler igenkännande filter. Detta förbättrade traffsakerheten hos nätverket som tränats med Adam med 0.76 procentenheter.

Place, publisher, year, edition, pages
2019.
Series
TRITA-EECS-EX ; 2019:316
National Category
Computer and Information Sciences
Identifiers
URN: urn:nbn:se:kth:diva-254932OAI: oai:DiVA.org:kth-254932DiVA, id: diva2:1336299
Subject / course
Computer Science
Supervisors
Examiners
Available from: 2019-07-29 Created: 2019-07-09 Last updated: 2019-07-29Bibliographically approved

Open Access in DiVA

fulltext(704 kB)29 downloads
File information
File name FULLTEXT01.pdfFile size 704 kBChecksum SHA-512
1c629541dee32912252787a6774aff143912356980e8dc9e47c4ec6990068a234d7b69af76238a753200bb015ec9517dadad89af11965eabf7fa11eb76b9e21c
Type fulltextMimetype application/pdf

By organisation
School of Electrical Engineering and Computer Science (EECS)
Computer and Information Sciences

Search outside of DiVA

GoogleGoogle Scholar
Total: 29 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

urn-nbn

Altmetric score

urn-nbn
Total: 84 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf