Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
A new scheme for training ReLU-based multi-layer feedforward neural networks
KTH, School of Computer Science and Communication (CSC).
2017 (English)Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesisAlternative title
Ett nytt system för att träna ReLU-baserade och framkopplade neurala nätverk med flera lager (Swedish)
Abstract [en]

A new scheme for training Rectified Linear Unit (ReLU) based feedforward neural networks is examined in this thesis. The project starts with the row-by-row updating strategy designed for Single-hidden Layer Feedforward neural Networks (SLFNs). This strategy exploits the properties held by ReLUs and optimizes each row in the input weight matrix individually, under the common optimization scheme. Then the Direct Updating Strategy (DUS), which has two different versions: Vector-Based Method (VBM) and Matrix-Based Method (MBM), is proposed to optimize the input weight matrix as a whole. Finally DUS is extended to Multi-hidden Layer Feedforward neural Networks (MLFNs). Since the extension, for general ReLU-based MLFNs, faces an initialization dilemma, a special structure MLFN is presented. Verification experiments are conducted on six benchmark multi-class classification datasets. The results confirm that MBM algorithm for SLFNs improves the performance of neural networks, compared to its competitor, regularized extreme learning machine. For most datasets involved, MLFNs with the proposed special structure perform better when adding extra hidden layers.

Abstract [sv]

Ett nytt schema för träning av rektifierad linjär enhet (ReLU)-baserade och framkopplade neurala nätverk undersöks i denna avhandling. Projektet börjar med en rad-för-rad-uppdateringsstrategi designad för framkopplade neurala nätverk med ett dolt lager (SLFNs). Denna strategi utnyttjar egenskaper i ReLUs och optimerar varje rad i inmatningsviktmatrisen individuellt, enligt en gemensam optimeringsmetod. Därefter föreslås den direkta uppdateringsstrategin (DUS), som har två olika versioner: vektorbaserad metod (VBM) respektive matrisbaserad metod (MBM), för att optimera ingångsviktmatrisen som helhet. Slutli- gen utvidgas DUS till framkopplade neurala nätverk med flera lager (MLFN). Eftersom utvidgningen för generella ReLU-baserade MLFN står inför ett initieringsdilemma presenteras därför en MLFN med en speciell struktur.

Verifieringsexperiment utförs på sex datamängder för klassificering av flera klasser. Resultaten bekräftar att MBM-algoritmen för SLFN förbättrar prestanda hos neurala nätverk, jämfört med konkurrenten, den regulariserade extrema inlärningsmaskinen. För de flesta använda dataset, fungerar MLFNs med den föreslagna speciella strukturen bättre när man lägger till extra dolda lager.

Place, publisher, year, edition, pages
2017. , p. 52
Keywords [en]
ReLU, feedforward neural network, ELM
National Category
Computer Sciences
Identifiers
URN: urn:nbn:se:kth:diva-217384OAI: oai:DiVA.org:kth-217384DiVA, id: diva2:1156251
Supervisors
Examiners
Available from: 2017-12-18 Created: 2017-11-10 Last updated: 2018-01-13Bibliographically approved

Open Access in DiVA

fulltext(1597 kB)60 downloads
File information
File name FULLTEXT01.pdfFile size 1597 kBChecksum SHA-512
0e7d64b649ef9de31c9e9bde83fe4d8182452be020eb5fe4df50be5177c96d3f235009d8e457550608a9e1936548e25b0e2183d394ee1a33c7fc9e48a406c31e
Type fulltextMimetype application/pdf

By organisation
School of Computer Science and Communication (CSC)
Computer Sciences

Search outside of DiVA

GoogleGoogle Scholar
Total: 60 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

urn-nbn

Altmetric score

urn-nbn
Total: 325 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf