Digitala Vetenskapliga Arkivet

Ändra sökning
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Gaussian Process Multiclass Classification: Evaluation of Binarization Techniques and Likelihood Functions
Linnéuniversitetet, Fakulteten för teknik (FTK), Institutionen för matematik (MA).
2019 (Engelska)Självständigt arbete på grundnivå (kandidatexamen), 10 poäng / 15 hpStudentuppsats (Examensarbete)
Abstract [en]

In binary Gaussian process classification the prior class membership probabilities are obtained by transforming a Gaussian process to the unit interval, typically either with the logistic likelihood function or the cumulative Gaussian likelihood function. Multiclass classification problems can be handled by any binary classifier by means of so-called binarization techniques, which reduces the multiclass problem into a number of binary problems. Other than introducing the mathematics behind the theory and methods behind Gaussian process classification, we compare the binarization techniques one-against-all and one-against-one in the context of Gaussian process classification, and we also compare the performance of the logistic likelihood and the cumulative Gaussian likelihood. This is done by means of two experiments: one general experiment where the methods are tested on several publicly available datasets, and one more specific experiment where the methods are compared with respect to class imbalance and class overlap on several artificially generated datasets. The results indicate that there is no significant difference in the choices of binarization technique and likelihood function for typical datasets, although the one-against-one technique showed slightly more consistent performance. However the second experiment revealed some differences in how the methods react to varying degrees of class imbalance and class overlap. Most notably the logistic likelihood was a dominant factor and the one-against-one technique performed better than one-against-all.

Ort, förlag, år, upplaga, sidor
2019. , s. 69
Nyckelord [en]
Machine learning, Gaussian process classification, Laplace's approximation, Likelihood functions, Binarization techniques, Class imbalance
Nationell ämneskategori
Matematik
Identifikatorer
URN: urn:nbn:se:lnu:diva-87952OAI: oai:DiVA.org:lnu-87952DiVA, id: diva2:1342982
Utbildningsprogram
Matematikerprogrammet, 180 hp
Handledare
Examinatorer
Tillgänglig från: 2019-08-15 Skapad: 2019-08-15 Senast uppdaterad: 2019-08-15Bibliografiskt granskad

Open Access i DiVA

fulltext(3259 kB)3642 nedladdningar
Filinformation
Filnamn FULLTEXT01.pdfFilstorlek 3259 kBChecksumma SHA-512
5d169d9b689dd82a82c4852f660f5d6f72dc65946a1ba1f3042b74461da2a043b9c921d5e3df8dc465c4cb100b04bb7f24c6093bd9eb0d15ab0481a9b44fa016
Typ fulltextMimetyp application/pdf

Av organisationen
Institutionen för matematik (MA)
Matematik

Sök vidare utanför DiVA

GoogleGoogle Scholar
Totalt: 3643 nedladdningar
Antalet nedladdningar är summan av nedladdningar för alla fulltexter. Det kan inkludera t.ex tidigare versioner som nu inte längre är tillgängliga.

urn-nbn

Altmetricpoäng

urn-nbn
Totalt: 1555 träffar
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf