Digitala Vetenskapliga Arkivet

Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Gaussian Process Multiclass Classification: Evaluation of Binarization Techniques and Likelihood Functions
Linnaeus University, Faculty of Technology, Department of Mathematics.
2019 (English)Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
Abstract [en]

In binary Gaussian process classification the prior class membership probabilities are obtained by transforming a Gaussian process to the unit interval, typically either with the logistic likelihood function or the cumulative Gaussian likelihood function. Multiclass classification problems can be handled by any binary classifier by means of so-called binarization techniques, which reduces the multiclass problem into a number of binary problems. Other than introducing the mathematics behind the theory and methods behind Gaussian process classification, we compare the binarization techniques one-against-all and one-against-one in the context of Gaussian process classification, and we also compare the performance of the logistic likelihood and the cumulative Gaussian likelihood. This is done by means of two experiments: one general experiment where the methods are tested on several publicly available datasets, and one more specific experiment where the methods are compared with respect to class imbalance and class overlap on several artificially generated datasets. The results indicate that there is no significant difference in the choices of binarization technique and likelihood function for typical datasets, although the one-against-one technique showed slightly more consistent performance. However the second experiment revealed some differences in how the methods react to varying degrees of class imbalance and class overlap. Most notably the logistic likelihood was a dominant factor and the one-against-one technique performed better than one-against-all.

Place, publisher, year, edition, pages
2019. , p. 69
Keywords [en]
Machine learning, Gaussian process classification, Laplace's approximation, Likelihood functions, Binarization techniques, Class imbalance
National Category
Mathematics
Identifiers
URN: urn:nbn:se:lnu:diva-87952OAI: oai:DiVA.org:lnu-87952DiVA, id: diva2:1342982
Educational program
Applied Mahtematics Programme, 180 credits
Supervisors
Examiners
Available from: 2019-08-15 Created: 2019-08-15 Last updated: 2019-08-15Bibliographically approved

Open Access in DiVA

fulltext(3259 kB)3635 downloads
File information
File name FULLTEXT01.pdfFile size 3259 kBChecksum SHA-512
5d169d9b689dd82a82c4852f660f5d6f72dc65946a1ba1f3042b74461da2a043b9c921d5e3df8dc465c4cb100b04bb7f24c6093bd9eb0d15ab0481a9b44fa016
Type fulltextMimetype application/pdf

By organisation
Department of Mathematics
Mathematics

Search outside of DiVA

GoogleGoogle Scholar
Total: 3636 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

urn-nbn

Altmetric score

urn-nbn
Total: 1553 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf