Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Incremental Learning of Deep Convolutional Neural Networks for Tumour Classification in Pathology Images
Linköping University, Department of Biomedical Engineering.
2019 (English)Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
Abstract [en]

Medical doctors understaffing is becoming a compelling problem in many healthcare systems. This problem can be alleviated by utilising Computer-Aided Diagnosis (CAD) systems to substitute doctors in different tasks, for instance, histopa-thological image classification. The recent surge of deep learning has allowed CAD systems to perform this task at a very competitive performance. However, a major challenge with this task is the need to periodically update the models with new data and/or new classes or diseases. These periodical updates will result in catastrophic forgetting, as Convolutional Neural Networks typically requires the entire data set beforehand and tend to lose knowledge about old data when trained on new data. Incremental learning methods were proposed to alleviate this problem with deep learning. In this thesis, two incremental learning methods, Learning without Forgetting (LwF) and a generative rehearsal-based method, are investigated. They are evaluated on two criteria: The first, capability of incrementally adding new classes to a pre-trained model, and the second is the ability to update the current model with an new unbalanced data set. Experiments shows that LwF does not retain knowledge properly for the two cases. Further experiments are needed to draw any definite conclusions, for instance using another training approach for the classes and try different combinations of losses. On the other hand, the generative rehearsal-based method tends to work for one class, showing a good potential to work if better quality images were generated. Additional experiments are also required in order to investigating new architectures and approaches for a more stable training.

Place, publisher, year, edition, pages
2019. , p. 90
Keywords [en]
Deep Learning, Convolutional Nerual Networks, Pathology, Incremental Learning, Catastrophic Forgetting, Generative Adversarial Networks, Auxiliary Classification Generative Adversarial Networks
National Category
Computer Sciences Medical Image Processing
Identifiers
URN: urn:nbn:se:liu:diva-158225ISRN: LIU-IMT-TFK-A--19/573--SEOAI: oai:DiVA.org:liu-158225DiVA, id: diva2:1331382
External cooperation
SAAB
Subject / course
Medical Informatics
Supervisors
Examiners
Available from: 2019-06-28 Created: 2019-06-26 Last updated: 2019-06-28Bibliographically approved

Open Access in DiVA

fulltext(99186 kB)39 downloads
File information
File name FULLTEXT01.pdfFile size 99186 kBChecksum SHA-512
1d0337154ce36b703a15c5a2d15baf74f76843e52c4bf34b0c00a550dd476995fc7ee57c32f84ababdb2730e54a0a01d9264c5ef62a439a0e53b59018f05e5f0
Type fulltextMimetype application/pdf

Search in DiVA

By author/editor
Johansson, Philip
By organisation
Department of Biomedical Engineering
Computer SciencesMedical Image Processing

Search outside of DiVA

GoogleGoogle Scholar
Total: 39 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

urn-nbn

Altmetric score

urn-nbn
Total: 269 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf