Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Modeling time-series with deep networks
Örebro University, School of Science and Technology.ORCID iD: 0000-0002-0579-7181
2014 (English)Doctoral thesis, comprehensive summary (Other academic)
Place, publisher, year, edition, pages
Örebro: Örebro university , 2014. , 56 p.
Series
Örebro Studies in Technology, ISSN 1650-8580 ; 63
Keyword [en]
multivariate time-series, deep learning, representation learning, unsupervised
National Category
Computer and Information Science
Research subject
Information technology
Identifiers
URN: urn:nbn:se:oru:diva-39415ISBN: 978-91-7529-054-6 (print)OAI: oai:DiVA.org:oru-39415DiVA: diva2:769465
Public defence
2015-02-02, Hörsalen, Musikhögskolan, Örebro universitet, Fakultetsgatan 1, Örebro, 13:15 (English)
Opponent
Supervisors
Available from: 2014-12-08 Created: 2014-12-08 Last updated: 2017-10-17Bibliographically approved
List of papers
1. A review of unsupervised feature learning and deep learning for time-series modeling
Open this publication in new window or tab >>A review of unsupervised feature learning and deep learning for time-series modeling
2014 (English)In: Pattern Recognition Letters, ISSN 0167-8655, E-ISSN 1872-7344, Vol. 42, no 1, 11-24 p.Article, review/survey (Refereed) Published
Abstract [en]

This paper gives a review of the recent developments in deep learning and unsupervised feature learning for time-series problems. While these techniques have shown promise for modeling static data, such as computer vision, applying them to time-series data is gaining increasing attention. This paper overviews the particular challenges present in time-series data and provides a review of the works that have either applied time-series data to unsupervised feature learning algorithms or alternatively have contributed to modifications of feature learning algorithms to take into account the challenges present in time-series data.

Place, publisher, year, edition, pages
Elsevier, 2014
Keyword
Time-series, Unsupervised feature learning, Deep learning
National Category
Computer Science
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-34597 (URN)10.1016/j.patrec.2014.01.008 (DOI)000333451300002 ()2-s2.0-84894359867 (Scopus ID)
Available from: 2014-04-07 Created: 2014-04-07 Last updated: 2017-12-05Bibliographically approved
2. Sleep stage classification using unsupervised feature learning
Open this publication in new window or tab >>Sleep stage classification using unsupervised feature learning
2012 (English)In: Advances in Artificial Neural Systems, ISSN 1687-7594, E-ISSN 1687-7608, 107046- p.Article in journal (Refereed) Published
Abstract [en]

Most attempts at training computers for the difficult and time-consuming task of sleep stage classification involve a feature extraction step. Due to the complexity of multimodal sleep data, the size of the feature space can grow to the extent that it is also necessary to include a feature selection step. In this paper, we propose the use of an unsupervised feature learning architecture called deep belief nets (DBNs) and show how to apply it to sleep data in order to eliminate the use of handmade features. Using a postprocessing step of hidden Markov model (HMM) to accurately capture sleep stage switching, we compare our results to a feature-based approach. A study of anomaly detection with the application to home environment data collection is also presented. The results using raw data with a deep architecture, such as the DBN, were comparable to a feature-based approach when validated on clinical datasets.

Place, publisher, year, edition, pages
Hindawi Publishing Corporation, 2012
National Category
Engineering and Technology Computer Science
Research subject
Computer and Systems Science
Identifiers
urn:nbn:se:oru:diva-24199 (URN)10.1155/2012/107046 (DOI)
Available from: 2012-08-02 Created: 2012-08-02 Last updated: 2017-12-07Bibliographically approved
3. Fast Classification of Meat Spoilage Markers Using Nanostructured ZnO Thin Films and Unsupervised Feature Learning
Open this publication in new window or tab >>Fast Classification of Meat Spoilage Markers Using Nanostructured ZnO Thin Films and Unsupervised Feature Learning
2013 (English)In: Sensors, ISSN 1424-8220, E-ISSN 1424-8220, Vol. 13, no 2, 1578-1592 p.Article in journal (Refereed) Published
Abstract [en]

This paper investigates a rapid and accurate detection system for spoilage in meat. We use unsupervised feature learning techniques (stacked restricted Boltzmann machines and auto-encoders) that consider only the transient response from undoped zinc oxide, manganese-doped zinc oxide, and fluorine-doped zinc oxide in order to classify three categories: the type of thin film that is used, the type of gas, and the approximate ppm-level of the gas. These models mainly offer the advantage that features are learned from data instead of being hand-designed. We compare our results to a feature-based approach using samples with various ppm level of ethanol and trimethylamine (TMA) that are good markers for meat spoilage. The result is that deep networks give a better and faster classification than the feature-based approach, and we thus conclude that the fine-tuning of our deep models are more efficient for this kind of multi-label classification task.

Keyword
electronic nose, sensor material, representational learning, fast multi-label classification
National Category
Computer Science
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-34598 (URN)10.3390/s130201578 (DOI)000315403300012 ()2-s2.0-84873853951 (Scopus ID)
Funder
VINNOVA, INT/SWD/VINN/P-04/2011
Note

Fuding agency: Department of Science & Technology, India 

Available from: 2014-04-07 Created: 2014-04-07 Last updated: 2017-12-05Bibliographically approved
4. Learning feature representations with a cost-relevant sparse autoencoder
Open this publication in new window or tab >>Learning feature representations with a cost-relevant sparse autoencoder
2015 (English)In: International Journal of Neural Systems, ISSN 0129-0657, E-ISSN 1793-6462, Vol. 25, no 1, 1450034- p.Article in journal (Refereed) Published
Abstract [en]

There is an increasing interest in the machine learning community to automatically learn feature representations directly from the (unlabeled) data instead of using hand-designed features. The autoencoder is one method that can be used for this purpose. However, for data sets with a high degree of noise, a large amount of the representational capacity in the autoencoder is used to minimize the reconstruction error for these noisy inputs. This paper proposes a method that improves the feature learning process by focusing on the task relevant information in the data. This selective attention is achieved by weighting the reconstruction error and reducing the influence of noisy inputs during the learning process. The proposed model is trained on a number of publicly available image data sets and the test error rate is compared to a standard sparse autoencoder and other methods, such as the denoising autoencoder and contractive autoencoder.

Keyword
Sparse autoencoder; unsupervised feature learning; weighted cost function
National Category
Other Engineering and Technologies Computer Engineering
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-40063 (URN)10.1142/S0129065714500348 (DOI)000347965500005 ()25515941 (PubMedID)
Available from: 2014-12-29 Created: 2014-12-29 Last updated: 2017-12-05Bibliographically approved
5. Selective attention auto-encoder for automatic sleep staging
Open this publication in new window or tab >>Selective attention auto-encoder for automatic sleep staging
2014 (English)In: Biomedical Signal Processing and Control, ISSN 1746-8094Article in journal (Refereed) Submitted
Place, publisher, year, edition, pages
Elsevier, 2014
National Category
Computer Science
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-42935 (URN)
Available from: 2015-02-25 Created: 2015-02-25 Last updated: 2017-10-17Bibliographically approved
6. Unsupervised feature learning for electronic nose data applied to Bacteria Identification in Blood.
Open this publication in new window or tab >>Unsupervised feature learning for electronic nose data applied to Bacteria Identification in Blood.
2011 (English)Conference paper, Poster (with or without abstract) (Refereed)
Abstract [en]

Electronic nose (e-nose) data represents multivariate time-series from an array of chemical gas sensors exposed to a gas. This data is a new data set for usewith deep learning methods, and is highly suitable since e-nose data is complexand difficult to interpret for human experts. Furthermore, this data set presentsa number of interesting challenges for deep learning architectures per se. In this work we present a first study of e-nose data classification using deep learningwhen testing for the presence of bacteria in blood and agar solutions. We showin this study that deep learning outperforms hand-selected strategy based methods which has been previously tried with the same data set.

National Category
Engineering and Technology
Research subject
Computer and Systems Science
Identifiers
urn:nbn:se:oru:diva-24197 (URN)
Conference
NIPS 2011 Workshop on Deep Learning and Unsupervised Feature Learning
Available from: 2012-08-06 Created: 2012-08-02 Last updated: 2017-10-17Bibliographically approved
7. Not all signals are created equal: Dynamic objective auto-encoder for multivariate data
Open this publication in new window or tab >>Not all signals are created equal: Dynamic objective auto-encoder for multivariate data
2012 (English)Conference paper, Published paper (Other academic)
National Category
Computer Science
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-42941 (URN)
Conference
NIPS Workshop on Deep Learning and Unsupervised Feature Learning, 2012
Available from: 2015-02-25 Created: 2015-02-25 Last updated: 2017-10-17Bibliographically approved
8. Detection of spoiled meat using an electronic nose
Open this publication in new window or tab >>Detection of spoiled meat using an electronic nose
Show others...
2014 (English)In: Sensors, ISSN 1424-8220, E-ISSN 1424-8220Article in journal (Refereed) Submitted
National Category
Computer Science
Research subject
Computer Science
Identifiers
urn:nbn:se:oru:diva-42942 (URN)
Available from: 2015-02-25 Created: 2015-02-25 Last updated: 2017-12-04Bibliographically approved

Open Access in DiVA

Introductory chapter(2508 kB)1395 downloads
File information
File name FULLTEXT01.pdfFile size 2508 kBChecksum SHA-512
5d2ba3e47cbaf4baa60f28c1dd0183979545f140704977cc87221b1ed7630c60d81c230f7e20b12416875c4b8aec0e56d88a43435200c58964aed9e786eab078
Type fulltextMimetype application/pdf
Cover(1268 kB)75 downloads
File information
File name FULLTEXT02.pdfFile size 1268 kBChecksum SHA-512
c222ad7d3c2e7a2f0f7e1838f94d59b1d25d174fff6594963dd36d24bac1993f8f92dbf6bbdefa9ed8d607e31dc4ebb1b033b31d3fa1e2f9d3aa1b2a222d0d88
Type fulltextMimetype application/pdf
Spikblad(123 kB)71 downloads
File information
File name FULLTEXT03.pdfFile size 123 kBChecksum SHA-512
469b4605a35cb37e05d208ab9a49d3988df1d647f7aabc124157f168ce4fa12c653a21f4e5318eb64cf3a2b3a484cbaa66ef969c450fac6dccff7ad9f6ca62eb
Type spikbladMimetype application/pdf

Search in DiVA

By author/editor
Längkvist, Martin
By organisation
School of Science and Technology
Computer and Information Science

Search outside of DiVA

GoogleGoogle Scholar
Total: 1545 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

isbn
urn-nbn

Altmetric score

isbn
urn-nbn
Total: 2829 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf