Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Dynamic Texture Recognition Using Time-Causal and Time-Recursive Spatio-Temporal Receptive Fields
KTH, School of Electrical Engineering and Computer Science (EECS), Computational Science and Technology (CST). (Computational Brain Science Lab)ORCID iD: 0000-0003-0011-6444
KTH, School of Electrical Engineering and Computer Science (EECS), Computational Science and Technology (CST). (Computational Brain Science Lab)ORCID iD: 0000-0002-9081-2170
2018 (English)In: Journal of Mathematical Imaging and Vision, ISSN 0924-9907, E-ISSN 1573-7683, p. 1-30Article in journal (Refereed) Published
Abstract [en]

This work presents a first evaluation of using spatio-temporal receptive fields from a recently proposed time-causal spatiotemporal scale-space framework as primitives for video analysis. We propose a new family of video descriptors based on regional statistics of spatio-temporal receptive field responses and evaluate this approach on the problem of dynamic texture recognition. Our approach generalises a previously used method, based on joint histograms of receptive field responses, from the spatial to the spatio-temporal domain and from object recognition to dynamic texture recognition. The time-recursive formulation enables computationally efficient time-causal recognition. The experimental evaluation demonstrates competitive performance compared to state of the art. In particular, it is shown that binary versions of our dynamic texture descriptors achieve improved performance compared to a large range of similar methods using different primitives either handcrafted or learned from data. Further, our qualitative and quantitative investigation into parameter choices and the use of different sets of receptive fields highlights the robustness and flexibility of our approach. Together, these results support the descriptive power of this family of time-causal spatio-temporal receptive fields, validate our approach for dynamic texture recognition and point towards the possibility of designing a range of video analysis methods based on these new time-causal spatio-temporal primitives.

Place, publisher, year, edition, pages
Springer, 2018. p. 1-30
Keywords [en]
Dynamic texture, Receptive field, Spatio-temporal, Time-causal, Time-recursive, Video descriptor, Receptive field histogram, Scale space
National Category
Computer Vision and Robotics (Autonomous Systems)
Identifiers
URN: urn:nbn:se:kth:diva-231094DOI: 10.1007/s10851-018-0826-9ISI: 000447385200002Scopus ID: 2-s2.0-85048764772OAI: oai:DiVA.org:kth-231094DiVA, id: diva2:1222478
Projects
Scale-space theory for invariant and covariant visual receptive fieldsTime-causal receptive fields for computer vision and modelling of biological vision
Funder
Swedish Research Council, 2014-4083Stiftelsen Olle Engkvist Byggmästare, 2015/465
Note

QC 20180625

Available from: 2018-06-21 Created: 2018-06-21 Last updated: 2018-11-08Bibliographically approved

Open Access in DiVA

fulltext(8254 kB)4 downloads
File information
File name FULLTEXT02.pdfFile size 8254 kBChecksum SHA-512
cab36303b3b619712371f78477090f77cd0e683996dbef6007a0f14e99e19df82f1eb3ee74c6823380b4fcc0108b0ae8f013792580401a1e300881f72261d752
Type fulltextMimetype application/pdf

Other links

Publisher's full textScopus

Search in DiVA

By author/editor
Jansson, YlvaLindeberg, Tony
By organisation
Computational Science and Technology (CST)
In the same journal
Journal of Mathematical Imaging and Vision
Computer Vision and Robotics (Autonomous Systems)

Search outside of DiVA

GoogleGoogle Scholar
Total: 13 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 127 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf