Convolutive Features for Storage and Transmission
2015 (English)Report (Other academic)
A central concern for many learning algorithms and sensing systems is how to efficiently store what the algorithm/system has learned. Convolutive Non-negative Matrix Factorization (CNMF) finds parts-based convolutive representations of non-negative data. Convolutive extensions of NMF have not considered storage efficiency as a side constraint during the learning procedure. We contribute an algorithm, Storable NMF (SNMF), that fuses ideas from the (1) parts-based learning and (2) integer sequence compression literature. SNMF enjoys the merits of both techniques: it retains the good-approximation properties of CNMF while also taking into account the size of the symbol set which is used to express the learned convolutive factors and activations. We demonstrate that SNMF yields a compression ratio ranging from 10:1 up to 20:1, which gives rise to a similar bandwidth and storage saving for networked sensors.
Trick: SNMF achieves these improved compression ratios, without incurring a significant loss of accuracy, by embedding an off-the-shelf compression algorithm in the CNMF updates so that quantization function updates are interleaved with CNMF’s up- date rules.
Place, publisher, year, edition, pages
Tübingen: Max Planck Institute for Intelligent Systems , 2015. , 1 p.
Machine Learning, Compression
Engineering and Technology
Research subject Applied and Computational Mathematics
IdentifiersURN: urn:nbn:se:kth:diva-173394OAI: oai:DiVA.org:kth-173394DiVA: diva2:853025
QC 201509112015-09-112015-09-112015-09-11Bibliographically approved