Neurogenesis-inspired dictionary learning: Online model adaption in a changing world

Sahil Garg, Irina Rish, Guillermo Cecchi, Aurelie Lozano

Research output: Contribution to conferencePaperpeer-review

3 Scopus citations

Abstract

We address the problem of online model adaptation when learning representations from non-stationary data streams. For now, we focus on single hidden-layer sparse linear autoencoders (i.e. sparse dictionary learning), although in the future, the proposed approach can be extended naturally to general multi-layer autoencoders and supervised models. We propose a simple but effective online model-selection, based on alternating-minimization scheme, which involves “birth” (addition of new elements) and “death” (removal, via l1/l2 group sparsity) of hidden units representing dictionary elements, in response to changing inputs; we draw inspiration from the adult neurogenesis phenomenon in the dentate gyrus of the hippocampus, known to be associated with better adaptation to new environments. Empirical evaluation on both real-life and synthetic data demonstrates that the proposed approach can considerably outperform the state-of-art non-adaptive online sparse coding of Mairal et al. (2009) in the presence of non-stationary data, especially when dictionaries are sparse.

Original languageEnglish
StatePublished - 2017
Externally publishedYes
Event5th International Conference on Learning Representations, ICLR 2017 - Toulon, France
Duration: 24 Apr 201726 Apr 2017

Conference

Conference5th International Conference on Learning Representations, ICLR 2017
Country/TerritoryFrance
CityToulon
Period24/04/1726/04/17

Fingerprint

Dive into the research topics of 'Neurogenesis-inspired dictionary learning: Online model adaption in a changing world'. Together they form a unique fingerprint.

Cite this