TY - GEN
T1 - Using sinusoidally-modulated noise as a surrogate for slow-wave sleep to accomplish stable unsupervised dictionary learning in a spike-based sparse coding model
AU - Watkins, Yijing
AU - Kim, Edward
AU - Sornborger, Andrew
AU - Kenyon, Garrett T.
N1 - Publisher Copyright:
© 2020 IEEE.
PY - 2020/6
Y1 - 2020/6
N2 - Sparse coding algorithms have been used to model the acquisition of V1 simple cell receptive fields as well as to accomplish the unsupervised acquisition of features for a variety of machine learning applications. The Locally Competitive Algorithm (LCA) provides a biologically plausible implementation of sparse coding based on lateral inhibition. LCA can be reformulated to support dictionary learning via an online local Hebbian rule that reduces predictive coding error. Although originally formulated in terms of leaky integrator rate-coded neurons, LCA based on lateral inhibition between leaky integrate-and-fire (LIF) neurons has been implemented on spiking neuromorphic processors but such implementations preclude local online learning. We previously reported that spiking LCA can be expressed in terms of predictive coding error in a manner that allows for unsupervised dictionary learning via a local Hebbian rule but the issue of stability has not previously been addressed. Here, we use the Nengo simulator to show that unsupervised dictionary learning in a spiking LCA model can be made stable by incorporating epochs of sinusoidally-modulated noise that we hypothesize are analogous to slow-wave sleep. In the absence of slow-wave sleep epochs, the |L|2 norm of individual features tends to increase over time during unsupervised dictionary learning until the corresponding neurons can be activated by random Gaussian noise. By inserting epochs of sinusoidally-modulated Gaussian noise, however, the |L|2 norms of any activated neurons are down regulated such | that individual neurons are no longer activated by noise. Our results suggest that slow-wave sleep may act, in part, to ensure that cortical neurons do not "hallucinate" their target features in pure noise, thus helping to maintain dynamical stability.
AB - Sparse coding algorithms have been used to model the acquisition of V1 simple cell receptive fields as well as to accomplish the unsupervised acquisition of features for a variety of machine learning applications. The Locally Competitive Algorithm (LCA) provides a biologically plausible implementation of sparse coding based on lateral inhibition. LCA can be reformulated to support dictionary learning via an online local Hebbian rule that reduces predictive coding error. Although originally formulated in terms of leaky integrator rate-coded neurons, LCA based on lateral inhibition between leaky integrate-and-fire (LIF) neurons has been implemented on spiking neuromorphic processors but such implementations preclude local online learning. We previously reported that spiking LCA can be expressed in terms of predictive coding error in a manner that allows for unsupervised dictionary learning via a local Hebbian rule but the issue of stability has not previously been addressed. Here, we use the Nengo simulator to show that unsupervised dictionary learning in a spiking LCA model can be made stable by incorporating epochs of sinusoidally-modulated noise that we hypothesize are analogous to slow-wave sleep. In the absence of slow-wave sleep epochs, the |L|2 norm of individual features tends to increase over time during unsupervised dictionary learning until the corresponding neurons can be activated by random Gaussian noise. By inserting epochs of sinusoidally-modulated Gaussian noise, however, the |L|2 norms of any activated neurons are down regulated such | that individual neurons are no longer activated by noise. Our results suggest that slow-wave sleep may act, in part, to ensure that cortical neurons do not "hallucinate" their target features in pure noise, thus helping to maintain dynamical stability.
UR - https://www.scopus.com/pages/publications/85090109424
U2 - 10.1109/CVPRW50498.2020.00188
DO - 10.1109/CVPRW50498.2020.00188
M3 - Conference contribution
AN - SCOPUS:85090109424
T3 - IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops
SP - 1482
EP - 1487
BT - Proceedings - 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2020
PB - IEEE Computer Society
T2 - 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2020
Y2 - 14 June 2020 through 19 June 2020
ER -