Sequence Learning in a Single Trial (original) (raw)

While recurrent neural networks can store pattern sequences through incremental learning, there could be a trade-off between network capacity and the speed of learning. The brain may solve this problem by using a two-stage system: a compact, low-capacity subsystem for rapid temporary storage of a few sequences, and a larger, high-capacity, slow learning subsystem for long-term storage of all sequences. In this study, we evaluate the ability of sparsely connected networks to learn pattern sequences in a single exposure using very high learning rates. The key factor is the amount of recurrent inhibition in the system. Our results indicate that post-synaptic gating in the learning rule enhances the rapid learning ability of networks. We also suggest how such rapid learning networks could transfer their memories to long-term storage.

Sign up for access to the world's latest research.

checkGet notified about relevant papers

checkSave papers to use in your research

checkJoin the discussion with peers

checkTrack your impact