An effective criterion for pruning reservoir's connections in Echo State Networks (original) (raw)

2014 International Joint Conference on Neural Networks (IJCNN), 2014

Abstract

ABSTRACT Echo State Networks (ESNs) were introduced to simplify the design and training of Recurrent Neural Networks (RNNs), by explicitly subdividing the recurrent part of the network, the reservoir, from the non-recurrent part. A standard practice in this context is the random initialization of the reservoir, subject to few loose constraints. Although this results in a simple-to-solve optimization problem, it is in general suboptimal, and several additional criteria have been devised to improve its design. In this paper we provide an effective algorithm for removing redundant connections inside the reservoir during training. The algorithm is based on the correlation of the states of the nodes, hence it depends only on the input signal, it is efficient to implement, and it is also local. By applying it, we can obtain an optimally sparse reservoir in a robust way. We present the performance of our algorithm on two synthetic datasets, which show its effectiveness in terms of better generalization and lower computational complexity of the resulting ESN. This behavior is also investigated for increasing levels of memory and non-linearity required by the task.

Aurelio Uncini hasn't uploaded this paper.

Let Aurelio know you want this paper to be uploaded.

Ask for this paper to be uploaded.