A stochastically motivated random initialization of pattern classifying MLPs (original) (raw)

Neural Processing Letters, 1996

Abstract

ABSTRACT In this contribution, a new stochastically motivated random weight initialization scheme for pattern classifying Multi-Layer Perceptrons (MLPs) is presented. Its first aim is to ensure that all training examples and all nodes have an equal opportunity to contribute to the improvement of the network during the Error Back-Propagation (EBP) training. In addition, it pursues input scale invariance: if the network inputs were substituted by rescaled inputs, the initialization procedure should provide an equally well performing network. Finally, the new algorithm can initialize MLPs comprising both concentric (e.g., Gaussian) and squashing (e.g., sigmoidal) nodes. Experiments demonstrate that networks initialized using the proposed method train better than networks initialized using a standard random initialization scheme.

Jean-pierre Martens hasn't uploaded this paper.

Let Jean-pierre know you want this paper to be uploaded.

Ask for this paper to be uploaded.