Clustering with multilayer perceptrons and self-organized (Hebbian) learning (original) (raw)

Performance Analysis of Self-Organizing Neural Network-Based Clustering

Data mining is generally the process of examining data from different aspects and summarizing it into valuable information. There are number of data mining software's for analysing the data. They allow users to examine the data from various angles, categorize it, and summarize the relationships identified.

Decorrelated Hebbian Learning for Clustering and Function Approximation

Neural Computation, 1995

This paper presents a new learning paradigm that consists of a Hebbian and anti-Hebbian learning. A layer of radial basis functions is adapted in an unsupervised fashion by minimizing a two-element cost function. The first element maximizes the output of each gaussian neuron and it can be seen as an implementation of the traditional Hebbian learning law. The second element of the cost function reinforces the competitive learning by penalizing the correlation between the nodes. Consequently, the second term has an “anti-Hebbian” effect that is learned by the gaussian neurons without the implementation of lateral inhibition synapses. Therefore, the decorrelated Hebbian learning (DHL) performs clustering in the input space avoiding the “nonbiological” winner-take-all rule. In addition to the standard clustering problem, this paper also presents an application of the DHL in function approximation. A scaled piece-wise linear approximation of a function is obtained in the supervised fashi...

Adaptation of Multilayer Perceptron Neural Network to unsupervised Clustering using a developed version of k-means algorithm

2015

Cluster analysis plays a very important role in different fields and can be mandatory in others. This fact is due to the huge amount of web services, products and information created and provided on the internet and in addition the need of representation, visualization and reduction of large vectors. So in order to facilitate the treatment of information and reducing the research space, data must be classified. In other words, the needless of having a good technique of clustering is continually growing. There exist many clustering algorithms (supervised and unsupervised) in the literature: hierarchical and non hierarchical clustering methods, k-means, artificial neural networks (RNAs).... All of these methods suffer from some drawbacks related to initialization issues, supervision or running time. For instance, the classes’ number, initial code vectors and the choice of the best learning set in k-means and Multi Layer Perceptron (MLP) affect seriously the clustering results. To deal...

Clustering-based algorithms for single-hidden-layer sigmoid perceptron

IEEE Transactions on Neural Networks, 2003

Gradient-descent type supervised learning is the most commonly used algorithm for design of standard sigmoid perceptron (SP). However, it is computationally expensive (slow) and has local-minima problem. Moody and Darken proposed an input-clustering based hierarchical algorithm for fast learning in networks of locally tuned neurons in the context of radial basis function networks. In this paper, we propose and analyze input clustering (IC) and input-output clustering (IOC)-based algorithms for fast learning in networks of globally tuned neurons in the context of the SP. It is shown that "localizing" the input layer weights of the SP by the IC and the IOC minimizes an upper bound to the SP output error. The proposed algorithms could possibly be used also to initialize the SP weights for the conventional gradient-descent learning. Simulation results offer that the SPs designed by the IC and the IOC yield comparable performance in comparison with its radial basis function network counterparts.

Self-organizing neurons: toward brain-inspired unsupervised learning

2019 International Joint Conference on Neural Networks (IJCNN), 2019

During the last years, Deep Neural Networks have reached the highest performances in image classification. Nevertheless, such a success is mostly based on supervised and off-line learning: they require thus huge labeled datasets for learning, and once it is done, they cannot adapt to any change in the data from the environment. In the context of brain-inspired computing, we apply Kohonen-based Self-Organizing Maps for unsupervised learning without labels, and we explore original extensions such as the Dynamic SOM that enables continuous learning and the Pruning Cellular SOM that includes synaptic pruning in neuromorphic circuits. After presenting the three models and the experimental setup for MNIST classification, we compare different methods for automatic labeling based on very few labeled data (1% of the training dataset), and then we compare the performances of the three Kohonen-based Self-Organizing Maps with STDP-based Spiking Neural Networks in terms of accuracy, dynamicity and scalability. Index Terms-brain-inspired computing, self-organizing maps, unsupervised learning, embedded image classification.

An incremental self-organizing neural network based on enhanced competitive Hebbian learning

The 2013 International Joint Conference on Neural Networks (IJCNN), 2013

Self-organizing neural networks are important tools for realizing unsupervised learning. Recently, a difficult task has involved the incremental, efficient and robust learning in noisy environments. Most of the existing techniques are poor in this regard. In this paper, we first propose a new topology generating method called enhanced competitive Hebbian learning (enhanced CHL), and then propose a novel incremental self-organizing neural network based on the enhanced CHL method, called enhanced incremental growing neural gas (Hi-GNG). The experiments presented in this paper show that the Hi-GNG algorithm can automatically and efficiently generate a topological structure with a suitable number of neurons and that the proposed algorithm is robust to noisy data.

Data Clustering Using a Reorganizing Neural Network

Cybernetics and Systems, 2006

A new approach, designed for clustering of arbitrary distributed patterns, is presented. This study is concerned with the use of a selforganizing neural network as a frame for data clustering. The nearest network nodes in feature space are treated as prototypes, assigned to the corresponding cluster. The rules for dead-node shifting and simple adjustment of coordinates of the active nodes are introduced. The performance of the proposed self-organizing neural network is examined on the benchmark synthetic and the real-world problem.

The Self Organizing Map as a Tool for Cluster Analysis

Menemui Matematik (Discovering Mathematics), 2016

The Self-organizing map is among the most acceptable algorithm in the unsupervised learning technique for cluster analysis. It is an important tool used to map high-dimensional data sets onto a low-dimensional discrete lattice of neurons. This feature is used for clustering and classifying data. Clustering is the process of grouping data elements into classes or clusters so that items in each class or cluster are as similar to each other as possible. In this paper, we present an overview of self organizing map, its architecture, applications and its training algorithm. Computer simulations have been analyzed based on samples of data for clustering problems.

Application of SOM neural network in clustering

Journal of Biomedical Science and Engineering, 2009

The Self-Organizing Map (SOM) is an unsupervised neural network algorithm that projects high-dimensional data onto a two-dimensional map. The projection preserves the topology of the data so that similar data items will be mapped to nearby locations on the map. One of the SOM neural network's applications is clustering of animals due their features. In this paper we produce an experiment to analyze the SOM in clustering different species of animals.

A Novel Hybrid Neural Network for Data Clustering

2007

Clustering plays an indispensable role for data analysis. Many clustering algorithms have been developed. However, most of them suffer either poor performance of unsupervised learning or lacking of mechanisms to utilize some prior knowledge about data (semi-supervised learning) for improving clustering result. In an effort to archieve the ability of semi-supervised clustering and better unsupervised clustering performance, we develop a hybrid neural network model (HNN). It is the sequential combination of Multi-Layer Perceptron (MLP) and Adaptive Resonance Theory-2 (ART2). It inherits two distinct advantages of stability and plasticity from ART2. Meanwhile, by combining the merits of MLP, it not only improves the performance for unsupervised clustering, but also supports for semi-supervised clustering if partial knowledge about data is available. Experiment results show that our model can be used both for unsupervised clustering and semi-supervised clustering with promising performance.