Realization of Generalized RBF Network (original) (raw)
Related papers
Model selection via Genetic Algorithms for RBF networks
2003
This work addresses the problem of finding the adjustable parameters of a learning algorithm using Genetic Algorithms. This problem is also known as the model selection problem. In this paper, some model selection techniques (e.g., crossvalidation and bootstrap) are used as objective functions of a Genetic Algorithm. The Genetic Algorithm is modified in order to allow the efficient use of these objective functions by means of occam's razor, growing, and other heuristics. Some modifications explore intrinsic features of Genetic Algorithms, such as their ability to handle multiple and noise objective functions. The proposed techniques are very general and may be applied to a large range of learning algorithms.
Initializing of an RBF network by a genetic algorithm
Neurocomputing, 1997
In this paper we use a genetic algorithm (GA) for selecting the initial seed points (prototypes, kernels) for a Radial Basis Function (RBF) classifier. The chromosome is directly mapped onto the training set and represents a subset: it contains 1 at the ith position if the ith element of the set is included, and 0, otherwise. Thus the GA serves a condensing technique that can hopefully lead to a small subset which still retains relevant classification information. We propose to use the set corresponding to the best chromosome from the final population as the seed points of the RBF network. Simulated annealing is used to tune the parameters of the radial function without changing kernels location. Experimental results with IRIS and two-spirals data sets are presented.
New RBF neural network classifier with optimized hidden neurons number
This article presents a noticeable performances improvement of a neural classifier based on an RBF network. Based on the Mahalanobis distance, this new classifier increases relatively the recognition rate while decreasing remarkably the number of hidden layer neurons. We obtain thus a new very general RBF classifier, very simple, not requiring any adjustment parameter, and presenting an excellent ratio performances/neurons number. A comparative study of its performances is presented and illustrated by examples on artificial and real databases.
Protein Sequences Classification Using Modular RBF Neural Networks
Lecture Notes in Computer Science, 2002
Abstract. A protein super-family consists of proteins which share amino acid sequence homology and which may therefore be functionally and structurally re-lated. One of the benefits from this category grouping is that some hint of func-tion may be deduced for individual members ...
c ○ World Scientific Publishing Company EFFICIENT TRAINING OF RBF NETWORKS FOR CLASSIFICATION
2003
Radial Basis Function networks with linear outputs are often used in regression problems because they can be substantially faster to train than Multi-layer Perceptrons. For classification problems, the use of linear outputs is less appropriate as the outputs are not guaranteed to represent probabilities. We show how RBFs with logistic and softmax outputs can be trained efficiently using the Fisher scoring algorithm. This approach can be used with any model which consists of a generalised linear output function applied to a model which is linear in its parameters. We compare this approach with standard non-linear optimisation algorithms on a number of datasets.
A Study of Applications of RBF Network
International Journal of Computer Applications, 2014
Forecasting is a method of making statements about certain event whose actual results have not been observed. It seems to be an easy process but is actually not. It requires a lot of analysis on current and past outcomes in order to give timely and accurate timely forecasted results. Radial Basis Function (RBF) is a method proposed in machine learning for making predictions and forecasting. It has been used in various real time applications such as weather forecasting, load forecasting, forecasting about number of tourist and many such applications. The paper includes a detailed survey on RBF network on the basis of its evolution and applications. It also covers explanation about combination of RBF with other techniques such as Fuzzy, Neural Networkand Genetic Algorithm.
Chemometrics and Intelligent Laboratory Systems, 2005
In many modeling problems that are based on input-output data, information about a plethora of variables is available. In these cases, the proper selection of explanatory variables is very critical for the success of the produced model, since it eliminates noisy variables and possible correlations, reduces the size of the model and accomplishes more accurate predictions. Many variable selection procedures have been proposed in the literature, but most of them consider only linear models. In this work, we present a novel methodology for variable selection in nonlinear modeling, which combines the advantages of several artificial intelligence technologies. More specifically, the Radial Basis Function (RBF) neural network architecture serves as the nonlinear modeling tool, by exploiting the simplicity of its topology and the fast fuzzy means training algorithm. The proper variables are selected in two stages using a multi-objective optimization approach: in the first stage, a specially designed genetic algorithm minimizes the prediction error over a monitoring data set, while in the second stage a simulated annealing technique aims at the reduction of the number of explanatory variables. The efficiency of the proposed method is illustrated through its application to a number of benchmark problems. D
Applied Soft Computing, 2012
Radial Basis Function Neural Networks (RBFNNs) have been successfully employed in several function approximation and pattern recognition problems. The use of different RBFs in RBFNN has been reported in the literature and here the study centres on the use of the Generalized Radial Basis Function Neural Networks (GRBFNNs). An interesting property of the GRBF is that it can continuously and smoothly reproduce different RBFs by changing a real parameter. In addition, the mixed use of different RBF shapes in only one RBFNN is allowed. Generalized Radial Basis Function (GRBF) is based on Generalized Gaussian Distribution (GGD), which adds a shape parameter, , to standard Gaussian Distribution. Moreover, this paper describes a hybrid approach, Hybrid Algorithm (HA), which combines evolutionary and gradient-based learning methods to estimate the architecture, weights and node topology of GRBFNN classifiers. The feasibility and benefits of the approach are demonstrated by means of six gene microarray classification problems taken from bioinformatic and biomedical domains. Three filters were applied: Fast Correlation-Based Filter (FCBF), Best Incremental Ranked Subset (BIRS), and Best Agglomerative Ranked Subset (BARS); this was done in order to identify salient expression genes from among the thousands of genes in microarray data that can directly contribute to determining the class membership of each pattern. After different gene subsets were obtained, the proposed methodology was performed using the selected gene subsets as new input variables. The results confirm that the GRBFNN classifier leads to a promising improvement in accuracy.
Hybrid Learning of RBF Networks
Lecture Notes in Computer Science, 2002
Three different learning methods for RBF networks and their combinations are presented. Standard gradient learning, three-step algoritm with unsupervised part, and evolutionary algorithm are introduced. Their perfromance is compared on two benchmark problems: Two spirals and Iris plants. The results show that three-step learning is usually the fastest, while gradient learning achieves better precission. The combination of these two approaches gives best results.