Catalina Cocianu - Academia.edu (original) (raw)

Uploads

Papers by Catalina Cocianu

Research paper thumbnail of Evolutionary Image Registration: A Review

Sensors, Jan 14, 2023

This article is an open access article distributed under the terms and conditions of the Creative... more This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY

Research paper thumbnail of Neural Architectures for Correlated Noise Removal in Image Processing

The paper proposes a new method that combines the decorrelation and shrinkage techniques to neura... more The paper proposes a new method that combines the decorrelation and shrinkage techniques to neural network-based approaches for noise removal purposes. The images are represented as sequences of equal sized blocks, each block being distorted by a stationary statistical correlated noise. Some significant amount of the induced noise in the blocks is removed in a preprocessing step, using a decorrelation method combined with a standard shrinkage-based technique. The preprocessing step provides for each initial image a sequence of blocks that are further compressed at a certain rate, each component of the resulting sequence being supplied as inputs to a feed-forward neural architecture í µí°¹ í µí±‹ → í µí°¹ í µí°» → í µí°¹ í µí±Œ. The local memories of the neurons of the layers í µí°¹ í µí°» and í µí°¹ í µí±Œ are generated through a supervised learning process based on the compressed versions of blocks of the same index value supplied as inputs and the compressed versions of them resulting as the mean of their preprocessed versions. Finally, using the standard decompression technique, the sequence of the decompressed blocks is the cleaned representation of the initial image. The performance of the proposed method is evaluated by a long series of tests, the results being very encouraging as compared to similar developments for noise removal purposes.

Research paper thumbnail of An unsupervised skeleton based method to discover the structure of the class system

Research Challenges in Information Science, 2008

The aim of the research reported in the paper was twofold: to propose a new approach in cluster a... more The aim of the research reported in the paper was twofold: to propose a new approach in cluster analysis and to investigate its performance, when it is combined with dimensionality reduction schemes. The search process for the optimal clusters approximating the unknown classes towards getting homogenous groups, where the homogeneity is defined in terms of the dasiatypicalitypsila of components with

Research paper thumbnail of Fuzzy modeling and bayesian inference network

Data mining is an evolving and growing area of research and involves interdisciplinary research a... more Data mining is an evolving and growing area of research and involves interdisciplinary research and development encompassing diverse domains. In this age of multimedia data exploration, data mining should no longer be restricted to the mining of knowledge from large volumes of high-dimensional data sets in traditional databases only. The aim of the paper is to present guidelines in fuzzy

Research paper thumbnail of Decorrelation Techniques in Image Restoration

Research paper thumbnail of Toward a Semi-supervised Approach in Classification based on Principal Directions

Research paper thumbnail of Denoising Techniques Based on the Multiresolution Representation

Informatica Economica, 2008

So far, considerable research efforts have been invested in the are of using statistical methods ... more So far, considerable research efforts have been invested in the are of using statistical methods for image processing purposes yielding to a significant amount of models that aim to improve as much as possible the still existing and currently used processing techniques, some of them being based on using wavelet representation of images. Among them the simplest and the most attractive one use the Gaussian assumption about the distribution of the wavelet coefficients. This model has been successfully used in image denoising and restoration. The limitation comes from the fact that only the first-order statistics of wavelet coefficients are taking into account and the higher-order ones are ignored. The dependencies between wavelet coefficients can be formulated explicitly, or implicitly. The multiresolution representation is used to develop a class of algorithms for noise removal in case of normal models. The multiresolution algorithms perform the restoration tasks by combining, at each resolution level, according to a certain rule, the pixels of a binary support image. The values of the support image pixels are either 1 or 0 depending on their significance degree. At each resolution level, the contiguous areas of the support image corresponding to 1-value pixels are taken as possible objects of the image. Our work reports two attempts in using the multiresolution based algorithms for restoration purposes in case of normally distributed noise. Several results obtained using our new restoration algorithm are presented in the final sections of the paper.

Research paper thumbnail of PCA Supervised and Unsupervised Classifiers in Signal Processing

Research paper thumbnail of Learning schemes in using PCA neural networks for image restoration purposes

Wseas Transactions on Information Science and Applications, 2008

Image restoration methods are used to improve the appearance of an image by application of a rest... more Image restoration methods are used to improve the appearance of an image by application of a restoration process that uses a mathematical model for image degradation. The restoration can be viewed as a process that attempts to reconstruct or recover an image that has been degraded by using some a priori knowledge about the degradation phenomenon. Principal component analysis allows the identification of a linear transformation such that the axes of the resulted coordinate system correspond to the largest variability of the investigated signal. The advantages of using principal components reside from the fact that bands are uncorrelated and no information contained in one band can be predicted by the knowledge of the other bands, therefore the information contained by each band is maximum for the whole set of bits. The multiresolution support set is a data structure suitable for developing noise removal algorithms. The multiresolution algorithms perform the restoration tasks by combining, at each resolution level, according to a certain rule, the pixels of a binary support image. The multiresolution support can be computed using the statistically significant wavelet coefficients. We investigate the comparative performance of different PCA algorithms derived from Hebbian learning, lateral interaction algorithms and gradientbased learning for digital signal compression and image processing purposes. The final sections of the paper focus on PCA based approaches for image restoration tasks based on the multirezolution support set as well as on PCA based shrinkage technique for noise removal. The proposed algorithms were tested and some of the results are presented and commented in the final part of each section.

Research paper thumbnail of Restaurarea frontierelor unei imagini: abordare genetica

Lucrarea prezinta un algoritm de restaurare a frontierelor unei imagini pe baza puterii spectrale... more Lucrarea prezinta un algoritm de restaurare a frontierelor unei imagini pe baza puterii spectrale a unui pixel virtual frontiera, calculata la nivelul unui bloc al imaginii de intrare. Algoritmul cuprinde doua componente: una de natura statistica, utilizata pentru a decide daca pixelul procesat la momentul curent este de tip frontiera si cea de-a doua de natura genetica, în care este realizata modificarea nivelului de gri al pixelului curent printr-un proces de selectie repetata bazat pe puterea spectrala corespunzatoare punctului prelucrat. Cuvinte cheie: putere spectrala, nivel de gri, algoritm genetic, selectie, mutatie, crossover, imagine-bloc.

Research paper thumbnail of A Neural Network Framework for Implementing the Bayesian Learning

Research paper thumbnail of Pca Based Shrinkage Attempt to Noise Removal

Research paper thumbnail of Modele HMM pentru prelucrarea semnalului sonor

The research reported in the paper focused on the HMM modeling in pattern recognition, with appli... more The research reported in the paper focused on the HMM modeling in pattern recognition, with application in speech processing. A speech recognizer is a device that automatically transcribes speech into text and can be thought of as a voice-actuated "typewriter". The recognizer is usually based on some finite vocabulary that restricts words that can be printed out. The mathematical formulation of the speech recognizer design problem is based on a statistical approach, involving the hidden Markov model (HMM).

Research paper thumbnail of Supervised and Unsupervised Classification for Pattern Recognition Purposes

A cluster analysis task has to identify the grouping trends of data, to decide on the sound clust... more A cluster analysis task has to identify the grouping trends of data, to decide on the sound clusters as well as to validate somehow the resulted structure. The identification of the grouping tendency existing in a data collection assumes the selection of a framework stated in terms of a mathematical model allowing to express the similarity degree between couples of particular objects, quasi-metrics expressing the similarity between an object an a cluster and between clusters, respectively. In supervised classification, we are provided with a collection of preclassified patterns, and the problem is to label a newly encountered pattern. Typically, the given training patterns are used to learn the descriptions of classes which in turn are used to label a new pattern. The final section of the paper presents a new methodology for supervised learning based on PCA. The classes are represented in the measurement/feature space by a continuous repartitions

Research paper thumbnail of Decorrelation Techniques in Image Restoration

Research paper thumbnail of The Use of Features Extracted from Noisy Samples for Image Restoration Purposes

An important feature of neural networks is the ability they have to learn from their environment,... more An important feature of neural networks is the ability they have to learn from their environment, and, through learning to improve performance in some sense. In the following we restrict the development to the problem of feature extracting unsupervised neural networks derived on the base of the biologically motivated Hebbian self-organizing principle which is conjectured to govern the natural neural assemblies and the classical principal component analysis (PCA) method used by statisticians for almost a century for multivariate data analysis and feature extraction. The research work reported in the paper aims to propose a new image reconstruction method based on the features extracted from the noise given by the principal components of the noise covariance matrix.

Research paper thumbnail of Investigations on the potential of PCA based neural implementation attempts in solving specific tasks in image processing

Self-organization is one of the most important learning paradigms of neural systems. The purpose ... more Self-organization is one of the most important learning paradigms of neural systems. The purpose of an algorithm for self-organizing learning is to discover significant patterns or features in the input data without the help provided by an external teacher. The ability to adapt to the environment without the provision of an external teacher is encountered in nature in most intelligent organisms. In this paradigm, the lack of teaching signals is compensated for by an inner purpose, i.e., some built-in criterion or objective function that the system seeks to optimize. We investigate the comparative performance of different PCA algorithms derived from Hebbian learning, lateral interaction algorithms and gradient-based learning for digital signal compression and image processing purposes. The final sections of the paper focus on PCA based approaches for image restoration task and on PCA based shrinkage technique for noise removal. The proposed algorithms were tested and some of the resu...

Research paper thumbnail of A new adaptive PCA scheme for noise removal in image processing

The research reported in the paper focused on the development of a new adaptive scheme based on t... more The research reported in the paper focused on the development of a new adaptive scheme based on the use of principal directions (CSPCA). The proposed method is based exclusively on the information extracted form a series of noisy images that share the same statistical properties. Basically, the idea is that being given a signal corrupted by additive Gaussian noise, a soft shrinkage of the sparse components can be used to reduce the noise. In our CSPCA algorithm a shrinkage step is applied in the transformed space. A new variant of CSPCA noise removal algorithm is considered yielding to an adaptive learning technique. A series of comments concerning the experimental results are presented in the final section of the paper.

Research paper thumbnail of PCA-based data mining probabilistic and fuzzy approaches with applications in pattern recognition

Research paper thumbnail of A New Unsupervised Learning Scheme to Classify Data of Relative Small Volume

Economic computation and economic cybernetics studies and research / Academy of Economic Studies

Research paper thumbnail of Evolutionary Image Registration: A Review

Sensors, Jan 14, 2023

This article is an open access article distributed under the terms and conditions of the Creative... more This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY

Research paper thumbnail of Neural Architectures for Correlated Noise Removal in Image Processing

The paper proposes a new method that combines the decorrelation and shrinkage techniques to neura... more The paper proposes a new method that combines the decorrelation and shrinkage techniques to neural network-based approaches for noise removal purposes. The images are represented as sequences of equal sized blocks, each block being distorted by a stationary statistical correlated noise. Some significant amount of the induced noise in the blocks is removed in a preprocessing step, using a decorrelation method combined with a standard shrinkage-based technique. The preprocessing step provides for each initial image a sequence of blocks that are further compressed at a certain rate, each component of the resulting sequence being supplied as inputs to a feed-forward neural architecture í µí°¹ í µí±‹ → í µí°¹ í µí°» → í µí°¹ í µí±Œ. The local memories of the neurons of the layers í µí°¹ í µí°» and í µí°¹ í µí±Œ are generated through a supervised learning process based on the compressed versions of blocks of the same index value supplied as inputs and the compressed versions of them resulting as the mean of their preprocessed versions. Finally, using the standard decompression technique, the sequence of the decompressed blocks is the cleaned representation of the initial image. The performance of the proposed method is evaluated by a long series of tests, the results being very encouraging as compared to similar developments for noise removal purposes.

Research paper thumbnail of An unsupervised skeleton based method to discover the structure of the class system

Research Challenges in Information Science, 2008

The aim of the research reported in the paper was twofold: to propose a new approach in cluster a... more The aim of the research reported in the paper was twofold: to propose a new approach in cluster analysis and to investigate its performance, when it is combined with dimensionality reduction schemes. The search process for the optimal clusters approximating the unknown classes towards getting homogenous groups, where the homogeneity is defined in terms of the dasiatypicalitypsila of components with

Research paper thumbnail of Fuzzy modeling and bayesian inference network

Data mining is an evolving and growing area of research and involves interdisciplinary research a... more Data mining is an evolving and growing area of research and involves interdisciplinary research and development encompassing diverse domains. In this age of multimedia data exploration, data mining should no longer be restricted to the mining of knowledge from large volumes of high-dimensional data sets in traditional databases only. The aim of the paper is to present guidelines in fuzzy

Research paper thumbnail of Decorrelation Techniques in Image Restoration

Research paper thumbnail of Toward a Semi-supervised Approach in Classification based on Principal Directions

Research paper thumbnail of Denoising Techniques Based on the Multiresolution Representation

Informatica Economica, 2008

So far, considerable research efforts have been invested in the are of using statistical methods ... more So far, considerable research efforts have been invested in the are of using statistical methods for image processing purposes yielding to a significant amount of models that aim to improve as much as possible the still existing and currently used processing techniques, some of them being based on using wavelet representation of images. Among them the simplest and the most attractive one use the Gaussian assumption about the distribution of the wavelet coefficients. This model has been successfully used in image denoising and restoration. The limitation comes from the fact that only the first-order statistics of wavelet coefficients are taking into account and the higher-order ones are ignored. The dependencies between wavelet coefficients can be formulated explicitly, or implicitly. The multiresolution representation is used to develop a class of algorithms for noise removal in case of normal models. The multiresolution algorithms perform the restoration tasks by combining, at each resolution level, according to a certain rule, the pixels of a binary support image. The values of the support image pixels are either 1 or 0 depending on their significance degree. At each resolution level, the contiguous areas of the support image corresponding to 1-value pixels are taken as possible objects of the image. Our work reports two attempts in using the multiresolution based algorithms for restoration purposes in case of normally distributed noise. Several results obtained using our new restoration algorithm are presented in the final sections of the paper.

Research paper thumbnail of PCA Supervised and Unsupervised Classifiers in Signal Processing

Research paper thumbnail of Learning schemes in using PCA neural networks for image restoration purposes

Wseas Transactions on Information Science and Applications, 2008

Image restoration methods are used to improve the appearance of an image by application of a rest... more Image restoration methods are used to improve the appearance of an image by application of a restoration process that uses a mathematical model for image degradation. The restoration can be viewed as a process that attempts to reconstruct or recover an image that has been degraded by using some a priori knowledge about the degradation phenomenon. Principal component analysis allows the identification of a linear transformation such that the axes of the resulted coordinate system correspond to the largest variability of the investigated signal. The advantages of using principal components reside from the fact that bands are uncorrelated and no information contained in one band can be predicted by the knowledge of the other bands, therefore the information contained by each band is maximum for the whole set of bits. The multiresolution support set is a data structure suitable for developing noise removal algorithms. The multiresolution algorithms perform the restoration tasks by combining, at each resolution level, according to a certain rule, the pixels of a binary support image. The multiresolution support can be computed using the statistically significant wavelet coefficients. We investigate the comparative performance of different PCA algorithms derived from Hebbian learning, lateral interaction algorithms and gradientbased learning for digital signal compression and image processing purposes. The final sections of the paper focus on PCA based approaches for image restoration tasks based on the multirezolution support set as well as on PCA based shrinkage technique for noise removal. The proposed algorithms were tested and some of the results are presented and commented in the final part of each section.

Research paper thumbnail of Restaurarea frontierelor unei imagini: abordare genetica

Lucrarea prezinta un algoritm de restaurare a frontierelor unei imagini pe baza puterii spectrale... more Lucrarea prezinta un algoritm de restaurare a frontierelor unei imagini pe baza puterii spectrale a unui pixel virtual frontiera, calculata la nivelul unui bloc al imaginii de intrare. Algoritmul cuprinde doua componente: una de natura statistica, utilizata pentru a decide daca pixelul procesat la momentul curent este de tip frontiera si cea de-a doua de natura genetica, în care este realizata modificarea nivelului de gri al pixelului curent printr-un proces de selectie repetata bazat pe puterea spectrala corespunzatoare punctului prelucrat. Cuvinte cheie: putere spectrala, nivel de gri, algoritm genetic, selectie, mutatie, crossover, imagine-bloc.

Research paper thumbnail of A Neural Network Framework for Implementing the Bayesian Learning

Research paper thumbnail of Pca Based Shrinkage Attempt to Noise Removal

Research paper thumbnail of Modele HMM pentru prelucrarea semnalului sonor

The research reported in the paper focused on the HMM modeling in pattern recognition, with appli... more The research reported in the paper focused on the HMM modeling in pattern recognition, with application in speech processing. A speech recognizer is a device that automatically transcribes speech into text and can be thought of as a voice-actuated "typewriter". The recognizer is usually based on some finite vocabulary that restricts words that can be printed out. The mathematical formulation of the speech recognizer design problem is based on a statistical approach, involving the hidden Markov model (HMM).

Research paper thumbnail of Supervised and Unsupervised Classification for Pattern Recognition Purposes

A cluster analysis task has to identify the grouping trends of data, to decide on the sound clust... more A cluster analysis task has to identify the grouping trends of data, to decide on the sound clusters as well as to validate somehow the resulted structure. The identification of the grouping tendency existing in a data collection assumes the selection of a framework stated in terms of a mathematical model allowing to express the similarity degree between couples of particular objects, quasi-metrics expressing the similarity between an object an a cluster and between clusters, respectively. In supervised classification, we are provided with a collection of preclassified patterns, and the problem is to label a newly encountered pattern. Typically, the given training patterns are used to learn the descriptions of classes which in turn are used to label a new pattern. The final section of the paper presents a new methodology for supervised learning based on PCA. The classes are represented in the measurement/feature space by a continuous repartitions

Research paper thumbnail of Decorrelation Techniques in Image Restoration

Research paper thumbnail of The Use of Features Extracted from Noisy Samples for Image Restoration Purposes

An important feature of neural networks is the ability they have to learn from their environment,... more An important feature of neural networks is the ability they have to learn from their environment, and, through learning to improve performance in some sense. In the following we restrict the development to the problem of feature extracting unsupervised neural networks derived on the base of the biologically motivated Hebbian self-organizing principle which is conjectured to govern the natural neural assemblies and the classical principal component analysis (PCA) method used by statisticians for almost a century for multivariate data analysis and feature extraction. The research work reported in the paper aims to propose a new image reconstruction method based on the features extracted from the noise given by the principal components of the noise covariance matrix.

Research paper thumbnail of Investigations on the potential of PCA based neural implementation attempts in solving specific tasks in image processing

Self-organization is one of the most important learning paradigms of neural systems. The purpose ... more Self-organization is one of the most important learning paradigms of neural systems. The purpose of an algorithm for self-organizing learning is to discover significant patterns or features in the input data without the help provided by an external teacher. The ability to adapt to the environment without the provision of an external teacher is encountered in nature in most intelligent organisms. In this paradigm, the lack of teaching signals is compensated for by an inner purpose, i.e., some built-in criterion or objective function that the system seeks to optimize. We investigate the comparative performance of different PCA algorithms derived from Hebbian learning, lateral interaction algorithms and gradient-based learning for digital signal compression and image processing purposes. The final sections of the paper focus on PCA based approaches for image restoration task and on PCA based shrinkage technique for noise removal. The proposed algorithms were tested and some of the resu...

Research paper thumbnail of A new adaptive PCA scheme for noise removal in image processing

The research reported in the paper focused on the development of a new adaptive scheme based on t... more The research reported in the paper focused on the development of a new adaptive scheme based on the use of principal directions (CSPCA). The proposed method is based exclusively on the information extracted form a series of noisy images that share the same statistical properties. Basically, the idea is that being given a signal corrupted by additive Gaussian noise, a soft shrinkage of the sparse components can be used to reduce the noise. In our CSPCA algorithm a shrinkage step is applied in the transformed space. A new variant of CSPCA noise removal algorithm is considered yielding to an adaptive learning technique. A series of comments concerning the experimental results are presented in the final section of the paper.

Research paper thumbnail of PCA-based data mining probabilistic and fuzzy approaches with applications in pattern recognition

Research paper thumbnail of A New Unsupervised Learning Scheme to Classify Data of Relative Small Volume

Economic computation and economic cybernetics studies and research / Academy of Economic Studies