Carmen Sánchez Ávila - Profile on Academia.edu (original) (raw)

Papers by Carmen Sánchez Ávila

Research paper thumbnail of Graphemic-phonetic diachronic linguistic invariance of the frequency and of the Index of Coincidence as cryptanalytic tools

PLOS ONE, Mar 19, 2019

Languages have inherent characteristics that make them their own and differentiated entities with... more Languages have inherent characteristics that make them their own and differentiated entities within their phyla and families. Even messages written in any language and later encrypted by cryptographic systems do not lose all of their characteristics, there remain aspects that help the cryptanalyst to recover them without knowing the decryption keys. For the characterization of the languages we will consider the frequencies of their graphemic and phonetic units and the Index of Coincidence, tools of fundamental utility in the field of Cryptography. Their diachronic invariance or survival over time in one language and their ability to discriminate against other languages will be analized. In order to do so, we will examine a total of 101 languages of which 261 texts have been taken. All of them are very diverse in style and time, taking us through a wide linguistic and temporal spectrum that will cover the period from the 6th century BC to the present day.

Research paper thumbnail of Elliptic curve cryptography on constraint environments

The main target of this work is to expose the capacities that make elliptic curve cryptography th... more The main target of this work is to expose the capacities that make elliptic curve cryptography the most suitable one to he implemented in environments with several constraints related to processor speed, bandwidth, security and memory. We have analyzed several elliptic curve cryptosystems with other public key ones. We have made a comparison among different puhlic key cryptosystems (such as ElGamal for encryption and Diffic-Hellman for kcy cxchanging) and thc corresponding ones based on elliptic curve theory; highlighting algorithm speed characteristics.

Research paper thumbnail of Red de mentoría en entornos universitarios españoles: resultados de un análisis comparativo

Red de mentoría en entornos universitarios españoles: resultados de un análisis comparativo

Page 13. UNIVERSIDAD POLITÉCNICA DE MADRID UNIVERSIDAD COMPLUTENSE DE MADRID 13 Revista Mentoring... more Page 13. UNIVERSIDAD POLITÉCNICA DE MADRID UNIVERSIDAD COMPLUTENSE DE MADRID 13 Revista Mentoring & Coaching. Número 3 (2010)· 13-29 RED DE MENTORIA EN ENTORNOS UNIVERSITARIOS ESPAÑOLES ...

Research paper thumbnail of The Rijndael block cipher (AES proposal) : a comparison with DES

The Rijndael block cipher (AES proposal) : a comparison with DES

ABSTRACT In October 2000, after three years of competition between 15 candidate algorithms, the N... more ABSTRACT In October 2000, after three years of competition between 15 candidate algorithms, the National Standards and Technology (NIST) chose the Rijndael algorithm to be adopted as Advanced Encryption Standard (AES) by the U.S. Department of Commerce, replacing to Data Encryption Algorithm (DES), which has been the standard since 1977. The authors analyze the structure and design of new AES, following three criteria: a) resistance against all known attacks; b) speed and code compactness on a wide range of platforms; and c) design simplicity; as well as its similarities and dissimilarities with other symmetric ciphers. On the other side, the principal advantages of new AES with respect to DES and T-DES, as well as its limitations, are investigated. Thus, for example, the fact that the new cipher and its inverse use different components, which practically eliminates the possibility for weak and semi-weak keys, as existing for DES, and the non-linearity of the key expansion, which practically eliminates the possibility of equivalent keys, are two of the principal advantages of new cipher. Finally, the implementation aspects of Rijndael cipher and its inverse are treated. Thus, although Rijndael is well suited to be implemented efficiently on a wide range of processors and in dedicated hardware, we have concentrated our study on 8-bit processors, typical for current smart cards and on 32-bit processors, typical for PCs

Research paper thumbnail of An adaptive regularization approach for solving discrete ill-posed problems: application to the harmonic signals restoration

An adaptive regularization approach for solving discrete ill-posed problems: application to the harmonic signals restoration

Research paper thumbnail of Análisis y síntesis de señales musicales mediante transformada Wavelet

Análisis y síntesis de señales musicales mediante transformada Wavelet

Research paper thumbnail of Redes neuronales: un camino abierto a la inteligencia articifial ...¿Hasta dónde? Objetivo : potenciar al máximo el cerebro humano, sin suplantar su originalidad, su libertad

Redes neuronales: un camino abierto a la inteligencia articifial ...¿Hasta dónde? Objetivo : potenciar al máximo el cerebro humano, sin suplantar su originalidad, su libertad

Research paper thumbnail of A New Edit Distance for Fuzzy Hashing Applications

Similarity preserving hashing applications, also known as fuzzy hashing functions, help to analys... more Similarity preserving hashing applications, also known as fuzzy hashing functions, help to analyse the content of digital devices by performing a resemblance comparison between different files. In practice, the similarity matching procedure is a two-step process, where first a signature associated to the files under comparison is generated, and then a comparison of the signatures themselves is performed. Even though ssdeep is the best-known application in this field, the edit distance algorithm that ssdeep uses for performing the signature comparison is not well-suited for certain scenarios. In this contribution we present a new edit distance algorithm that better reflects the similarity of two strings, and that can be used by fuzzy hashing applications in order to improve their results.

Research paper thumbnail of Aplicaciones de la Criptografía de Curva Elíptica

Aplicaciones de la Criptografía de Curva Elíptica

Novática: Revista de la Asociación de Técnicos de Informática, 2004

el principal objetivo de este trabajo consiste en exponer las cualidades que hacen de la "Cr... more el principal objetivo de este trabajo consiste en exponer las cualidades que hacen de la "Criptografia de Curva Eliptica" el tipo de criptografia mas adecuado para ser utilizado en plataformas con distintas restricciones relacionadas con la velocidad del procesador, el ancho de banda, seguridad y memoria. Se han comparado distintas caracteristicas de algoritmos de clave publica con algoritmos de curva eliptica, para ver las ventajas que presenta el uso de la criptografia de curva eliptica. Concretamente, se ha llevado a cabo una comparacion entre distintos criptosistemas de clave publica (tales como ElGamal para cifrado y Diffie-Hellman para intercambio de claves) y los correspondientes basados en teoria de curva eliptica; subrayando principalmente las caracteristicas de velocidad de los algoritmos. Ademas, se han analizado distintas funciones llamadas por los algoritmos de curva eliptica para encontrar las mas lentas, las mas criticas.

Research paper thumbnail of El paradero actual de la talla original de Nuestra Señora de Candelaria

El paradero actual de la talla original de Nuestra Señora de Candelaria

Estudios Canarios: Anuario del Instituto de Estudios Canarios, 2017

espanolLa talla original de la Virgen de Candelaria del templo tinerfeno del mismo nombre desapar... more espanolLa talla original de la Virgen de Candelaria del templo tinerfeno del mismo nombre desaparecio en el aluvion del ano 1826. A pesar de los intentos de localizarla despues del desastre que arraso la iglesia y convento de los frailes dominicos donde se encontraba nada pudo lograrse, siendo sustituida por la actual, realizada en el siguiente ano. Tras rememorar los hechos acaecidos en la devastacion exponemos todas las opciones posibles en cuanto al destino de la imagen mariana, ya en tierra como en mar. Se analizan los tipos posibles de madera en la que pudiera estar hecha y la capacidad de emerger por flotabilidad en las aguas, asi como los destinos y rumbos si alli hubiera caido. Finalmente, y dentro de las hipotesis probables, se propone la que consideramos la mas adecuada situacion y estado que hoy tendria la imagen de Candelaria. EnglishThe original wood carving of the Virgin of Candelaria of the temple of Tenerife of the same name disappeared in the flood of 1826. Despite attempts to locate it after the disaster that devastated the church and convent of the dominican friars where it was, anything could be achieved, and was replaced by the present carving, carried out in the following year. After compile the events of the devastation we expose all possible options regarding the fate of the marian image, on land and at sea; we analyze the possible types of wood used and the ability to emerge in the water, as well as destinations and directions there. Finally, it is proposed what we consider the most likely location and condition nowadays of the carving of Candelaria.

Research paper thumbnail of Principal component analysis for ear-based biometric verification

Principal component analysis for ear-based biometric verification

Biometrics is an active research field that is increasingly being integrated into current technol... more Biometrics is an active research field that is increasingly being integrated into current technology. As a result, more and more people are becoming familiar with biometric technics such as fingerprint or facial recognition. Nevertheless, there are innovative techniques such as ear-based biometrics which are not very well known yet because they are at initial stages of research. In this work, an ear geometry-based biometric verification system oriented to recognition through mobile phones is presented and evaluated. Feature extraction is carried out by means of Principal Component Analysis and feature matching is performed by a distance-based classifier including Euclidean and Eigen distances. The evaluation has been made according to the specifications included in the ISO/IDE 19795 norm following a zero-effort falsification scenario. To this end, a database including ear images of different users has been captured using the camera of a smartphone. The obtained Equal Error Rate matches 1.11% when Euclidean distance is applied for feature comparison and 5.56% in the case of Eigendistance.

Research paper thumbnail of Improving access control security using iris identification

Improving access control security using iris identification

Abstract When an access control system is going to be developed or installed, the security needed... more Abstract When an access control system is going to be developed or installed, the security needed in the application where that system will be applied should be studied deeply. This security environment can deny the use of passwords or PINs, due to their low level of user ...

Research paper thumbnail of Two different approaches for iris recognition using Gabor filters and multiscale zero-crossing representation

Pattern Recognition, Feb 1, 2005

Importance of biometric user identification is increasing everyday. One of the most promising tec... more Importance of biometric user identification is increasing everyday. One of the most promising techniques is the one based on the human iris. The authors, in this work, describe different approaches to develop this biometric technique. Based on the works carried out by Daugman, the authors have worked using Gabor filters and Hamming distance. But in addition, they have also worked in zero-crossing representation of the dyadic wavelet transform applied to two different iris signatures: one based on a single virtual circle of the iris; the other one based on an annular region. Also other metrics have been applied to be compared with the results obtained with the Hamming distance. In this work Euclidean distance and d Z will be shown. The last proposed approach is translation, rotation and scale invariant. Results will show a classification success up to 99.6% achieving an equal error rate down to 0.12% and the possibility of having null false acceptance rates with very low false rejection rates.

Research paper thumbnail of Deep Learning for Facial Recognition on Single Sample per Person Scenarios with Varied Capturing Conditions

Applied sciences, Dec 13, 2019

Single sample per person verification has received considerable attention because of its relevanc... more Single sample per person verification has received considerable attention because of its relevance in security, surveillance and border crossing applications. Nowadays, e-voting and bank of the future solutions also join this scenario, opening this field of research to mobile and low resources devices. These scenarios are characterised by the availability of a single image during the enrolment of the users into the system, so they require a solution able to extract knowledge from previous experiences and similar environments. In this study, two deep learning models for face recognition, which were specially designed for applications on mobile devices and resources saving environments, were described and evaluated together with two publicly available models. This evaluation aimed not only to provide a fair comparison between the models but also to measure to what extent a progressive reduction of the model size influences the obtained results.The models were assessed in terms of accuracy and size with the aim of providing a detailed evaluation which covers as many environmental conditions and application requirements as possible. To this end, a well-defined evaluation protocol and a great number of varied databases, public and private, were used.

Research paper thumbnail of A comparative study of palmprint feature extraction methods for contact-less biometrics under different environmental conditions

A comparative study of palmprint feature extraction methods for contact-less biometrics under different environmental conditions

Interest on palmprint biometrics has experimented a strong growth in the last decades due to its ... more Interest on palmprint biometrics has experimented a strong growth in the last decades due to its useful characteristics as uniqueness, permanence, reliability, user-friendliness, acceptability, non-intrusiveness, and low cost of the acquisition devices, which make it attractive for civil and commercial applications. Accordingly, a wide research has been developed in this field. Nevertheless, there is a lack of evaluation of the results under a common framework which permits to obtain a fair comparison between different methods. In this work an evaluation methodology based on the definitions suggested by the ISO/IDE 19795 norm is provided and applied to evaluate different palmprint feature extraction methods and matching approaches in terms of accuracy and computation time, offering a basis against which to compare future research. Given the growing trend towards touch-less biometrics promoted by the increase in the use of mobile devices, the evaluation has been carried out using two different contact-less hand databases. First database was recorded under totally controlled environmental conditions and providing the user some feedback about the recorded images during the acquisition process, while second database introduces lighting variations as well as no-pose restrictions during the capture.

Research paper thumbnail of A comparative survey on supervised classifiers for face recognition

A comparative survey on supervised classifiers for face recognition

During the last decades, several different techniques have been proposed for computer recognition... more During the last decades, several different techniques have been proposed for computer recognition of human faces. A further step in the development of these biometrics is to implement them in portable devices, such as mobile phones. Due to this devices' features and limitations it is necessary to select, among the currently available algorithms, the one with the best performance in terms of algorithm overall elapsed time and correct identification rates. The aim of this paper is to offer a complementary study to previous works, focusing on the performance of different supervised classifiers, such as the Normal Bayesian Classifier, Neural Architectures or distance-based algorithms. In addition, we analyse all the proposed algorithms' efficiency over public face databases (ORL, FERET, NIST and the Face Recognition Data from the Essex University). Each one of these databases contains a different number of individuals and particular samples and they present variations among images from the same user (scale, pose, expression, illumination, ...). We expect to simulate many different situations which take place when dealing with face recognition on mobile phones. In order to get a complete comparison, all the proposed algorithms have been implemented and run over all the databases, using the same computer. Different parametrizations for each algorithm have also been tested. Bayesian classifiers and distance-based algorithms turn out to be the most suitable, as their parametrization is simple, the training stage is not as time consuming as others' and classification results are satisfying.

Research paper thumbnail of Wavelet domain signal deconvolution with singularity-preserving regularization

Mathematics and Computers in Simulation, 2003

In this paper, we consider a wavelet based singularity-preserving regularization scheme for use i... more In this paper, we consider a wavelet based singularity-preserving regularization scheme for use in signal deconvolution problems. The inverse problem of finding solutions with singularities to discrete Fredholm integral equations of the first kind arises in many applied fields, e.g. in Geophysics. This equation is usually an ill-posed problem when it is considered in a Hilbert space framework, requiring regularization techniques to control arbitrary error amplifications and to get adequate solutions. Thus, considering the joint detection-estimation character this kind of signal deconvolution problems have, we introduce two novel algorithms which involve two principal steps at each iteration: (a) detecting the positions of the singularities by a nonlinear projection selection operator based on the estimation of Lipschitz regularity using the discrete dyadic wavelet transform; and (b) estimating the amplitudes of these singularities by obtaining a regularized solution of the original equation using the a priori knowledge and the above approximation. Some simulation examples serve to appreciate the high performance of the proposed techniques in this kind of problems.

Research paper thumbnail of Quality Measurements for Iris Images in Biometrics

Quality Measurements for Iris Images in Biometrics

... Abstract—A method for analyzing the quality of iris images is presented in this paper. Iris B... more ... Abstract—A method for analyzing the quality of iris images is presented in this paper. Iris Biometrics for personal identification is based on capturing an eye photograph and obtaining a feature vector that allows to identify a human being. ...

Research paper thumbnail of XVII Reunión española sobre criptología y seguridad de la información. RECSI 2022

Editorial Universidad de Cantabria eBooks, Sep 29, 2022

In recent years, the field of side-channel analysis has observed a revolution in the design of th... more In recent years, the field of side-channel analysis has observed a revolution in the design of the attack methodology. Conventional approaches which require the use of highly specialized equipment like oscilloscopes and spectrum analyzers, despite highly-precise, might be regarded as impractical in some scenarios. On the other hand, the use of less-accurate internal sensors which can monitor the power footprint of a circuit has risen in popularity. In particular, delay sensors such as those based in Time-to-Digital converters and Ring-Oscillators have shown promising results. These structures are interesting since they can be implemented from regular hardware resources available in most circuits. This means that components already available in the target might be leveraged to implement a side-channel attack. Moreover, it is not really necessary to have direct access to the platform to carry out such an attack; which implies that if there is a remote link such as Ethernet, an adversary might be able to perform Remote Power Analysis (RPA) of the system. So far, the main challenge for the success of this kind of attacks is cutting and aligning the power traces. This is usually achieved through secondary digital channels which carry some trigger information. In this paper, we propose to use a single channel to encode both the power trace and the alignment information. This is achieved by exploiting architectural vulnerabilities of the platform. Our results demonstrate, for the very first time, that RPA traces can be auto-aligned. As a case study we attempt to perform RPA on a serialized implementation of Photon-Beetle, a finalist in the NIST lightweight cryptography standardization process.

Research paper thumbnail of Deconvolución de señales con discontinuidades mediante regularización adaptativa con estimación de la regularidad Lipschitz

Deconvolución de señales con discontinuidades mediante regularización adaptativa con estimación de la regularidad Lipschitz

Research paper thumbnail of Graphemic-phonetic diachronic linguistic invariance of the frequency and of the Index of Coincidence as cryptanalytic tools

PLOS ONE, Mar 19, 2019

Languages have inherent characteristics that make them their own and differentiated entities with... more Languages have inherent characteristics that make them their own and differentiated entities within their phyla and families. Even messages written in any language and later encrypted by cryptographic systems do not lose all of their characteristics, there remain aspects that help the cryptanalyst to recover them without knowing the decryption keys. For the characterization of the languages we will consider the frequencies of their graphemic and phonetic units and the Index of Coincidence, tools of fundamental utility in the field of Cryptography. Their diachronic invariance or survival over time in one language and their ability to discriminate against other languages will be analized. In order to do so, we will examine a total of 101 languages of which 261 texts have been taken. All of them are very diverse in style and time, taking us through a wide linguistic and temporal spectrum that will cover the period from the 6th century BC to the present day.

Research paper thumbnail of Elliptic curve cryptography on constraint environments

The main target of this work is to expose the capacities that make elliptic curve cryptography th... more The main target of this work is to expose the capacities that make elliptic curve cryptography the most suitable one to he implemented in environments with several constraints related to processor speed, bandwidth, security and memory. We have analyzed several elliptic curve cryptosystems with other public key ones. We have made a comparison among different puhlic key cryptosystems (such as ElGamal for encryption and Diffic-Hellman for kcy cxchanging) and thc corresponding ones based on elliptic curve theory; highlighting algorithm speed characteristics.

Research paper thumbnail of Red de mentoría en entornos universitarios españoles: resultados de un análisis comparativo

Red de mentoría en entornos universitarios españoles: resultados de un análisis comparativo

Page 13. UNIVERSIDAD POLITÉCNICA DE MADRID UNIVERSIDAD COMPLUTENSE DE MADRID 13 Revista Mentoring... more Page 13. UNIVERSIDAD POLITÉCNICA DE MADRID UNIVERSIDAD COMPLUTENSE DE MADRID 13 Revista Mentoring & Coaching. Número 3 (2010)· 13-29 RED DE MENTORIA EN ENTORNOS UNIVERSITARIOS ESPAÑOLES ...

Research paper thumbnail of The Rijndael block cipher (AES proposal) : a comparison with DES

The Rijndael block cipher (AES proposal) : a comparison with DES

ABSTRACT In October 2000, after three years of competition between 15 candidate algorithms, the N... more ABSTRACT In October 2000, after three years of competition between 15 candidate algorithms, the National Standards and Technology (NIST) chose the Rijndael algorithm to be adopted as Advanced Encryption Standard (AES) by the U.S. Department of Commerce, replacing to Data Encryption Algorithm (DES), which has been the standard since 1977. The authors analyze the structure and design of new AES, following three criteria: a) resistance against all known attacks; b) speed and code compactness on a wide range of platforms; and c) design simplicity; as well as its similarities and dissimilarities with other symmetric ciphers. On the other side, the principal advantages of new AES with respect to DES and T-DES, as well as its limitations, are investigated. Thus, for example, the fact that the new cipher and its inverse use different components, which practically eliminates the possibility for weak and semi-weak keys, as existing for DES, and the non-linearity of the key expansion, which practically eliminates the possibility of equivalent keys, are two of the principal advantages of new cipher. Finally, the implementation aspects of Rijndael cipher and its inverse are treated. Thus, although Rijndael is well suited to be implemented efficiently on a wide range of processors and in dedicated hardware, we have concentrated our study on 8-bit processors, typical for current smart cards and on 32-bit processors, typical for PCs

Research paper thumbnail of An adaptive regularization approach for solving discrete ill-posed problems: application to the harmonic signals restoration

An adaptive regularization approach for solving discrete ill-posed problems: application to the harmonic signals restoration

Research paper thumbnail of Análisis y síntesis de señales musicales mediante transformada Wavelet

Análisis y síntesis de señales musicales mediante transformada Wavelet

Research paper thumbnail of Redes neuronales: un camino abierto a la inteligencia articifial ...¿Hasta dónde? Objetivo : potenciar al máximo el cerebro humano, sin suplantar su originalidad, su libertad

Redes neuronales: un camino abierto a la inteligencia articifial ...¿Hasta dónde? Objetivo : potenciar al máximo el cerebro humano, sin suplantar su originalidad, su libertad

Research paper thumbnail of A New Edit Distance for Fuzzy Hashing Applications

Similarity preserving hashing applications, also known as fuzzy hashing functions, help to analys... more Similarity preserving hashing applications, also known as fuzzy hashing functions, help to analyse the content of digital devices by performing a resemblance comparison between different files. In practice, the similarity matching procedure is a two-step process, where first a signature associated to the files under comparison is generated, and then a comparison of the signatures themselves is performed. Even though ssdeep is the best-known application in this field, the edit distance algorithm that ssdeep uses for performing the signature comparison is not well-suited for certain scenarios. In this contribution we present a new edit distance algorithm that better reflects the similarity of two strings, and that can be used by fuzzy hashing applications in order to improve their results.

Research paper thumbnail of Aplicaciones de la Criptografía de Curva Elíptica

Aplicaciones de la Criptografía de Curva Elíptica

Novática: Revista de la Asociación de Técnicos de Informática, 2004

el principal objetivo de este trabajo consiste en exponer las cualidades que hacen de la "Cr... more el principal objetivo de este trabajo consiste en exponer las cualidades que hacen de la "Criptografia de Curva Eliptica" el tipo de criptografia mas adecuado para ser utilizado en plataformas con distintas restricciones relacionadas con la velocidad del procesador, el ancho de banda, seguridad y memoria. Se han comparado distintas caracteristicas de algoritmos de clave publica con algoritmos de curva eliptica, para ver las ventajas que presenta el uso de la criptografia de curva eliptica. Concretamente, se ha llevado a cabo una comparacion entre distintos criptosistemas de clave publica (tales como ElGamal para cifrado y Diffie-Hellman para intercambio de claves) y los correspondientes basados en teoria de curva eliptica; subrayando principalmente las caracteristicas de velocidad de los algoritmos. Ademas, se han analizado distintas funciones llamadas por los algoritmos de curva eliptica para encontrar las mas lentas, las mas criticas.

Research paper thumbnail of El paradero actual de la talla original de Nuestra Señora de Candelaria

El paradero actual de la talla original de Nuestra Señora de Candelaria

Estudios Canarios: Anuario del Instituto de Estudios Canarios, 2017

espanolLa talla original de la Virgen de Candelaria del templo tinerfeno del mismo nombre desapar... more espanolLa talla original de la Virgen de Candelaria del templo tinerfeno del mismo nombre desaparecio en el aluvion del ano 1826. A pesar de los intentos de localizarla despues del desastre que arraso la iglesia y convento de los frailes dominicos donde se encontraba nada pudo lograrse, siendo sustituida por la actual, realizada en el siguiente ano. Tras rememorar los hechos acaecidos en la devastacion exponemos todas las opciones posibles en cuanto al destino de la imagen mariana, ya en tierra como en mar. Se analizan los tipos posibles de madera en la que pudiera estar hecha y la capacidad de emerger por flotabilidad en las aguas, asi como los destinos y rumbos si alli hubiera caido. Finalmente, y dentro de las hipotesis probables, se propone la que consideramos la mas adecuada situacion y estado que hoy tendria la imagen de Candelaria. EnglishThe original wood carving of the Virgin of Candelaria of the temple of Tenerife of the same name disappeared in the flood of 1826. Despite attempts to locate it after the disaster that devastated the church and convent of the dominican friars where it was, anything could be achieved, and was replaced by the present carving, carried out in the following year. After compile the events of the devastation we expose all possible options regarding the fate of the marian image, on land and at sea; we analyze the possible types of wood used and the ability to emerge in the water, as well as destinations and directions there. Finally, it is proposed what we consider the most likely location and condition nowadays of the carving of Candelaria.

Research paper thumbnail of Principal component analysis for ear-based biometric verification

Principal component analysis for ear-based biometric verification

Biometrics is an active research field that is increasingly being integrated into current technol... more Biometrics is an active research field that is increasingly being integrated into current technology. As a result, more and more people are becoming familiar with biometric technics such as fingerprint or facial recognition. Nevertheless, there are innovative techniques such as ear-based biometrics which are not very well known yet because they are at initial stages of research. In this work, an ear geometry-based biometric verification system oriented to recognition through mobile phones is presented and evaluated. Feature extraction is carried out by means of Principal Component Analysis and feature matching is performed by a distance-based classifier including Euclidean and Eigen distances. The evaluation has been made according to the specifications included in the ISO/IDE 19795 norm following a zero-effort falsification scenario. To this end, a database including ear images of different users has been captured using the camera of a smartphone. The obtained Equal Error Rate matches 1.11% when Euclidean distance is applied for feature comparison and 5.56% in the case of Eigendistance.

Research paper thumbnail of Improving access control security using iris identification

Improving access control security using iris identification

Abstract When an access control system is going to be developed or installed, the security needed... more Abstract When an access control system is going to be developed or installed, the security needed in the application where that system will be applied should be studied deeply. This security environment can deny the use of passwords or PINs, due to their low level of user ...

Research paper thumbnail of Two different approaches for iris recognition using Gabor filters and multiscale zero-crossing representation

Pattern Recognition, Feb 1, 2005

Importance of biometric user identification is increasing everyday. One of the most promising tec... more Importance of biometric user identification is increasing everyday. One of the most promising techniques is the one based on the human iris. The authors, in this work, describe different approaches to develop this biometric technique. Based on the works carried out by Daugman, the authors have worked using Gabor filters and Hamming distance. But in addition, they have also worked in zero-crossing representation of the dyadic wavelet transform applied to two different iris signatures: one based on a single virtual circle of the iris; the other one based on an annular region. Also other metrics have been applied to be compared with the results obtained with the Hamming distance. In this work Euclidean distance and d Z will be shown. The last proposed approach is translation, rotation and scale invariant. Results will show a classification success up to 99.6% achieving an equal error rate down to 0.12% and the possibility of having null false acceptance rates with very low false rejection rates.

Research paper thumbnail of Deep Learning for Facial Recognition on Single Sample per Person Scenarios with Varied Capturing Conditions

Applied sciences, Dec 13, 2019

Single sample per person verification has received considerable attention because of its relevanc... more Single sample per person verification has received considerable attention because of its relevance in security, surveillance and border crossing applications. Nowadays, e-voting and bank of the future solutions also join this scenario, opening this field of research to mobile and low resources devices. These scenarios are characterised by the availability of a single image during the enrolment of the users into the system, so they require a solution able to extract knowledge from previous experiences and similar environments. In this study, two deep learning models for face recognition, which were specially designed for applications on mobile devices and resources saving environments, were described and evaluated together with two publicly available models. This evaluation aimed not only to provide a fair comparison between the models but also to measure to what extent a progressive reduction of the model size influences the obtained results.The models were assessed in terms of accuracy and size with the aim of providing a detailed evaluation which covers as many environmental conditions and application requirements as possible. To this end, a well-defined evaluation protocol and a great number of varied databases, public and private, were used.

Research paper thumbnail of A comparative study of palmprint feature extraction methods for contact-less biometrics under different environmental conditions

A comparative study of palmprint feature extraction methods for contact-less biometrics under different environmental conditions

Interest on palmprint biometrics has experimented a strong growth in the last decades due to its ... more Interest on palmprint biometrics has experimented a strong growth in the last decades due to its useful characteristics as uniqueness, permanence, reliability, user-friendliness, acceptability, non-intrusiveness, and low cost of the acquisition devices, which make it attractive for civil and commercial applications. Accordingly, a wide research has been developed in this field. Nevertheless, there is a lack of evaluation of the results under a common framework which permits to obtain a fair comparison between different methods. In this work an evaluation methodology based on the definitions suggested by the ISO/IDE 19795 norm is provided and applied to evaluate different palmprint feature extraction methods and matching approaches in terms of accuracy and computation time, offering a basis against which to compare future research. Given the growing trend towards touch-less biometrics promoted by the increase in the use of mobile devices, the evaluation has been carried out using two different contact-less hand databases. First database was recorded under totally controlled environmental conditions and providing the user some feedback about the recorded images during the acquisition process, while second database introduces lighting variations as well as no-pose restrictions during the capture.

Research paper thumbnail of A comparative survey on supervised classifiers for face recognition

A comparative survey on supervised classifiers for face recognition

During the last decades, several different techniques have been proposed for computer recognition... more During the last decades, several different techniques have been proposed for computer recognition of human faces. A further step in the development of these biometrics is to implement them in portable devices, such as mobile phones. Due to this devices' features and limitations it is necessary to select, among the currently available algorithms, the one with the best performance in terms of algorithm overall elapsed time and correct identification rates. The aim of this paper is to offer a complementary study to previous works, focusing on the performance of different supervised classifiers, such as the Normal Bayesian Classifier, Neural Architectures or distance-based algorithms. In addition, we analyse all the proposed algorithms' efficiency over public face databases (ORL, FERET, NIST and the Face Recognition Data from the Essex University). Each one of these databases contains a different number of individuals and particular samples and they present variations among images from the same user (scale, pose, expression, illumination, ...). We expect to simulate many different situations which take place when dealing with face recognition on mobile phones. In order to get a complete comparison, all the proposed algorithms have been implemented and run over all the databases, using the same computer. Different parametrizations for each algorithm have also been tested. Bayesian classifiers and distance-based algorithms turn out to be the most suitable, as their parametrization is simple, the training stage is not as time consuming as others' and classification results are satisfying.

Research paper thumbnail of Wavelet domain signal deconvolution with singularity-preserving regularization

Mathematics and Computers in Simulation, 2003

In this paper, we consider a wavelet based singularity-preserving regularization scheme for use i... more In this paper, we consider a wavelet based singularity-preserving regularization scheme for use in signal deconvolution problems. The inverse problem of finding solutions with singularities to discrete Fredholm integral equations of the first kind arises in many applied fields, e.g. in Geophysics. This equation is usually an ill-posed problem when it is considered in a Hilbert space framework, requiring regularization techniques to control arbitrary error amplifications and to get adequate solutions. Thus, considering the joint detection-estimation character this kind of signal deconvolution problems have, we introduce two novel algorithms which involve two principal steps at each iteration: (a) detecting the positions of the singularities by a nonlinear projection selection operator based on the estimation of Lipschitz regularity using the discrete dyadic wavelet transform; and (b) estimating the amplitudes of these singularities by obtaining a regularized solution of the original equation using the a priori knowledge and the above approximation. Some simulation examples serve to appreciate the high performance of the proposed techniques in this kind of problems.

Research paper thumbnail of Quality Measurements for Iris Images in Biometrics

Quality Measurements for Iris Images in Biometrics

... Abstract—A method for analyzing the quality of iris images is presented in this paper. Iris B... more ... Abstract—A method for analyzing the quality of iris images is presented in this paper. Iris Biometrics for personal identification is based on capturing an eye photograph and obtaining a feature vector that allows to identify a human being. ...

Research paper thumbnail of XVII Reunión española sobre criptología y seguridad de la información. RECSI 2022

Editorial Universidad de Cantabria eBooks, Sep 29, 2022

In recent years, the field of side-channel analysis has observed a revolution in the design of th... more In recent years, the field of side-channel analysis has observed a revolution in the design of the attack methodology. Conventional approaches which require the use of highly specialized equipment like oscilloscopes and spectrum analyzers, despite highly-precise, might be regarded as impractical in some scenarios. On the other hand, the use of less-accurate internal sensors which can monitor the power footprint of a circuit has risen in popularity. In particular, delay sensors such as those based in Time-to-Digital converters and Ring-Oscillators have shown promising results. These structures are interesting since they can be implemented from regular hardware resources available in most circuits. This means that components already available in the target might be leveraged to implement a side-channel attack. Moreover, it is not really necessary to have direct access to the platform to carry out such an attack; which implies that if there is a remote link such as Ethernet, an adversary might be able to perform Remote Power Analysis (RPA) of the system. So far, the main challenge for the success of this kind of attacks is cutting and aligning the power traces. This is usually achieved through secondary digital channels which carry some trigger information. In this paper, we propose to use a single channel to encode both the power trace and the alignment information. This is achieved by exploiting architectural vulnerabilities of the platform. Our results demonstrate, for the very first time, that RPA traces can be auto-aligned. As a case study we attempt to perform RPA on a serialized implementation of Photon-Beetle, a finalist in the NIST lightweight cryptography standardization process.

Research paper thumbnail of Deconvolución de señales con discontinuidades mediante regularización adaptativa con estimación de la regularidad Lipschitz

Deconvolución de señales con discontinuidades mediante regularización adaptativa con estimación de la regularidad Lipschitz