Error Correction Code Research Papers (original) (raw)
Lexicographic construction is a greedy algorithm that produces error correcting codes known as lexicodes. The most surprising fact about lexicodes is that they possess some distinctive characteristics, contrary to the first impression... more
Lexicographic construction is a greedy algorithm that produces error correcting codes known as lexicodes. The most surprising fact about lexicodes is that they possess some distinctive characteristics, contrary to the first impression that they do not have any interesting structure. In this project we observe these distinctive characteristics of the lexicodes in order to produce the optimal algorithm for the
Poset and block metrics were introduced in recent years as alternative metrics to study error correcting codes. Poset-block codes were introduced in 2008, encompassing both poset and block metrics. In this paper, we study a family of such... more
Poset and block metrics were introduced in recent years as alternative metrics to study error correcting codes. Poset-block codes were introduced in 2008, encompassing both poset and block metrics. In this paper, we study a family of such metrics, the Niederreiter-Rosenbloom-Tsfasman block metrics. In this context, we classify the classes of equivalent codes, describe canonical representatives of each class and develop much of the classical theory of error correcting codes for Niederreiter-Rosenbloom-Tsfasman block codes, including determination of packing radius and classification of MDS and perfect codes, determination of covering radius and characterization of quasi-perfect codes, and the description of an algorithm for syndrome decoding.
- by Marcelo Firer
- •
- Algorithm, Decoding, Metric, Block Codes
The use of biometrics (e.g., fingerprints, irises, faces) for recognizing individuals is becoming increasingly popular and many applications are already available. Biometrics are intrinsically associated with individuals and cannot be... more
The use of biometrics (e.g., fingerprints, irises, faces) for recognizing individuals is becoming increasingly popular and many applications are already available. Biometrics are intrinsically associated with individuals and cannot be forgotten or shared with others. However, one of the most relevant vulnerabilities of biometrics is that once a biometric template is compromised, it cannot be reissued, updated or destroyed. An attacker could then gain access to all the accounts/services/applications using that same biometric trait. This paper proposes a biometric verification system using distributed source coding principles, with enhanced security with respect to traditional biometric verification systems. The generation of different templates from the same biometric data is supported, as well as cancelable templates. Furthermore, it will not be possible to recover the original biometric data from the stored data, thus guaranteeing its privacy.
The determination of number of neurons (H) in hidden layers is very important as it affects the training time and generalization property of neural networks. A higher value of H may force the network to memorize (as opposed to generalize)... more
The determination of number of neurons (H) in hidden layers is very important as it affects the training time and generalization property of neural networks. A higher value of H may force the network to memorize (as opposed to generalize) the patterns which it has seen during ...
Pin number or password that is used for authentication can be easily attacked. This limitation triggered the utilization of biometric for secured transactions. Biometric is unique to each individual and is reliable. Among the types of... more
Pin number or password that is used for authentication can be easily attacked. This limitation triggered the utilization of biometric for secured transactions. Biometric is unique to each individual and is reliable. Among the types of biometric being used currently, iris is the most accurate and it remains stable throughout a person’s life. However the major challenge on iris and other biometric for authentication is the intra user variability in the acquired identifiers. Iris of the same person captured at different time may differ due to the signal noise of the iris camera. Traditional cryptography method is unable to encrypt and store biometric template, then perform the matching directly. Minor changes in the bits of the feature set extracted from the iris may lead to a huge difference in the results of the encrypted feature. In our approach, an iris biometric template is secured using iris biometric and passwords. Error Correction Code, ECC is introduced to reduce the variability and noise of the iris data. Experimental results show that this approach can assure a higher security with a low false rejection or false acceptance rate. The successful iris recognition rate using this approach is up to 97%.