Encoding Research Papers - Academia.edu (original) (raw)
Page 1. A Parallel Genetic Algorithm for Rule Discovery in Large Databases Dieferson Luis Alves de Araujo' , Heitor S. Lopes', Alex A. Freitas2 CEFET-PR - Centro... more
Page 1. A Parallel Genetic Algorithm for Rule Discovery in Large Databases Dieferson Luis Alves de Araujo' , Heitor S. Lopes', Alex A. Freitas2 CEFET-PR - Centro Federal de EducaGBo Tecnol6gica do Paranh CPGEI - Curso ...
Cognitive radio has the ability to sense the environment and adapt its behavior to optimize communication features, such as quality of service in the presence of interference and noise. In physical layer to achieve this goal, different... more
Cognitive radio has the ability to sense the environment and adapt its behavior to optimize communication features, such as quality of service in the presence of interference and noise. In physical layer to achieve this goal, different phases of sensing, channel estimation, and configuration selection are necessary. The sensing part measures the interference level, recognize the spectrum holes, and send this information to channel estimator. In the next step, channel state information (CSI) is used for data detection and also sent to the transmitter through a limited feedback. CSI feedback consists of achievable rate, SNR value, modulation or coding schemes (MCS). Feedback link improves the system performance in the cost of complexity and delay. In this paper, we present and compare different feedback schemes for cognitive radio and study the channel capacity when an imperfect feedback link is corrupted by noise and delay.
With the technological advancement in body area sensor networks (BASNs), low cost high quality electrocardiographic (ECG) diagnosis systems have become important equipment for healthcare service providers. However, energy consumption and... more
With the technological advancement in body area sensor networks (BASNs), low cost high quality electrocardiographic (ECG) diagnosis systems have become important equipment for healthcare service providers. However, energy consumption and data security with ECG systems in BASNs are still two major challenges to tackle. In this study, we investigate the properties of compressed ECG data for energy saving as an effort to devise a selective encryption mechanism and a two-rate unequal error protection (UEP) scheme. The proposed selective encryption mechanism provides a simple and yet effective security solution for an ECG sensor-based communication platform, where only one percent of data is encrypted without compromising ECG data security. This part of the encrypted data is essential to ECG data quality due to its unequally important contribution to distortion reduction. The two-rate UEP scheme achieves a significant additional energy saving due to its unequal investment of communication energy to the outcomes of the selective encryption, and thus, it maintains a high ECG data transmission quality. Our results show the improvements in communication energy saving of about 40%, and demonstrate a higher transmission quality and security measured in terms of wavelet-based weighted percent root-mean-squared difference.
During the last decades, Steganography has found many applications. Many steganographic systems have been developed and used in various areas, e.g., in digital assets (DRM), Telecommunications, Medicine etc. In this paper, we prove that... more
During the last decades, Steganography has found many applications. Many steganographic systems have been developed and used in various areas, e.g., in digital assets (DRM), Telecommunications, Medicine etc. In this paper, we prove that the set CF, which is a union of a certain set of Fibonacci numbers and a certain set of Catalan numbers, satisfies conditions, similar to those of Zeckendorf's Theorem. Therefore, it can be used for the encoding of data. Using this result, we propose a method that improves the Fibonacci data hiding technique.
Data stored in databases keep growing as a result of businesses requirements for more information. A big portion of the cost of keeping large amounts of data is in the cost of disk systems, and the resources utilized in managing that... more
Data stored in databases keep growing as a result of businesses requirements for more information. A big portion of the cost of keeping large amounts of data is in the cost of disk systems, and the resources utilized in managing that data. This paper introduces various compression techniques for data stored in row oriented as well as column-oriented databases. Keeping
Abstract A multidisciplinary approach is being used to identify and validate virulence factors and determinants of bacteria of the Burkholderia cepacia complex (Bcc). Bcc is a group of problematic opportunistic pathogenic bacteria,... more
Abstract A multidisciplinary approach is being used to identify and validate virulence factors and determinants of bacteria of the Burkholderia cepacia complex (Bcc). Bcc is a group of problematic opportunistic pathogenic bacteria, particularly among cystic fibrosis patients, ...
It's a fact that nonlinear color models like Hue-Saturation -Value/ Brightness/ Luminance/ Intensity (HSV/ HSB/ HSL/ HSI) have special feature for each channel. So in this paper we propose a new hybrid compression system that deals with... more
It's a fact that nonlinear color models like Hue-Saturation -Value/ Brightness/ Luminance/ Intensity (HSV/ HSB/ HSL/ HSI) have special feature for each
channel. So in this paper we propose a new hybrid compression system that deals with each channel with a suitable compression technique to obtain encoded images with less size and high decoding quality than the traditional encoding methods.
A coding theory approach to error control in redundant residue number systems (RRNSs) is presented. The concepts of Hamming weight, minimum distance, weight distribution, and error detection and correction capabilities in redundant... more
A coding theory approach to error control in redundant residue number systems (RRNSs) is presented. The concepts of Hamming weight, minimum distance, weight distribution, and error detection and correction capabilities in redundant residue number systems are introduced. The necessary and sufficient conditions for the desired error control capability are derived from the minimum distance point of view. Closed-form expressions are
The design and implementation of lossless audio signal processing using Finite Field Transforms is discussed. Finite field signal processing techniques are described. The effects of filter length and coefficient accuracy are also... more
The design and implementation of lossless audio signal processing using Finite Field Transforms is discussed. Finite field signal processing techniques are described. The effects of filter length and coefficient accuracy are also discussed. Finite field transform algorithms which would be suitable for lossless signal processing are presented
MP3 is a standard that is used for encoding/decoding audio data. The standard can lower audio bit rate significantly without any loss. For this reason, it is key to understand how it does so, and secondly, find out if it is doable. Raw... more
MP3 is a standard that is used for encoding/decoding audio data. The standard can lower audio bit rate significantly without any loss. For this reason, it is key to understand how it does so, and secondly, find out if it is doable. Raw audio signals carry large data quantities and are neither suitable for transmission nor storage [2]. Therefore it is necessary to compress audio and at the same time maintain its quality as required by the International Organization for Standardization (ISO). MP3 was developed by the Motion Pictures Experts Group for audio and video compression. It is composed of three modes; the third referred to as Layer III. It is this layer that lowers raw-audio data bit rates from i.e. 1.4 Megabits per second to just 128 kilobits per second and can still reconstruct the signals to a level comparable to the original [3]. The objective of this paper is to review and provide a simple idea of what Layer III does in relations to audio compression and decompression.
- by Ijarcsee Journal
- •
- Audio, Decoding, Mp3, Encoding
One of the main issues with machine learning is too many input variables when going from raw data. Too many numbers of inputs make it hard to train the models and often can lead to overfitting. Autoencoders were originally proposed as a... more
One of the main issues with machine learning is too many input variables when going from raw data. Too many numbers of inputs make it hard to train the models and often can lead to overfitting. Autoencoders were originally proposed as a method of reducing dimensions and extracting higher-level features that we know for sure contain most if not all of the information. It can be thought of as a method for storing the information in a compressed, more efficient and meaningful way.
- by Amir Ali and +1
- •
- Neural Networks, Decoding, Encoding, Auto-Encoder
‘Avestan’ is the name of the ritual language of Zoroastrianism, which was the state religion of the Iranian empire in Achaemenid, Arsacid and Sasanid times, covering a time span of more than 1200 years.1 It is named after the ‘Avesta’,... more
‘Avestan’ is the name of the ritual language of Zoroastrianism, which was the state religion of
the Iranian empire in Achaemenid, Arsacid and Sasanid times, covering a time span of more
than 1200 years.1 It is named after the ‘Avesta’, i.e., the collection of holy scriptures that form
the basis of the religion which was allegedly founded by Zarathushtra, also known as Zoroaster,
by about the beginning of the first millennium B.C. Together with Vedic Sanskrit, Avestan
represents one of the most archaic witnesses of the Indo-Iranian branch of the Indo-European
languages, which makes it especially interesting for historical-comparative linguistics. This is
why the texts of the Avesta were among the first objects of electronic corpus building that were
undertaken in the framework of Indo-European studies, leading to the establishment of the
TITUS database (‘Thesaurus indogermanischer Text- und Sprachmaterialien’). 2 Today, the
complete Avestan corpus is available, together with elaborate search functions3 and an extended
version of the subcorpus of the so-called ‘Yasna’, which covers a great deal of the attestation of
variant readings.4
Right from the beginning of their computational work concerning the Avesta, the compilers5
had to cope with the fact that the texts contained in it have been transmitted in a special
script written from right to left, which was also used for printing them in the scholarly editions
used until today.6 It goes without saying that there was no way in the middle of the 1980s to
encode the Avestan scriptures exactly as they are found in the manuscripts. Instead, we had to
rely upon transcriptional devices that were dictated by the restrictions of character encoding as
provided by the computer systems used. As the problems we had to face in this respect and the
solutions we could apply are typical for the development of computational work on ancient
languages, it seems worthwhile to sketch them out here.
Zusammenfassung: Dieser Beitrag diskutiert das in der (vor allem deutschsprachigen) sozialwissenschaftlichen Medien-und Rezeptionsforschung populäre Konzept der Aneignung, das insbesondere auf Arbeiten aus dem Bereich der Cultural Studies... more
Zusammenfassung: Dieser Beitrag diskutiert das in der (vor allem deutschsprachigen) sozialwissenschaftlichen Medien-und Rezeptionsforschung populäre Konzept der Aneignung, das insbesondere auf Arbeiten aus dem Bereich der Cultural Studies zurückgeht, welche die Aktivität, Kreativität und den Eigensinn von RezipientInnen akzentuieren. Die Cultural Studies haben damit Pionierarbeit für eine qualitative resp. rekonstruktive Rezeptionsforschung geleistet. Es finden sich jedoch Tendenzen einer Romantisierung und Idealisierung der Aktivität des Zuschauers, die eine inflationäre Verwen-dung des Aneignungskonzepts nahelegen. In Abgrenzung von diesen – vorrangig poststrukturalistisch und interaktionis-tisch geprägten – Positionen und im Anschluss an Ansätze innerhalb der Cultural Studies, die Aneignung als spezifische Praxis der Rezeption begreifen, schlägt der Beitrag eine wissenssoziologische Präzisierung des Aneignungskonzeptes vor dem Hintergrund eigener Studien vor. Dies führt zu einer Differenzierung in eine produktive und reproduktive Aneig-nung, welche die Lesarten-Typologie nach Stuart Hall (dominant, ausgehandelt, oppositionell) weiterführt und differen-ziert. Summary: This paper discusses the currently very popular concept of appropriation in (especially German) media and audience studies, which is being developed by cultural studies and focuses on the creativity and self-will of spectators. Despite the fact that cultural studies researchers have, in this way, accomplished pioneering work in terms of qualitative reception studies, there have been tendencies to romanticize and idealize audience activity. In contrast to such (post-structuralistic and interactionistic) positions and in continuation of approaches in cultural studies which conceptualize appropriation as a specific practice of reception this article proposes a specification inspired by the sociology of knowledge and by the author's empirical findings. This leads to a differentiation of productive vs. reproductive appropriation that is able to further illuminate Stuart Hall's reading-triad: dominant; negotiated; oppositional.
- by Kathleen Hourihan and +1
- •
- Psychology, Cognitive Science, Semantics, Production
In this paper, an extensive review on video encoding-decoding schemes have been discussed. Further, a novel architecture has been laid off for suitable future work in the same direction. The reviews on literature plies from 1960 until... more
In this paper, an extensive review on video encoding-decoding schemes have been discussed. Further, a novel architecture has been laid off for suitable future work in the same direction. The reviews on literature plies from 1960 until date show casing the bench mark methods proposed by eminent researchers in the domain of video compression. The time line of representing the review is split into three categories. These categories refer to the classical methods, the conventional heuristic methods, and modern deep learning techniques utilized for the purpose of video compression. For all the categories, the milestone contributions are discussed. The approaches are summarized in different tables along with their advantages and disadvantages. Certain remarks for specific approaches are also provided in the summary. The limitations of existing works are well described in depth so that prospective researchers can have a path for future works. Finally, concluding remark along with prospective work in the same direction is presented.
The inspiration for this research paper was the natural bias in university paper checking. When a paper is checked it is either checked by a professor who teaches the subject or someone who has no knowledge of the subject. When checked by... more
The inspiration for this research paper was the natural bias in university paper checking. When a paper is checked it is either checked by a professor who teaches the subject or someone who has no knowledge of the subject. When checked by the latter type, the answers cannot be appropriately marked unless obviously highlighted. This paper aims to check long answers without human intervention using artificial intelligence and regular expressions. It checks student or examinee written digital form answer by comparing it to an answer key which is to be provided by the exam host. The proposed methodology allows doing so by combining two techniques to get a faster and more accurate system to check long answers. The long answers will be evaluated by breaking them to simplest form of sentences and then encoding them to high density vectors using a Deep Averaging Network (DAN) to analyses the semantic similarity of the examinees answer to the provided answer key. This system does not look for only keywords in the content of the answer but looks at the sentence as a whole and if it evaluates similarly to the content in the answer key. This research relies on the availability of an answer key to check answers and does not check the relevance of content written by the examinee, meaning as long as examinee writes points mentioned in the answer key, he/she will be marked correct. This system of evaluation doesn't cut marks for wrong point (meaning no negative marking).
This paper proposes a multilevel Non-Return-to-Zero (NRZ) coding technique for the transmission of digital signals. The Multilevel Technique presented here helps in removing certain problems associated with Bipolar and Manchester coding... more
This paper proposes a multilevel Non-Return-to-Zero (NRZ) coding technique for the transmission of digital signals. The Multilevel Technique presented here helps in removing certain problems associated with Bipolar and Manchester coding techniques. This multilevel technique utilizes different D.C. levels for representing a ‘0’ and ‘1’ with a NRZ method. The PSD (power spectral density) of the encoded signal is analyzed and possible generation method is also shown.
- by Satria Lazuardi
- •
- Encoding
- by Makii Muthalib and +1
- •
- Bioinformatics, Life Sciences, Biomedical Research, Fnirs
Una introducción con aplicaciones a las comunicaciones móviles y a la
modulación en audio digital