What is Shannon information? (original) (raw)
Related papers
viXra, 2017
Information Theory, as developed by Claude Shannon in 1948, was about the communication of messages as electronic signals via a transmission channel. Only physical properties of the signal and the channel have been taken into account. While the meaning of the message has been ignored totally. Such an approach to information met very well the requirements of a data communication channel. But recent advances in almost all sciences put an urgent demand for meaningful information inclusion into the body of a communicated message. To meet this demand, I have proposed a new definition of information. In this definition, information is seen as a complex notion composed of two inseparable parts: Physical information and Semantic information. Classical informations such as Shannon, Fisher, Renyi, Kolmogorov's complexity, and Chaitin's algorithmic informationthey are all physical information variants. Semantic information is a new concept and it desires to be properly studied, treated, and used.
Information theory after Shannon
Communications of The ACM, 2011
Science of Information Center with the goal of moving beyond Shannon. They aim to develop principles that encompass such concepts as structure, time, space, and semantics. These principles might help design better mobile networks, lead to new insights in biology and neuroscience, drive research in quantum computing, and even aid our understanding of social networks and economic behavior.
From Philosophy to Theory of Information
2011
This is an attempt to develop a systematic formal theory of information based on philosophical foundations adequate for the broad context of pre-systematic concept of information. The existing formalisms, in particular that commonly called information theory, consider only some aspects of information, such as its measure. In spite of spectacular successes of Shannon's entropy and its generalizations, the quantitative description did not help in the development of the formal description of the concept of information itself. In this paper, the brief review of the contexts in which the term information is being used is followed by similarly brief presentation of philosophical foundations incorporating such aspects of information as its selective and structural manifestations, information integration and semantics of information presented in more extensive form in other publications of the author. Finally, based on these foundations, a mathematical formalism is proposed with an expl...
On Definitions of Information in Physics
Foundations of Science, 2011
During the refereeing procedure of Anthropomorphic Quantum Darwinism by Thomas Durt, it became apparent in the dialogue between him and me that the definition of information in Physics is something about which not all authors agreed. This text aims at describing the concepts associated to information that are accepted as the standard in the Physics world community.
What is Information? A Multidimensional Concern
tripleC-Cognition, Communication, Co-operation, 2010
Looking for an answer to the posed question, we will first go through a brief historical enquiry aiming at exploring the development of the uses given to the Latin word “information” from its Greek roots until its scientific formalisation in hands of the Mathematical Theory of Communication. Secondly and starting from the conceptual limitations of Shannon’s theory, we will put forward the most important theoretical demands claimed by many scientific and technical fields, directly concerned with the usage of information concepts. Such claims eventually entail an open critic to Shannon’s definition with different degrees of radicality, proposing a perspective change in which the different uses and disciplinary interests might be better represented. In order to foster an interdisciplinary approach aiming at gathering together the competing views of information and at bridging their theoretical and practical interests, a sketched glossary of concepts concerning information is proposed as an interdisciplinary tool. Keywords: Information concepts, Information theories, Interdisciplinarity, Mathematical Theory of Communication…
Shannon's information is not entropy
Physics Letters A, 1991
Contrary to the widespread belief, Shannon's information and entropy are not in general equivalent. The purpose of this Letter is to discuss their conceptual difference and to pinpoint to mathematical reason for this. This fact is further illustrated through a toy model consisting of a harmonic oscillator in a coherent state, showing explicitly the dependence of Shannon's information on the class of quantum states the system is in. The interrelation between Shannon's information and entropy is reestablished after a theorem telling which class of states maximizes the (missing) information is proved. We shall conclude that entropy is the maximum amount of missing information.
Information Theory, Relative Entropy and Statistics
Lecture Notes in Computer Science, 2009
It is commonly assumed that computers process information. But what is information? In a technical, important, but nevertheless rather narrow sense, Shannon's information theory gives a first answer to this question. This theory focuses on measuring the information content of a message. Essentially this measure is the reduction of the uncertainty obtained by receiving a message. The uncertainty of a situation of ignorance in turn is measured by entropy. This theory has had an immense impact on the technology of information storage, data compression, information transmission and coding and still is a very active domain of research.