Modelling informational entropy (original) (raw)
Related papers
One sense of 'information': A quick tutorial to Information-Theoretic Logic
tripleC: Communication, Capitalism & Critique. Open Access Journal for a Global Sustainable Information Society, 2009
One of the multiple meanings of the word ‘information’ is given implicitly in the postulates and conditions of information-theoretic logic (I-T-L). The tradition of looking at logical phenomena from an informational stance goes back as far as the XIX century. Logicians such as Boole, De Morgan, Jevons, and Venn already suggested that deducing is a sort of unpacking the information already contained in given premises. In the XX century this tradition is recovered by Carnap and Bar Hillel, Cohen and Nagel, and more recently by Corcoran. John Corcoran has articulated a specific information-theoretic viewpoint of logic with its own particular characteristics. I intend to explain the basic ideas of I-T-L by motivating their philosophical underpinnings. One desideratum is to complement and to shed light on some of the philosophical shortcomings of the nowadays paradigmatic model-theoretic concept of logical consequence. Another is to provide a brief sample of questions to be newly address...
Information and Meaning (Entropy 2003)
Entropy, 2003
We propose here to clarify some of the relations existing between information and meaning by showing how meaningful information can be generated by a system submitted to a constraint. We build up definitions and properties for meaningful information, a meaning generator system and the domain of efficiency of a meaning (to cover cases of meaningful information transmission). Basic notions of information processing are used.
Logical Entropy of Information Sources
Entropy
In this paper, we present the concept of the logical entropy of order m, logical mutual information, and the logical entropy for information sources. We found upper and lower bounds for the logical entropy of a random variable by using convex functions. We show that the logical entropy of the joint distributions X1 and X2 is always less than the sum of the logical entropy of the variables X1 and X2. We define the logical Shannon entropy and logical metric permutation entropy to an information system and examine the properties of this kind of entropy. Finally, we examine the amount of the logical metric entropy and permutation logical entropy for maps.
AN INTRODUCTION TO LOGICAL ENTROPY AND ITS RELATION TO SHANNON ENTROPY
The logical basis for information theory is the newly developed logic of partitions that is dual to the usual Boolean logic of subsets. The key concept is a \distinction" of a partition, an ordered pair of elements in distinct blocks of the partition. The logical concept of entropy based on partition logic is the normalized counting measure of the set of distinctions of a partition on ā nite set À À À just as the usual logical notion of probability based on the Boolean logic of subsets is the normalized counting measure of the subsets (events). Thus logical entropy is a measure on the set of ordered pairs, and all the compound notions of entropy (join entropy, conditional entropy, and mutual information) arise in the usual way from the measure (e.g. the inclusionexclusion principle) À À À just like the corresponding notions of probability. The usual Shannon entropy of a partition is developed by replacing the normalized count of distinctions (dits) by the average number of binary partitions (bits) necessary to make all the distinctions of the partition.
In memory of Alfred Tarski 1901-1983 and Alonzo Church 1903-1995 on the 40th anniversary of their classic works: Logic, Semantics, Metamathematics and Introduction to Mathematical Logic. Information-theoretic approaches [114] to formal logic analyse the "common intuitive" concept of propositional implication (or argumental validity) in terms of information content of propositions and sets of propositions: one given proposition implies a second if the former contains all of the information contained by the latter; an argument is valid if the conclusion contains no information beyond that of the premise-set. This paper locates information-theoretic approaches historically, philosophically and pragmatically. Advantages and disadvantages are identified by examining such approaches in themselves and by contrasting them with standard transformation-theoretic approaches. Transformation-theoretic approaches analyse validity (and thus implication) in terms of transformations that map one argument onto another: a given argument is valid if no transformation carries it onto an argument with all true premises and false conclusion. Model-theoretic, set-theoretic, and substitution-theoretic approaches, which dominate current literature, can be construed as transformation-theoretic, as can the so-called possible-worlds approaches. Ontic and epistemic presuppositions of both types of approaches are considered. Attention is given to the question of whether our historically cumulative experience applying logic is better explained from a purely information-theoretic perspective or from a purely transformation-theoretic perspective or whether apparent conflicts between the two types of approaches need to be reconciled in order to forge a new type of approach that recognizes their basic complementarity.
Natural Science, 2014
In this paper, by axiomatic way, a form of information entropy will be presented on crisp and fuzzy setting. Information entropy is the unavailability of information about a crisp or fuzzy event. It will use measure of information defined without any probability or fuzzy measure: for this reason it is called general information.
From Philosophy to Theory of Information
2011
This is an attempt to develop a systematic formal theory of information based on philosophical foundations adequate for the broad context of pre-systematic concept of information. The existing formalisms, in particular that commonly called information theory, consider only some aspects of information, such as its measure. In spite of spectacular successes of Shannon's entropy and its generalizations, the quantitative description did not help in the development of the formal description of the concept of information itself. In this paper, the brief review of the contexts in which the term information is being used is followed by similarly brief presentation of philosophical foundations incorporating such aspects of information as its selective and structural manifestations, information integration and semantics of information presented in more extensive form in other publications of the author. Finally, based on these foundations, a mathematical formalism is proposed with an expl...
The combination of logic and information is popular as we as controversial. It is, in fact, not even clear what their juxtaposition, for instance in the title of this chapter, should mean, and indeed different authors have a given a different interpretation to what a or the logic of information might be. Throughout this chapter, I will embrace the plurality of ways in which logic and information can be related and try to individuate a number of fruitful lines of research. In doing so, I want to explain why we should care about the combination, where the controversy comes from, and how certain common themes emerge in different settings.
A quantitative-informational approach to logical consequence
DOI: 10.1007/978-3-319-15368-1_3. http://link.springer.com/chapter/10.1007/978-3-319-15368-1\_3., 2015
In this work, we propose a definition of logical consequence based on the relation between the quantity of information present in a particular set of formulae and a particular formula. As a starting point, we use Shannon’s quantitative notion of information, founded on the concepts of logarithmic function and probability value. We first consider some of the basic elements of an axiomatic probability theory, and then construct a probabilistic semantics for languages of classical propositional logic. We define the quantity of information for the formulae of these languages and introduce the concept of informational logical consequence, identifying some important results, among them: certain arguments that have traditionally been considered valid, such as modus ponens, are not valid from the informational perspective; the logic underlying informational logical consequence is not classical, and is at the least paraconsistent sensu lato; informational logical consequence is not a Tarskian logical consequence.