On Defining Partition Entropy by Inequalities (original) (raw)

An axiomatization of partition entropy

IEEE Transactions on Information Theory, 2002

The aim of this paper is to present an axiomatization of a generalization of Shannon's entropy starting from partitions of finite sets. The proposed axiomatization yields as special cases the Havrda-Charvat entropy, and thus, provides axiomatizations for the Shannon entropy, the Gini index, and for other types of entropy used in classification and data mining.

A Generalization of Conditional Entropy

We introduce an extension of the notion of Shannon conditional entropy to a more general form of conditional entropy that captures both the conditional Shannon entropy and a similar notion related to the Gini index. The proposed family of conditional entropies generates a collection of metrics over the set of partitions of finite sets, which can be used to construct decision trees. Experimental results suggest that by varying the parameter that defines the entropy it is possible to obtain smaller decision trees for certain databases without sacrificing accurracy. RÉSUMÉ. Nous présentons une extension de la notion de l'entropie conditionnelle de Shannon à une forme plus générale d'entropie conditionnelle qui formalise l'entropie conditionnelle de Shannon et une notion semblable liée à l'index de Gini. La famille proposée des entropies condi- tionnelles produit d'une collection de métriques sur l'ensemble de partitions des ensembles finis, qui peuvent être empl...

An axiomatization of generalized entropy of partitions

Proceedings 31st IEEE International Symposium on Multiple-Valued Logic, 2001

The aim of this paper is to present an axiomatization of a generalization of Shannon's entropy. The newly proposed axiomatization yields as special cases the Havrada-Charvat entropy, and thus, provides axiomatizations for the Shannon entropy, the Gini index, and for other types of entropy used in classification and data mining.

Generalized Entropy and Decision Trees

We introduce an extension of the notion of Shannon conditional entropy to a more general form of conditional entropy that captures both the conditional Shannon entropy and a similar notion related to the Gini index. The proposed family of conditional entropies generates a collection of metrics over the set of partitions of finite sets, which can be used to construct decision trees. Experimental results suggest that by varying the parameter that defines the entropy it is possible to obtain smaller decision trees for certain databases without sacrificing accurracy.

Generalized Conditional Entropy and a Metric Splitting Criterion for Decision Trees

Lecture Notes in Computer Science, 2006

We introduce an extension of the notion of Shannon conditional entropy to a more general form of conditional entropy that captures both the conditional Shannon entropy and a similar notion related to the Gini index. The proposed family of conditional entropies generates a collection of metrics over the set of partitions of finite sets, which can be used to construct decision trees. Experimental results suggest that by varying the parameter that defines the entropy it is possible to obtain smaller decision trees for certain databases without sacrificing accurracy.

Conditional Entropy for the Union of Fuzzy and Crisp Partitions

In this paper we introduce the conditional entropy without a fuzzy measure for the union of fuzzy partitions. We recall its properties and we solve the system of functional equations which derives from these conditions, taking into account the locality principle and the independence axiom.

Entropy: An Inequality

Tokyo Journal of Mathematics, 1988

improved on the assumption that the terms are not too often of comparable size. A8 an application, we derive a general, optimal bound for the entropy of a probability distribution.

Functional Entropy and Decision Trees

1998

We introduce a technique to compute several information estimations for Boolean and multivalued functions. Special features of these estimations for completely and incompletely speci ed l o gic functions, including symmetric logic functions are investigated. Finally, we give an algorithm for determining various information measures for logical functions based o n d e cision trees.