entropy — SciPy v1.15.3 Manual (original) (raw)

scipy.stats.Mixture.

Mixture.entropy(*, method=None)[source]#

Differential entropy

In terms of probability density function \(f(x)\) and support\(\chi\), the differential entropy (or simply “entropy”) of a continuous random variable \(X\) is:

\[h(X) = - \int_{\chi} f(x) \log f(x) dx\]

Parameters:

method{None, ‘formula’, ‘logexp’, ‘quadrature’}

The strategy used to evaluate the entropy. By default (None), the infrastructure chooses between the following options, listed in order of precedence.

Not all method options are available for all distributions. If the selected method is not available, a NotImplementedErrorwill be raised.

Returns:

outarray

The entropy of the random variable.

Notes

This function calculates the entropy using the natural logarithm; i.e. the logarithm with base \(e\). Consequently, the value is expressed in (dimensionless) “units” of nats. To convert the entropy to different units (i.e. corresponding with a different base), divide the result by the natural logarithm of the desired base.

References

Examples

Instantiate a distribution with the desired parameters:

from scipy import stats X = stats.Uniform(a=-1., b=1.)

Evaluate the entropy:

X.entropy() 0.6931471805599454