Information Theory (original) (raw)
The paper discusses the foundational concepts in information theory, focusing on probability distributions, random variables, and the relationships between them. It introduces critical terms like joint distributions, conditional distributions, and mutual information to quantify how knowledge of one random variable influences the uncertainty about another. The concept of entropy is explored through practical examples, including fair and biased coins, highlighting how entropy quantifies the amount of information required to describe random variables and how it relates to coding theory.