-
is the set of source symbols
-
(message)
-
(channel input)
- is the set of channel input symbols
- is a sequence of codewords
-
-
(channel output)
- is the set of channel output symbols
-
is a discrete random variable
-
is the probability of an outcome
-
is the alphabet of .
-
The self-information (or information content) of an outcome is defined as
- (It measures how surprising or informative the specific outcome is. If is small, then observing is rare and thus carries more information, so is large. Conversely, if is large, then observing is common and carries less information, so is small.)
-
The (Shannon) entropy of is defined as
- , with equality if and only if is uniformly distributed over
-
(examples)
- (deterministic variable) Let for some , then
- (uniform distribution) Let for all , then
- (fair die) If , then
- (binary variable) Let and , then
- (fair coin) If , then
-
(Shannon’s source coding theorem)
-
-
(maximum possible entropy)
-
(absolute redundancy)
-
((relative) redundancy)