Information Theory

Discipline: Economics

The mathematical study of information, its storage by codes, and its transmission through channels of limited capacity.

A fundamental idea is the entropy of a set of events,

H=- Σ1n pk logpk

where the pks are the probability of each event.

Information theory was introduced by CLAUDE ELWOOD SHANNON in 1948 to measure the uncertainty of the set.

Channels are considered as mechanisms which take a letter of an input alphabet and transmit letters of an output alphabet with various probabilities, depending on how prone they are to error or noise.

SHANNON was then able to measure the effectiveness of the channel, which he called capacity, using ergodic theory and entropy.

Share

Facebook Twitter