"Communicating over a noisy channel" and "The Noisy-Channel Coding Theorem" Two chapters from David MacKay's book (2001) Jonathan Ledlie (jonathan@eecs) October 30, 2001 Binary Symmetric Channel In a BSC, the channel changes sign of every bit (independently) with probability p. If p >= 1/2, then the message is indecipherable. It is memoryless because changing one bit has no effect on whether another is changed. Unlike a BEC, a BSC never outputs a third unknown state, where presumably the voltage is neither high nor low enough to claim the bit as a one or a zero. The capacity of the BSC is C = 1 + p log2 p + (1-p) log2 (1-p) where p = probability of a bit flip. Binary Erasure Channel A BEC has two inputs and three outputs. When 0 is transmitted, the receiver can receive either 0 or ?. The latter output means that the receiver does not know how to demodulate it. Similarly, when 1 is transmitted, the receiver can receive either 1 or ?.Since each channel output Y only depends on its related input X, the channel is said to be memoryless. The capacity of the BEC is C = 1 - p where p = probability of either 0 or 1 becoming 0 or ? and 1 or ? respectively. Capacity of a Channel The capacity of a channel is the limiting rate at which it can carry information, expressed in bits per second. If a source with given entropy feeds information to a channel with a given capacity, and if the source entropy is less than the channel capacity, a code exists for which the frequency of errors may be reduced as low as desired. If the channel capacity is less than the source entropy, no such code exists. Shannon's Theorem It states that for any coded message Mc, the uncoded original message M, and the probability of any single bit being incorrect after passing through the channel is q, the following inequality holds: M/Mc <= f(q) = 1 - (q log(1/q) + (1-q)log(1/(1-q))) In other words, if Mc can become arbitrarily large, q will become arbitrarily small -- as small as we need it to be (but never zero).