Fundamentals of Error-Correcting Codes, W.Cary Huffman and Vara Pless
Cambridge University Press | English | ISBN 0511077793 | 662 pages | DJVU | 5.23 MB | 2003
In 1948 Claude Shannon published a landmark paper "A mathematical theory of communication" that signified the beginning of both information theory and coding theory. Given a communication channel which may corrupt information sent over it, Shannon identified a number called the capacity of the channel and proved that arbitrarily reliable communication is possible at any rate below the channel capacity. For example, when transmitting images of planets from deep space, it is impractical to retransmit the images. Hence if portions of the data giving the images are altered, due to noise arising in the transmission, the data may prove useless. Shannon's results guarantee that the data can be encoded before transmission so that the altered data can be decoded to the specified degree of accuracy. Examples of other communication channels include magnetic storage devices, compact discs, and any kind of electronic communication device such as cellular telephones (ISBN2: 0521782805).