The Science of Information: From Language to Black Holes
2015
Abstract
This book, a course guidebook for a series of lectures on the science of information, argues that information is a fundamental concept in understanding both the natural and the man-made world. Information is ubiquitous, appearing in language, communication systems, computation, biology, and even in the fundamental laws of physics. The book argues that information has several distinct properties. First, information can be transformed from one physical form to another – from sound to electrical signals, to magnetic fields, and back again. Second, information can be measured in bits. Third, information can be lost in a noisy channel, but errors can be corrected with the use of error-correcting codes. Fourth, information can be compressed by encoding common messages with shorter codewords. Fifth, the process of information erasure has a thermodynamic cost in that it requires the production of waste heat. Finally, algorithmic information theory, based on computation rather than communication, argues that information can be measured by the length of the shortest computer program that can produce a particular output. – AI-generated abstract