Information theory


Information impression is a scientific explore of the probability theory, statistics, computer science, statistical mechanics, information engineering, together with electrical engineering.

A key measure in information concepts is entropy. Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process. For example, identifying the outcome of a reasonable coin flip with two equally likely outcomes offers less information lower entropy than specifying the outcome from a roll of a die with six equally likely outcomes. Some other important measures in information theory are mutual information, channel capacity, error exponents, together with relative entropy. Important sub-fields of information theory add source coding, algorithmic complexity theory, algorithmic information theory and information-theoretic security.

Applications of necessary topics of information theory include quotation coding/data compression e.g. for ZIP files, and channel coding/error detection and correction e.g. for DSL. Its impact has been crucial to the success of the Voyager missions to deep space, the invention of the compact disc, the feasibility of mobile phones and the coding of the Internet. The theory has also found applications in other areas, including statistical inference, cryptography, neurobiology, perception, linguistics, the evolution and function of molecular codes bioinformatics, thermal physics, molecular dynamics, quantum computing, black holes, information retrieval, intelligence gathering, plagiarism detection, pattern recognition, anomaly detection and even art creation.

Historical background


The landmark event establishing the discipline of information theory and bringing it to immediate worldwide attention was the publication of Claude E. Shannon's classic paper "A Mathematical Theory of Communication" in the Bell System Technical Journal in July and October 1948.

Prior to this paper, limited information-theoretic ideas had been developed at Boltzmann's constant, where W is the speed of transmission of intelligence, m is the number of different voltage levels tofrom at each time step, and K is a constant. Ralph Hartley's 1928 paper, Transmission of Information, uses the word information as a measurable quantity, reflecting the receiver's ability to distinguish one sequence of symbols from any other, thus quantifying information as , where S was the number of possible symbols, and n the number of symbols in a transmission. The unit of information was therefore the decimal digit, which since has sometimes been called the hartley in his honor as a unit or scale or degree of information. Alan Turing in 1940 used similar ideas as factor of the statistical analysis of the breaking of the Germanworld war Enigma ciphers.

Much of the mathematics unhurried information theory with events of different probabilities were developed for the field of thermodynamics by Ludwig Boltzmann and J. Willard Gibbs. Connections between information-theoretic entropy and thermodynamic entropy, including the important contributions by Rolf Landauer in the 1960s, are explored in Entropy in thermodynamics and information theory.

In Shannon's revolutionary and groundbreaking paper, the cause for which had been substantially completed at Bell Labs by the end of 1944, Shannon for the number one time exposed the qualitative and quantitative expediency example of communication as a statistical process underlying information theory, opening with the assertion:

With it came the ideas of