Information Theory

Information theory is a branch of applied mathematics, engineering, and computer science involving the quantification of information. Information theory was developed by Claude E. Shannon to find fundamental limits on signal processing operations such as compressing data and on reliably storing and communicating data. Information is modelled as a random sequence. There is no model of meaning associated with this concept of information. This contrasts with the semantic concept of information.

In Shannon's theory key measure of information is entropy, which is usually expressed by the average number of bits needed to store or communicate one symbol in a message. Entropy quantifies the uncertainty involved in predicting the value of a random variable. For example, specifying the outcome of a fair coin flip (two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a die (six equally likely outcomes).

Since its inception its application has broadened to many areas, including statistical inference, natural language processing, cryptography, neurophysiology, the evolution and function of molecular codes, model selection in ecology, thermal physics,quantum computing, plagiarism detection, gambling schemes and other forms of data analysis.

In the semantic conception the General Definition of Information is (GDI):

σ is an instance of information, understood as semantic content, if and only if:

(GDI.1) σ consists of one or more data (what information is made of)

(GDI.2) the data in σ are well-formed (correct syntax)

(GDI.3) the well-formed data in σ are meaningful (this is with reference to a system with a recognised model of meaning)

Where a datum is a generally accepted fact regarding some difference, distinction or lack of uniformity within some recognised context. This is known as the diaphoric definition of data.