Encyclopedia  |   World Factbook  |   World Flags  |   Reference Tables  |   List of Lists     
   Academic Disciplines  |   Historical Timeline  |   Themed Timelines  |   Biographies  |   How-Tos     
Sponsor by The Tattoo Collection
Physical information
Main Page | See live article | Alphabetical index

Physical information

Physical information refers generally to the information that is contained in a physical system. First, what is information?


For our purposes in this article, information itself may be loosely defined as "that which can distinguish one thing from another." The information embodied by a thing can thus be said to be the identity of the particular thing itself, that is, all of its properties, all that makes it distinct from other (real or potential) things. It is a complete description of the thing, but in a sense that is divorced from any particular language. We might even consider the sum total of the information in a thing to be the ideal essence of the thing itself, i.e., its form in the sense of Plato's eidos (The Forms).

When discussing information, we should take care to distinguish between the following specific usages that are related to the word.

The above usages are clearly all conceptually distinct from each other. However, many people insist on overloading the word "information" (by itself) to denote (or connote) multiple of these concepts simultaneously. Since this may lead to confusion, we recommend instead using more detailed phrases (such as those shown in bold above) whenever the intended meaning is not made clear by the context.

Etymology. According to the Oxford English Dictionary(http://dictionary.oed.com/), the earliest historical meaning of the word information in English was the act of informing, or giving form or shape to the mind, as in education, instruction, or training. A quote from 1387: "Five books come down from heaven for information of mankind." It was also used for an item of training, e.g. a particular instruction. "Melibee had heard the great skills and reasons of Dame Prudence, and her wise informations and techniques." (1386) The English word was apparently derived by adding the common "noun of action" ending "-ation" (descended through French from Latin "-tio") to the earlier verb to inform, in the sense of to give form to the mind, to discipline, instruct, teach: "Men so wise should go and inform their kings." (1330) Inform itself comes (via French) from the Latin verb informare, to give form to, to form an idea of. Furthermore, Latin itself already even contained the word informatio meaning concept or idea, but the extent to which this may have influenced the development of the word information in English is unclear. As a final note, the ancient Greek word for form was eidos, and this word was famously used in a technical philosophical sense by Plato (and later Aristotle) to denote the ideal identity or essence of something, similarly to what we mean when we refer to the pattern of information that is instantiated in the nugget of information that composes a given object.

Classical vs. Quantum Information

The nugget of information that is contained in a physical system is generally considered to specify that system's "true" state. (In many pratical situations, a system's true state may be largely unknown, but a realist would insist that a physical system regardless always has, in principle, a true state of some sort--whether classical or quantum.)

When discussing the information that is contained in physical systems according to modern quantum physics, we must distinguish between classical information and quantum information. Quantum information specifies the complete quantum state vector (or equivalently, wavefunction) of a system, whereas classical information, roughly speaking, only picks out a definite (pure) quantum state if we are already given a prespecified set of distinguishable (orthogonal) quantum states to choose from; such a set forms a basis for the vector space of all the possible pure quantum states (see pure state). Quantum information could thus be expressed by providing (1) a choice of a basis such that the actual quantum state is equal to one of the basis vectors, together with (2) the classical information specifying which of these basis vectors is the actual one. (However, the quantum information by itself does not include a specification of the basis, indeed, an uncountable number of different bases will include any given state vector.)

Note that the amount of classical information in a quantum system gives the maximum amount of information that can actually be measured and extracted from that quantum system for use by external classical (decoherent) systems, since only basis states are operationally distinguishable from each other. The impossibility of differentiating between non-orthogonal states is a fundamental principle of quantum mechanics, equivalent to Heisenberg's uncertainty principle. Because of its more general utility, the remainder of this article will deal primarily with classical information, although quantum information theory does also have some potential applications (quantum computing, quantum cryptography, quantum teleportation) that are currently being actively explored by both theoreticians and experimentalists [1].

Quantifying Classical Physical Information

An amount of (classical) physical information may be quantified, as in information theory, as follows [2]. For a system S, defined abstractly in such a way that it has N distinguishable states (orthogonal quantum states) that are consistent with its description, the amount of information I(S) contained in the system's state can be said to be log(N). The logarithm is selected for this definition since it has the advantage that this measure of information content is additive when concatenating independent, unrelated subsystems; e.g., if subsystem A has N distinguishable states (I(A)=log(N) information content) and an independent subsystem B has M distinguishable states (I(B)=log(M) information content), then the concatenated system has NM distinguishable states and an information content I(AB) = log(NM) = log(N) + log(M) = I(A) + I(B). We expect information to be additive from our everyday associations with the meaning of the word, e.g., that two pages of a book can contain twice as much information as one page.

The base of the logarithm used in this definition is arbitrary, since it affects the result by only a multiplicative constant, which determines the unit of information that is implied. If the log is taken base 2, the unit of information is the binary digit or bit (so named by John Tukey); if we use a natural logarithm instead, we might call the resulting unit the "nat." In magnitude, a nat is apparently identical to Boltzmann's constant k or the ideal gas constant R, although these particular quantities are usually reserved to measure physical information that happens to be entropy, and that are expressed in physical units such as Joules per Kelvin, or kilocalories per mole per Kelvin.

Physical Information and Entropy

An easy way to understand the underlying unity between physical (as in thermodynamic) entropy and information-theoretic entropy is as follows: Entropy is simply that portion of the (classical) physical information contained in a system of interest (whether it is an entire physical system, or just a subsystem delineated by a set of possible messages) whose identity (as opposed to amount) is unknown (from the point of view of a particular knower). This informal characterization corresponds to both von Neumann's formal definition of the entropy of a mixed quantum state (which is just a statistical mixture of pure states; see Quantum statistical mechanics#Von Neumann entropy), as well as Claude Shannon's definition of the entropy of a probability distribution over classical signal states or messages (see information entropy) [2]. Incidentally, the credit for Shannon's entropy formula (though not its use in an information theory context) really belongs to Boltzmann, who derived it much earlier for use in his H-theorem of statistical mechanics [6]. (Shannon himself references Boltzmann in his monograph [7].)

Furthermore, even when the state of a system is known, we can say that the information in the system is still effectively entropy if that information is effectively incompressible, that is, if there are no known or feasibly determinable correlations or redundancies between different pieces of information within the system. Note that this definition of entropy can even be viewed as equivalent to the previous one (unknown information) if we take a meta-perspective, and say that for observer A to "know" the state of system B means simply that there is a definite correlation between the state of observer A and the state of system B; this correlation could thus be used by a meta-observer (that is, whoever is discussing the overall situation regarding A's state of knowledge about B) to compress his own description of the joint system AB [3].

Due to this connection with algorithmic information theory, entropy can be said to be that portion of a system's information capacity which is "used up," that is, unavailable for storing new information (even if the existing information content were to be compressed). The rest of a system's information capacity (aside from its entropy) might be called extropy, and it represents the part of the system's information capacity which is potentially still available for storing newly derived information. The fact that physical entropy is basically "used-up storage capacity" is a direct concern in the engineering of computing systems; e.g., a computer must first remove the entropy from a given physical subsystem (eventually expelling it to the environment, and emitting heat) in order for that subsystem to be used to store some newly computed information.


  1. Michael A. Nielsen and Isaac L. Chuang, Quantum Computation and Quantum Information, Cambridge University Press, 2000.
  2. Michael P. Frank, "Physical Limits of Computing", Computing in Science and Engineering, 4(3):16-25, May/June 2002. http://www.cise.ufl.edu/research/revcomp/physlim/plpaper.html.
  3. W. H. Zurek, "Algorithmic randomness, physical entropy, measurements, and the demon of choice," in [4], pp. 393-410, and reprinted in [5], pp. 264-281.
  4. J. G. Hey, ed., Feynman and Computation: Exploring the Limits of Computers, Perseus, 1999.
  5. Harvey S. Leff and Andrew F. Rex, Maxwell's Demon 2: Entropy, Classical and Quantum Information, Computing, Institute of Physics Publishing, 2003.
  6. Carlo Cercignani, Ludwig Boltzmann: The Man Who Trusted Atoms, Oxford University Press, 1998.
  7. Claude E. Shannon and Warren Weaver, Mathematical Theory of Communication, University of Illinois Press, 1963.