# Entropy

*For other uses of the term***entropy**, see Entropy (disambiguation)

**thermodynamic entropy**

*S*, often simply called the

**entropy**in the context of chemistry and thermodynamics, is a measure of the amount of energy in a physical system which cannot be used to do work. It is also a measure of the disorder present in a system.

Table of contents |

2 Statistical definition of entropy: Boltzmann's principle 3 Graphing entropy 4 Measuring entropy 5 See also 6 References 7 External links |

## Thermodynamic definition of entropy

The concept of entropy was introduced in 1865 by Rudolf Clausius. He defined the *change in entropy* of a thermodynamic system, during a *reversible process* in which an amount of heat *δQ* is applied at constant absolute temperature *T*, as

*S*the name "entropy", from the Greek word τρoπή, "transformation". Note that this equation involves only a change in entropy, so the entropy itself is only defined up to an additive constant. Later, we will discuss an alternative definition which uniquely defines the entropy.

### Entropy change in heat engines

A thermodynamic transformation is a change in a system's thermodynamic properties, such as its temperature and volume. A transformation is said to be reversible if, at each successive step of the transformation, the system is infinitesimally close to equilibrium; otherwise, the transformation is said to be irreversible. As an example, consider a gas enclosed in a piston chamber, whose volume may be changed by moving the piston. A reversible volume change is one that takes place so slowly that the density of the gas always remains homogeneous. An irreversible volume change is one that takes place so quickly that pressure waves are created within the gas, which is a state of disequilibrium. Reversible processes are sometimes referred to as quasi-static processes.

A heat engine is a thermodynamic system that can undergo a sequence of transformations which ultimately return it to its original state. This sequence is called a *cycle*. During some transformations, the engine may exchange heat with large systems known as heat reservoirs, which have a fixed temperature and can absorb or provide an arbitrary amount of heat. The net result of a cycle is (i) work done by the system (which may be negative, which is the same as positive work done *on* the system), and (ii) heat transferred between the heat reservoirs. By the conservation of energy, the heat lost by the heat reservoirs is exactly equal to the work done by the engine plus the heat gained by the heat reservoirs. (See cyclic process.)

If every transformation in the cycle is reversible, the cycle is reversible. This means that it can be run in reverse, i.e. the heat transfers occur in the opposite direction and the amount of work done switches sign. The simplest reversible cycle is a Carnot cycle, which exchanges heat with two heat reservoirs.

In thermodynamics, absolute temperature is *defined* in the following way. Suppose we have two heat reservoirs. If a Carnot cycle absorbs an amount of heat *Q* from the first reservoir and delivers an amount of heat *Q′* to the second, then the respective temperatures *T* and *T′* are given by

*Q*,

_{1}*Q*, ...,

_{2}*Q*with a sequence of

_{N}*N*heat reservoirs that have temperatures

*T*, ...,

_{1}*T*. We take each

_{N}*Q*to be positive if it represents heat received by the system, and negative if it represents heat emitted by the system. We will show that

To prove this, we introduce an additional heat reservoir at some arbitrary temperature *T _{0}*, as well as

*N*Carnot cycles that have the following property: the

*j*-th such cycle operates between the

*T*reservoir and the

_{0}*T*reservoir, transferring heat

_{j}*Q*to the latter. From the above definition of temperature, this means that the heat extracted from the

_{j}*T*reservoir by the

_{0}*j*-th cycle is

*N*Carnot cycles. At the end of this process, each of the reservoirs

*T*, ...,

_{1}*T*have no net heat loss, since the heat extracted by the heat engine is replaced by one of the Carnot cycles. The net result is (i) an unspecified amount of work done by the heat engine, and (ii) a total amount of heat extracted from the

_{N}*T*reservoir, equal to

_{0}

It is important to note that we have used *T _{j}* to refer to the temperature of each heat reservoir with which the system comes into contact, not the temperature of the system itself. If the cycle is not reversible, then heat always flows from higher temperatures to lower temperatures, so that

*T*is the temperature of the system while it is in thermal contact with the heat reservoir.

However, if the cycle is reversible, the system is always infinitesimally close to equilibrium, so its temperature must be equal to any reservoir with which it is contact. In that case, we may replace each *T _{j}* with

*T*. In the limiting case of a reversible cycle consisting of a

*continuous*sequence of transformations,

*T*is the temperature of the system at each step.

### Entropy as a state function

Since the integral is independent of the particular transformation taken, this equation is well-defined.
We now consider irreversible transformations. It is straightforward to show that the entropy change during any transformation between two *equilibrium* states is

Notice that if *dQ = 0*, then Δ*S* ≥ 0. The second law of thermodynamics is sometimes stated as this result: *the total entropy of a thermally isolated system can never decrease*.

Suppose a system is thermally isolated but remains in *mechanical* contact with the environment. If it is not in mechanical equilibrium with the environment, it will do work on the environment, or vice versa. For example, consider a gas enclosed in a piston chamber whose walls are perfect thermal insulators. If the pressure of the gas differs from the pressure applied to the piston, it will expand or contract, and work will be done. Our above result indicates that the entropy of the system will increase during this process (it could in principle remain constant, but this is unlikely.) Typically, there exists a maximum amount of entropy the system may possess under the circumstances. This entropy corresponds to a state of *stable equilibrium*, since a transformation to any other equilibrium state would cause the entropy to decrease, which is forbidden. Once the system reaches this maximum-entropy state, no more work may be done.

## Statistical definition of entropy: Boltzmann's principle

In 1877, Boltzmann realised that the entropy of a system may be related to the number of possible "microstates" (microscopic states) consistent with its thermodynamic properties. Consider, for example, an ideal gas in a container. A microstate is specified with the positions and momenta of each constituent atom. Consistency requires us to consider only those microstates for which (i) the positions of all the particles are located within the volume of the container, (ii) the kinetic energies of the atoms sum up to the total energy of the gas, and so forth. Boltzmann then postulated that

*k*is known as Boltzmann's constant and

*Ω*is the number of microstates that are consistent with the given macroscopic state. This postulate, which is known as Boltzmann's principle, may be regarded as the foundation of statistical mechanics, which describes thermodynamic systems using the statistical behaviour of its constituents. It relates a microscopic property of the system (Ω) to one of its thermodynamic properties (

*S*).

Under Boltzmann's definition, the entropy is clearly a function of state. Furthermore, since *Ω* is just a natural number (1,2,3,...), the entropy must be *positive* — this is simply a property of the logarithm.

### Entropy as a measure of disorder

We can view *Ω* as a measure of the disorder in a system. This is reasonable because what we think of as "ordered" systems tend to have very few configurational possibilities, and "disordered" systems have very many. Consider, for example, a set of 10 coins, each of which is either heads up or tails up. The most "ordered" macroscopic states are 10 heads or 10 tails; in either case, there is exactly one configuration that can produce the result. In contrast, the most "disordered" state consists of 5 heads and 5 tails, and there are ^{10}C_{5} = 252 ways to produce this result (see combinatorics.)

Under the statistical definition of entropy, the second law of thermodynamics states that the disorder in an isolated system tends to increase. This can be understood using our coin example. Suppose that we start off with 10 heads, and re-flip one coin at random every minute. If we examine the system after a long time has passed, it is *possible* that we will still see 10 heads, or even 10 tails, but that is not very likely; it is far more probable that we will see approximately as many heads as tails.

Since its discovery, the idea that disorder tends to increase has been the focus of a great deal of thought, some of it confused. A chief point of confusion is the fact that the result Δ*S* ≥ 0 applies only to *isolated* systems; notably, the Earth is not an isolated system because it is constantly receiving energy in the form of sunlight. Nevertheless, it has been pointed out that the universe may be considered an isolated system, so that its total disorder should be constantly increasing. It has been speculated that the universe is fated to a heat death in which all the energy ends up as a homogeneous distribution of thermal energy, so that no more work can be extracted from any source.

### Counting of microstates

In classical statistical mechanics, the number of microstates is actually infinite, since the properties of classical systems are continuous. For example, a microstate of a classical ideal gas is specified by the positions and momenta of all the atoms, which range continuously over the real numbers. Therefore, a method of "classifying" the microstates must be invented if we are to define *Ω*. In the case of the ideal gas, we count two states of an atom as the "same" state if their positions and momenta are within *δx* and *δp* of each other. Since the values of *δx* and *δp* can be chosen quite arbitrarily, the entropy is not uniquely defined; as before, it is still defined only up to an additive constant. This grouping of microstates is called coarse graining, and has its counterpart in the choice of basis states in quantum mechanics.

This ambiguity is partly resolved with quantum mechanics. The quantum state of a system can be expressed as a superposition of *basis states*, which are typically chosen to be eigenstates of the unperturbed Hamiltonian. In quantum statistical mechanics, Ω refers to the number of basis states consistent with the thermodynamic properties. Since the set of basis states is generally countable, we can define Ω.

However the choice of the set of basic states is still somehow arbitrary. It corresponds to the choice of coarse graining of microstates, to the distinct macrostates in classical physics.

This leads to Nernst's theorem, sometimes referred to as the third law of thermodynamics, which states that the entropy of a system at zero absolute temperature is a well-defined constant. This is due to the fact that a system at zero temperature exists in its ground state, so that its entropy is determined by the degeneracy of the ground state. Many systems, such as crystal lattices, have a unique ground state, and therefore have zero entropy at absolute zero (since ln(1) = 0).

## Graphing entropy

*Main article*: adiabatic process

The following equation can be used to graph entropy on a P-V diagram:

*C*and

_{V}*C*are constant, which they are really not, as seen below.

_{P}## Measuring entropy

In real experiments, it is quite difficult to measure the entropy of a system. The techniques for doing so are based on the thermodynamic definition of the entropy, and require extremely careful calorimetry.

For simplicity, we will examine a mechanical system, whose thermodynamic state may be specified by its volume *V* and pressure *P*. In order to measure the entropy of a specific state, we must first measure the heat capacity at constant volume and at constant pressure (denoted *C _{V}* and

*C*respectively), for a successive set of states intermediate between a reference state and the desired state. The heat capacities are related to the entropy

_{P}*S*and the temperature

*T*by

*X*subscript refers to either constant volume or constant pressure. This may be integrated numerically to obtain a change in entropy:

*P*,

*V*) with respect to a reference state (

*P*,

_{0}*V*). The exact formula depends on our choice of intermediate states. For example, if the reference state has the same pressure as the final state,

_{0}

The entropy of the reference state must be determined independently. Ideally, one chooses a reference state at an extremely high temperature, at which the system exists as a gas. The entropy in such a state would be that of a classical ideal gas plus contributions from molecular rotations and vibrations, which may be determined spectroscopically. Choosing a *low* temperature reference state is sometimes problematic since the entropy at low temperatures may behave in unexpected ways. For instance, a calculation of the entropy of ice by the latter method, assuming no entropy at zero temperature, falls short of the value obtained with a high-temperature reference state by 3.41 J/K/mol. This is due to the fact that the molecular crystal lattice of ice exhibits geometrical frustration, and thus possesses a non-vanishing "zero-point" entropy at arbitrarily low temperatures.

## See also

## References

- Fermi, E,
*Thermodynamics*, Prentice Hall (1937) - Reif, F.,
*Fundamentals of statistical and thermal physics*, McGraw-Hill (1965)

## External links