Entropy PDF
Entropy PDF
Entropy PDF
This article is about entropy in thermodynamics. For system (Q). (If heat is transferred out the sign would
other uses, see Entropy (disambiguation). be reversed giving a decrease in entropy of the system.)
Not to be confused with Enthalpy. The above denition is sometimes called the macroscopic
For a more accessible and less technical introduction to denition of entropy because it can be used without re-
this topic, see Introduction to entropy. gard to any microscopic description of the contents of a
system. The concept of entropy has been found to be gen-
erally useful and has several other formulations. Entropy
In statistical mechanics, entropy (usual symbol S) is re-
lated to the number of microscopic congurations that was discovered when it was noticed to be a quantity that
a thermodynamic system can have when in a state as spec- behaves as a function of state, as a consequence of the
ied by some macroscopic variables. Specically, as- second law of thermodynamics.
suming for simplicity that each of the microscopic con- Entropy is an extensive property. It has the dimension
gurations is equally probable, the entropy of the system of energy divided by temperature, which has a unit of
is the natural logarithm of that number of congurations, joules per kelvin (J K1 ) in the International System of
multiplied by the Boltzmann constant kB. Formally, Units (or kg m2 s2 K1 in terms of base units). But the
entropy of a pure substance is usually given as an intensive
propertyeither entropy per unit mass (SI unit: J K1
S = kB ln (assuming equiprobable states). kg1 ) or entropy per unit amount of substance (SI unit: J
K1 mol1 ).
This is consistent with 19th century formulas for en-
The absolute entropy (S rather than S) was dened later,
tropy in terms of heat and temperature, as discussed be-
using either statistical mechanics or the third law of ther-
low. Boltzmanns constant, and therefore entropy, have
modynamics, an otherwise arbitrary additive constant is
dimensions of energy divided by temperature.
xed such that the entropy of a pure substance at absolute
For example, gas in a container with known volume, pres- zero is zero. In statistical mechanics this reects that the
sure, and energy could have an enormous number of pos- ground state of a system is generally non-degenerate and
sible congurations of the collection of individual gas only one microscopic conguration corresponds to it.
molecules. At equilibrium, each instantaneous congu-
In the modern microscopic interpretation of entropy in
ration of the gas may be regarded as random. Entropy
statistical mechanics, entropy is the amount of additional
may be understood as a measure of disorder within a
information needed to specify the exact physical state of
macroscopic system. The second law of thermodynamics
a system, given its thermodynamic specication. Under-
states that an isolated systems entropy never decreases.
standing the role of thermodynamic entropy in various
Such systems spontaneously evolve towards thermody-
processes requires an understanding of how and why that
namic equilibrium, the state with maximum entropy.
information changes as the system evolves from its initial
Non-isolated systems may lose entropy, provided their
to its nal state. It is often said that entropy is an expres-
environments entropy increases by at least that amount.
sion of the disorder, or randomness of a system, or of
Since entropy is a function of the state of the system, a
our lack of information about it. The second law is now
change in entropy of a system is determined by its ini-
often seen as an expression of the fundamental postulate
tial and nal states. This applies whether the process
of statistical mechanics through the modern denition of
is reversible or irreversible. However, irreversible pro-
entropy.
cesses increase the combined entropy of the system and
its environment.
In the mid-19th century, the change in entropy (S) of a
system undergoing a thermodynamically reversible pro- 1 History
cess was dened by Rudolf Clausius as:
Main article: History of entropy
Qrev
S = The French mathematician Lazare Carnot proposed in
T
his 1803 paper Fundamental Principles of Equilibrium
where T is the absolute temperature of the system, di- and Movement that in any machine the accelerations and
viding an incremental reversible transfer of heat into that shocks of the moving parts represent losses of moment of
1
2 2 DEFINITIONS AND DESCRIPTIONS
2.1 Function of state when, in fact, QH is greater than QC.[9][10] Through the
eorts of Clausius and Kelvin, it is now known that the
There are many thermodynamic properties that are maximum work that a heat engine can produce is the
functions of state. This means that at a particular ther- product of the Carnot eciency and the heat absorbed
modynamic state (which should not be confused with the from the hot reservoir:
microscopic state of a system), these properties have a
certain value. Often, if two properties of the system are
determined, then the state is determined and the other
properties values can also be determined. For instance, To derive the Carnot eciency, which is 1-(TC/TH) (a
a quantity of gas at a particular temperature and pressure number less than one), Kelvin had to evaluate the ratio of
has its state xed by those values and thus has a specic the work output to the heat absorbed during the isother-
volume that is determined by those values. As another in- mal expansion with the help of the Carnot-Clapeyron
stance, a system composed of a pure substance of a single equation which contained an unknown function, known
phase at a particular uniform temperature and pressure is as the Carnot function. The possibility that the Carnot
determined (and is thus a particular state) and is at not function could be the temperature as measured from a
only a particular volume but also at a particular entropy.[5] zero temperature, was suggested by Joule in a letter to
The fact that entropy is a function of state is one reason Kelvin. This allowed Kelvin to establish his absolute tem-
it is useful. In the Carnot cycle, the working uid returns perature scale.[11] It is also known that the work produced
to the same state it had at the start of the cycle, hence the by the system is the dierence between the heat absorbed
line integral of any state function, such as entropy, over from the hot reservoir and the heat given up to the cold
this reversible cycle is zero. reservoir:
When the second equation is used to express the work as state of the system. Clausius wrote that he intention-
a dierence in heats, we get ally formed the word Entropy as similar as possible to
the word Energy, basing the term on the Greek
( ) trop, transformation.[13][note 1]
QH QC < 1 TH QH TC
where b is the density matrix, Tr is trace (linear algebra) In a thermodynamic system, pressure, density, and tem-
and log is the matrix logarithm. This density matrix for- perature tend to become uniform over time because the
mulation is not needed in cases of thermal equilibrium equilibrium state has higher probability (more possible
so long as the basis states are chosen to be energy eigen- combinations of microstates) than any other state.
states. For most practical purposes, this can be taken as As an example, for a glass of ice water in air at room tem-
the fundamental denition of entropy since all other for- perature, the dierence in temperature between a warm
mulas for S can be mathematically derived from it, but room (the surroundings) and cold glass of ice and water
not vice versa. (the system and not part of the room), begins to equal-
6 2 DEFINITIONS AND DESCRIPTIONS
20,000 ft^3/lbm
50,000 ft^3/lbm
1000 ft^3/lbm
2000 ft^3/lbm
5000 ft^3/lbm
100 ft^3/lbm
200 ft^3/lbm
500 ft^3/lbm
100,000 psi
.02 ft^3/lbm
.05 ft^3/lbm
10 ft^3/lbm
20 ft^3/lbm
50 ft^3/lbm
50,000 psi
20,000 psi
10,000 psi
.1 ft^3/lbm
.2 ft^3/lbm
.5 ft^3/lbm
1 ft^3/lbm
2 ft^3/lbm
5 ft^3/lbm
5000 psi
2000 psi
1000 psi
500 psi
200 psi
100 psi
.05 psi
.02 psi
.01 psi
20 psi
10 psi
.5 psi
.2 psi
.1 psi
5 psi
2 psi
1 psi
2200
2100
2000
Unlike many other functions of state, entropy cannot be
1900
directly observed but must be calculated. Entropy can
1800 be calculated for a substance as the standard molar en-
1700
supercritical region
tropy from absolute zero (also known as absolute entropy)
1600
or as a dierence in entropy from some other reference
1500
vapor region
liquid region
1200
1100
tem of Units. While these are the same units as heat ca-
1000
pacity, the two concepts are distinct.[28] Entropy is not a
900
conserved quantity: for example, in an isolated system
800 with non-uniform temperature, heat might irreversibly
700
saturated region
ow and the temperature become more uniform such that
600
entropy increases. The second law of thermodynamics
500
100%
70%
0%
30%
50%
60%
80%
90%
20%
quality
40%
Main article: Second law of thermodynamics 4.1 The fundamental thermodynamic rela-
See also: Thermodynamic Equilibrium and Non- tion
equilibrium thermodynamics
Main article: Fundamental thermodynamic relation
The second law of thermodynamics requires that, in gen-
eral, the total entropy of any system can't decrease other The entropy of a system depends on its internal energy
than by increasing the entropy of some other system. and its external parameters, such as its volume. In the
Hence, in a system isolated from its environment, the en- thermodynamic limit, this fact leads to an equation re-
tropy of that system tends not to decrease. It follows that lating the change in the internal energy U to changes in
heat can't ow from a colder body to a hotter body without the entropy and the external parameters. This relation
the application of work (the imposition of order) to the is known as the fundamental thermodynamic relation. If
colder body. Secondly, it is impossible for any device op- external pressure P bears on the volume V as the only
erating on a cycle to produce net work from a single tem- external parameter, this relation is:
perature reservoir; the production of net work requires
ow of heat from a hotter reservoir to a colder reservoir,
or a single expanding reservoir undergoing adiabatic cool-
dU = T dS pdV
ing, which performs adiabatic work. As a result, there is
no possibility of a perpetual motion system. It follows
Since both internal energy and entropy are monotonic
that a reduction in the increase of entropy in a specied
functions of temperature T, implying that the internal
process, such as a chemical reaction, means that it is en-
energy is xed when one species the entropy and the
ergetically more ecient.
volume, this relation is valid even if the change from
It follows from the second law of thermodynamics that the one state of thermal equilibrium to another with in-
entropy of a system that is not isolated may decrease. An nitesimally larger entropy and volume happens in a non-
air conditioner, for example, may cool the air in a room, quasistatic way (so during this change the system may be
thus reducing the entropy of the air of that system. The very far out of thermal equilibrium and then the entropy,
heat expelled from the room (the system), which the air pressure and temperature may not exist).
conditioner transports and discharges to the outside air,
The fundamental thermodynamic relation implies many
always makes a bigger contribution to the entropy of the
thermodynamic identities that are valid in general, inde-
environment than the decrease of the entropy of the air of
pendent of the microscopic details of the system. Impor-
that system. Thus, the total of entropy of the room plus
tant examples are the Maxwell relations and the relations
the entropy of the environment increases, in agreement
between heat capacities.
with the second law of thermodynamics.
In mechanics, the second law in conjunction with the
fundamental thermodynamic relation places limits on a 4.2 Entropy in chemical thermodynamics
systems ability to do useful work.[30] The entropy change
of a system at temperature T absorbing an innitesimal Thermodynamic entropy is central in chemical thermo-
amount of heat q in a reversible way, is given by q/T. dynamics, enabling changes to be quantied and the out-
More explicitly, an energy TR S is not available to do use- come of reactions predicted. The second law of ther-
ful work, where TR is the temperature of the coldest ac- modynamics states that entropy in an isolated system
cessible reservoir or heat sink external to the system. For the combination of a subsystem under study and its sur-
further discussion, see Exergy. roundings increases during all spontaneous chemical
Statistical mechanics demonstrates that entropy is gov- and physical processes. The Clausius equation of q/T
erned by probability, thus allowing for a decrease in dis- = S introduces the measurement of entropy change, S.
order even in an isolated system. Although this is pos- Entropy change describes the direction and quanties the
sible, such an event has a small probability of occurring, magnitude of simple changes such as heat transfer be-
making it unlikely.[31] tween systems always from hotter to cooler sponta-
neously.
The applicability of a second law of thermodynamics is
a limited to systems which are near or in equilibrium The thermodynamic entropy therefore has the dimension
state.[32] At the same time, laws governing systems which of energy divided by temperature, and the unit joule per
are far from equilibrium are still debatable. One of the kelvin (J/K) in the International System of Units (SI).
guiding principles for such systems is the maximum en- Thermodynamic entropy is an extensive property, mean-
tropy production principle.[33][34] It claims that a non- ing that it scales with the size or extent of a system. In
equilibrium systems evolves such as to maximize its en- many processes it is useful to specify the entropy as an
tropy production.[35][36] intensive property independent of the size, as a specic
8 5 ENTROPY CHANGE FORMULAS FOR SIMPLE PROCESSES
entropy characteristic of the type of system studied. Spe- tinct from the paths of entry and exit of matter from the
cic entropy may be expressed relative to a unit of mass, system.[40][41]
typically the kilogram (unit: Jkg1 K1 ). Alternatively, in To derive a generalized entropy balanced equation, we
chemistry, it is also referred to one mole of substance, in start with the general balance equation for the change in
which case it is called the molar entropy with a unit of any extensive quantity in a thermodynamic system, a
Jmol1 K1 . quantity that may be either conserved, such as energy, or
Thus, when one mole of substance at about 0K is warmed non-conserved, such as entropy. The basic generic bal-
by its surroundings to 298K, the sum of the incremental ance expression states that d/dt, i.e. the rate of change
values of q/T constitute each elements or compounds of in the system, equals the rate at which enters the
standard molar entropy, an indicator of the amount of en- system at the boundaries, minus the rate at which leaves
ergy stored by a substance at 298K.[37][38] Entropy change the system across the system boundaries, plus the rate at
also measures the mixing of substances as a summation which is generated within the system. For an open
of their relative quantities in the nal mixture.[39] thermodynamic system in which heat and work are trans-
Entropy is equally essential in predicting the extent and ferred by paths separate from the paths for transfer of
direction of complex chemical reactions. For such ap- matter, using this generic balance equation, with respect
plications, S must be incorporated in an expression that to the rate of change with time t of the extensive quantity
includes both the system and its surroundings, S entropy S, the entropy balance equation is:[42][note 2]
= S + S . This expression becomes, via
some steps, the Gibbs free energy equation for reactants K
dS Q
and products in the system: G [the Gibbs free energy = Mk Sk + + Sgen
change of the system] = H [the enthalpy change] T dt T
k=1
[37]
S [the entropy change]. where
K
k=1 Mk Sk = the net rate of entropy ow due
4.3 Entropy balance equation for open sys- to the ows of mass into and out of the system
tems (where S = entropy per unit mass).
Q
T = the rate of entropy ow due to the ow of
Heat added heat across the system boundary.
Q
Sgen = the rate of entropy production within
Work performed
external to boundary the system. This entropy production arises
Wshaft from processes within the system, including
chemical reactions, internal matter diusion,
internal heat transfer, and frictional eects
Hout such as viscosity occurring within the system
from mechanical work transfer to or from the
Hin system.
System boundary (open)
Note, also, that if there
are multiple heat ows, the term
Q/T is replaced by Qj /Tj , where Qj is the heat ow
and Tj is the temperature at the jth heat ow port into the
During steady-state continuous operation, an entropy balance ap- system.
plied to an open system accounts for system entropy changes re-
lated to heat ow and mass ow across the system boundary.
and pressure P at any constant temperature, the change change for the transition, and the entropy change is the
in entropy is given by: enthalpy change divided by the thermodynamic temper-
ature. For fusion (melting) of a solid to a liquid at the
melting point T , the entropy of fusion is
V P
S = nR ln = nR ln .
V0 P0
Hfus
Here n is the number of moles of gas and R is the ideal Sfus = .
Tm
gas constant. These equations also apply for expansion
into a nite vacuum or a throttling process, where the Similarly, for vaporization of a liquid to a gas at the boil-
temperature, internal energy and enthalpy for an ideal gas ing point T , the entropy of vaporization is
remain constant.
Hvap
5.2 Cooling and heating Svap =
Tb
.
energy transformation from one state or form to another. spontaneous changes are always accompanied by a dis-
In this direction, several recent authors have derived ex- persal of energy.[54]
act entropy formulas to account for and measure disorder
and order in atomic and molecular assemblies.[48][49][50]
One of the simpler entropy order/disorder formulas is 6.4 Relating entropy to energy usefulness
that derived in 1984 by thermodynamic physicist Peter
Landsberg, based on a combination of thermodynamics Following on from the above, it is possible (in a thermal
and information theory arguments. He argues that when context) to regard entropy as an indicator or measure of
constraints operate on a system, such that it is prevented the eectiveness or usefulness of a particular quantity of
from entering one or more of its possible or permitted energy.[55] This is because energy supplied at a high tem-
states, as contrasted with its forbidden states, the mea- perature (i.e. with low entropy) tends to be more useful
sure of the total amount of disorder in the system is than the same amount of energy available at room tem-
given by:[49][50] perature. Mixing a hot parcel of a uid with a cold one
produces a parcel of intermediate temperature, in which
the overall increase in entropy represents a loss which
CD can never be replaced.
Disorder = .
CI Thus, the fact that the entropy of the universe is steadily
Similarly, the total amount of order in the system is increasing, means that its total energy is becoming less
given by: useful: eventually, this will lead to the "heat death of the
Universe".[56]
CO
Order = 1 . 6.5 Entropy and adiabatic accessibility
CI
In which CD is the disorder capacity of the system, A denition of entropy based entirely on the relation
which is the entropy of the parts contained in the per- of adiabatic accessibility between equilibrium states was
mitted ensemble, CI is the information capacity of the given by E.H.Lieb and J. Yngvason in 1999.[57] This ap-
system, an expression similar to Shannons channel ca- proach has several predecessors, including the pioneering
pacity, and CO is the order capacity of the system.[48] work of Constantin Carathodory from 1909[58] and the
monograph by R. Giles.[59] In the setting of Lieb and Yn-
gvason one starts by picking, for a unit amount of the sub-
6.3 Energy dispersal stance under consideration, two reference states X0 and
X1 such that the latter is adiabatically accessible from the
Main article: Entropy (energy dispersal) former but not vice versa. Dening the entropies of the
reference states to be 0 and 1 respectively the entropy of
The concept of entropy can be described qualitatively as a a state X is dened as the largest number such that X
measure of energy dispersal at a specic temperature.[51] is adiabatically accessible from a composite state consist-
Similar terms have been in use from early in the history of ing of an amount in the state X1 and a complementary
classical thermodynamics, and with the development of amount, (1) , in the state X0 . A simple but important
statistical thermodynamics and quantum theory, entropy result within this setting is that entropy is uniquely deter-
changes have been described in terms of the mixing or mined, apart from a choice of unit and an additive con-
spreading of the total energy of each constituent of a stant for each chemical element, by the following prop-
system over its particular quantized energy levels. erties: It is monotonic with respect to the relation of adi-
abatic accessibility, additive on composite systems, and
Ambiguities in the terms disorder and chaos, which usu- extensive under scaling.
ally have meanings directly opposed to equilibrium, con-
tribute to widespread confusion and hamper comprehen-
sion of entropy for most students.[52] As the second law 6.6 Entropy in quantum mechanics
of thermodynamics shows, in an isolated system inter-
nal portions at dierent temperatures tend to adjust to Main article: von Neumann entropy
a single uniform temperature and thus produce equilib-
rium. A recently developed educational approach avoids
In quantum statistical mechanics, the concept of entropy
ambiguous terms and describes such spreading out of en-
was developed by John von Neumann and is generally re-
ergy as dispersal, which leads to loss of the dierentials
ferred to as "von Neumann entropy",
required for work even though the total energy remains
constant in accordance with the rst law of thermody-
namics[53] (compare discussion in next section). Physical
S = kB Tr( log )
chemist Peter Atkins, for example, who previously wrote
of dispersal leading to a disordered state, now writes that where is the density matrix and Tr is the trace operator.
11
for two reasons. In the rst place your uncertainty and if entropy is measured in units of k per nat, then the
function has been used in statistical mechanics under entropy is given[67] by:
that name, so it already has a name. In the second place,
and more important, nobody knows what entropy re-
ally is, so in a debate you will always have the advantage.
H = k log(W )
Conversation between Claude Shannon and John von which is the famous Boltzmann entropy formula when k
Neumann regarding what name to give to the attenuation is Boltzmanns constant, which may be interpreted as the
in phone-line signals[60] thermodynamic entropy per nat. There are many ways of
Main articles: Entropy (information theory), Entropy in demonstrating the equivalence of information entropy
thermodynamics and information theory, and Entropic and physics entropy, that is, the equivalence of Shan-
uncertainty non entropy and Boltzmann entropy. Nevertheless,
some authors argue for dropping the word entropy for
the H function of information theory and using Shannons
When viewed in terms of information theory, the entropy other term uncertainty instead.[68]
state function is simply the amount of information (in the
Shannon sense) that would be needed to specify the full
microstate of the system. This is left unspecied by the
macroscopic description. 7 Interdisciplinary applications of
In information theory, entropy is the measure of the entropy
amount of information that is missing before recep-
tion and is sometimes referred to as Shannon en- Although the concept of entropy was originally a
tropy.[61] Shannon entropy is a broad and general concept thermodynamic construct, it has been adapted in
which nds applications in information theory as well as other elds of study, including information theory,
thermodynamics. It was originally devised by Claude psychodynamics, thermoeconomics/ecological eco-
Shannon in 1948 to study the amount of information in nomics, and evolution.[69][70][71][72][73] For instance,
a transmitted message. The denition of the informa- an entropic argument has been recently proposed for
tion entropy is, however, quite general, and is expressed explaining the preference of cave spiders in choosing a
in terms of a discrete set of probabilities pi so that suitable area for laying their eggs.[74]
12 7 INTERDISCIPLINARY APPLICATIONS OF ENTROPY
Standard molar entropy is the entropy content of Since a nite universe is an isolated system, the Second
one mole of substance, under conditions of standard Law of Thermodynamics states that its total entropy is
temperature and pressure. constantly increasing. It has been speculated, since the
19th century, that the universe is fated to a heat death
Residual entropy the entropy present after a sub- in which all the energy ends up as a homogeneous distri-
stance is cooled arbitrarily close to absolute zero. bution of thermal energy, so that no more work can be
extracted from any source.
Entropy of mixing the change in the entropy when If the universe can be considered to have generally in-
two dierent chemical substances or components are creasing entropy, then as Roger Penrose has pointed
mixed. out gravity plays an important role in the increase be-
cause gravity causes dispersed matter to accumulate into
Loop entropy is the entropy lost upon bringing to- stars, which collapse eventually into black holes. The en-
gether two residues of a polymer within a prescribed tropy of a black hole is proportional to the surface area
distance. of the black holes event horizon.[76] Jacob Bekenstein
and Stephen Hawking have shown that black holes have
Conformational entropy is the entropy associated the maximum possible entropy of any object of equal
with the physical arrangement of a polymer chain size. This makes them likely end points of all entropy-
that assumes a compact or globular state in solution. increasing processes, if they are totally eective matter
and energy traps. However, the escape of energy from
Entropic force a microscopic force or reaction black holes might be possible due to quantum activity, see
tendency related to system organization changes, Hawking radiation. In 2014 Hawking changed his stance
molecular frictional considerations, and statistical on some details, in a paper which largely redened the
variations. event horizons of black holes, positing that black holes
do not exist.[77]
Free entropy an entropic thermodynamic potential The role of entropy in cosmology remains a controver-
analogous to the free energy. sial subject since the time of Ludwig Boltzmann. Re-
cent work has cast some doubt on the heat death hypoth-
Entropic explosion an explosion in which the re- esis and the applicability of any simple thermodynamic
actants undergo a large change in volume without model to the universe in general. Although entropy does
releasing a large amount of heat. increase in the model of an expanding universe, the max-
imum possible entropy rises much more rapidly, moving
Entropy change a change in entropy dS between the universe further from the heat death with time, not
two equilibrium states is given by the heat trans- closer.[78][79][80] This results in an entropy gap push-
ferred dQrev divided by the absolute temperature T ing the system further away from the posited heat death
of the system in this interval. equilibrium.[81] Other complicating factors, such as the
energy density of the vacuum and macroscopic quantum
Sackur-Tetrode entropy the entropy of a eects, are dicult to reconcile with thermodynamical
monatomic classical ideal gas determined via models, making any predictions of large-scale thermody-
quantum considerations. namics extremely dicult.[82]
13
Current theories suggest the entropy gap to have been Harmonic entropy
originally opened up by the early rapid exponential ex-
pansion of the universe.[83] Heat death of the universe
Laws of thermodynamics
7.4 Economics Multiplicity function
Entropy and life [6] 6.5 Irreversibility, Entropy Changes, and ``Lost Work".
web.mit.edu. Retrieved 21 May 2016.
Entropy (order and disorder)
[7] Lower, Stephen. What is entropy?". www.chem1.com.
Entropy rate Retrieved 21 May 2016.
[9] Carnot, Sadi Carnot (1986). Fox, Robert, ed. Reexions [25] Haynie, Donald, T. (2001). Biological Thermodynamics.
on the motive power of re. New York, NY: Lilian Barber Cambridge University Press. ISBN 0-521-79165-0.
Press. p. 26. ISBN 978-0-936508-16-0.
[26] Daintith, John (2005). A dictionary of science (5th
[10] Truesdell, C. (1980). The tragicomical history of thermo- ed.). Oxford: Oxford University Press. ISBN 978-0-19-
dynamics 18221854. New York: Springer. pp. 7885. 280641-3.
ISBN 978-0-387-90403-0.
[27] de Rosnay, Joel (1979). The Macroscope a New World
[11] Clerk Maxwel, James (2001). Pesic, Peter, ed. Theory of View (written by an M.I.T.-trained biochemist). Harper &
heat. Mineola: Dover Publications. pp. 115158. ISBN Row, Publishers. ISBN 0-06-011029-5.
978-0-486-41735-6.
[28] J. A. McGovern,Heat Capacities. Archived from the
[12] Rudolf Clausius (1867). The Mechanical Theory of Heat: original on 2012-08-19. Retrieved 2013-01-27.
With Its Applications to the Steam-engine and to the Phys-
ical Properties of Bodies. J. Van Voorst. p. 28. ISBN [29] Ben-Naim, Arieh (21 September 2007). On the So-
978-1-4981-6733-8. Called Gibbs Paradox, and on the Real Paradox (PDF).
Entropy. 9 (3): 132136. doi:10.3390/e9030133.
[13] Clausius, Rudolf (1865). Ueber verschiedene fr die
Anwendung bequeme Formen der Hauptgleichungen der [30] Daintith, John (2005). Oxford Dictionary of Physics. Ox-
mechanischen Wrmetheorie: vorgetragen in der natur- ford University Press. ISBN 0-19-280628-9.
forsch. Gesellschaft den 24. April 1865. p. 46.
[31] Saha, Arnab; Lahiri, Sourabh; Jayannavar, A. M.
[14] Atkins, Peter; Julio De Paula (2006). Physical Chemistry, (2009). Entropy production theorems and some
8th ed. Oxford University Press. p. 79. ISBN 0-19- consequences. Physical Review E. 80: 110.
870072-5. doi:10.1103/PhysRevE.80.011117.
[15] Engel, Thomas; Philip Reid (2006). Physical Chemistry. [32] Martyushev, L. M.; Seleznev, V. D. (2014). The
Pearson Benjamin Cummings. p. 86. ISBN 0-8053- restrictions of the maximum entropy production prin-
3842-X. ciple. Physica A: Statistical Mechanics and its
Applications. 410: 1721. arXiv:1311.2068 .
[16] Licker, Mark D. (2004). McGraw-Hill concise encyclope- doi:10.1016/j.physa.2014.05.014.
dia of chemistry. New York: McGraw-Hill Professional.
ISBN 978-0-07-143953-4. [33] Ziegler, H. (1983). An Introduction to Thermomechanics.
North Holland, Amsterdam.
[17] Sethna, James P. (2006). Statistical mechanics : entropy,
order parameters, and complexity. ([Online-Ausg.]. ed.). [34] Onsager, Lars (1931). Reciprocal Relations in Ir-
Oxford: Oxford University Press. p. 78. ISBN 978-0-19- reversible Processes. Phys. Rev. 37: 405.
856677-9. doi:10.1103/PhysRev.37.405.
[18] Clark, John O.E. (2004). The essential dictionary of sci- [35] Kleidon, A.; et., al. (2005). Non-equilibrium Thermo-
ence. New York: Barnes & Noble. ISBN 978-0-7607- dynamics and the Production of Entropy. Heidelberg:
4616-5. Springer.
[19] Frigg, R. and Werndl, C. Entropy A Guide for the Per- [36] Belkin, Andrey; et., al. (2015). Self-assembled
plexed. In Probabilities in Physics; Beisbart C. and Hart- wiggling nano-structures and the principle of maxi-
mann, S. Eds; Oxford University Press, Oxford, 2010 mum entropy production. Scientic Reports. 5.
doi:10.1038/srep08323.
[20] Schroeder, Daniel V. (2000). An introduction to thermal
physics ([Nachdr.] ed.). San Francisco, CA [u.a.]: Addi- [37] Moore, J. W.; C. L. Stanistski; P. C. Jurs (2005). Chem-
son Wesley. p. 57. ISBN 978-0-201-38027-9. istry, The Molecular Science. Brooks Cole. ISBN 0-534-
42201-2.
[21] EntropyOrderParametersComplexity.pdf www.physics.
cornell.edu" (PDF). Retrieved 2012-08-17. [38] Jungermann, A.H. (2006). Entropy and the Shelf
Model: A Quantum Physical Approach to a Physi-
[22] Jaynes, E. T., The Gibbs Paradox, In Maximum En- cal Property. Journal of Chemical Education. 83
tropy and Bayesian Methods; Smith, C. R; Erickson, G. (11): 16861694. Bibcode:2006JChEd..83.1686J.
J; Neudorfer, P. O., Eds; Kluwer Academic: Dordrecht, doi:10.1021/ed083p1686.
1992, pp. 122 (PDF). Retrieved 2012-08-17.
[39] Levine, I. N. (2002). Physical Chemistry, 5th ed.
[23] Sandler, Stanley I. (2006). Chemical, biochemical, and McGraw-Hill. ISBN 0-07-231808-2.
engineering thermodynamics (4th ed.). New York: John
Wiley & Sons. p. 91. ISBN 978-0-471-66174-0. [40] Late Nobel Laureate Max Born (8 August 2015). Natural
Philosophy of Cause and Chance. BiblioLife. pp. 44,
[24] Simon, Donald A. McQuarrie; John D. (1997). Physi- 146147. ISBN 978-1-298-49740-6.
cal chemistry : a molecular approach (Rev. ed.). Sausal-
ito, Calif.: Univ. Science Books. p. 817. ISBN 978-0- [41] Haase, R. (1971). Thermodynamics. New York: Aca-
935702-99-6. demic Press. pp. 197. ISBN 0-12-245601-7.
15
[42] Sandler, Stanley, I. (1989). Chemical and Engineering [59] R. Giles (22 January 2016). Mathematical Foundations of
Thermodynamics. John Wiley & Sons. ISBN 0-471- Thermodynamics: International Series of Monographs on
83050-X. Pure and Applied Mathematics. Elsevier Science. ISBN
978-1-4831-8491-3.
[43] GRC.nasa.gov. GRC.nasa.gov. 2000-03-27. Retrieved
2012-08-17. [60] M. Tribus, E.C. McIrvine, Energy and information, Sci-
entic American, 224 (September 1971), pp. 178184
[44] Franzen, Stefan. Third Law. (PDF). ncsu.ed.
[61] Balian, Roger (2004). Entropy, a Protean concept. In
[45] GRC.nasa.gov. GRC.nasa.gov. 2008-07-11. Retrieved Dalibard, Jean. Poincar Seminar 2003: Bose-Einstein
2012-08-17. condensation entropy. Basel: Birkhuser. pp. 119144.
ISBN 978-3-7643-7116-6.
[46] Gribbin, John (1999). Gribbin, Mary, ed. Q is for quan-
tum : an encyclopedia of particle physics. New York: Free [62] Brillouin, Leon (1956). Science and Information Theory.
Press. ISBN 0-684-85578-X. ISBN 0-486-43918-6.
[47] Entropy: Denition and Equation. Encyclopdia Bri- [63] Georgescu-Roegen, Nicholas (1971). The Entropy Law
tannica. Retrieved 22 May 2016. and the Economic Process. Harvard University Press.
ISBN 0-674-25781-2.
[48] Brooks, Daniel R.; Wiley, E. O. (1988). Evolution as
[64] Chen, Jing (2005). The Physical Foundation of Economics
entropy : toward a unied theory of biology (2nd ed.).
an Analytical Thermodynamic Theory. World Scientic.
Chicago [etc.]: University of Chicago Press. ISBN 0-226-
ISBN 981-256-323-7.
07574-5.
[65] Kalinin, M.I.; Kononogov, S.A. (2005). Boltzmanns
[49] Landsberg, P.T. (1984). Is Equilibrium always an En- constant. Measurement Techniques. 48 (7): 632636.
tropy Maximum?". J. Stat. Physics. 35: 159169. doi:10.1007/s11018-005-0195-9.
Bibcode:1984JSP....35..159L. doi:10.1007/bf01017372.
[66] Ben-Naim, Arieh (2008). Entropy demystied the second
[50] Landsberg, P.T. (1984). Can Entropy and Order In- law reduced to plain common sense (Expanded ed.). Sin-
crease Together?". Physics Letters. 102A (4): 171 gapore: World Scientic. ISBN 9789812832269.
173. Bibcode:1984PhLA..102..171L. doi:10.1016/0375-
9601(84)90934-4. [67] Edwin T. Jaynes Bibliography. Bayes.wustl.edu.
1998-03-02. Retrieved 2009-12-06.
[51] Lambert, Frank L. A Students Approach to the Second
Law and Entropy. entropysite.oxy.edu. Archived from [68] Schneider, Tom, DELILA system (Deoxyribonucleic acid
the original on 17 July 2009. Retrieved 22 May 2016. Library Language), (Information Theory Analysis of
binding sites), Laboratory of Mathematical Biology, Na-
[52] Watson, J.R.; Carson, E.M. (May 2002). Undergraduate tional Cancer Institute, FCRDC Bldg. 469. Rm 144, P.O.
students understandings of entropy and Gibbs free en- Box. B Frederick, MD 21702-1201, USA
ergy. (PDF). University Chemistry Education. 6 (1): 4.
[69] Brooks, Daniel, R.; Wiley, E.O. (1988). Evolution
ISSN 1369-5614.
as Entropy Towards a Unied Theory of Biology.
[53] Lambert, Frank L. (February 2002). Disorder University of Chicago Press. ISBN 0-226-07574-5.
A Cracked Crutch for Supporting Entropy Discus- [70] Avery, John (2003). Information Theory and Evolution.
sions. Journal of Chemical Education. 79 (2): 187. World Scientic. ISBN 981-238-399-9.
doi:10.1021/ed079p187.
[71] Yockey, Hubert, P. (2005). Information Theory, Evolu-
[54] Atkins, Peter (1984). The Second Law. Scientic Ameri- tion, and the Origin of Life. Cambridge University Press.
can Library. ISBN 0-7167-5004-X. ISBN 0-521-80293-8.
[55] Sandra Saary (Head of Science, Latifa Girls School, [72] Chiavazzo, Eliodoro; Fasano, Matteo; Asinari,
Dubai) (23 February 1993). Book Review of A Sci- Pietro (2013). Inference of analytical thermody-
ence Miscellany"". Khaleej Times. Galadari Press, UAE: namic models for biological networks. Physica
XI. A: Statistical Mechanics and its Applications. 392
(5): 11221132. Bibcode:2013PhyA..392.1122C.
[56] Lathia, R; Agrawal, T; Parmar, V; Dobariya, K; Patel, doi:10.1016/j.physa.2012.11.030.
A (2015-10-20). Heat Death (The Ultimate Fate of the
Universe)". doi:10.13140/rg.2.1.4158.2485. [73] Chen, Jing (2015). The Unity of Science and Economics:
A New Foundation of Economic Theory. https://www.
[57] Lieb, Elliott H.; Yngvason, Jakob (March 1999). The springer.com/us/book/9781493934645: Springer.
physics and mathematics of the second law of ther-
modynamics. Physics Reports. 310 (1): 196. [74] Chiavazzo, Eliodoro; Isaia, Marco; Mammola, Stefano;
doi:10.1016/S0370-1573(98)00082-9. Lepore, Emiliano; Ventola, Luigi; Asinari, Pietro; Pugno,
Nicola Maria (2015). Cave spiders choose optimal en-
[58] Carathodory, C. (September 1909). Untersuchun- vironmental factors with respect to the generated en-
gen ber die Grundlagen der Thermodynamik. Math- tropy when laying their cocoon. Scientic Reports. 5:
ematische Annalen (in German). 67 (3): 355386. 7611. Bibcode:2015NatSR...5E7611C. PMC 5154591
doi:10.1007/BF01450409. . PMID 25556697. doi:10.1038/srep07611.
16 11 FURTHER READING
[75] IUPAC, Compendium of Chemical Terminology, 2nd ed. [88] Ayres, Robert U. (2007). On the practical
(the Gold Book) (1997). Online corrected version: limits to substitution (PDF). Ecological Eco-
(2006) "Entropy unit". nomics. Amsterdam: Elsevier. 61: 115128.
doi:10.1016/j.ecolecon.2006.02.011.
[76] von Baeyer, Christian, H. (2003). Information
the New Language of Science. Harvard University [89] Kerschner, Christian (2010). Economic de-growth vs.
Press. ISBN 0-674-01387-5.Srednicki M (August steady-state economy (PDF). Journal of Cleaner Pro-
1993). Entropy and area. Phys. Rev. Lett. 71 duction. Amsterdam: Elsevier. 18 (6): 544551.
(5): 666669. Bibcode:1993PhRvL..71..666S. doi:10.1016/j.jclepro.2009.10.019.
PMID 10055336. arXiv:hep-th/9303048 .
doi:10.1103/PhysRevLett.71.666.Callaway DJE (April
1996). Surface tension, hydrophobicity, and black
holes: The entropic connection. Phys. Rev. E. 53 11 Further reading
(4): 37383744. Bibcode:1996PhRvE..53.3738C.
PMID 9964684. arXiv:cond-mat/9601111 . Adam, Gerhard; Otto Hittmair (1992). Wrmethe-
doi:10.1103/PhysRevE.53.3738. orie. Vieweg, Braunschweig. ISBN 3-528-33311-1.
[77] Buchan, Lizzy. Black holes do not exist, says Stephen Atkins, Peter; Julio De Paula (2006). Physical
Hawking. Cambridge News. Retrieved 27 January 2014.
Chemistry (8th ed.). Oxford University Press. ISBN
[78] Layzer, David (1988). Growth of Order in the Universe. 0-19-870072-5.
MIT Press.
Baierlein, Ralph (2003). Thermal Physics. Cam-
[79] Chaisson, Eric J. (2001). Cosmic Evolution: The Rise of bridge University Press. ISBN 0-521-65838-1.
Complexity in Nature. Harvard University Press. ISBN
0-674-00342-X. Ben-Naim, Arieh (2007). Entropy Demystied.
[80] Lineweaver, Charles H.; Davies, Paul C. W.; Ruse, World Scientic. ISBN 981-270-055-2.
Michael, eds. (2013). Complexity and the Arrow of Time.
Cambridge University Press. ISBN 978-1-107-02725-1. Callen, Herbert, B (2001). Thermodynamics and an
Introduction to Thermostatistics (2nd ed.). John Wi-
[81] Stenger, Victor J. (2007). God: The Failed Hypothesis. ley and Sons. ISBN 0-471-86256-8.
Prometheus Books. ISBN 1-59102-481-1.
[82] Benjamin Gal-Or (1987). Cosmology, Physics and Philos-
Chang, Raymond (1998). Chemistry (6th ed.). New
ophy. Springer Verlag. ISBN 0-387-96526-2. York: McGraw Hill. ISBN 0-07-115221-0.
[83] Albrecht, Andreas (2004). Cosmic ination and the ar- Cutnell, John, D.; Johnson, Kenneth, J. (1998).
row of time (PDF). In Barrow, John D.; Davies, Paul Physics (4th ed.). John Wiley and Sons, Inc. ISBN
C.W.; Harper, Charles L., Jr. Science and Ultimate Re- 0-471-19113-2.
ality: From Quantum to Cosmos. Cambridge, UK: Cam-
bridge University Press. arXiv:astro-ph/0210527 . Re- Dugdale, J. S. (1996). Entropy and its Physical
trieved 28 June 2017 (in honor of John Wheelers 90th Meaning (2nd ed.). Taylor and Francis (UK); CRC
birthday) (US). ISBN 0-7484-0569-0.
[84] Georgescu-Roegen, Nicholas (1971). The Entropy Law Fermi, Enrico (1937). Thermodynamics. Prentice
and the Economic Process. (Full book accessible in three
Hall. ISBN 0-486-60361-X.
parts at SlideShare). Cambridge, Massachusetts: Harvard
University Press. ISBN 0-674-25780-4. Goldstein, Martin; Inge, F (1993). The Refrigerator
[85] Cleveland, Cutler J.; Ruth, Matthias (1997). When, and the Universe. Harvard University Press. ISBN
where, and by how much do biophysical limits constrain 0-674-75325-9.
the economic process? A survey of Nicholas Georgescu-
Roegens contribution to ecological economics (PDF). Gyftopoulos, E.P.; G.P. Beretta (2010). Thermo-
Ecological Economics. Amsterdam: Elsevier. 22 dynamics. Foundations and Applications. Dover.
(3): 203223. doi:10.1016/s0921-8009(97)00079-7. ISBN 0-486-43932-1.
Archived from the original (PDF) on 2015-12-08.
Haddad, Wassim M.; Chellaboina, VijaySekhar;
[86] Daly, Herman E.; Farley, Joshua (2011). Ecological Eco-
Nersesov, Sergey G. (2005). Thermodynamics A
nomics. Principles and Applications. (PDF contains full
Dynamical Systems Approach. Princeton University
book) (2nd ed.). Washington: Island Press. ISBN 978-1-
59726-681-9. Press. ISBN 0-691-12327-6.
[87] Schmitz, John E.J. (2007). The Second Law of Life: En- Kroemer, Herbert; Charles Kittel (1980). Thermal
ergy, Technology, and the Future of Earth As We Know Physics (2nd ed.). W. H. Freeman Company. ISBN
It. (Link to the authors science blog, based on his text- 0-7167-1088-9.
book). Norwich: William Andrew Publishing. ISBN 0-
8155-1537-5. Lambert, Frank L.; entropysite.oxy.edu
17
Mller-Kirsten, Harald J.W. (2013). Basics of Sta- Moriarty, Philip; Merrield, Michael (2009). S
tistical Physics (2nd ed.). Singapore: World Scien- Entropy. Sixty Symbols. Brady Haran for the
tic. ISBN 978-981-4449-53-3. University of Nottingham.
12 External links
Entropy and the Second Law of Thermodynamics
an A-level physics lecture with detailed derivation of
entropy based on Carnot cycle
Chen 0010334, Jianhui67, W. P. Uzer, Balljust, PhoenixPub, Technoalpha, DrADScott, ProKro, Anrnusna, Saad bin zubair, Quan-
tumMatt101, Dragonlord Jack, Elenceq, Monkbot, Yikkayaya, Asympto, George Evgenidis, Eczanne, Lamera1234, TaeYunPark, Vtor,
ClockWork96, Georgeciobanu, Gbkrishnappa2015, Eliodorochia, KasparBot, Asterixf2, Sir Cumference, Datbubblegumdoe, Gaeanautes,
Ericliu shu, Miller.alexb, Tanmay pathak987654, TomKaufmann869, Spinrade, Stemwinders, Ssmmachen, Samuelchuuu, InternetArchive-
Bot, SSA7471, PhyKBA, JosiahWilard, WandaLan, Sir.Arjit Chauhan, KGirlTrucker81, Fmadd, LathiaRutvik, Bender the Bot, Rippledj,
Apollo The Logician, PeterPresent, Chuckwoodjohn, TheRealChrisWill, Alan T, Lochan94s, Justeditingtoday, Dbelkina, CharlieBob,
Servicesoon, Sharonova, Sens15, AlanTuring546, Shubhamssl, Miopew, KolbertBot, JCW-CleanerBot and Anonymous: 837
13.2 Images
File:Clausius.jpg Source: https://upload.wikimedia.org/wikipedia/commons/4/40/Clausius.jpg License: Public domain Contributors:
http://www-history.mcs.st-andrews.ac.uk/history/Posters2/Clausius.html Original artist: Original uploader was user:Sadi Carnot at
en.wikipedia
File:First_law_open_system.svg Source: https://upload.wikimedia.org/wikipedia/commons/8/86/First_law_open_system.svg License:
Public domain Contributors:
First_law_open_system.png Original artist:
derivative work: Pbroks13 (talk)
File:Lock-green.svg Source: https://upload.wikimedia.org/wikipedia/commons/6/65/Lock-green.svg License: CC0 Contributors: en:File:
Free-to-read_lock_75.svg Original artist: User:Trappist the monk
File:System_boundary.svg Source: https://upload.wikimedia.org/wikipedia/commons/b/b6/System_boundary.svg License: Public do-
main Contributors: en:Image:System-boundary.jpg Original artist: en:User:Wavesmikey, traced by User:Stannered
File:Temperature-entropy_chart_for_steam,_US_units.svg Source: https://upload.wikimedia.org/wikipedia/commons/6/63/
Temperature-entropy_chart_for_steam%2C_US_units.svg License: CC BY-SA 3.0 Contributors: Own workData retrieved from: E.W.
Lemmon, M.O. McLinden and D.G. Friend, Thermophysical Properties of Fluid Systems in NIST Chemistry WebBook, NIST Standard
Reference Database Number 69, Eds. P.J. Linstrom and W.G. Mallard, National Institute of Standards and Technology, Gaithersburg
MD, 20899, http://webbook.nist.gov, (retrieved November 2, 2010).) Original artist: Emok
File:Wiktionary-logo-en-v2.svg Source: https://upload.wikimedia.org/wikipedia/commons/9/99/Wiktionary-logo-en-v2.svg License:
CC-BY-SA-3.0 Contributors: ? Original artist: ?