Theconceptof Entropy

Download as pdf or txt
Download as pdf or txt
You are on page 1of 26

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/309735225

The concept of Entropy in Physical Science

Presentation · November 2011

CITATIONS READS

0 2,978

7 authors, including:

Shubhadeep Nag Suchetan Das


Indian Institute of Science Ramakrishna Mission Vivekananda University
5 PUBLICATIONS 0 CITATIONS 14 PUBLICATIONS 25 CITATIONS

SEE PROFILE SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Geometric Algebra View project

Geometric Algebra View project

All content following this page was uploaded by Debashis Sen on 08 November 2016.

The user has requested enhancement of the downloaded file.


Presented by: Shubhadeep Nag, Suchetan Das, Sarif Khan, Ayan
Kumar Pal, Anirban Chatterjee and Nilanjan Biswas

(2nd Year Physics Honours)

Department of Physics, Maulana Azad College

Supervisor: Prof. Debashis Sen

Date: 02.11.11
“If someone points out to you that your pet theory of the
universe is in disagreement with Maxwell‟s equations - then
so much the worse for Maxwell‟s equations. If it is found to be
contradicted by observation - well, these experimentalists do
bungle things sometimes. But if your theory is found to be
against the second law of thermodynamics I can offer you no
hope; there is nothing for it but to collapse in deepest
humiliation.” - - Sir Arthur Eddington.
Entropy (s) is defined phenomenologically by the second law of
thermodynamics (Carnot's principle), as a function of other
thermodynamic variables. It (the second law) states that the
entropy of an isolated system always increases or remains
constant. The term entropy was coined in 1865 by Rudolf Clausius
from the Greek word `entropia‟ meaning - a turning toward.

Combining first and second law of thermodynamics, for a simple


thermodynamics system, we get the basic thermodynamic equation
involving entropy, as: Tds = du + pdv, where u, T, p and v,
respectively represent the internal energy, temperature, pressure
and volume of the system.
Today the word entropy is used in different branches of
physical science with varied connotations:
1. In Thermodynamics, it is a function of thermodynamic
variables of a system, which provides a measure of the
unavailable energy for work during a thermodynamic
process. A closed system evolves toward a state of
maximum entropy. Thus, the fact that the entropy of the
universe is steadily increasing, means that its total energy
is becoming less and less useful: eventually, this will lead
to the `heat death of the Universe‟.

Entropy has the dimension of energy divided by


temperature, and is an extensive thermodynamic property,
meaning that it scales with the size of a system.
... contd.
From the basic thermodynamic equation it follows that, the
temperature of the system in absolute scale is determined by the
derivative (partial) of entropy w. r. t. the internal energy.

2. In statistical mechanics: a measure of the randomness of the


microscopic constituents of a thermodynamic system.

3. In data transmission and information theory: a measure of the


loss of information in a transmitted signal or message.

4. In cosmology: a hypothetical tendency for the universe to


attain a state of maximum homogeneity in which all matter is at a
uniform temperature (heat death).
Entropy in Thermodynamics and classical statistical mechanics:
The (thermodynamic) entropy function earlier introduced by Clausius is
changed to statistical entropy using probability theory. The statistical
perspective of entropy was introduced in 1870 by the Austrian physicist
Ludwig Boltzmann and further elaborated by Max Planck and J. Willard
Gibbs.

Rudolf Clausius Ludwig Boltzmann


(1822 – 1888) (1844 – 1906)
Max Planck J. Willard Gibbs
(1858 – 1947) (1839 – 1903)

Entropy in Boltzmann‟s definition, is a measure of the number of


possible microstates of a system in thermodynamic equilibrium,
consistent with its macroscopic thermodynamic properties (or
macrostate). To understand what microstates and macrostates are,
consider the example of a gas in a container.
... contd.
At a microscopic level, the gas consists of a vast number of
freely moving atoms, which occasionally collide with one
another and with the walls of the container. The microstate of
the system is a description of the positions and momenta of all
the atoms. In principle, all the physical properties of the system
are determined by its microstate. However, the motion of
individual atoms is mostly irrelevant to the behaviour of the
system as a whole, if the number of atoms is very large.
Providentially, very close to the thermodynamic equilibrium, the
system can be adequately described by a handful of
macroscopic quantities, called „thermodynamic variables‟: the
internal energy u, volume v, pressure p, temperature T, and so
forth. The macrostate of the system is a description of its
thermodynamic variables.
Three important points to note:

To specify any one microstate , we need to write down an


impractically long list of numbers, whereas specifying a macrostate
requires only a few numbers (p, v etc.).

However, and this is the second point, the usual thermodynamic


equations only describe the macrostate of a system adequately
when this system is in equilibrium; non-equilibrium situations can
not be described generally by a small number of variables. For
example, if a gas is sloshing around in its container, even a
macroscopic description would have to include, e.g., the velocity of
the fluid at each different point. Actually, the macroscopic state of
the system will be described by a small number of variables only if
the system is at global thermodynamic equilibrium.

Thirdly, more than one microstate can correspond to a single


macrostate. In fact, for any given macrostate, there will be a huge
number of microstates that are consistent with the given values of
u, v etc.
We are now ready to provide a definition of entropy.
The entropy s may be defined (according to Planck) as

s = kB ln Ω + s0,

where, kB is Boltzmann‟s constant and Ω is the number


of microstates consistent with the given macrostate.

According to the above definition, the minimum value


of entropy corresponds to Ω = 1, and entropy can
never be negative. If Ω is the thermodynamic
probability corresponding to the most probable
distribution under the given constraint then s is the
equilibrium value of entropy of the system.
... contd.
In classical thermodynamics usually the equilibrium
value of entropy is defined. But if we give Ω values
corresponding to any non-equilibrium distribution,
then s may be interpreted as the corresponding
entropy.
However, the macroscopic state of the system is
defined by a distribution on the microstates that are
accessible to a system in the course of its thermal
fluctuations. Thus, the entropy is defined over two
different levels of description of the given system.
In Gibbs formulation (J. W. Gibbs), for a classical system
(i.e., a collection of classical particles) with a discrete set
of microstates i, if ωi be the probability of its occurrence,
then the entropy of the system is

s = −kB ∑i ωi ln ωi,

This definition remains valid even when the system is far


away from equilibrium. Other definitions assume that the
system is in thermal equilibrium, either as an isolated
system, or as a system in exchange with its surroundings.
The set of microstates on which the sum is to be done is
called a statistical ensemble.
… contd.
The statistical entropy (Gibbs) reduces to Boltzmann‟s
entropy when all the accessible microstates of the
system are equally likely. It is also the configuration
corresponding to the maximum of a system‟s entropy for
a given set of accessible microstates, in other words, the
macroscopic configuration in which the lack of
information is maximal. As such, according to the second
law of thermodynamics, it is the equilibrium configuration
of an isolated system. Boltzmann‟s entropy is the
expression of entropy at thermodynamic equilibrium in
the microcanonical ensemble.
Entropy and the third law of thermodynamics: The
behaviour of a number of thermodynamic quantities agree
with the assumption that cooling of the system is attended
by diminishing of the entropy and that its value at
absolute zero is zero. Consequently, the arbitrary
constant mentioned in Boltzmann-Planck formula must be
chosen to satisfy: sT=0 = 0. This condition is called the
third law of thermodynamics and it is not difficult to show
that it follows directly from the Gibbs definition.
Entropy and the arrow of time: Entropy is the only
physical quantity (apart from certain Interactions e.g. weak
interaction in particle physics) that requires a particular
direction for time, sometimes called an arrow of time. As one
goes „forward‟ in time, the second law of thermodynamics
says, the entropy of an isolated system will increase. Hence,
from one perspective, entropy measurement is a way of
distinguishing the past from the future. However, for
thermodynamic systems that are not closed, entropy can
decrease with time -- many systems, including living
systems, reduce local entropy at the expense of an
environmental increase, resulting in a net increase in
entropy. Examples of such systems and phenomena include
the formation of certain crystals, the workings of a
refrigerator and living organisms.
... Contd.
By contrast, all physical processes occurring at the
microscopic level, such as mechanics, do not pick out an
arrow of time. Going forward in time, an atom might move
to the left, whereas going backward in time the same
atom might move to the right. The behaviour of the atom
is not qualitatively different in either case. In contrast, it
would be an astronomically improbable event if a
macroscopic amount of gas that originally filled a
container evenly spontaneously shrunk to occupy only
half the container.

Unlike most other laws of physics, the second law of


thermodynamics is statistical in nature, and therefore, its
reliability arises from the huge number of particles present
in macroscopic systems.
... Contd.
It is not impossible, in principle, for all 6×1023 atoms in a
mole of a gas to spontaneously migrate to one half of a
container; it is only fantastically unlikely, so unlikely that no
macroscopic violation of the second law has ever been
observed. T-Symmetry is the symmetry of physical laws
under a time reversal transformation. Although in restricted
contexts one may find this symmetry, the observable
universe itself does not show symmetry under time
reversal, primarily due to the second law of
thermodynamics.

The thermodynamic arrow is often linked to the


cosmological arrow of time, because it is ultimately about
the boundary conditions of the early universe. According to
the Big Bang theory, the universe was initially very hot with
energy distributed uniformly.
... Contd.
As the universe grows, its temperature drops, which leaves
less energy available to perform useful work in the future
than was available in the past. Additionally, perturbations in
the energy density grow (eventually forming galaxies and
stars). Thus the universe itself has a well-defined
thermodynamic arrow of time. But this does not address the
question of why the initial state of the universe was that of
low entropy. If cosmic expansion were to halt and reverse
due to gravity, the temperature of the universe would once
again grow hotter, but its entropy would also continue to
increase due to the continued growth of perturbations and
the eventual black hole formation, until the latter stages of
the Big Crunch when entropy would be lower than now.
Quote: “Just as the constant increase of entropy is
the basic law of the universe, so it is the basic law of
life to be ever more highly structured and to struggle
against entropy.”
- Va‟clav Havel.
Entropy in Cosmology: Since a finite universe is an isolated
system, the second law of thermodynamics states that its total
entropy is constantly increasing. It has been speculated that, the
universe is fated to a heat death, in which, all the energy ends up
as a homogeneous distribution of thermal energy so that, no more
work can be extracted from any source.

If the entropy of the universe be generally considered as


increasing, then as R. Penrose has pointed out gravity plays an
important role in the increase. Gravity causes dispersed matter to
accumulate into stars, which collapse eventually into black holes.
The entropy of a black hole is proportional to the surface area of
the black hole‟s event horizon. J. Bekenstein and S. Hawking have
shown that black holes have the maximum possible entropy of any
object of equal size. This makes them likely end points of all
entropy-increasing processes, if they are totally effective matter
and energy traps. (Hawking has, however, recently changed his
stance on this aspect.)
... Contd.
The role of entropy in cosmology remains a controversial
subject. Recent work has cast some doubt on the heat
death hypothesis and the applicability of any simple
thermodynamic model to the universe in general.
Although entropy does increase in the model of an
expanding universe, the maximum possible value of
entropy rises much more rapidly, moving the universe
further from the heat death with time, not closer. This
results in an „entropy gap‟ pushing the system further
away from the posited heat death equilibrium. Other
complicating factors, such as the energy density of the
vacuum and macroscopic quantum effects, are difficult to
reconcile with thermodynamic models, making any
predictions of large-scale thermodynamics extremely
difficult. The entropy gap is widely believed to have been
originally opened up by the early rapid exponential
expansion of the universe.
Other definitions of entropy:
• von Neumann entropy, named after John von
Neumann, is the extension of classical entropy
concepts to the field of quantum statistical mechanics.

•Shannon entropy and Re´nyi entropy (a generalisation


of Shannon entropy) in information theory, Kolmogorov-
Sinai Measure-theoretic entropy.

Entropy of a state of a system is also used as a


measure of quantum uncertainty and the proposed
„entropic‟ uncertainty relation is stronger than the
variance-based relation[3].
Entropy and Evolution-related concepts:

Negentropy: a shorthand colloquial phrase for negative


entropy.

Ectropy: a measure of the tendency of a dynamical system


to do useful work and grow more organized.

Extropy: a metaphorical term defining the extent of a living or


organizational system‟s intelligence, functional order, vitality,
energy, life, experience, and capacity and drive for
improvement and growth.

Ecological entropy: a measure of biodiversity in the study of


biological ecology.
1. J.S.Dugdale, Entropy and its physical meaning, (Taylor
and Francis 1996).

2. A. M. Vasilyev, Intro. to Stat. Phys., Mir Publishers,


1983.

3. Bialynicki-Birula, I. and Mycielski, J., Uncertainty relations


for information entropy in wave mechanics. Commun. Math.
Phys., 1975, 44, 129; D. Sen, The Uncertainty relations in
Quantum Mechanics, Manuscript under preparation.
THANK YOU
For your kind attention

View publication stats

You might also like