What Is Entropy? Definition and Examples

What Is Entropy Definition
Entropy is defined as a measure of a system’s disorder or the energy unavailable to do work.

Entropy is a key concept in physics and chemistry, with application in other disciplines, including cosmology, biology, and economics. In physics, it is part of thermodynamics. In chemistry, it is part of physical chemistry. Here is the entropy definition, a look at some important formulas, and examples of entropy.

  • Entropy is a measure of the randomness or disorder of a system.
  • Its symbol is the capital letter S. Typical units are joules per kelvin (J/K).
  • Change in entropy can have a positive (more disordered) or negative (less disordered) value.
  • In the natural world, entropy tends to increase. According to the second law of thermodynamics, the entropy of a system only decreases if the entropy of another system increases.

Entropy Definition

The simple definition is that entropy is that it is the measure of the disorder of a system. An ordered system has low entropy, while a disordered system has high entropy. Physicists often state the definition a bit differently, where entropy is the energy of a closed system that is unavailable to do work.

Entropy is an extensive property of a thermodynamic system, which means it depends on the amount of matter that is present. In equations, the symbol for entropy is the letter S. It has SI units of joules per kelvin (J⋅K−1) or kg⋅m2⋅s−2⋅K−1.

Examples of Entropy

Here are several examples of entropy:

  • As a layman’s example, consider the difference between a clean room and messy room. The clean room has low entropy. Every object is in its place. A messy room is disordered and has high entropy. You have to input energy to change a messy room into a clean one. Sadly, it never just cleans itself.
  • Dissolving increases entropy. A solid goes from an ordered state into a more disordered one. For example, stirring sugar into coffee increases the energy of the system as the sugar molecules become less organized.
  • Diffusion and osmosis are also examples of increasing entropy. Molecules naturally move from regions of high concentration to those of low concentration until they reach equilibrium. For example, if you spray perfume in one corner of a room, eventually you smell it everywhere. But, after that, the fragrance doesn’t spontaneously move back toward the bottle.
  • Some phase changes between the states of matter are examples of increasing entropy, while others demonstrate decreasing entropy. A block of ice increases in entropy as it melts from a solid into a liquid. Ice consists of water molecules bonded to each other in a crystal lattice. As ice melts, molecules gain more energy, spread further apart, and lose structure to form a liquid. Similarly, the phase change from a liquid to a gas, as from water to steam, increases the energy of the system. Condensing a gas into a liquid or freezing a liquid into a gas decreases the entropy of the matter. Molecules lose kinetic energy and assume a more organized structure.

Entropy Equation and Calculation

There are several entropy formulas:

Entropy of a Reversible Process

Calculating the entropy of a reversible process assumes that each configuration within the process is equally probable (which it may not actually be). Given equal probability of outcomes, entropy equals Boltzmann’s constant (kB) multiplied by the natural logarithm of the number of possible states (W):

S = kB ln W

Entropy of an Isothermal Process

For an isothermal process, the change in entropy (ΔS) equals the change in heat (ΔQ) divided by the absolute temperature (T):

ΔS = ΔQ / T

Applying calculus, entropy is the integral of dQ/T from the initial state to final state, where Q is heat and T is the absolute (Kelvin) temperature of a system.

Entropy and Internal Energy

In physical chemistry and thermodynamics, one useful entropy formula relates entropy to the internal energy (U) of a system:

dU = T dSp dV

Here, the change in internal energy dU equals absolute temperature T multiplied by the change in entropy minus external pressure p and the change in volume V.

Entropy and the Second Law of Thermodynamics

The second law of thermodynamics states the total entropy of a closed system cannot decrease. For example, a scattered pile of papers never spontaneously orders itself into a neat stack. The heat, gases, and ash of a campfire never spontaneously re-assemble into wood.

However, the entropy of one system can decrease by raising entropy of another system. For example, freezing liquid water into ice decreases the entropy of the water, but the entropy of the surroundings increase as the phase change releases energy as heat. There is no violation of the second law of thermodynamics because the matter is not in a closed system. When the entropy of the system being studied decreases, the entropy of the environment increases.

Entropy and Time

Physicists and cosmologists often call entropy “the arrow of time” because matter in isolated systems tends to move from order to disorder. When you look at the Universe as a whole, its entropy increases. Over time, ordered systems become more disordered and energy changes forms, ultimately getting lost as heat.

Entropy and Heat Death of the Universe

Some scientists predict the entropy of the universe eventually increases to the point useful work becomes impossible. When only thermal energy remains, the universe dies of heat death. However, other scientists dispute the heat death theory. An alternative theory views the universe as part of a larger system.


  • Atkins, Peter; Julio De Paula (2006). Physical Chemistry (8th ed.). Oxford University Press. ISBN 978-0-19-870072-2.
  • Chang, Raymond (1998). Chemistry (6th ed.). New York: McGraw Hill. ISBN 978-0-07-115221-1.
  • Clausius, Rudolf (1850). On the Motive Power of Heat, and on the Laws which can be deduced from it for the Theory of Heat. Poggendorff’s Annalen der Physick, LXXIX (Dover Reprint). ISBN 978-0-486-59065-3.
  • Landsberg, P.T. (1984). “Can Entropy and “Order” Increase Together?”. Physics Letters. 102A (4): 171–173. doi:10.1016/0375-9601(84)90934-4
  • Watson, J.R.; Carson, E.M. (May 2002). “Undergraduate students’ understandings of entropy and Gibbs free energy.” University Chemistry Education. 6 (1): 4. ISSN 1369-5614