Entropy 2

A bit more on Entropy.

We observed in the first entropy post that, for a spontaneous (i.e. natural, non-coerced) thermodynamic process, the net change in entropy is always positive.

In the example we cited, this was because the entropy decrease caused in a warm body by the withdrawal of heat was less than the entropy increase caused in a cool body absorbing the same quantity of heat.

Unfortunately, I don’t think these words always relay an intuitive sense of what entropy is for most beginning engineers.  Clearly temperature somehow relates to entropy, but in what way? It’s kind of a head scratcher.

While I have seen many attempts at plain English analogies for entropy, I have not personally found one that really worked for me, but I am offering my own analogy here.  I imagine this analogy has been used elsewhere, though I have not run across it.  Also, I am working without a net, so there may be some conceptual errors that I am not seeing.  Still, even if imperfect, it promotes thinking about entropy.

Let’s ignore work energy for now, and recall the expression for internal energy and heat flow from our last post:

dU = dQ = TdS

And if heat is transferred exclusively between two bodies of different temperatures the law of conservation of energy dictates that:

T(1) * dS(1) = T(2) * dS(2)

If T(1) is greater than T(2) we see dS(1) must be less than dS(2) for the equality to hold.  Thus the change in entropy must be greater than zero as the system tends towards thermodynamic equilibrium.


Here is our analogy.

Imagine a country of altruistic and rational people

Imagine that the country has a fixed amount of money – money is never created or destroyed.

Imagine that people in this country initially tend to have either lots of money, or not much money.

The people who have lots of money are “Wealthy”, those without much money are “Less Wealthy”.


As I said, our imaginary citizens are altruistic and rational.

As citizens interact and become friends, the Wealthy citizens become inclined to give some of their money to their less well off friends.

The wealthiest (those with the “densest” bank accounts) can afford to be the most generous.

Now, to the Wealthy citizens, giving away some money will diminish their wealth slightly, but it improves the Wealth of their less well off companions much more noticeably.  Thus, the overall Wealth of society increases when Wealthy citizens share their wealth with the Less Wealthy.  Perhaps it prevents children from going hungry in Less Wealthy families, for instance.

In addition, our citizens are rational, so Less Wealthy citizens do not give money to their Wealthier friends.  It would make no financial sense, and it diminishes the overall Wealth of society at large since it beggars the Less Wealthy at the expense of the already Wealthy.  So overall, the Wealth of society tends to increase in all “natural” transactions.

Observe that as these altruistic citizens continue to exchange money, they will eventually reach a point where everyone will have the same amount of wealth.  Transactions between the citizens will no longer occur, and the wealth of the average citizen will have been maximized.

Note that when a Wealthy citizen shares their wealth with a Less Wealthy person, the amount of money in the society remains unchanged even though societal wealth can increase.


As is hopefully obvious from the above the analogy components are:

  • Money = Energy
  • Bank Account = Temperature
  • Giving Money = Heat Flow
  • Wealth = Entropy

If this analogy is valid, what it suggests is that entropy somehow relates to the “wealth” of a substance, but it is different from the specific amount of the energy in a substance (the number of dollars in the bank account.)

Can we think of entropy as the “wealth” of a substance, denoting a capability to give, whereas energy is like the exact amount of money?  I would say that, loosely speaking, yes we can.

We can notice something else.  Over time, the money will be evenly distributed, the sharing will cease and the wealth of the citizens will be maximized overall.  This is equivalent to our system reaching thermal equilibrium.  At this point, heat flow stops and entropy is at a maximum for the system.

Note too that as citizens give away money and their Bank Accounts become more similar, the effects on society’s wealth become smaller.  For instance, when a citizen with $1,000,001 dollars gives $1 to a citizen with $999,999 dollars, the change in wealth to either one and to society is vanishingly small.  So too, when the temperature differences in a thermodynamic system are very small, so is the creation of entropy.  If this difference could be made to be zero, the entropy change would be zero and the process would be reversible (see last post.)


As a physical example, if you combine a pot of cold water and a pot of hot water, the net entropy will increase as the temperatures tend towards each other.  But they will be at a maximum only when the the combined water reaches a fixed, stable equilibrium temperature and no further heat flow occurs.


A few closing comments.

An excellent, though somewhat more complicated (at least to me), analogy is provided by Van Ness in his great book Understanding Thermodynamics.  Worth a look see.

It is worth mentioning the units we are working with.

The internal energy of a substance is reported as energy per unit mass.  Examples include kiloJoules/kilogram and Btu/Pound-mass.

As you might expect (since Q = T * dS), entropy is reported as energy per unit mass – degree.  Examples include kiloJoules/kilogram-Kelvin  and Btu/Pound-mass-Rankine.


Entropy 1

I have hesitated to post about entropy for a few reasons.

First, there are ubiquitous “standard” descriptions out there already (both technical and general), so writing about entropy seems redundant and likely to only reveal the inadequacy of my own understanding of it.

Also, entropy is not usually invoked in day to day energy work performed by most people doing work with renewables and/or energy conservation, and they’re kind of my target audience (at least in theory – I don’t really have an audience.)

Lastly, entropy is a tricky concept, and I don’t want to say anything too dumb.

But I guess I’m dumb enough to think my approach might provide a simplified overview that could be helpful to some folks.

There’s a bit of a long wind up to this one, but eventually I’ll get to the point…


You may recall that energy manifests itself as either work or heat.   More properly, energy in transition manifests itself as work or heat.  Formally, we can write this for a substance as:

Change in Internal Energy = Work done (to or by) the system + heat (added to or taken from) the system


Delta-U = Delta-W + Delta-Q

or for more brevity

dU = dW + dQ

Where U is the internal energy of the substance, W is the work, and Q is the heat.  Two technical asides:  The equation is most often written with the work term being negative, meaning that work done by the system reduces the internal energy.  Also, this equation assumes a contained, non-flow system.  If there is flow work, internal energy is replaced with enthalpy.

Work is formally defined as the application of a force through a distance.  Thus, as expected, one unit of force is a newton-meter.  Another is the foot-pound.  Confusingly, the pound-foot is a common unit of torque (twisting.)

Let’s call small changes “differential” change and designate them with a lower case “d”.  For instance a differential change in volume would be designated dV

You will note that pressure times area equals force.  This allows us to represent differential work performed by a substance as shown below.  Here, W is work, p is pressure, A is the area under pressure, x is the “distance” the surface of the substance moved, and dV is the change in volume, equal to the area times the displacement x (yeah, I know this is a little sloppy, but the idea is there.  The standard texts describe this idea more gracefully.)

dW = p * A * dx  =  p * (A * dx) = p * dV.

The bottom line is, pressure times change in volume equals work.  Which also means, no change in volume, no work.

Let us now consider the heat term.  You may recall that adding or removing heat from a substance will result in a temperature change, and that the specific heat of the material provides a relationship between them.   (It is worth noting that if this is true, then temperature must represent the internal energy of a substance somehow.  And this is correct.)  However, heat flow dQ can also be expressed as the product of temperature T and change in a property called entropy S as follows:

dQ = T * dS

You will note the similarity to

dW = p * dV

A big difference between these little equations is that we can visualize a change in volume resulting in work.  For example, we can imagine water freezing behind masonry and damaging the masonry because of the tremendous pressure generated by the small increase in volume that takes place when water freezes.  That’s serious work.

But what the heck does a changing entropy mean?  Although I think this is an unorthodox suggestion, one can think of entropy as an invisible property akin to volume, that changes in correspondence with heat flow into or out of a substance.  The more heat that is added or removed, the “bigger” or “smaller” our substance’s entropy becomes, like the heat was pushed in or squeezed out of it.

Let’s go back to our original equation but substituting our new terms for work and heat:

dU = p * dV + T * dS

If we assume no work is done (i.e. volume is held fixed), the fist term vanishes, and the relationship between a substance’s internal energy, temperature and entropy is found with

dU = T * dS

Hmm.  Now the first thing a careful reader might ask is, how is this possible?  We know that when heat is added or removed from a substance, the temperature changes in accordance with the specific heat.  What gives?  Well, the trick here is that the change is assumed to be so tiny, so slow, so incremental, that temperature is presumed to be unchanged for the tiny changes in entropy.  Calculus actually resolves this computationally, but we don’t want to get that deep into things.  So just assume tiny, tiny differentials and go with it.  Imagine, for instance, a teeny-tiny bit of heat being removed from a bowling ball.  The temperature of the bowling ball would not change appreciably.

The second thing you might observe is that as entropy increases or decreases, so too does the internal energy of the material.  But we recall that energy cannot be created or destroyed per the law of conservation of energy, so if the internal energy of a system changes, that internal energy must be accounted for.  An example is the easiest way to see how this works.

Consider two blocks of material at different temperatures.  We put them in contact.  We know that heat will flow from the warmer block to the colder block.  If we ignore any expansion or contraction of the blocks due changes in their temperature resulting from the heat flow (which eliminates the p * dV work term), then the law of conservation of energy provides us with:

T  * dS (block 1) = T * dS (block 2)

Okay, energy has been conserved as evidenced by the equal sign (remember, T * dS is heat) with the heat leaving block 1 going into block 2.  But what’s going on with the entropy?

Well, we know from experience that heat always flows from a warm body to a cold body, never the reverse.  And so, for example, we expect a hot cup of coffee to always cool over time as it sits on our desk.  Never does a cup of coffee do the reverse, spontaneously drawing heat from the surroundings and heating itself up.

Now let’s look at our equation.  Assume the temperature of block 1 was higher than block 2.  If we rearrange that equation, we can learn something about the entropy changes:

T(1)/T(2) * dS (1) = dS (2)

Remember, T(1) was warmer than T(2), so the ratio of T(1)/T(2) must be greater than 1 if this equation is to remain true.  This means that dS(1) must be smaller than dS(2) in magnitude.

Now note that the block 1 gave up heat (negative energy) meaning that dS(1) is actually negative, while block 2 absorbed the heat (positive energy) meaning the dS(2) is positive.  The sum of these entropy changes provide the total entropy change for the process.  Since dS(2) is necessarily larger in magnitude than dS(1), and since dS(1) is negative, we find in all cases:

dS(2)  –  dS(1)  >  0

Yep, when heat flows from hot to cold, the net entropy of the system always increases in a spontaneous process.

It’s worth noting here that if T(1) was indistinguishably different from T(2), the net entropy gain would approach zero.  Such a process is known as “reversible” and represents the limit of energy’s utility to us.

By the way, we know we can re-heat things that have cooled, like our cup of coffee.  But it is a fact that if you look at the entropy decrease of the heating element versus the entropy increase of the coffee, the net result is always greater than zero.

We can now circle back to something.  I mentioned that you could think about entropy as being akin to volume in that it is a real property of matter.  But now we see that what it is really associated with is the “quality” of a substance’s internal energy.  By quality, we mean the ability the substance has to transfer heat (and work, which is heat’s equivalent) to cooler surroundings.

Think of it this way.  Gasoline contains very high quality energy.  When we burn gasoline and utilize that energy, it manifests itself as motion (kinetic energy) and heat (from the tail pipe, and through the heating of the air and road via frictional forces.)  No energy has been created or destroyed, but the energy that has transferred to the atmosphere is far less capable of being put to good use.  Some of the tailpipe heat might be captured instead as heat for the cabin, but most of the left over heat is not useful.   And once that cabin heat dissipates, the useful energy from the gasoline has pretty much been extinguished.  Energy has been conserved, but the quality of the energy has been degraded.  And we measure this degradation as an increase in entropy.

In a real sense, then, energy conservation might better better called entropy conservation.

Notice, though, that in one way entropy is rather like volume.  It can change.  It is not conserved.

You may have noticed something else about entropy.  Since it must always increase in real, spontaneous processes, then it tells us what is possible in the real world.  If someone describes a spontaneous (“natural”) process that results in a net decrease of total entropy, it is impossible.  They would be describing a perpetual motion machine.

Lastly, while we are speaking in generalities, entropy is a real, measurable property of matter, just like internal energy and energy’s close cousin enthalpy.

There are many other ways to think about entropy, some of which I hope to think about in a future post.

If you made it this far and have any corrections or suggestions to make, by all means leave me a note.