Did you ever notice that it can take you hours to clean your room, organize everything and put everything in its place, yet it seems like it takes no effort and no time at all for it to fall back into disarray?
That's entropy, the tendency of the universe to move toward a state of disorder, and it's a very real thing, a driver of much that occurs around and within us.
Some examples are:
Entropy plays a crucial role in determining what kind of processes are spontaneous (like spreading or burning) and which are not (like condensing or freezing).
We've seen that the amount of heat absorbed or emitted by a process is not a good determiner of spontaneity. For example, if we add sodium hydroxide (NaOH) to water and mix, the solid dissolves on its own (spontaneously) and the temperature of the liquid rises significantly.
On the other hand, if we dissolve a quantity of urea (H2NCONH2), the solid still dissolves (urea is quite soluble in water), but the temperature of the liquid drops so much that often water from the air condenses on the outside of the cold beaker.
The key to being able to predict the spontaneity of a process like a chemical reaction is the combination of heat loss or gain (that's enthalpy if we're working at constant pressure) and entropy.
In chemistry, spontaneous means that a process can occur without any apparent input of energy from the surroundings.
Sometimes a process needs a nudge to get going, such as when we ignite a fire, but afterward, a spontaneous process maintains itself without any further input.
Entropy is a measure of the disorder or randomness of a system. The symbol S is used for entropy, and we usually are most concerned with entropy changes, ∆S = Sfinal - Sinitial. The absolute entropy of a substance at some temperature is designated by So.
Measuring absolute entropy is difficult (how do you put a number on disorder?) and we'll tackle that below. We can, however, work with the change in entropy, ∆S.
Here are a couple of examples of entropy changes in chemical reactions. First consider a synthesis reaction,
In this reaction, three moles of "particles" (H2O and O2) are converted into two moles of product (H2O). In running this reaction, we've imposed more order on these molecules: The products are more ordered than the reactants, therefore their entropy is lower. Therefore ∆S = Sfinal - Sinitial < 0. This reaction is actually known to be spontaneous and quite exothermic (It's the reaction that lifts some rockets into space). so there is something about negative ∆S and negative ∆H that might predict that this reaction is spontaneous.
converts one mole of a substance into three moles of two different compounds. The entropy of this system has clearly increased, so ∆S = Sfinal - Sinitial > 0. Such decomposition reactions can also be endothermic or exothermic. It will be the combination of ∆H and ∆S that will be our ultimate predictor of spontaneity.
Phase changes are another instance where entropy changes are usually obvious. When water ice melts, a very well-ordered tetrahedral lattice of water molecules, each sharing four hydrogen bonds, is disrupted.
While the water molecules in liquid water are spaced just about as far apart as in ice (they're actually a little closer), they are relatively free to rotate, and those bonds are transient – breaking and reforming frequently. There's more chaos in water, and that's higher entropy.
We know that if we leave an ice cube out on the table at room temperature, it will spontaneously melt, and that the liquid water that results is more disordered than the crystalline solid. And it takes a lot of energy to run a freezer to remove heat from water to restore it to its organized, crystallized form.
It's similar for boiling. Compared to liquids, the spacing between molecules in a gas is far larger. For example, in liquid water the average O-O spacing is about 3 Å (3 x 10-10 m), where in the gas at room temperature, it's more like 3 × 10-7 m, a factor of about 1000 larger.
Spreading out is a way to generate disorder, and the large distance between water molecules in the gas phase means that intermolecular forces can't really produce any meaningful alignment or organization.
In all processes, entropy generally increases as we move from solid to liquid to gas. In chemical processes, entropy also increases when we move from fewer moles of particles to more moles. Think about a decomposition reaction, for example:
In this reaction, part of what was solid calcium carbonate is entering the gas phase and one mole of CaCO3 turns into two moles of products. Entropy increases in this system.
means impermanent or lasting only for a short time.
A system that is endothermic absorbs heat fromt he surroundings. The melting of ice is endothermic: Heat flows from the surroungings into the ice.
A system that is exothermic releases heat to the surroundings. For example, when sodium hydroxide (NaOH) is dissolved in water, the solution heats up.
These processes lead predictably to an increase in entropy of a system:
Predict whether the process shown leads to an increase in entropy (∆S > 0) or a decrease (∆S < 0).
The second law of thermodynamics says simply that the entropy of any closed system increases in any process. Often we note that the universe itself is a closed system, one in which we can account for anything entering or leaving – it's all included – and we say that the entropy of the universe always increases.
Now this might be a little troubling because we know we can do things like make ice cubes, clearly a process in which order increases, therefore entropy decreases. What we have to remember is that we must be careful about defining what the "system" is. In fact, a freezer has to expend a great deal of energy extracting heat from water to turn it to ice, and that heat, plus the heat of running a compressor pump, is given off to the surroundings, heating them up and therefore increasing the entropy of the surroundings.
But it gets better. The second law says that the entropy of the universe increases, always. So it's not a zero-sum game: We don't trade an amount of entropy decrease in the water-ice system for an equal amount of entropy increase in the surroundings.
People still sometimes try to sell things that rely on the concept of "perpetual motion." Here's an example of such a device, a motor attached to and driving a generator:
It seems like this should work. If the generator is capable of generating enough power to turn the motor, which is capable of turning the generator at sufficient speed to generate the power it needs, ... and so on, it should work. But it never has and never will.
It's just not possible not to lose some of the generated power to heat from friction, heat from electric current flowing through wires, sound and other losses. It wouldn't matter if you replaced the ball bearings with magnetic levitation or if you replaced the wires with superconducting wires. The second law would still hold and our device would eventually just wind down and stop. The best we can do is hold that off as long as possible.
The entropy of an isolated system (or the universe) always increases.
Sometimes the first two laws of thermodynamics are expressed, jokingly, like this:
1. You can't win.
2. You can't even break even.
The first law says that you can't make energy out of nothing, and the second says that you always lose some to the fact that energy "spreads out" (entropy).
What we haven't done yet is to define some scale on which to place entropy. What are the units and where is the zero? (or is there a zero?) That's easy for enthalpy where we can measure the heat given off or taken in by a reaction using instruments like a calorimeter. But how do we measure disorder?
We'll begin with the third and final law of thermodynamics: The entropy of a substance at absolute zero temperature (0 K = -273.15°C) is zero. At absolute zero, all atomic motion stops, and we therefore can't have any disorder because it would be possible to know where everything is and how it's arranged, and it wouldn't change without addition of energy.
Now absolute zero is a theoretical thing that we can't ever actually achieve, mostly because it violates the uncertainty principle of quantum mechanics.
Modern experiments have achieved a temperature of -273.144˚C, only 0.056˚C away from absolute zero, and the record will probably be pushed lower, but as far as we know, it can't actually get to 0 K. OK?
Absolute entropies of elements have been calculated based on the third law and they've been tabulated as standard molar entropies, So, often at a particular temperature. To obtain the energy related to that entropy, we simply multiply by the temperature, as we'll see later.
I won't go into the details of how absolute entropies are calculated here; I'll save that for a later section. But we can use absolute entropies to calculate ∆S, the entropy change of a chemical process in the same manner as we used standard enthalpies of formation to calculate the enthalpy change of a reaction. That was called thermochemistry.
The entropy of a substance at the absolute zero of temperature (0 K = -273.15°C) is zero,
where S° is called the absolute entropy.
The standard molar entropies of formation can be looked up in a table. Here they are for these reactants and products.
Note that it's important to get the phase right. The standard molar entropy for liquid water is 69.9 J/mol·K – lower, as we'd expect for a liquid.
Just like the thermochemistry of enthalpies of formation, we use standard enthalpies of formation to calculate the overall entropy change:
The result is
It's worth checking to make sure that makes sense. In this reaction, where three total moles of gas are combined to make two moles of gas, we'd expect the overall entropy to decrease, and thus to have ∆S = Sfinal - Sinitial < 0.
We can look up the standard entropies of formation, and they are:
This is also a gas-phase reaction. It's one of the exercises in the gray boxes above, the one that was too subtle to determine a sign for the entropy change.
Getting quantitative is the way to go here. The entropy change is
Plugging in our standard entropies of formation gives
So in this very subtle case, the entropy of the reaction mix increases as reactants are turned to products. It's not much, compared to many other reactions, but we can say with confidence that this reaction is "favored by entropy."
Looking up the standard entropies of formation, we get
Now the entropy change is easily calculated.
This one is not surprising. Four moles of reactants, one a liquid, are transformed into five moles of gas. The entropy should increase based on the liquid → gas transition and the fact that the number of moles particles increases.
On the surface, dissolving a solute in solution should be "entropically favored," meaning that the entropy of dissolving is positive. Let's dig a little deeper.
The entropy of mixing might be best viewed in this way. Here's a beaker with a removable partition. There are ions A+ and B- on the left and ions C+ and D- on the right.
When the barrier is removed, ions A+ and B- have double the solution volume to spread into, so it's reasonable to conclude that their entropy increases.
The same thing happens for ions C+ and D-.
In general, we expect the entropy of mixing, whether mixing two solutions or mixing a solid solute in a solvent, to be positive.
What's interesting is that some spontaneous mixing process are exothermic and some are endothermic. We're right back to where we started at the beginning of this section. In the next section, we'll combine enthalpy and entropy to form a new state function that will finally be predictive about the spontaneity of reactions.
Which of the following physical process would lead to an increase in entropy?
Calculate the entropy change, ∆S, for each of these reactions. See if you can predict ahead of your calculation whether the change will be positive or negative.
This site is a one-person operation. If you can manage it, I'd appreciate anything you could give to help defray the cost of keeping up the domain name, server, search service and my time.
xaktly.com by Dr. Jeff Cruzan is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License. © 2016, Jeff Cruzan. All text and images on this website not specifically attributed to another source were created by me and I reserve all rights as to their use. Any opinions expressed on this website are entirely mine, and do not necessarily reflect the views of any of my employers. Please feel free to send any questions or comments to firstname.lastname@example.org.