That is an everyday observation: we can cool objects in a refrigerator, which involves transferring heat from them and depositing it in the warmer surroundings, but to do so, we have to do work—the refrigerator must be driven by connecting it to a 42 power supply, and the ultimate change elsewhere in the surroundings that drives the refrigeration is the combustion of fuel in a power station that may be far away.
The Kelvin and the Clausius statements are both summaries of observations. No one has ever built a working heat engine without a cold sink, although they might not have realized one was present. Nor have they observed a cool object spontaneously becoming hotter than its surroundings.
- The Laws of Thermodynamics: A Very Short Introduction - Peter Atkins - كتب Google.
- Summer Reading/Book Review: Four Laws That Drive The Universe, by Peter Atkins!
- Late Neoliberalism and its Discontents in the Economic Crisis: Comparing Social Movements in the European Periphery.
- The Moral Background: An Inquiry into the History of Business Ethics (Princeton Studies in Cultural Sociology).
- Evidenced-Based Gastroenterology and Hepatology 2nd ed.
- Enter contact details and we will notify you, when the product is available..
As such, their statements are indeed laws of Nature in the sense that I am using the term as a summary of exhaustive observations. But are there two second laws? First, imagine coupling two engines together Figure The two engines share the same hot source. Engine A has no cold sink, but engine B does. We use engine A to drive engine B. That work is used to drive the transfer of heat from the cold sink of engine B into the shared hot sink. The net effect is the restoration of the energy to the hot sink in addition to whatever engine B transferred out of its cold sink.
We build an engine with a hot source and a cold sink, and run the engine to produce work. In the process we discard some heat into 43 The second law: The increase in entropy The answer is that the two statements are logically equivalent.
I shall now demonstrate both sides of this equivalence. The Laws of Thermodynamics The equivalence of the Kelvin and Clausius statements. The diagram on the left depicts the fact that the failure of the Kelvin statement implies the failure of the Clausius statement. The diagram on the right depicts the fact that the failure of the Clausius statement implies the failure of the Kelvin statement the cold sink.
The heat absorbed by the engine can also, in principle at least, be measured by measuring the fall in a weight. Thus, as we saw in Chapter 2, the transfer of energy as heat can be measured by observing how much work must be done to achieve a given change of state in an adiabatic container, then measuring the work that must be done to achieve the same change in a diathermic container, and identifying the difference of the two amounts of work as the heat transaction in the second process.
The Laws of Thermodynamics standard system as the hot source in the engine. Thus, at atmospheric pressure, water is found to freeze at K to be precise, at about 0. Their precise values are still open to discussion, but reliable values appear to be The latter name comes from expressing temperature in terms of the properties of a perfect gas, a hypothetical gas in which there are no interactions 46 between the molecules. In short, it is important not to stir up any turbulent regions of thermal motion.
For our initial encounter 47 The second law: The increase in entropy It is inelegant, but of practical utility, to have alternative statements of the second law. To do so, we follow Clausius and introduce a new thermodynamic function, the entropy, S. The Laws of Thermodynamics with the concept, we shall identify entropy with disorder: if matter and energy are distributed in a disordered way, as in a gas, then the entropy is high; if the energy and matter are stored in an ordered manner, as in a crystal, then the entropy is low.
A quiet library is the metaphor for a system at low temperature, with little disorderly thermal motion. A sneeze corresponds to the transfer of energy as heat. In a quiet library a sudden sneeze is highly disruptive: there is a big increase in disorder, a large increase in entropy. On the other hand, a busy street is a metaphor for a system at high temperature, with a lot of thermal motion.
Now the same sneeze will introduce relatively little additional disorder: there is only a small increase in entropy. Now we are ready to express the second law in terms of the entropy and to show that a single statement captures the Kelvin and Clausius statements. We begin by proposing the following as a statement of the second law: the entropy of the universe increases in the course of any spontaneous change. When heat leaves the hot source, there is a decrease in the entropy of the system.
We shall understand that point more fully later, when we turn to the molecular nature of entropy. There is no other change. Therefore, the overall change is a decrease in the entropy of the universe, which is contrary to the second law. It follows that an engine with no cold sink cannot produce work. To see that an engine with a cold sink can produce work, we think of an actual heat engine.
As before, there is a decrease in entropy when energy leaves the hot sink as heat and there is no change in entropy when some of that heat is converted into work. However, 49 The second law: The increase in entropy The key word here is universe: it means, as always in thermodynamics, the system together with its surroundings. There is no prohibition of the system or the surroundings individually undergoing a decrease in entropy provided that there is a compensating change elsewhere.
On the right is shown the consequence of providing a cold sink and discarding some heat into it. The increase in entropy of the sink may outweigh the reduction of entropy of the source, and overall there is an increase in entropy. Such an engine is viable provided we do not convert all the energy into work, we can discard some into the cold sink as heat.
There will now be an increase in the entropy of the cold sink, and provided its temperature is low enough—that is, it is a quiet enough library—even a small deposit of heat into the sink can result in an increase in its entropy that cancels the decrease in entropy of the hot source. Overall, therefore, there can be an increase in entropy of the universe, but only provided there is a cold sink in which to generate a positive contribution.
That is why the cold sink is the crucial part of a heat engine: entropy can be increased only if the sink is present, and the engine can produce work from heat only if overall the process is spontaneous. It is worse than useless to have to drive an engine to make it work! It turns out, as may be quite readily shown, that the fraction of energy withdrawn from the hot source that must be discarded into the cold sink, and which therefore is not available for converting into work, depends only on the temperatures of the source and sink.
Thus, we see that the concept of entropy captures the two equivalent phenomenological statements of the second law and acts as the signpost of spontaneous change. The second law and entropy identify the spontaneous changes among these feasible changes: a feasible process is spontaneous only if the total entropy of the universe increases. It is of some interest that the concept of entropy greatly troubled the Victorians. They could understand the conservation of energy, for they could presume that at the Creation God had endowed the 51 The second law: The increase in entropy Now consider the Clausius statement in terms of entropy.
If a certain quantity of energy leaves the cold object as heat, the entropy decreases. This is a large decrease, because the object is cold—it is a quiet library.
The same quantity of heat enters the hot object. The entropy increases, but because the temperature is higher—the object is a busy street—the resulting increase in entropy is small, and certainly smaller than the decrease in entropy of the cold object. What were they to make of entropy, though, which somehow seemed to increase ineluctably.
Where did this entropy spring from? Why was there not an exact, perfectly and eternally judged amount of the God-given stuff? To resolve these matters and to deepen our understanding of the concept, we need to turn to the molecular interpretation of entropy and its interpretation as a measure, in some sense, of disorder. The Laws of Thermodynamics Images of disorder With entropy as a measure of disorder in mind, the change in entropy accompanying a number of processes can be predicted quite simply, although the actual numerical change takes more effort to calculate than we need to display in this introduction.
For example, the isothermal constant temperature expansion of a gas distributes its molecules and their constant energy over a greater volume, the system is correspondingly less ordered in the sense that we have less chance of predicting successfully where a particular molecule and its energy will be found, and the entropy correspondingly increases. The central result is that as the walls of the box are moved apart, the energy levels fall and become less 52 widely separated Figure At room temperature, billions of these energy levels are occupied by the molecules, the distribution of populations being given by the Boltzmann distribution characteristic of that temperature.
As the box expands, the Boltzmann distribution spreads over more energy levels and it becomes less probable that we can specify which energy level a molecule would come from if we made a blind selection of molecules. A similar picture accounts for the change in entropy as the temperature of a gaseous sample is raised. The increase in entropy of a collection of particles in an expanding box-like region arises from the fact that as the box expands, the allowed energies come closer together.
Provided the temperature remains the same, the Boltzmann distribution spans more energy levels, so the chance of choosing a molecule from one level in a blind selection decreases. That is, the disorder and the entropy increase as the gas occupies a greater volume The Laws of Thermodynamics to expect an increase in entropy with temperature. That increase in molecular terms can be understood, because as the temperature increases at constant volume, the Boltzmann distribution acquires a longer tail, corresponding to the occupation of a wider range of energy levels.
Once again, the probability that we can predict which energy level a molecule comes from in a blind selection corresponds to an increase in disorder and therefore to a higher entropy. That means that we can be absolutely certain that in a blind selection we will select a molecule from that single ground state: there is no uncertainty in the distribution of energy, and the entropy is zero. This expression is much harder to implement than the classical thermodynamic expression, and really belongs to the domain of statistical thermodynamics, which is not the subject of this volume.
He deserves his constant even if we do not. There are various little wrinkles in the foregoing about which we now need to own up. In some cases, however, there is a discrepancy, and the thermodynamic entropy differs from the statistical entropy. In some cases, 55 The second law: The increase in entropy Degenerate solids The Laws of Thermodynamics that is not true, and in such cases there may be many different states of the system corresponding to the lowest energy. We could say that the ground states of these systems are highly degenerate and denote the number of states that correspond to that lowest energy as D.
I give a visualizable example in a moment. If there are D such states, then even at absolute zero we have only 1 chance in D of predicting which of these degenerate states a molecule will come from in a blind selection. Solid carbon monoxide provides one of the simplest examples of residual entropy. A carbon monoxide molecule, CO, has a highly uniform distribution of electric charge technically, it has only a very tiny electric dipole moment , and there is little difference in energy if in the solid the molecules lie.
In other words, the ground state of a solid sample of carbon monoxide is highly degenerate. Try calculating the value of D. The value of the residual entropy is k log D, which works out to 0. There is one common substance, though, of considerable importance that is also highly degenerate in its ground state: ice.
We do not often think—perhaps ever—of ice being a degenerate solid, but it is, and the degeneracy stems from the location of the hydrogen atoms around each oxygen atom. The molecule is electrically neutral overall, but the electrons are not distributed uniformly, and each oxygen atom has patches of net negative charge on either side of the molecule, and each hydrogen atom is slightly positively charged on account of the withdrawal of electrons from it by the electron-hungry oxygen atom.
In ice, each water molecule is surrounded by others in a tetrahedral arrangement, but the slightly positively charged hydrogen atoms of one molecule are attracted to one of the patches of slight negative charge on the oxygen atom of a neighbouring water molecule. Although each oxygen atom is closely attached to two hydrogen atoms and makes a more distant link to a hydrogen atom of each of two neighbouring water molecules, there is some freedom in the choice of which links are close and which are distant. Two of the many arrangements are shown here are long is almost random.
When the statistics of this variability is analysed, it turns out that the residual entropy of 1 g of ice should be 0. The Laws of Thermodynamics Refrigerators and heat pumps The concept of entropy is the foundation of the operation of heat engines, heat pumps, and refrigerators. We have already seen that a heat engine works because heat is deposited in a cold sink and generates disorder there that compensates, and in general more than compensates, for any reduction in entropy due to the extraction of energy as heat from the hot source.
The fundamental reason for that design feature is that the high temperature of the source minimizes the entropy reduction of the withdrawal of heat to go unnoticed, it is best to sneeze in a very busy street , so that least entropy has to be generated in the cold sink to compensate for that decrease, and therefore that more energy can be used to do the work for which the engine is intended. A refrigerator is a device for removing heat from an object and transferring that heat to the surroundings.
This process does not occur spontaneously because it corresponds to a reduction in total entropy. Thus, when a given quantity of heat is removed from a cool body a quiet library, in our sneeze analogy , there is a large decrease in entropy. When that heat is released into warmer surroundings, there is an increase in entropy, but the increase is smaller than the original decrease because the temperature is higher it is a busy street.
Therefore, overall there 58 Tsurroundings is a net decrease in entropy. In order to achieve a net increase of entropy, we must release more energy into the surroundings than is extracted from the cool object we must sneeze more loudly in the busy street. This we can do by doing work on the system, for the work we do adds to the energy stream Figure If enough work is done on the system, the release of a large amount of energy into the warm surroundings gives a large increase in entropy and overall there is a net increase in entropy and the process can occur.
Of course, to generate the work to drive the 59 The second law: The increase in entropy The processes involved in a refrigerator and a heat pump. In a heat pump right , the same net increase in entropy is achieved, but in this case the interest lies in the energy supplied to the interior of the house refrigerator, a spontaneous process must occur elsewhere, as in a distant power station. By a calculation very similar to that on p. Air conditioning is essentially refrigeration, and this calculation indicates why it is so expensive—and environmentally damaging—to run.
When a refrigerator is working, the energy released into the surroundings is the sum of that extracted from the cooled object and that used to run the apparatus. This remark is the basis of the operation of a heat pump, a device for heating a region such as the interior of a house by pumping heat from the outside into the interior.
A heat pump is essentially a refrigerator, with the cooled 60 object the outside world and the heat transfer arranged to be in the region to be heated. That is, our interest is in the back of the refrigerator, not its interior. Wherever structure is to be conjured from disorder, it must be driven by the generation of greater disorder elsewhere, so that there is a net increase in disorder of the universe, with disorder understood in the sophisticated manner that we have sketched.
That is clearly true for an actual heat engine, as we have seen. However, it is in fact universally true. For instance, in an internal combustion engine, the combustion of a hydrocarbon fuel results in the replacement of a compact liquid by a mixture of gases that occupies a volume over times greater and still times greater if we allow for the oxygen consumed. Thus, to release J into the interior, we need do only 67 J of work. In other words, a heat pump rated at 1 kW behaves like a 15 kW heater.
The Laws of Thermodynamics that energy disperses into the surroundings. The fuel might be food. The dispersal that corresponds to an increase in entropy is the metabolism of the food and the dispersal of energy and matter that that metabolism releases. The structure that taps into that dispersal is not a mechanical chain of pistons and gears, but the biochemical pathways within the body. The structure that those pathways cause to emerge may be proteins assembled from individual amino acids. Thus, as we eat, so we grow.
The structures may be of a different kind: they may be works of art. For another structure that can be driven into existence by coupling to the energy released by ingestion and digestion consists of organized electrical activity within the brain constructed from random electrical and neuronal activity.
Thus, as we eat, we create: we create works of art, of literature, and of understanding. The steam engine, in its abstract form as a device that generates organized motion work by drawing on the dissipation of energy, accounts for all the processes within our body. Moreover, that great steam engine in the sky, the Sun, is one of the great fountains of construction.
We all live off the spontaneous dissipation of its energy, and as we live so we spread disorder into our surroundings: we could not survive without our surroundings. In his seventeenth century meditation, John Donne was unknowingly expressing a version of the second law when he wrote, two centuries before Carnot, Joule, Kelvin, and Clausius, that no man is an island.
Surely not! How can energy be free? Of course, the answer lies in a technicality. By free energy we do not mean that it is monetarily free. In thermodynamics, the freedom refers to the energy that is free to do work rather than just tumble out of a system as heat. We have seen that when a combustion occurs at constant pressure, the energy that may be released as heat is given by the change of enthalpy of the system. Although there may be a change in internal energy of a certain value, the system in effect has to pay a tax to the surroundings in the sense that some of that change in internal energy must be used to drive back the atmosphere in order to make room for the products.
In such a case, the energy that can be released as heat is less than the change in internal energy. It is also possible for there to be a tax refund in the sense that if the products of a reaction occupy less volume than the reactants, then the system can contract. In this case, the surroundings do work on the system, energy is transferred into it, and the system can release more heat than is given by the change in internal energy: the system recycles the incoming work as outgoing heat.
The enthalpy, in short, is an accounting tool for heat that takes into account automatically the tax payable or repayable as work and lets us 63 calculate the heat output without having to calculate the contributions of work in a separate calculation. The Laws of Thermodynamics The question that now arises is whether a system must pay a tax to the surroundings in order to produce work.
Can we extract the full change in internal energy as work, or must some of that change be transferred to the surroundings as heat, leaving less to be used to do work? Must there be a tax, in the form of heat, that a system has to pay in order to do work? Could there even be a tax refund in the sense that we can extract more work than the change in internal energy leads us to expect? In short, by analogy with the role of enthalpy, is there a thermodynamic property that instead of focusing on the net heat that a process can release focuses on the net work instead?
To identify spontaneous processes we must note the crucially important aspect of the second law that it refers to the entropy of the universe, the sum of the entropies of the system and the surroundings. According to the second law, a spontaneous change is accompanied by an increase in entropy of the universe. An important feature of this emphasis on the universe is that a process may be spontaneous, and work producing, even though it is accompanied by a decrease in entropy of the system provided that a greater increase occurs in the surroundings and the total entropy increases.
Whenever we see the apparently spontaneous reduction of entropy, as when a structure emerges, a crystal forms, a plant grows, or a thought emerges, there is always a greater increase in entropy elsewhere than accompanies the reduction in entropy of the system. It is inconvenient to have to do two separate calculations, one for the system and one for the surroundings. Provided we are prepared to restrict our interest to certain types of change, there is a way to combine the two calculations into one and to carry out the calculation by focusing on the properties of the system alone.
The clever step is to realize that if we limit changes to those taking place at constant volume and temperature, then the change in entropy of the surroundings can be expressed in terms of the change in internal energy of the system. That is because at constant volume, the only way that the internal energy can change in a closed system is to exchange energy as heat with the surroundings, and that heat can be used to calculate the change in entropy of the surroundings by using the Clausius expression for the entropy.
This expression is in terms of the properties of the system alone. The German physiologist and physicist Hermann von Helmholtz — , after whom this property is named, was responsible for formulating the law of conservation of energy as well as making other major contributions to the science of sensation, colour blindness, nerve propagation, hearing, and thermodynamics in general.
So, a change in A is just a disguised form of the change in total entropy of the universe when the temperature and volume of the system are constant. The important implication of this conclusion is that, because spontaneous changes correspond to positive changes increases in the total entropy of the universe, provided we limit our attention to processes at constant temperature and volume, spontaneous changes correspond to a decrease in Helmholtz energy of the system.
The restriction of the conditions to constant temperature and volume has allowed us to express spontaneity solely in terms of the properties of the system: its internal energy, temperature, and entropy. It probably seems more natural that a spontaneous change corresponds to a decrease in a quantity: in the everyday world, 66 As well as the Helmholtz energy being a signpost of spontaneous change it has another important role: it tells us the maximum work that can be extracted when a process occurs at constant temperature.
More commonly, though, A is called a free energy, suggesting that it indicates the energy in a system that is free to do work. The last point becomes clearer once we think about the molecular nature of the Helmholtz energy. As we saw in Chapter 2, work is uniform motion in the surroundings, as in the moving of all the atoms of a weight in the same direction. The term TS that appears 67 Free energy: The availability of work things tend to fall down, not up.
You might then jump to the conclusion that systems tend towards lower internal energy and higher entropy. That would be a wrong interpretation. The only criterion of spontaneous change in thermodynamics is the increase in total entropy of the universe. We can then think of only the energy stored in an orderly way as being available to cause orderly motion, that is, work, in the surroundings. For that increase to occur, some of the change in internal energy must be released as heat, for only heat transactions result in changes in entropy.
Some numbers might give these considerations a sense of reality.
- The Umbral Calculus.
- Relativism and Reality: A Contemporary Introduction.
- The Game Before the Game: The Perfect 30-Minute Practice!
- The Perversion of Virtue: Understanding Murder-Suicide!
- The Architecture of Community.
- The History and Topography of Ireland.
- What is Kobo Super Points?.
- L1-Norm and L∞-Norm Estimation: An Introduction to the Least Absolute Residuals, the Minimax Absolute Residual and Related Fitting Procedures?
- Featured channels.
When 1 L of gasoline is burned it produces carbon dioxide and water vapour. The change in internal energy is 33 MJ, which tells us that if the combustion takes place at constant volume in a sturdy, sealed container , then 33 MJ will be released as heat. The change in enthalpy is 0. Notice that less heat is released in the second arrangement because 0. However, suppose the entropy of the system happens to increase in the course of the process.
In that case the process is already spontaneous, and no tax need be paid to the surroundings. In fact, it is better than that, because the surroundings can be allowed to supply energy as heat to the system, because they can tolerate a decrease in entropy yet the entropy of the universe will still increase. In other words, the system can receive a tax refund. Thus, if the combustion took place in an engine, the maximum amount of work that could be obtained is 35 MJ.
Introducing the Gibbs energy The discussion so far refers to all kinds of work. In many cases we are not interested in expansion work but the work, for example, that can be extracted electrically from an electrochemical cell or the work done by our muscles as we move around. He worked at Yale University throughout his life and was noted for his public reticence. His extensive and subtle work was published in what we now consider to be an obscure journal The Transactions of the Connecticut Academy of Science and was not appreciated until it was interpreted by his successors.
Just as it is not really possible to give a molecular interpretation of the enthalpy, which is really just a clever accounting device, it is not possible to give a simple explanation of the molecular nature of the Gibbs energy. It is good enough for our purposes to think of it like the Helmholtz energy, as a measure of the energy that is stored in an orderly way and is therefore free to do useful work.
In each case, the underlying origin of the spontaneity is the increase in entropy of the universe, but in each case we can express that increase in terms of the properties of the system alone and do not have to worry about doing a special calculation for the surroundings. The Laws of Thermodynamics The thermodynamics of freezing There are three applications that I shall discuss here. The Gibbs energy of a pure substance decreases as the temperature is raised. Therefore, as T increases, TS becomes larger and subtracts more and more from H, and G consequently falls.
The Gibbs energy of ice behaves similarly. The entropy of g of water vapour is much greater than that of the liquid because the molecules of a gas occupy a much greater volume and are distributed randomly over it. That is why we have drawn the Gibbs energies starting in their relative positions on the left of the illustration. The decrease in Gibbs energy with increasing temperature for three phases of a substance.
If the gas line falls more steeply, it might intersect the solid line before the liquid line does, in which case the liquid is never the stable phase and the solid sublimes directly to a vapour The Laws of Thermodynamics The Gibbs energy of the liquid remains the lowest of the three phases until the steeply falling line for the vapour intersects it.
In such a case, the substance will make a direct transition from solid to vapour without melting to an intermediate liquid phase. This is the process called sublimation. Dry ice solid carbon dioxide behaves in this way, and converts directly to carbon dioxide gas. All phase changes can be expressed thermodynamically in a similar way, including melting, freezing, condensation, vaporization, and sublimation. More elaborate discussions also enable us to discuss the effect of pressure on the temperatures at which phase transitions occur, for pressure affects the locations of the lines showing the dependence of Gibbs energy on temperature in different ways, and the intersection points move accordingly.
Living off Gibbs energy Our bodies live off Gibbs energy. Many of the processes that constitute life are non-spontaneous reactions, which is why we decompose and putrefy when we die and these life-sustaining reactions no longer continue. A simple in principle example is 74 A process that corresponds to a large increase in total energy represented here by an increase in disorder on the left can drive a process in which order emerges from disorder on the right. When a terminal phosphate group is snipped off by reaction with water Figure 18 , to form adenosine diphosphate ADP , there is a substantial decrease in Gibbs energy, arising in part from the increase in entropy when the group is liberated from the chain.
Enzymes in the body make use of this change in Gibbs 75 Free energy: The availability of work the construction of a protein molecule by stringing together in an exactly controlled sequence numerous individual amino acid molecules. The construction of a protein is not a spontaneous process, as order must be created out of disorder. A helpful analogy is that of a weight which can be raised by coupling it to a heavier weight that raises the lighter weight as it falls Figure A molecular model of adenosine triphosphate ATP.
Some of the phosphorus P and oxygen O atoms are marked. Energy is released when the terminal phosphate group is severed at the location shown by the line. It takes the effort of about three ATP molecules to link two amino acids together, so the construction of a typical protein of about amino acid groups needs the energy released by about ATP molecules. They are converted back into ATP molecules by coupling to reactions that release even more Gibbs energy—act as even heavier weights—and which reattach a phosphate group to each one.
These heavy-weight reactions are the reactions of metabolism of the food that we need to ingest regularly. That food may be the material that has been driven into existence by even heavier reactions—reactions that release even 76 more Gibbs energy, and ultimately off the nuclear processes that occur on the Sun. Chemical equilibrium The Gibbs energy is the key.
Once again we note that at constant temperature and pressure a system tends to change in the direction corresponding to decreasing Gibbs energy. Nevertheless, even in this case there are one or two molecules of reactants among the myriad product molecules. The explosive reaction of hydrogen and oxygen to form water is an example. On the other hand, some reactions do not appear to go at all. Nevertheless, at equilibrium there are one or two product molecules among the myriad reactant molecules.
The dissolution of gold in water is an example. A lot of reactions lie between these extremes, with reactants and products both in abundance, and it is a matter of great interest in chemistry to account for the composition corresponding to equilibrium and how it responds to the conditions, such as the temperature and the pressure. An important point about chemical equilibrium is that when it is achieved the reaction does not simply grind to a halt.
At a molecular level all is turmoil: reactants form products and products decompose into reactants, but both processes occur at matching rates, so there is no net change. Chemical equilibrium is dynamic equilibrium, so it remains sensitive to the conditions: the reaction is not just lying there dead.
Progress of reaction Equilibrium Equilibrium Gibbs energy When both contributions are taken into account, it is found that the Gibbs energy passes through a minimum at an intermediate composition. This composition corresponds to equilibrium. Any composition to the left or right of the minimum has a higher Gibbs energy, and the system tends spontaneously to migrate to lower Gibbs energy and attain the composition corresponding to equilibrium. If the composition is at equilibrium, then the Equilibrium The Laws of Thermodynamics energy of the reaction mixture depends on the composition of the mixture.
That dependence has two origins. One is the difference in Gibbs energies of the pure reactants and the pure products: as the composition changes from pure reactants to pure products, so the Gibbs energy changes from one to the other. This contribution is zero for pure reactants and for pure products where there is nothing to mix , and is a maximum when the reactants and products are both abundant and the mixing is extensive. Pure products Pure reactants The variation of the Gibbs energy of a reaction mixture as it changes from pure reactants to pure products. In each case, the equilibrium composition, which shows no further net tendency to change, occurs at the minimum of the curve 78 reaction has no tendency to run in either direction.
In some cases Figure 19 , the minimum lies far to the left, very close to pure reactants, and the Gibbs function reaches its minimum value after only a few molecules of products are formed as for gold dissolving in water. In other cases, the minimum lies far to the right, and almost all the reactants must be consumed before the minimum is reached as for the reaction between hydrogen and oxygen.
In a battery, a chemical reaction drives electrons through an external circuit by depositing electrons in one electrode and extracting them from another electrode. This process is spontaneous in the thermodynamic sense, and we can imagine it taking place as the reactants sealed into the battery convert to products, and the composition migrates from left to right in Figure The Gibbs energy of the system falls, and in due course reaches its minimum value.
The chemical reaction has reached equilibrium. It has no further tendency to change into products, and therefore no further tendency to drive electrons through the external circuit. The reaction has reached the minimum of its Gibbs energy and the battery—but not the reactions still continuing inside—is dead. Chapter 5 The third law The unattainability of zero I have introduced the temperature, the internal energy, and the entropy. Essentially the whole of thermodynamics can be expressed in terms of these three quantities. I have also introduced the enthalpy, the Helmholtz energy, and the Gibbs energy; but they are just convenient accounting quantities, not new fundamental concepts.
For one thing, it does not inspire the introduction of a new thermodynamic function. However, it does make possible their application. Hints of the third law are already present in the consequences of the second law, where we considered its implications for refrigeration. There is another hint about the nature of the third law in our discussion of the second. Extreme cold As usual in classical thermodynamics, we focus on observations made outside the system of interest, in its surroundings, and close our minds, initially at least, to any knowledge or preconceptions we might have about the molecular structure of the system.
That is, to establish a law of classical thermodynamics, we proceed wholly phenomenologically. Interesting things happen to matter when it is cooled to very low temperatures. It also makes it possible to use data obtained by thermal measurements, such as heat capacities, to predict the composition of reacting systems that correspond to equilibrium. The third law also has some troublesome implications, especially for those seeking very low temperatures.
The Laws of Thermodynamics electricity with zero resistance, was discovered when it became possible to cool matter to the temperature of liquid helium about 4 K.
The Laws of Thermodynamics by Peter Atkins, Paperback | Barnes & Noble®
The challenge, partly because it is there, is to cool matter to absolute zero itself. Another challenge, to which we shall return, is to explore whether it is possible—and even meaningful—to cool matter to temperatures below absolute zero of temperature; to break, as it were, the temperature barrier. In due course, it was conceded that it is impossible to attain absolute zero using a conventional thermal technique; that is, a refrigerator based on the heat engine design we discussed in Chapter 3.
Note that it refers to a cyclic process: there might be other kinds of process that can cool an object to absolute zero, but the apparatus that is used will not be found to be in the same state as it was initially. There must be more to the third law than appearances suggest. We need to know that a single electron has the property of spin, which for our purposes we can think of as an actual spinning motion.
To do so, we need to think about how low temperatures are achieved. The Laws of Thermodynamics population difference will correspond to a lower temperature, and we shall have cooled the sample. These spins are in thermal contact with the rest of the material in the sample and share the same temperature. Because the sample can give up energy to its surroundings, the electron spin populations can adjust.
The sample becomes. Because the process is adiabatic the entropy of the entire sample the spins and their immediate surroundings remains the same. However, because there is no change in the overall entropy of the sample, the entropy of the molecules that carry the electrons must be lowered, which corresponds to a lowering of temperature. Isothermal magnetization followed by adiabatic demagnetization has cooled the sample.
About This Item
This cycle lowers the temperature of the sample a little more. In principle, we can repeat this cyclic process, and gradually cool the sample to any desired temperature. It has not proved possible to achieve absolute zero in this way. The process of adiabatic demagnetization for reaching low temperatures. The arrows depict the spin alignment of the electrons in the sample. For instance, we could take a gas and compress it isothermally then allow it to expand adiabatically to its initial volume. The adiabatic expansion of a gas does work, and as no heat enters the system, the internal energy falls.
As we have seen, the internal energy of a gas arises largely from the kinetic energy of its molecules, so adiabatic expansion must result in their slowing down and therefore to a lowering of the temperature. However, it turns out that the effect of adiabatic expansion on the temperature diminishes as the temperature falls, so the possibility of using this technique is thwarted.
Once again, careful analysis shows that the technique will fail to reach absolute zero because the entropies of A and B converge on the same value as the temperature approaches zero. The common feature of this collective failure is traced to the convergence of the entropies of substances to a common value as T approaches zero. So, we can replace the phenomenological statement of the third law with a slightly more sophisticated version expressed in terms of the entropy: the entropy of every pure, perfectly crystalline substance approaches the same value as the temperature approaches zero.
The third law does not introduce a new thermodynamic function, and is therefore not the same type of law as the other three: it simply implies that the entropy can be expressed on an absolute scale. The law would seem to be irrelevant to the everyday world, unlike the other three laws of thermodynamics, which govern our daily lives with such fearsome relevance.
There are indeed no pressing consequences of the third law for the everyday world, but there are serious consequences for those who inhabit laboratories. There are technical salves 87 The third law: The unattainability of zero Some technical consequences to what might seem fatal injuries to the fabric of thermodynamics, so the subject does survive this onslaught from its own laws.
The third law provides the key to this application, which could not be done if the entropies of substances were different at absolute zero. The Laws of Thermodynamics Temperatures below zero Absolute zero is unattainable—in a sense. Too much should not be read into the third law, because in the form that expresses the unattainability of absolute zero it concerns processes that maintain thermal equilibrium and are cyclic. It leaves open the possibility that there are non-cyclic processes that can reach absolute zero. It will be simplest, and in practice most readily realizable, to consider a system that has only two energy levels, a ground state and a second state above it in energy.
As we have already remarked, because these two spin states correspond to opposite orientations of the bar magnet, they have two different energies. As the temperature is raised, electrons migrate into the upper state, and the internal energy and the entropy both increase. Now suppose that T is negative, such as — K. In fact, just below 0, the population is entirely in the upper state. Immediately above zero the population is entirely in the lower state.
The variation of left the internal energy and right the entropy for a two-level system. The expressions for these two properties can be calculated for negative temperatures, as shown on the left of each illustration. The entropy tracks these changes in the distribution of populations. Thus, whereas S increases from zero to log 2 in 91 The third law: The unattainability of zero The big question is whether the inversion of a thermal equilibrium that is, Boltzmann population can be contrived.
It can, but not by thermodynamic procedures. There are a variety of experimental techniques available for polarizing, as it is called, a collection of electron or nuclear spins that use pulses of radiofrequency energy. In fact, there is an everyday device that makes use of negative temperatures: the laser. The essential principle of a laser is to produce large numbers of atoms or molecules in an excited state and then to stimulate them to discard their energy collectively.
All the laser-equipped devices we use around the home, as in CD and DVD players, operate at temperatures below zero. Thermodynamics below zero The concept of negative temperature really applies in practice only to systems that possess two energy levels. Moreover, negative temperatures effectively take us outside the domain of classical thermodynamics because they have to be contrived and in general do not persist for more than very 92 short periods. Therefore, in a region of negative temperature, energy is conserved and the internal energy may be changed by doing work or making use of a temperature difference.
Suppose heat leaves the system: its entropy increases as we have just seen. If that energy enters the surroundings at a positive temperature, their entropy also increases. We can understand that conclusion at a molecular level by thinking about a two-level system: think of the inverted population, which has a high energy but low entropy, losing some of its energy and the population returning towards equality, a high entropy log 2 condition, so the entropy increases as energy is lost.
In short, the second law implies that there will be a spontaneous transfer of heat from a system of negative temperature in contact with one of positive temperature and that the process will continue until the temperatures of the two systems are equal. Therefore, overall the total entropy of the two systems increases by 0. The extra energy actually comes from the cold sink, because, as we have seen, extracting heat from a source with a negative 94 temperature increases its entropy. In a sense, as the inverted population in the cold negative sink tumbles back down towards equality, the energy released contributes to the work that the engine produces.
Conclusion We are at the end of our journey. We have seen that thermodynamics, the study of the transformations of energy, is a subject of great breadth and underlies and elucidates many of the most common concepts of the everyday world, such as temperature, heat, and energy. The third law brought the molecular and empirical formulations of thermodynamics into coincidence, uniting the two rivers. Where I have feared to tread is in two domains that spring from or draw analogies with thermodynamics. I have not touched on the still insecure world of non-equilibrium thermodynamics, where attempts are made to derive laws relating to the rate at which a 96 process produces entropy as it takes place.
What I have sought to cover are the core concepts, concepts that effectively sprang from the steam engine but reach out to embrace the unfolding of a thought. This little mighty handful of laws truly drive the universe, touching and illuminating everything we know. Conclusion 97 Further reading If you would like to take any of these matters further, then here are some suggestions.
In The Second Law W. More serious accounts will be found in my various textbooks. Others, of course, have written wonderfully about the laws. I can direct you to that most authoritative account, Thermodynamics, by G. Lewis and M. Randall McGraw-Hill, ; revised by K. Pitzer and L. Brewer, Other useful and reasonably accessible texts on my shelves are The Theory of Thermodynamics, by J.
Denbigh and J. Widom Cambridge University Press, In this Very Short Introduction, Philip Ball uses a non-traditional approach to chemistry, focusing on what chemistry might become during this century, rather than a survey of its past He explores the role of the molecule in and around us how, for example, a single fertilized egg can grow into a multi-celled Mozart, what makes spider's silk insoluble in the morning dew, and how this molecular dynamism is being captured in the laboratory, promising to reinvent chemistry as the central creative science of the century.
In this lively and accessible introduction, Graham Priest shows how wrong this conception is. He explores the philosophical roots of the subject, explaining how modern formal logic deals with issues ranging from the existence of God and the reality of time to paradoxes of self-reference, change, and probability. Along the way, the book explains the basic ideas of formal logic in simple, non-technical terms, as well as the philosophical pressures to which these have responded. This is a book for anyone who has ever been puzzled by a piece of reasoning. Whilst not shirking the problems, Priest always manages to keep his discussion accessible and instructive.
Even if you read no other book on modern logic but this one, you will come away with a deeper and broader grasp of the reason d'etre for logic. Edward Craig argues that philosophy is not an activity from another planet: learning about it is just a matter of broadening and deepening what most of us do already. He shows that philosophy is no mere intellectual pastime: thinkers such as Plato, Buddhist writers, Descartes, Hobbes, Hume, Hegel, Darwin, Mill and de Beauvoir were responding to real needs and events - much of their work shapes our lives today, and many of their concerns are still ours.
Henry Chadwick. Anthony Stevens. Continental Philosophy. Shakespeare [n 1]. William Shakespeare [n 1]. Michael Howard. The Russian Revolution. Timothy Gowers. Philosophy of Science. Fred Piper, Sean Murphy. Choice Theory. Michael Allingham. Charles Townshend. John Dunn. Modern Ireland. Robert J. The History of Astronomy.
Chris Frith , Eve C. British Politics. Tony Wright. Political Philosophy. David Miller. Brian Charlesworth , Deborah Charlesworth. Presocratic Philosophy. Dada and Surrealism. Egyptian Myth. In the 3rd edition the title has changed to Climate Change. The World Trade Organization. Global Catastrophes. Bill McGuire. Christopher Kelly.
International Relations. Paul Wilkinson. The American Presidency. The Great Depression and New Deal. American Political Parties and Elections. The Quakers. Richard Bushman. Religion in America. Science and Religion. The History of Medicine. The History of Life. Michael J.
The Apocryphal Gospels. The Soviet Union. Writing and Script. The Reformation. Fergus Kerr. The Norman Conquest. Biblical Archaeology. The Reagan Revolution. The Book of Mormon. Islamic History. The Laws of Thermodynamics. Mark Dodgson , David Gann. The New Testament. Landscapes and Geomorphology. Andrew Goudie , Heather Viles. Spanish Literature.
North American Indians. The U. Robert Skidelsky. Robin LePoidevin. Late Antiquity. Jonathan A. The Scientific Revolution. American Immigration. Developmental Biology. Global Economic History. Environmental Economics. Children's Literature. Modern France. Colonial Latin American Literature. Steven W. Lockley, Russell G. The Cultural Revolution. Modern Latin American Literature.
The History of Mathematics. Supreme Court. The Gothic. American History. Guido Caldarelli , Michele Catanzaro. Colonial America. The British Constitution. American Politics. Ian Stewart. Ashley Jackson. The Palestinian—Israeli Conflict. Contemporary Fiction. Kenneth Falconer. International Security. American Legal History. The Ice Age. Landscape Architecture. Christopher Smith. African American Religion. Structural Engineering. Ancient Egyptian Art and Architecture.
Child Psychology. Corporate Social Responsibility.