How To Understand The Relationship Between Entropy And Heat Engines

Similar documents
The First Law of Thermodynamics

Chapter 18 Temperature, Heat, and the First Law of Thermodynamics. Problems: 8, 11, 13, 17, 21, 27, 29, 37, 39, 41, 47, 51, 57

Thermodynamics AP Physics B. Multiple Choice Questions

FUNDAMENTALS OF ENGINEERING THERMODYNAMICS

Thermodynamics. Thermodynamics 1

Chapter 6 The first law and reversibility

TEACHER BACKGROUND INFORMATION THERMAL ENERGY

1. The Kinetic Theory of Matter states that all matter is composed of atoms and molecules that are in a constant state of constant random motion

Chapter 10 Temperature and Heat

The Second Law of Thermodynamics

Entropy and the Kinetic Theory: the Molecular Picture

The first law: transformation of energy into heat and work. Chemical reactions can be used to provide heat and for doing work.

Supplementary Notes on Entropy and the Second Law of Thermodynamics

The Second Law of Thermodynamics

Thermochemistry. r2 d:\files\courses\ \99heat&thermorans.doc. Ron Robertson

Study the following diagrams of the States of Matter. Label the names of the Changes of State between the different states.

Physics 5D - Nov 18, 2013

Answer, Key Homework 6 David McIntyre 1

What is Energy? What is the relationship between energy and work?

KINETIC MOLECULAR THEORY OF MATTER

1 CHAPTER 7 THE FIRST AND SECOND LAWS OF THERMODYNAMICS

Chemistry 13: States of Matter

) and mass of each particle is m. We make an extremely small

Define the notations you are using properly. Present your arguments in details. Good luck!

We will try to get familiar with a heat pump, and try to determine its performance coefficient under different circumstances.

Thermodynamics - Example Problems Problems and Solutions

HEAT UNIT 1.1 KINETIC THEORY OF GASES Introduction Postulates of Kinetic Theory of Gases

APPLIED THERMODYNAMICS TUTORIAL 1 REVISION OF ISENTROPIC EFFICIENCY ADVANCED STEAM CYCLES

Chapter 17: Change of Phase

Problem Set MIT Professor Gerbrand Ceder Fall 2003

Phys222 W11 Quiz 1: Chapters Keys. Name:

Practice Test. 4) The planet Earth loses heat mainly by A) conduction. B) convection. C) radiation. D) all of these Answer: C

SAM Teachers Guide Heat and Temperature

Name: Class: Date: 10. Some substances, when exposed to visible light, absorb more energy as heat than other substances absorb.

THE KINETIC THEORY OF GASES

6-2. A quantum system has the following energy level diagram. Notice that the temperature is indicated

The final numerical answer given is correct but the math shown does not give that answer.

CLASSICAL CONCEPT REVIEW 8

ENERGY CONSERVATION The First Law of Thermodynamics and the Work/Kinetic-Energy Theorem

KINETIC THEORY AND THERMODYNAMICS

Name Date Class STATES OF MATTER. SECTION 13.1 THE NATURE OF GASES (pages )

Kinetic Molecular Theory and Gas Laws

Statistical Mechanics, Kinetic Theory Ideal Gas. 8.01t Nov 22, 2004

Chapter 1 Classical Thermodynamics: The First Law. 1.2 The first law of thermodynamics. 1.3 Real and ideal gases: a review

ENTROPY AND THE SECOND LAW OF THERMODYNAMICS

13.1 The Nature of Gases. What is Kinetic Theory? Kinetic Theory and a Model for Gases. Chapter 13: States of Matter. Principles of Kinetic Theory

LAB 15: HEAT ENGINES AND

Exergy: the quality of energy N. Woudstra

_CH06_p qxd 1/20/10 9:44 PM Page 69 GAS PROPERTIES PURPOSE

We will study the temperature-pressure diagram of nitrogen, in particular the triple point.

Name Class Date. In the space provided, write the letter of the term or phrase that best completes each statement or best answers each question.

Energy and Energy Transformations Test Review

(Walter Glogowski, Chaz Shapiro & Reid Sherman) INTRODUCTION

Problem Set 4 Solutions

Topic 3b: Kinetic Theory

Chapter 8 Maxwell relations and measurable properties

Module P7.3 Internal energy, heat and energy transfer

Rate Equations and Detailed Balance

Physics 176 Topics to Review For the Final Exam

The First Law of Thermodynamics

Gas Laws. Heat and Temperature

Isentropic flow. Wikepedia

Topic 2: Energy in Biological Systems

Thermodynamics: Lecture 2

Unit 3: States of Matter Practice Exam

Chemistry Unit 3 Reading Assignment Energy and Kinetic Molecular Theory

Kinetic Theory & Ideal Gas

CALCULATIONS & STATISTICS

Chapter 2 Classical Thermodynamics: The Second Law

Experiment 1: Colligative Properties

The unifying field Theory

States of Matter CHAPTER 10 REVIEW SECTION 1. Name Date Class. Answer the following questions in the space provided.

Lecture 3: Models of Solutions

CFD SIMULATION OF SDHW STORAGE TANK WITH AND WITHOUT HEATER

Physical Chemistry Laboratory I CHEM 445 Experiment 6 Vapor Pressure of a Pure Liquid (Revised, 01/09/06)

IDEAL AND NON-IDEAL GASES

Physical Properties of a Pure Substance, Water

WHERE DID ALL THE ELEMENTS COME FROM??

Thermodynamics and Equilibrium

8.7 Exponential Growth and Decay

a) Use the following equation from the lecture notes: = ( J K 1 mol 1) ( ) 10 L

Energy comes in many flavors!

Give all answers in MKS units: energy in Joules, pressure in Pascals, volume in m 3, etc. Only work the number of problems required. Chose wisely.

There is no such thing as heat energy

AZ State Standards. Concept 3: Conservation of Energy and Increase in Disorder Understand ways that energy is conserved, stored, and transferred.

Review D: Potential Energy and the Conservation of Mechanical Energy

Introduction to the Ideal Gas Law

ES 106 Laboratory # 2 HEAT AND TEMPERATURE

Linear Programming Notes VII Sensitivity Analysis

Energy Conservation: Heat Transfer Design Considerations Using Thermodynamic Principles

Heat and Work. First Law of Thermodynamics 9.1. Heat is a form of energy. Calorimetry. Work. First Law of Thermodynamics.

Name Partners Date. Energy Diagrams I

CURVE FITTING LEAST SQUARES APPROXIMATION

Second Order Linear Partial Differential Equations. Part I

Chapter 4: Transfer of Thermal Energy

Problem Set 3 Solutions

Forms of Energy. Freshman Seminar

Astro 102 Test 5 Review Spring See Old Test 4 #16-23, Test 5 #1-3, Old Final #1-14

15 THERMODYNAMICS. Learning Objectives

Differential Balance Equations (DBE)

Transcription:

Class Notes 9, Phyx Entropy and the Second Law I. THERMODYNAMIC DEFINITION OF ENTROPY P-V DIAGRAM for IDEAL MONATOMIC GAS Entropy is a term that crops up from time to time in everyday conversations. This is usually due to its intimate association with disorder. Less commonly known is the association of an increase in entropy with a lessened ability to do work. We will discuss both of these ideas here. To begin we consider the relationship of entropy to our previous discussions of the first and second laws of thermodynamics and heat engines. First of all, entropy is a function of state (or state function) of the system, similar to the internal energy U of the system. For example, if we know the volume V, number of particles N, and temperature T of an ideal monatomic gas, then we know the internal energy, which can be expressed as U = (3/) Nk B T. Similarly, if we know V, N, and T of the ideal monatomic gas we also know the entropy, which can be written as S = Nk B ln [ ( ) 3 ] U V N N C. () Here C is a constant that depends upon the mass of the atoms that make up the gas. This is all well and good, but you are probably wondering how does entropy relate to the laws of thermodynamics? The basic thermodynamic definition of entropy is as follows. Suppose the system (which could be the working substance of a heat engine) is at a temperature T and we add a small amount of heat Q to it. The entropy S of the system changes as a result of this heat flow and is given by S = Q T. () [Note that if Q is negative (heat flow out) then the entropy decreases.] Let s consider what Eq. () means in terms of the P-V diagram for an ideal gas. Figure shows the familiar P-V diagram with isotherms and adiabats indicated as solid and dotted lines, respectively. Recall, along an adiabat the heat flow Q is zero. Equation () tells us then that the entropy S of the system does not change along an adiabat. That is, an adiabat is a constant entropy curve, in the same way that an isotherm is a constant temperature curve. The fact that the entropy is defined by Eq. () and that it is a state function [given by Eq. ()] means that heat flow (which is not a state function) is, in fact, directly related to the change of a state function of the system the change in entropy! Practically speaking, this now puts heat flow on the same level as the internal energy. PRESSURE (atm).8.6.4..8.6.4. A.5..5..5.3.35.4.45.5 VOLUME (m^3) 3 Figure. P-V diagram for an ideal monatomic gas. Path is along an isotherm. Path is along an isochor. Path 3 is along an adiabat, which has a constant value of entropy. Because entropy is a state function each point in the diagram has a particular value of entropy. We can now think about the P-V diagram with this understanding of entropy. Keep in mind that because entropy is a state function, each point on the P-V diagram has a certain value of entropy S associated with it (just as each point also has a certain value of U, T, P, and V ). Let s now suppose that we start with the system at point A in Fig.. If the gas is expanded along an adiabat (along path 3 from point A), the entropy of the system remains constant, but the temperature decreases (because the internal energy is being converted into work during the expansion). If, however, the system expands along an isotherm (along path from point A), the temperature remains constant, but the entropy increases (because heat flows into the system). If the system completes a thermodynamic cycle, for example starting at A and following paths,, and 3 back to A, then the entropy of the system is the same as when it left point A, precisely because the entropy is a state function. Another way to say this is that because S is a state function, for any thermodynamic cycle starting at A and ending back at A, S =. Notice however, for a thermodynamic cycle Q = Q in Q out is not necessarily zero, because the net work is not necessarily zero (remember, W = Q for a close path): so heat flow is not a state function! C B

.8 A P-V DIAGRAM for IDEAL MONATOMIC GAS Q out Q in = T C T H, (4).6.4 which is why the efficiency (= (Q out /Q in )) for the Carnot heat engine can be written as (T C /T H ). PRESSURE (atm)..8.6.4. 4 D B 3.5..5..5.3.35.4.45.5 VOLUME (m^3) Figure. P-V diagram for an ideal monatomic gas, showing an isotherm-adiabat (Carnot) thermodynamic cycle. Paths and 3 are along isotherms. Along path the temperature is T H, along path 3, T C. Heat flow in (Q in ) occurs along path. Heat flow out (Q out ) occurs along path 3. Paths and 4 are along adiabats (constant entropy curves). Now, even though we have a simple relationship between heat added and the change in entropy Eq. (), it is not necessarily trivial to use Eq. () to calculate S for any given path on the P-V diagram. This because for most paths the temperature is constantly changing along the path. In such a case we must then resort to integral calculus and use the infinitesimal version of Eq. (), ds = dq/t. However, there are two types of paths where Eq. () can be directly applied: isotherms (where T is constant) and adiabats (where S is constant). II. ENTROPY AND THE CARNOT CYCLE With this in mind let s think about the Carnot engine, which consists of paths that lie along only either an isotherm or an adiabat. Referring to Fig., because S = around the whole path (from A to A, say), and because S is zero on paths and 4, it must be that S path = S path 3. Thus from Eq. () we have that Q in T H C = Q out T C, (3) III. THE ENTROPY FORM OF THE SECOND LAW OF THERMODYNAMICS When a Carnot engine runs around a complete cycle no net entropy change occurs to the system (working substance) because the final state equals the initial state, and entropy only depends on the state of the system. What about the entropy change of the system s surroundings? In each cycle, the high-temperature reservoir loses some heat (= Q in ); thus, its entropy decreases. Similarly, in each cycle the entropy of the low temperature reservoir increases due to the heat flow Q out into that reservoir. Using Eq. () we can easily calculate the entropy change of the two reservoirs during a complete cycle, assuming that the temperature of each reservoir is constant. Let s assume that the hot-reservoir temperature is constant at T HR (which must be larger than the system temperature T H along path in Fig.., else heat would not flow from the reservoir to the system) and that the cold-reservoir temperature is T CR. Using Eq. () we can thus write down the total entropy change of the universe (= system + surroundings) for the complete cycle as S total = S system + S surroundings = Q in T H Q out T C Q in T HR + Q out T CR (5) Rearranging the right hand side of Eq. (5) we can write the change in the entropy of the universe as ( S total = Q in ) ( + Q out ) T H T HR T CR T C (6) This equation tells us something very important. Because T HR > T H and T C > T CR, we must have that S total >. That is, the total entropy change of the universe increases for a complete thermodynamic cycle. Whoa! In fact, we do not have to think about a complete thermodynamic cycle to come to a similar conclusion. Let s say we have two objects in thermal contact, one at temperature T hot and one at temperature T cold, and we let a small amount of heat Q flow between them. The entropy change of the universe is then where T H is the temperature along path and T C is the temperature along path 3. Rearranging Eq. (3) we thus have S total = Q ( ). (7) T cold T hot

3 Again, this must be greater than zero since T hot > T cold. Thus for any spontaneous heat-flow process between two objects the entropy of the universe increases. This is one consequence of the entropy form of the second law of thermodynamics, which can be stated as follows: for any process the entropy of the universe never decreases. Processes where S = are known as reversible, because the universe can be put back into the same state (by reversing the process) as it was before the process occured. Processes where S > are known as irreversible, because the the universe cannot be put back into its original state. In other words, let s say that an irreversible process occurs. The entropy of the universe will necessarily have increased. Because entropy is a state function the universe will obviously not be in the same state as it was before the process. The second law then states that we cannot get back to the initial state of the universe because S cannot be negative. For the thermodynamic-cycle situation described by Eq. (6), the only way that the process could be reversible (so that S = ) would be if T HR were only infinitesimally (in the mathematical sense) larger than T H, and T C were infinitesimally larger than T CR (in other words if T HR = T H + dt and T C = T CR + dt ). Then we could ignore the difference between T HR and T H and the difference between T C and T CR in Eq. (6), and the total entropy change of the universe would be zero. However, in reality reversibility is only a theoretical limit to what practically happens in any situation. In all real-world cases thermodynamic processes are irreversible and the entropy of the universe increases. Note that the entropy form of the second law directly implies the heat-flow form of the second law: heat always spontaneously flows from a hotter to colder substance, which we discussed in our previous heat-engine handout. Keep in mind that entropy can decrease in a particular region of the universe. For the Carnot cycle, as heat flows out of the system its entropy is decreasing. However, the increase in entropy of the low temperature reservoir is greater than the decrease in entropy of the system, resulting is a net increase in entropy for the universe. Similarly, sometimes one hears that life exists in the face of the second law of thermodynamics, meaning, presumably, that the orderly arrangement of molecules required to make living matter is a low state of entropy and that this somehow violates the second law. Nothing could be further from the truth. Consider, for example, a beaker of water that has been supercooled to a temperature below C. (That can be done if there are no dust particles and the beaker isn t vibrated.) The liquid water wants to become ice and will do so if given the slightest encouragement, such as a gentle tap. When the ice forms, the atoms are arrayed in closely packed crystalline structure-a beautifully ordered array. Certainly the ice is in a lower state of entropy than when it was a liquid, with its atoms free to move about helter-skelter. How did this happen if the second law says entropy has to increase? The answer is that when the water freezes it releases energy (called latent heat) and this heat flow to the surroundings increases the entropy of the surroundings. If the net change in entropy of the universe is positive the process has the possibility to occur. Thus, crystals (and all other low entropy states) are perfectly acceptable to the second law as long as their appearance increases the entropy of something else. So, is the entropy of the entire universe continually increasing? Because thermodynamics was born from observations about systems that are on the length scale of a few meters interacting with surroundings that are also typically a few meters in size, it does make sense to ask if we look at a much larger system, such as our galaxy, for example, does the second law hold? From one perspective, the question about the entropy of the entire universe is probably impossible to even ask because the entropy of the whole universe is probably not definable (neither is the total energy!). However, if we define the surroundings to be the region of space that is interacting with the system, then it is probably safe to say that if the entropy of the system decreases (even if the system is a galaxy or cluster of galaxies or something even larger), then the entropy of the surroundings must increase by a greater amount. Related to this discussion about the entropy of the universe, it is certainly true that gravity tends to reduce entropy by forming planets, stars, solar systems, galaxies, clusters of galaxies, etc. So if this order is appearing, what part of the universe is paying for the order by becoming disordered? The answer at least partially lies in the gravitational field itself. As matter is organized into planets, stars, and so forth, energy and entropy are produced in the gravitational field (in the form of gravitational waves). At present most gravitational theorists believe that the entropy increase, partially present in the gravitational field, is greater than the decrease in the organized matter, satisfying the second law. IV. ENTROPY AND AVAILABLE TO DO WORK A consequence of this new form of the second law is that a heat engine that utilizes irreversible steps (that is, any real engine) must be even less efficient than a heat engine that operates reversibly (which already cannot be % efficient). This can be seen by thinking about the Carnot cycle, illustrated in Fig.. Let s assume that we have a hot reservoir at T HR and a cold reservoir at T CR, which are both fixed temperatures. The most efficient engine that can utilize these two reservoirs is the reversible engine, where T H is just (infinitesimally) below T HR and T C is just above T CR. Since the efficiency of the engine is give by e = (T H /T C ), the reversible engine has an efficiency equal to e = (T HR /T CR ). Any other engine that can utilize these two reservoirs must have T H < T HR and T C > T CR. Therefore T C /T H is larger for any irreversible engine operating between these two

4 reservoirs, and the efficiency is smaller than for the reversible engine. This is a generic result: whenever entropy increases the amount of microscopic energy available to do work is decreased. In other words, an increase of entropy degrades the usefulness of microscopic energy. A very instructive example of this result is the so-called free expansion of a gas. Suppose an ideal gas is enclosed in a container with rigid, thick, insulating walls. Initially the gas is in equilibrium and is confined to half the container, say, by a removable partition. The other half of the container is evacuated. Suddenly, the partition is removed and the gas rushes freely into the vacuum. Of course, this process leads to turbulent swirling of the gas during which pressure, density, and temperature rapidly change from place-to-place in the container. That is, during this turbulent phase the thermodynamic properties of the gas are not well defined (so that its changing state does not even exist on the P-V diagram!). Eventually, however, the magic of internal equilibrium asserts itself and the gas re-establishes a state of thermodynamic equilibrium, with well-defined T, P, V, etc. The rapid expansion of the gas is highly irreversible. Certainly, merely reintroducing the partition will not cause the gas to spontaneously compress back to its original state. Nonetheless, the gas has well-defined initial and final states, so U f U i and S f S i can be calculated by connecting the initial and final states with any process, including a reversible process. We first argue that U f U i is zero for free expansion. This is because during the actual intervening events the gas moves no walls, so no work is done, and the walls are insulating, so no heat flows. (We assume that the partition is so light that removing it doesn t cost any energy.) If the gas is ideal, its internal energy depends on temperature only, so the temperature of the final state must equal the temperature of the initial state. Using P V = Nk B T we can state that if P i, T i,and V i are the gas s initial thermodynamic properties, then P f = P i /, T f = T i, and V f = V i are its final ones. There are infinitely many processes that could connect these two states. The most direct, perhaps, is a reversible isothermal expansion. By looking at the P-V diagram we know that isothermal expansion is accompanied by an increase in entropy. Thus the entropy of the freely expanded gas must have increased. We can use Eq. () to calculate this increase in entropy. Using ln(ab) = ln(a) + ln(b), we can rewrite Eq. () as so that S = Nk B {ln(v ) + ln which is positive. [ ( ) 3 ]} U N N C (8) S = Nk B ln(v f /V i ) = Nk B ln(), (9) gas ready to do work gas will never do work Figure 3. Before and after removal of the partition that separates two halves of the container. The gas is initially in the left-hand side of the container. So how is this example of free expansion connected with the degradation of the usefulness of energy? Remember, the initial and final states have exactly the same internal energy. But, as we will now argue, the final state s isn t as useful as the initial state s. Here s the story. Go back to the initial unexpanded state. Insert a paddle wheel contraption in the evacuated portion of the chamber, as shown in Fig. 3. Pull the partition down halfway. As the gas rushes into the vacuum it can turn the paddle wheel. If the paddle wheel is connected to an electric generator, for example, its motion can produce an electric current. That is, the rushing gas can do useful work. Of course, in this case after it fully expands it won t be in the same final state as before because some of its internal energy will have been transferred to the surroundings via work. The gas will be cooler. On the other hand, suppose we let the gas freely expand as before. Then insert into it a half-pulled-down partition and a paddle wheel, as in the second panel of the figure. In the expanded state, the gas has the same internal energy as in the initial compressed state, but now it will never turn the paddle wheel. The increase in entropy of the gas has resulted in a loss of usefulness of the internal energy. V. MICROSCOPIC DEFINITION OF ENTROPY As we discussed in the Section I, the purely thermodynamic (or macroscopic) definition of entropy states that

5 () entropy is a state function, and that () its change can be calculated via S = Q/T. Entropy can also be defined from a microscopic viewpoint. In fact, as we see below entropy connects what we know macroscopically about the system to what we do not know microscopically about the system. The microscopic definition of entropy, due to Gibbs, is defined as 6 4 vs SITE NUMBER (a) S = k B ln(z), () where Z is the number of microstates consistent with a given macrostate of the system. What does this mean? Well, for a system in thermodynamic equilibrium the macrostate is defined as the thermodynamic state defined by values of P, V, N, T, U, and S, for example. Of course, for any given macrostate there are typically an incredibility large number of microstates that could result in that particular macrostate, as we now show. To understand Eq. () let s consider the simulation discussed in class where we had sites, each starting out with an equal amount of energy, illustrated in Fig. 4(a) with one energy unit on each site. Recall, we can define the energy distribution function for this system, which is a histogram of the number of sites with a given amount of energy. In the simulation the initial (microscopic) state of the system is one where of the sites have one energy unit, so the energy distribution function, shown in Fig. 4(b), is a spike at energy unit that is sites high. Let s also recall what happens when we turn on the simulation and we let the sites exchange energy. After a while there is a much broader distribution function, with some sites having no energy, some having one energy unit, some with two, and so forth. A plot of the system and distribution function at some later time are shown in Fig. 5. For this system P and V aren t really important, and N is fixed at, so the only interesting thermodynamic variables are T, U, and S. You might think that U isn t too interesting since it is fixed at energy units, but energy is very interesting because it is constantly being transferred among the individual sites. OK, so what is the macrostate and what is the microstate for the simulated system? The macrostate is defined by the distribution function [parts (b) of Figs. 4 and 5] while the microstate is defined by the exact placement of the energy bits [parts (a) of Figs. 4 and 5]. Here is how the entropy enters into all of this. In thermodynamics we only ever know the macrostate of the system, not the exact microstate. To calculate the entropy in the simulation we take the given distribution function (macrostate) and ask, how many exact partitionings of energy among the sites (microstates) are consistent with the macrostate. That number is Z in Eq. (). In general Z is an incredibility large number. For example, for the distribution function shown in Fig. 5(b), Z is approximately 9. (!) And this is for a system with only sites and one energy unit per site. For a system NUMBER of SITES 5 5 SITE NUMBER DISTRIBUTION FUNCTION (b) 4 6 8 Figure 4. Initial conditions of energy exchange simulation. (a) Initial energy (= energy unit) at each site. (b) Energy distribution function (histogram of sites vs energy). with 3 atoms the number of energy configurations is even more staggering. The simulation also illustrates very nicely the principle that any spontaneous process results in an increase in entropy. In the initial, very special state, the distribution function is a spike at one energy unit. A little thought reveals that there is only one microstate corresponding to this macrostate, that shown in Fig. 4(a), where each sites has exactly one energy unit. Thus, in Eq. () Z =, which implies ln(z) =, and thus the entropy is initially zero. As the system evolves in time we end up with a distribution function that has a maximal number of microstates associated with it, the distribution function shown in Fig. 5(b). At this point the entropy has increased to it maximum value. The graph in Fig. 6 shows the time evolution of the entropy for the simulated system. One last point associated with this simulation. The simulated system has well defined values of N, U, and

6 vs SITE NUMBER 6 (a) 3 7 ENTROPY vs TIME 4 4 ENTROPY (kb) 8 5 9 5 5 SITE NUMBER 6 3 DISTRIBUTION FUNCTION 5 5 TIME STEP NUMBER of SITES 5 (b) 4 6 8 Figure 6. Entropy S (in units of k B ) vs time for the simulated system. Initially S =. It increases to it thermodynamic value after approximately time steps. is initially removed and the gas expands into the evacuated space it cannot be characterized by these simple thermodynamic variables. However, if we wait a while we will find out that, again, it has well-defined values of T, V, and P : thermodynamic equilibrium is again achieved as the entropy increases to its maximal thermodynamic value in the larger volume space. Figure 5. Energy exchange simulation after 7 time steps. (a) Energy at each site. (b) Energy distribution function (histogram of sites vs energy). The green curve is the theoretical calculation for the long-time average of the distribution function. S (which is initially increasing) during all stages of evolution to the maximum entropy macrostate state, but it is only after the entropy has reached its maximum value that the temperature T well defined. As we discussed earlier in class, the temperature is related to the slope of the late-time distribution function. Thus the second law also is consistent with the evolution of an isolated system to a well defined set of thermodynamic variables. This is exactly what is meant by thermodynamic equilibrium. (Referring to the language of dynamical systems and chaos, we can say that the attractor state of the system is a macroscopic state with well defined thermodynamic variables.) We point out that the evolution to thermodynamic equilibrium was already discussed with regard to the freely expanding gas. Initially, the gas is in thermodynamic equilibrium at T i, V i, and P i. When the partition VI. CHILDREN OF ENTROPY? As we discussed above, low-entropy states (such as life) are certainly allowed if that state was paid for by an increase in entropy elsewhere. Interestingly, life is not only compatible with the second law, it may, in fact, be actually helping it out. There are many examples of physical systems with remarkably orderly behavior in space and time that exist because they facilitate the making of entropy elsewhere. These are generically called dissipative structures because they help dissipate organization of energy. One example occurs when a liquid in a container is heated from below and cooled from above. When the temperature difference between bottom and top is maintained at a small value, the most efficient way of creating an entropy increase is via random atomic collisions heat conduction (diffusion of heat). In conduction, there is no net motion of the fluid; atoms remain in incoherent motion only. When the temperature difference is increased, however, convection begins. In convection large blobs of the liquid rise (because of buoyancy-they are less dense than their surroundings) to the top, where they cool and fall back down. Throughout the liquid atoms

7 are still moving incoherently, but in the blobs there is coherent motion as well. Thus, under the right conditions, the overall increase in entropy leads to coherent motion. When the temperature difference is made a little bigger, it is possible for convective motion to organize even more into cylindrical cell-like structures. In these cells, fluid moves up the center and falls on the outsides. These beautiful cells (they have hexagonal cross-sections) facilitate the production of entropy in the universe and are thus are a favorite trick of nature. Dissipation of organization can make wonderfully ordered structures as long as those structures help make more entropy elsewhere. Ultimately, when the temperature difference is made too large, turbulent roiling of the liquid destroys the convection cells. At too high a temperature difference convection cells just don t do as well in making entropy elsewhere as does turbulence. Consider now our planet. There is heat energy coming in from a hot source (the sun) and heat transmitted out to a cold sink (the rest of the universe). So the interesting question is, are the organized structures we see on our planet (i.e., life) the result of the overall increase in entropy that is occurring all around us? That is, are we indeed children of entropy?