03. Boltzmann Entropy, Gibbs Entropy, Shannon Information. I. Entropy in Statistical Mechanics.

Size: px
Start display at page:

Download "03. Boltzmann Entropy, Gibbs Entropy, Shannon Information. I. Entropy in Statistical Mechanics."

Transcription

1 03. Boltzmann Entropy, Gibbs Entropy, Shannon Information. I. Entropy in Statistical Mechanics. Goal: To explain the behavior of macroscopic systems in terms of the dynamical laws governing their microscopic consituents. In particular: To provide a micro-dynamical explanation of the 2nd Law. 1. Boltzmann's Approach. Consider different "macrostates" of a gas: Ludwig Boltzmann Why does the gas prefer to be in the equilibrium macrostate (last one)?

2 Suppose the gas consists of N identical particles governed by Hamilton's equations of motion (the micro-dynamics). Def. 1. A microstate X of a gas is a specification of the position (3 values) and momentum (3 values) for each of its N particles.! Ω = phase space = 6N-dim space of all possible microstates.! Ω E = region of Ω that consists of all microstates with constant energy E. Hamiltonian dynamics maps initial microstate X i to final microstate X f. Can 2nd Law be explained by recourse to this dynamics? X i X f Ω E

3 Def. 2. A macrostate Γ of a gas is a specification of the gas in terms of macroscopic properties (pressure, temperature, volume, etc.). Relation between microstates and macrostates: Macrostates supervene on microstates!! To each microstate there corresponds exactly one macrostate.! Many distinct microstates can correspond to the same macrostate. So: Ω E is partitioned into a finite number of regions corresponding to macrostates, with each microstate X belonging to one macrostate Γ(X). microstates macrostate Ω E Γ(X) X

4 Claim #1: Different macrostates have vastly different sizes. Def. 3. The Boltzmann Entropy is defined by S B (Γ(X)) = k log Γ(X) Γ(X) = the volume of Γ(X). So: S B (Γ(X)) is a measure of the size of Γ(X). Claim #2: The equilibrium macrostate Γ eq is vastly larger than any other macrostate (in other words, S B obtains its maximum value for Γ eq ). Γ eq Ω E very small nonequilibrium macrostates Thus: S B increases over time because, for any initial microstate X i, the dynamics will map X i into Γ eq very quickly, and then keep it there for an extremely long time.

5 Two Ways to Explain the Approach to Equilibrium: (a) Appeal to Typicality Claim: A system approaches equilibrium because equilibrium microstates are typical and nonequilibrium microstates are atypical. Why? For large N, Ω E is almost entirely filled up with equilibrium microstates. Hence they are "typical".! But: What is it about the dynamics that evolves atypical states to typical states? - "If a system is in an atypical microstate, it does not evolve into an equilibrium microstate just because the latter is typical." (Frigg 2009.)! Need to identify properties of the dynamics that guarantee atypical states evolve into typical states.! And: Need to show that these properties are typical. - Ex: If the dynamics is chaotic (in an appropriate sense), then (under certain conditions), any initial microstate X i will quickly be mapped into Γ eq and remain there for long periods of time. (Frigg 2009.)

6 (b) Appeal to Probabilities Claim: A system approaches equilibrium because it evolves from states of lower toward states of higher probability, and the equilibrium state is the state of highest probabililty. Associate probabilities with macrostates: the larger the macrostate, the greater the probability of finding a microstate in it. "In most cases, the initial state will be a very unlikely state. From this state the system will steadily evolve towards more likely states until it has finally reached the most likely state, i.e., the state of thermal equilibrium."

7 P 6 w 1 w 2 w 3 P 89 Arrangement #1: state of P 6 in w 1, state of P 89 in w 3, etc. Ω µ! Start with the 6-dim phase space Ω µ of a single particle. (Ω E = N copies of Ω µ ) Partition Ω µ into l cells w 1, w 2,..., w l of size δw. A state of an N-particle system is given by N points in Ω µ. point in Ω µ = singleparticle microstate. Def. 4. An arrangement is a specification of which points lie in which cells.

8 P 89 w 1 w 2 w 3 P 6 Arrangement #1: state of P 6 in w 1, state of P 89 in w 3, etc. Arrangement #2: state of P 89 in w 1, state of P 6 in w 3, etc. Distribution: (1, 0, 2, 0, 1, 1,...) Ω µ! Takes form (n 1, n 2,..., n l ), where n j = # of points in w j. Start with the 6-dim phase space Ω µ of a single particle. (Ω E = N copies of Ω µ ) Partition Ω µ into l cells w 1, w 2,..., w l of size δw. A state of an N-particle system is given by N points in Ω µ. Def. 4. An arrangement is a specification of which points lie in which cells. Def. 5. A distribution is a specification of how many points (regardless of which ones) lie in each cell. point in Ω µ = singleparticle microstate. Note: More than one arrangement can correspond to the same distribution.

9 How many arrangements G(D i ) are compatible with a given distribution D i = (n 1, n 2,..., n l )? Answer: Check: G(D i ) = N! n 1!n 2!!n l!! Let D 1 = (N, 0,..., 0) and D 2 = (N 1, 1, 0,..., 0). n! = n(n 1)(n 2)#1 = number of sequences that n distinguishable objects can be arranged. 0! = 1! G(D 1 ) = N!/N! = 1. (Only one way for all N particles to be in w 1.)! G(D 2 ) = N!/(N 1)! = N(N 1)(N 2)#1/(N 1)(N 2)#1 = N. (There are N different ways w 2 could have one point in it; namely, if P 1 was in it, or if P 2 was in it, or if P 3 was in it, etc...) "The probability of this distribution [D i ] is then given by the number of permutations of which the elements of this distribution are capable, that is by the number [G(D i )]. As the most probable distribution, i.e., as the one corresponding to thermal equilibrium, we again regard that distribution for which this expression is maximal..."

10 Note: Each distribution D i corresponds to a macrostate Γ Di. Why? Because a system's macroscopic properties (volume, pressure, temp, etc) only depend on how many particles are in particular microstates, and not on which particles are in which microstates. What is the size of this macrostate?! Note: A point in Ω E corresponds to an arrangement in Ω µ.! And: The size of a macrostate Γ Di in Ω E is given by the number of points it contains (i.e., the number of arrangements compatible with D i ) multiplied by a volume element of Ω E.! And: A volume element of Ω E is given by N copies of a volume element δw of Ω µ. So: The size of Γ Di is Γ Di = number of arrangements compatible with D i $ volume element of Ω E = G(D i ) δw N Thus: The Boltzmann entropy of Γ Di is given by: S B (Γ Di ) = k log(g(d i ) δw N ) = k log(g(d i )) + Nk log(δw) = k log(g(d i )) + const. "... S B is a measure of how much we can infer about the arrangement of a system on the basis of its distribution. The higher S B, the less information a distribution conveys about the arrangement of the system." (Frigg & Werndl 2011)

11 S B (Γ Di ) = k log(g(d i )) + const. N! = k log n 1!n 2!!n l! +const. Stirling's approx: logn! nlogn n n n l = N = k log(n!) k log(n 1!)... k log(n l!) + const. (Nk logn N ) (n 1 k logn 1 n 1 )... (n l logn l n l ) + const. l = k n j logn j j=1 Let: p j = n j /N = +const. probability of finding a randomly chosen microstate in cell w j Then: S B (Γ Di ) = Nk p j log p j l j=1 +const. The biggest value of S B is for the distribution D i for which the p j 's are all 1/l; i.e., the n j 's are all N/l, which is the equilibrium distribution. Boltzmann (1872) shows: The D i that maximizes S B is characterized by n j = µe λε j (Maxwell equilibrium distribution, where ε j = energy of microstates in cell w j ).

12 Relation between Boltzmann Entropy S B and Thermodynamic Entropy S T First: Let's derive the Maxwell-Boltzmann equilibrium distribution. Assume: A system of N weakly interacting particles described by: n j = N j ε j n j = U j n j = # states in cell w j N = total # particles ε j = energy associated with states in w j U = total internal energy Recall: S B (n j ) = (Nklog N N ) k (n j log n j n j ) + const. j d d So: S B = 0 k (n j logn j n j ) dn j dn + 0 j Or: ds B = k j = k (log n j + n j /n j 1) j = k j j log n j log n j dn j

13 Now: S B takes its maximum value for the values n j * that solve: ds B = k j log n j* dn j = 0 small changes to S B due only to small changes dn j! subject to the constraints on the small changes dn j : dn = dn j = 0 j du = ε j dn j = 0 j Note: Can add arbitrary multiples of the constraints to our equation and still get zero result: ds B = j ( k log n j * + α + βε j )dn j = 0 Or: k log n j * + α + βε j = 0 for appropriate choices of α and β! Now solve for n j* : n j * = e (α + βε j )/k Maxwell-Boltzmann equilibrium distribution for weakly interacting, distinguishable particles(the most probable distribution, corresponding to the largest macrostate).!

14 What is α? Substitute n j * into n j = N and get: N = e (α + βε j )/k = e α/k e βε j /k. Or: α = k log (N / e βε j /k ) α is a normalization constant that enforces correct particle number N More importantly: What is β? Consider: Small changes in internal energy of a reversible process: Macroscopic point of view du = δq dw = TdS T PdV Microscopic point of view du = d( ε j n j ) = ε j dn j + n j dε j Note: If PdV = n j dε j, then ds T = (1/T) ε j dn j Recall: ds B (n j* ) = k log n j* dn j = k (α + βε j )/k dn j, since log(e (α + βε j )/k ) = (α + βε j )/k = β ε j dn j, since kα dn j = 0 So: For the equilibrium distribution n j*, S B = S T provided β = 1/T.

15 Macroscopic point of view du = δq dw = TdS T PdV Microscopic point of view du = d( ε j n j ) = ε j dn j + n j dε j Recap: ds B (n j* ) = β ε j dn j So: For the equilibrium distribution n j*, S B = S T provided β = 1/T. What this shows: For a reversible process involving a large number of distinguishable particles characterized by their positions and velocities, it is consistent to identify the Boltzmann entropy with the thermodynamic entropy. But: Are we forced to?! S T measures absolute changes in heat per temperature of a reverisble process.! For thermally isolated processes, S T absolutely increases or remains constant.! S B measures how likely a given distribution of states occurs.! No absolute law that requires S B to increase or remain constant.

16 2. Gibbs' Approach. Boltzmann's Schtick: analysis of a single system. Each point of phase space Ω represents a possible state of the system. Willard Gibbs Gibbs' Schtick: analysis of an ensemble of infinitely many systems. Each point of phase space Ω represents a possible state of one member of the ensemble. The state of the entire ensemble is given by a density function ρ(x, t) on Ω. ρ(x, t)dx is the number of systems in the ensemble whose states lie in the region (x, x + dx). p t (R) = ρ(x,t)dx = R probability at time t of finding the state of a system in region R The Gibbs Entropy is then given by: S G (ρ) = k ρ(x,t)log(ρ(x,t))dx Ω

17 Interpretive Issues: (1) Why do low-probability states evolve into high-probability states? Characterizations of the dynamics are, again, required to justify this. (2) How are the probabilities to be interpreted? (a) Ontic probabilities = properties of physical systems Long run frequencies? Single-case propensities? (b) Epistemic probabilities = measures of degrees of belief Objective (rational) degrees of belief? Subjective degrees of belief?

18 II. Entropy in Classical Information Theory. Goal: To construct a measure for the amount of information associated with a message. The amount of info gained from the reception of a message depends on how likely it is. The less likely a message is, the more info gained upon its reception! Let X = {x 1, x 2,..., x l } = set of l messages. Def. 1: A probability distribution P = (p 1, p 2,..., p l ) on X is an assignment of a probability p j = p(x j ) to each message x j. Claude Shannon Recall: This means p j 0 and p 1 + p p l = 1.

19 Def. 2: A measure of information for X is a real-valued function H(X) : {prob. distributions on X} R, that satisfies:! Continuity. H(p 1,..., p l ) is continuous.! Additivity. H(p 1 q 1,..., p l q l ) = H(P) + H(Q), for probability distributions P, Q.! Monoticity. Info increases with l for uniform distributions: If m > l, then H(Q) > H(P), for any P = (1/l,..., 1/l) and Q = (1/m,..., 1/m).! Branching. H(p 1,..., p l ) is independent of how the process is divided into parts.! Bit normalization. The average info gained for two equally likely messages is one bit: H(½, ½) = 1. Claim (Shannon 1949): There is exactly one function that satisfies these criteria; namely, the Shannon Entropy (or Shannon Information): H(X) = l j=1 p j log 2 p j! H(X) is maximal for p 1 = p 2 =... = p l = 1/l.! H(X) = 0 just when one p j is 1 and the rest are 0.! Logarithm is to base 2: log 2 x = y x = 2 y. Bit normalization requires: If X = {x 1, x 2 }, and P = (½, ½), then H(X) = 1. Note: H(X) = (½log½ + ½log½) = log2. And: log2 = 1 if and only if log is to base 2. In what sense is this a measure of information?

20 1. H(X) as Maximum Amount of Message Compression Let X = {x 1,..., x l } be a set of letters from which we construct the messages. Suppose the messages have N letters a piece. The probability distribution P = (p 1,..., p l ) is now over the letter set. What this means:! Each letter x i has a probability of p i of occuring in a message.! In other words: A typical message will contain p 1 N occurrences of x 1, p 2 N occurrences of x 2, etc. Thus: So: log 2 The number of distinct typical messages The number of distinct typical messages N! = (p 1 N )!(p 2 N )!!(p l N )! N! = log 2 (p 1 N )!(p 2 N )!!(p l N )! Let's simplify the RHS...

21 N! log 2 (p 1 N )!(p 2 N )!!(p l N )! = log 2(N!) {log 2 ((p 1 N )!)+...+ log 2 ((p l N )!)} (N log 2 N N) {(p 1 N log 2 p 1 N p 1 N )+...+(p l N log 2 p l N p l N )} = N {log 2 N 1 p 1 log 2 p 1 p 1 log 2 N + p 1... p l log 2 p l p l log 2 N + p l } l = N p j log 2 p j j=1 = NH(X) Thus: log 2 The number of distinct typical messages = NH(X) So: The number of distinct typical messages = 2 NH(X)

22 So: There are only 2 NH(X) typical messages with N letters. This means, at the message level, we can encode them using only NH(X) bits. Check: 2 possible messages require 1 bit: 0, 1. 4 possible messages require 2 bits: 00, 01, 10, 11. etc. Now: At the letter level, how many bits are needed to encode a message of N letters drawn from an n-letter alphabet? First: How many bits are needed to encode each letter in an n-letter alphabet? n = #letters x = #bits per letter% 2 letters 1 bit: 0, 1 4 letters 2 bits: 00, 01, 10, 11 8 letters 3 bits: 000, 001, 010, 011, 100, 101, 110, 111 So: n = 2 x, so x = log 2 n. Note: log 2 n bits per letter entails N log 2 n bits for a sequence of N letters. Thus: Instead of requiring N log 2 n bits to encode our messages, we can get by with only NH(X) bits. Thus: H(X) represents the maximum amount that (typical) messages drawn from a given set of letters can be compressed.

23 Ex: Let X = {A, B, C, D} (n = 4) Then: We need log 2 4 = 2 bits per letter. For instance: A = 00, B = 01, C = 10, D = 11. So: We need 2N bits to encode a message with N letters. Now: Suppose the probabilities for each letter to occur in a typical N-letter message are the following: p A = 1/2, p B = 1/4, p C = p D = 1/8 Then: The number of such typical N-letter messages is: NH(X) = N ( 1 log log log log ) = 1.75N Thus: Instead of requiring 2N bits to encode the N messages, we can get by with only 1.75N. Note: If all letters are equally likely (the equilibrium distribution), then p A = p B = p C = p D = 1/4. And: ( ) = 2N. NH(X) = N 1 4 log log log log 2 1 4

24 2. H(X) as a Measure of Uncertainty Suppose P = (p 1,..., p l ) is a probability distribution over a set of values {x 1,..., x l } of a random variable X. Def. 1. The expected value E(X) of X is given by E(X) = l p j x j. j=1 Def. 2. The information gained if X is measured to have the value x j is given by log 2 p j.! Motivation: The greater p j is, the more certain x j is, and the less information should be associated with it. Then the expected value of log 2 p j is just the Shannon information: E( log 2 p j ) = What this means: l j=1 p j log 2 p j = H(X) H(X) tells us our expected information gain upon measuring X.

25 3. Conditional Entropy Def. 1: A communication channel is a device with a set of input states X = {x 1,..., x l } that are mapped to a set of output states Y = {y 1,..., y l }. Def. 2: Given probability distributions p(x i ) and p(y i ) on X and Y, and a joint distribution p(x i y i ), the conditional probability p(x i y j ) of input x i, given output y j, is defined by: p(x i y j ) = p(x i y j ) p(y j ) Def. 3: The conditional Shannon Entropy of X given Y is given by: H(X Y ) = p(y j ) p(x i y j )log 2 p(x i y j ) What this means: j i H(X Y) measures the average uncertainty about the value of an input, given a particular output value.

26 Suppose we use our channel a very large number N of times.! Then: There will be 2 NH(X) typical input strings.! And: There will be 2 NH(Y) typical output strings.! Thus: There will be 2 NH(X Y) typical strings of X and Y values. So: # of typical input strings that could result in a given output Y = 2NH (X Y ) 2 NH (Y ) = 2 N (H (X Y ) H (Y )) Claim: H(X Y) = H(Y) + H(X Y) So: # of typical input strings that could result in a given output Y NH (X Y ) = 2 If one is trying to use a noisey channel to send a message, then the conditional entropy specifies the # of bits per letter that would need to be sent by an auxiliary noiseless channel in order to correct all the errors due to noise.

27 Check Claim: H(X Y ) = p(x i y j )log 2 [p(x i y j )] i,j = p(x i y j )log 2 [p(y j )p(x i y j )] i,j = p(x i y j )log 2 [p(y j )] p(x i y j )log 2 [p(x i y j )] i,j = p(y j )log 2 [p(y j )] p(y j )p(x i y j )log 2 [p(x i y j )] j i,j i,j = H(Y )+ H(X Y ) p(x i y j ) = p(y j ) i

28 Interpretive Issues: 1. How should the probabilities p(x i ) be interpreted? Emphasis is on uncertainty: The information content of a message x i is a function of how uncertain it is, with respect to the receiver. So: Perhaps the probabilities are epistemic. In particular: p(x i ) is a measure of the receiver's degree of belief in the accuracy of message x i. But: The probabilities are set by the nature of the source. If the source is not probabilistic, then p(x i ) can be interpreted epistemically. If the source is inherently probabilistic, then p(x i ) can be interpreted as the ontic probability that the source produces message x i.

29 2. How is Shannon Information/Entropy related to other notions of entropy? Thermodynamic entropy: Boltzmann entropy: ΔS = S f S i = S B (Γ Di ) = Nk l R j=1 i f δq R T p j ln p j +const. Gibbs entropy: S G (ρ) = k Ω ρ(x,t)ln(ρ(x,t))dx Shannon entropy: H(X) = l j=1 p j log 2 p j Can statistical mechanics be given an information-theoretic foundation? Can the 2nd Law be given an information-theoretic foundation?

30 Relation between Boltzmann Entropy S B and Shannon Entropy H: Boltzmann gas N = # of single-particle microstates. (n 1,..., n l ) = l-cell distribution. (p 1,..., p l ) = probabilty distribution over cells. p j = n j /N = probability that a microstate occurs in cell w j. Np j = # of microstates in cell w j. S B (Γ Di ) = Nk l j=1 p j ln p j +const. Size of macrostate corresponding to (p 1,..., p l ). Shannon message N = # of letters in message. {x 1,..., x l } = l-letter alphabet. (p 1,..., p l ) = probabilty distribution over letters. p j = n j /N = probability that x j occurs in a given message. Np j = # of x j 's in message. H(X) = l j=1 p j log 2 p j Maximum amount an N-length message drawn from {x 1,..., x l } can be compressed.

Topic 3b: Kinetic Theory

Topic 3b: Kinetic Theory Topic 3b: Kinetic Theory What is temperature? We have developed some statistical language to simplify describing measurements on physical systems. When we measure the temperature of a system, what underlying

More information

CHAPTER 6. Shannon entropy

CHAPTER 6. Shannon entropy CHAPTER 6 Shannon entropy This chapter is a digression in information theory. This is a fascinating subject, which arose once the notion of information got precise and quantifyable. From a physical point

More information

a) Use the following equation from the lecture notes: = ( 8.314 J K 1 mol 1) ( ) 10 L

a) Use the following equation from the lecture notes: = ( 8.314 J K 1 mol 1) ( ) 10 L hermodynamics: Examples for chapter 4. 1. One mole of nitrogen gas is allowed to expand from 0.5 to 10 L reversible and isothermal process at 300 K. Calculate the change in molar entropy using a the ideal

More information

An Introduction to Information Theory

An Introduction to Information Theory An Introduction to Information Theory Carlton Downey November 12, 2013 INTRODUCTION Today s recitation will be an introduction to Information Theory Information theory studies the quantification of Information

More information

Isentropic flow. Wikepedia

Isentropic flow. Wikepedia Isentropic flow Wikepedia In thermodynamics, an isentropic process or isoentropic process (ισον = "equal" (Greek); εντροπία entropy = "disorder"(greek)) is one in which for purposes of engineering analysis

More information

HEAT UNIT 1.1 KINETIC THEORY OF GASES. 1.1.1 Introduction. 1.1.2 Postulates of Kinetic Theory of Gases

HEAT UNIT 1.1 KINETIC THEORY OF GASES. 1.1.1 Introduction. 1.1.2 Postulates of Kinetic Theory of Gases UNIT HEAT. KINETIC THEORY OF GASES.. Introduction Molecules have a diameter of the order of Å and the distance between them in a gas is 0 Å while the interaction distance in solids is very small. R. Clausius

More information

Physics 176 Topics to Review For the Final Exam

Physics 176 Topics to Review For the Final Exam Physics 176 Topics to Review For the Final Exam Professor Henry Greenside May, 011 Thermodynamic Concepts and Facts 1. Practical criteria for identifying when a macroscopic system is in thermodynamic equilibrium:

More information

Chapter 1 Classical Thermodynamics: The First Law. 1.2 The first law of thermodynamics. 1.3 Real and ideal gases: a review

Chapter 1 Classical Thermodynamics: The First Law. 1.2 The first law of thermodynamics. 1.3 Real and ideal gases: a review Chapter 1 Classical Thermodynamics: The First Law 1.1 Introduction 1.2 The first law of thermodynamics 1.3 Real and ideal gases: a review 1.4 First law for cycles 1.5 Reversible processes 1.6 Work 1.7

More information

Some background on information theory

Some background on information theory Some background on information theory These notes are based on a section of a full biophysics course which I teach at Princeton, aimed at PhD students in Physics. In the current organization, this section

More information

What is Statistical Mechanics?

What is Statistical Mechanics? What is Statistical Mechanics? Roman Frigg Department of Philosophy, Logic and Scientific Method, London School of Economics, UK Forthcoming in Carlos Galles, Pablo Lorenzano, Eduardo Ortiz, and Hans-Jörg

More information

Review of Statistical Mechanics

Review of Statistical Mechanics Review of Statistical Mechanics 3. Microcanonical, Canonical, Grand Canonical Ensembles In statistical mechanics, we deal with a situation in which even the quantum state of the system is unknown. The

More information

Thermodynamics & Statistical Mechanics:

Thermodynamics & Statistical Mechanics: Thermodynamics & Statistical Mechanics: An intermediate level course Richard Fitzpatrick Associate Professor of Physics The University of Texas at Austin 1 INTRODUCTION 1 Introduction 1.1 Intended audience

More information

Diffusion and Data compression for data security. A.J. Han Vinck University of Duisburg/Essen April 2013 Vinck@iem.uni-due.de

Diffusion and Data compression for data security. A.J. Han Vinck University of Duisburg/Essen April 2013 Vinck@iem.uni-due.de Diffusion and Data compression for data security A.J. Han Vinck University of Duisburg/Essen April 203 Vinck@iem.uni-due.de content Why diffusion is important? Why data compression is important? Unicity

More information

Lecture 3: Models of Solutions

Lecture 3: Models of Solutions Materials Science & Metallurgy Master of Philosophy, Materials Modelling, Course MP4, Thermodynamics and Phase Diagrams, H. K. D. H. Bhadeshia Lecture 3: Models of Solutions List of Symbols Symbol G M

More information

ECON 459 Game Theory. Lecture Notes Auctions. Luca Anderlini Spring 2015

ECON 459 Game Theory. Lecture Notes Auctions. Luca Anderlini Spring 2015 ECON 459 Game Theory Lecture Notes Auctions Luca Anderlini Spring 2015 These notes have been used before. If you can still spot any errors or have any suggestions for improvement, please let me know. 1

More information

Lies My Calculator and Computer Told Me

Lies My Calculator and Computer Told Me Lies My Calculator and Computer Told Me 2 LIES MY CALCULATOR AND COMPUTER TOLD ME Lies My Calculator and Computer Told Me See Section.4 for a discussion of graphing calculators and computers with graphing

More information

Statistical Mechanics, Kinetic Theory Ideal Gas. 8.01t Nov 22, 2004

Statistical Mechanics, Kinetic Theory Ideal Gas. 8.01t Nov 22, 2004 Statistical Mechanics, Kinetic Theory Ideal Gas 8.01t Nov 22, 2004 Statistical Mechanics and Thermodynamics Thermodynamics Old & Fundamental Understanding of Heat (I.e. Steam) Engines Part of Physics Einstein

More information

CHAPTER 2 Estimating Probabilities

CHAPTER 2 Estimating Probabilities CHAPTER 2 Estimating Probabilities Machine Learning Copyright c 2016. Tom M. Mitchell. All rights reserved. *DRAFT OF January 24, 2016* *PLEASE DO NOT DISTRIBUTE WITHOUT AUTHOR S PERMISSION* This is a

More information

Entropy and the Kinetic Theory: the Molecular Picture

Entropy and the Kinetic Theory: the Molecular Picture previous index next Entropy and the Kinetic Theory: the Molecular Picture Michael Fowler 7/15/08 Searching for a Molecular Description of Entropy Clausius introduced entropy as a new thermodynamic variable

More information

Information Theory and Coding Prof. S. N. Merchant Department of Electrical Engineering Indian Institute of Technology, Bombay

Information Theory and Coding Prof. S. N. Merchant Department of Electrical Engineering Indian Institute of Technology, Bombay Information Theory and Coding Prof. S. N. Merchant Department of Electrical Engineering Indian Institute of Technology, Bombay Lecture - 17 Shannon-Fano-Elias Coding and Introduction to Arithmetic Coding

More information

1 The basic equations of fluid dynamics

1 The basic equations of fluid dynamics 1 The basic equations of fluid dynamics The main task in fluid dynamics is to find the velocity field describing the flow in a given domain. To do this, one uses the basic equations of fluid flow, which

More information

Thermochemistry. r2 d:\files\courses\1110-20\99heat&thermorans.doc. Ron Robertson

Thermochemistry. r2 d:\files\courses\1110-20\99heat&thermorans.doc. Ron Robertson Thermochemistry r2 d:\files\courses\1110-20\99heat&thermorans.doc Ron Robertson I. What is Energy? A. Energy is a property of matter that allows work to be done B. Potential and Kinetic Potential energy

More information

Ex. 2.1 (Davide Basilio Bartolini)

Ex. 2.1 (Davide Basilio Bartolini) ECE 54: Elements of Information Theory, Fall 00 Homework Solutions Ex.. (Davide Basilio Bartolini) Text Coin Flips. A fair coin is flipped until the first head occurs. Let X denote the number of flips

More information

) and mass of each particle is m. We make an extremely small

) and mass of each particle is m. We make an extremely small Umeå Universitet, Fysik Vitaly Bychkov Prov i fysik, Thermodynamics, --6, kl 9.-5. Hjälpmedel: Students may use any book including the textbook Thermal physics. Present your solutions in details: it will

More information

The properties of an ideal Fermi gas are strongly determined by the Pauli principle. We shall consider the limit: µ >> k B T βµ >> 1,

The properties of an ideal Fermi gas are strongly determined by the Pauli principle. We shall consider the limit: µ >> k B T βµ >> 1, Chapter 3 Ideal Fermi gas The properties of an ideal Fermi gas are strongly determined by the Pauli principle. We shall consider the limit: µ >> k B T βµ >>, which defines the degenerate Fermi gas. In

More information

Introduction to Learning & Decision Trees

Introduction to Learning & Decision Trees Artificial Intelligence: Representation and Problem Solving 5-38 April 0, 2007 Introduction to Learning & Decision Trees Learning and Decision Trees to learning What is learning? - more than just memorizing

More information

WHERE DOES THE 10% CONDITION COME FROM?

WHERE DOES THE 10% CONDITION COME FROM? 1 WHERE DOES THE 10% CONDITION COME FROM? The text has mentioned The 10% Condition (at least) twice so far: p. 407 Bernoulli trials must be independent. If that assumption is violated, it is still okay

More information

Notes on Elastic and Inelastic Collisions

Notes on Elastic and Inelastic Collisions Notes on Elastic and Inelastic Collisions In any collision of 2 bodies, their net momentus conserved. That is, the net momentum vector of the bodies just after the collision is the same as it was just

More information

NATIONAL OPEN UNIVERSITY OF NIGERIA SCHOOL OF SCIENCE AND TECHNOLOGY COURSE CODE: PHY311 COURSE TITLE: KINETIC THEORY AND STATISTICAL MECHANICS

NATIONAL OPEN UNIVERSITY OF NIGERIA SCHOOL OF SCIENCE AND TECHNOLOGY COURSE CODE: PHY311 COURSE TITLE: KINETIC THEORY AND STATISTICAL MECHANICS NATIONAL OPEN UNIVERSITY OF NIGERIA SCHOOL OF SCIENCE AND TECHNOLOGY COURSE CODE: PHY311 COURSE TITLE: KINETIC THEORY AND STATISTICAL MECHANICS PHY 311 KINECTIC THEORY AND STATISTICAL MECHANICS COURSE

More information

CLASSICAL CONCEPT REVIEW 8

CLASSICAL CONCEPT REVIEW 8 CLASSICAL CONCEPT REVIEW 8 Kinetic Theory Information concerning the initial motions of each of the atoms of macroscopic systems is not accessible, nor do we have the computational capability even with

More information

LECTURE 4. Last time: Lecture outline

LECTURE 4. Last time: Lecture outline LECTURE 4 Last time: Types of convergence Weak Law of Large Numbers Strong Law of Large Numbers Asymptotic Equipartition Property Lecture outline Stochastic processes Markov chains Entropy rate Random

More information

6-2. A quantum system has the following energy level diagram. Notice that the temperature is indicated

6-2. A quantum system has the following energy level diagram. Notice that the temperature is indicated Chapter 6 Concept Tests 6-1. In a gas of hydrogen atoms at room temperature, what is the ratio of atoms in the 1 st excited energy state (n=2) to atoms in the ground state(n=1). (Actually H forms H 2 molecules,

More information

36 CHAPTER 1. LIMITS AND CONTINUITY. Figure 1.17: At which points is f not continuous?

36 CHAPTER 1. LIMITS AND CONTINUITY. Figure 1.17: At which points is f not continuous? 36 CHAPTER 1. LIMITS AND CONTINUITY 1.3 Continuity Before Calculus became clearly de ned, continuity meant that one could draw the graph of a function without having to lift the pen and pencil. While this

More information

Information, Entropy, and Coding

Information, Entropy, and Coding Chapter 8 Information, Entropy, and Coding 8. The Need for Data Compression To motivate the material in this chapter, we first consider various data sources and some estimates for the amount of data associated

More information

Rate Equations and Detailed Balance

Rate Equations and Detailed Balance Rate Equations and Detailed Balance Initial question: Last time we mentioned astrophysical masers. Why can they exist spontaneously? Could there be astrophysical lasers, i.e., ones that emit in the optical?

More information

Time dependence in quantum mechanics Notes on Quantum Mechanics

Time dependence in quantum mechanics Notes on Quantum Mechanics Time dependence in quantum mechanics Notes on Quantum Mechanics http://quantum.bu.edu/notes/quantummechanics/timedependence.pdf Last updated Thursday, November 20, 2003 13:22:37-05:00 Copyright 2003 Dan

More information

There is no such thing as heat energy

There is no such thing as heat energy There is no such thing as heat energy We have used heat only for the energy transferred between the objects at different temperatures, and thermal energy to describe the energy content of the objects.

More information

CHAPTER FIVE. Solutions for Section 5.1. Skill Refresher. Exercises

CHAPTER FIVE. Solutions for Section 5.1. Skill Refresher. Exercises CHAPTER FIVE 5.1 SOLUTIONS 265 Solutions for Section 5.1 Skill Refresher S1. Since 1,000,000 = 10 6, we have x = 6. S2. Since 0.01 = 10 2, we have t = 2. S3. Since e 3 = ( e 3) 1/2 = e 3/2, we have z =

More information

Differential Relations for Fluid Flow. Acceleration field of a fluid. The differential equation of mass conservation

Differential Relations for Fluid Flow. Acceleration field of a fluid. The differential equation of mass conservation Differential Relations for Fluid Flow In this approach, we apply our four basic conservation laws to an infinitesimally small control volume. The differential approach provides point by point details of

More information

The First Law of Thermodynamics

The First Law of Thermodynamics Thermodynamics The First Law of Thermodynamics Thermodynamic Processes (isobaric, isochoric, isothermal, adiabatic) Reversible and Irreversible Processes Heat Engines Refrigerators and Heat Pumps The Carnot

More information

Revisiting the concept of chemical potential in classical and quantum gases: A perspective from Equilibrium Statistical Mechanics.

Revisiting the concept of chemical potential in classical and quantum gases: A perspective from Equilibrium Statistical Mechanics. Revisiting the concept of chemical potential in classical and quantum gases: A perspective from Equilibrium Statistical Mechanics F.J. Sevilla Instituto de Física, UNAM, Apdo. Postal 20-364, 01000 México

More information

A New Interpretation of Information Rate

A New Interpretation of Information Rate A New Interpretation of Information Rate reproduced with permission of AT&T By J. L. Kelly, jr. (Manuscript received March 2, 956) If the input symbols to a communication channel represent the outcomes

More information

Integrals of Rational Functions

Integrals of Rational Functions Integrals of Rational Functions Scott R. Fulton Overview A rational function has the form where p and q are polynomials. For example, r(x) = p(x) q(x) f(x) = x2 3 x 4 + 3, g(t) = t6 + 4t 2 3, 7t 5 + 3t

More information

ENERGY CONSERVATION The First Law of Thermodynamics and the Work/Kinetic-Energy Theorem

ENERGY CONSERVATION The First Law of Thermodynamics and the Work/Kinetic-Energy Theorem PH-211 A. La Rosa ENERGY CONSERVATION The irst Law of Thermodynamics and the Work/Kinetic-Energy Theorem ENERGY TRANSER of ENERGY Heat-transfer Q Macroscopic external Work W done on a system ENERGY CONSERVATION

More information

Solving DEs by Separation of Variables.

Solving DEs by Separation of Variables. Solving DEs by Separation of Variables. Introduction and procedure Separation of variables allows us to solve differential equations of the form The steps to solving such DEs are as follows: dx = gx).

More information

Partial Fractions. p(x) q(x)

Partial Fractions. p(x) q(x) Partial Fractions Introduction to Partial Fractions Given a rational function of the form p(x) q(x) where the degree of p(x) is less than the degree of q(x), the method of partial fractions seeks to break

More information

Technical Thermodynamics

Technical Thermodynamics Technical Thermodynamics Chapter 2: Basic ideas and some definitions Prof. Dr.-Ing. habil. Egon Hassel University of Rostock, Germany Faculty of Mechanical Engineering and Ship Building Institute of Technical

More information

Math 461 Fall 2006 Test 2 Solutions

Math 461 Fall 2006 Test 2 Solutions Math 461 Fall 2006 Test 2 Solutions Total points: 100. Do all questions. Explain all answers. No notes, books, or electronic devices. 1. [105+5 points] Assume X Exponential(λ). Justify the following two

More information

Gibbs Free Energy and Chemical Potential. NC State University

Gibbs Free Energy and Chemical Potential. NC State University Chemistry 433 Lecture 14 Gibbs Free Energy and Chemical Potential NC State University The internal energy expressed in terms of its natural variables We can use the combination of the first and second

More information

We can express this in decimal notation (in contrast to the underline notation we have been using) as follows: 9081 + 900b + 90c = 9001 + 100c + 10b

We can express this in decimal notation (in contrast to the underline notation we have been using) as follows: 9081 + 900b + 90c = 9001 + 100c + 10b In this session, we ll learn how to solve problems related to place value. This is one of the fundamental concepts in arithmetic, something every elementary and middle school mathematics teacher should

More information

7. DYNAMIC LIGHT SCATTERING 7.1 First order temporal autocorrelation function.

7. DYNAMIC LIGHT SCATTERING 7.1 First order temporal autocorrelation function. 7. DYNAMIC LIGHT SCATTERING 7. First order temporal autocorrelation function. Dynamic light scattering (DLS) studies the properties of inhomogeneous and dynamic media. A generic situation is illustrated

More information

Math 120 Final Exam Practice Problems, Form: A

Math 120 Final Exam Practice Problems, Form: A Math 120 Final Exam Practice Problems, Form: A Name: While every attempt was made to be complete in the types of problems given below, we make no guarantees about the completeness of the problems. Specifically,

More information

9460218_CH06_p069-080.qxd 1/20/10 9:44 PM Page 69 GAS PROPERTIES PURPOSE

9460218_CH06_p069-080.qxd 1/20/10 9:44 PM Page 69 GAS PROPERTIES PURPOSE 9460218_CH06_p069-080.qxd 1/20/10 9:44 PM Page 69 6 GAS PROPERTIES PURPOSE The purpose of this lab is to investigate how properties of gases pressure, temperature, and volume are related. Also, you will

More information

1. Prove that the empty set is a subset of every set.

1. Prove that the empty set is a subset of every set. 1. Prove that the empty set is a subset of every set. Basic Topology Written by Men-Gen Tsai email: b89902089@ntu.edu.tw Proof: For any element x of the empty set, x is also an element of every set since

More information

Physics 9e/Cutnell. correlated to the. College Board AP Physics 1 Course Objectives

Physics 9e/Cutnell. correlated to the. College Board AP Physics 1 Course Objectives Physics 9e/Cutnell correlated to the College Board AP Physics 1 Course Objectives Big Idea 1: Objects and systems have properties such as mass and charge. Systems may have internal structure. Enduring

More information

Compressible Fluids. Faith A. Morrison Associate Professor of Chemical Engineering Michigan Technological University November 4, 2004

Compressible Fluids. Faith A. Morrison Associate Professor of Chemical Engineering Michigan Technological University November 4, 2004 94 c 2004 Faith A. Morrison, all rights reserved. Compressible Fluids Faith A. Morrison Associate Professor of Chemical Engineering Michigan Technological University November 4, 2004 Chemical engineering

More information

Partial Fractions Decomposition

Partial Fractions Decomposition Partial Fractions Decomposition Dr. Philippe B. Laval Kennesaw State University August 6, 008 Abstract This handout describes partial fractions decomposition and how it can be used when integrating rational

More information

Basics of information theory and information complexity

Basics of information theory and information complexity Basics of information theory and information complexity a tutorial Mark Braverman Princeton University June 1, 2013 1 Part I: Information theory Information theory, in its modern format was introduced

More information

Gambling with Information Theory

Gambling with Information Theory Gambling with Information Theory Govert Verkes University of Amsterdam January 27, 2016 1 / 22 How do you bet? Private noisy channel transmitting results while you can still bet, correct transmission(p)

More information

High Speed Aerodynamics Prof. K. P. Sinhamahapatra Department of Aerospace Engineering Indian Institute of Technology, Kharagpur

High Speed Aerodynamics Prof. K. P. Sinhamahapatra Department of Aerospace Engineering Indian Institute of Technology, Kharagpur High Speed Aerodynamics Prof. K. P. Sinhamahapatra Department of Aerospace Engineering Indian Institute of Technology, Kharagpur Module No. # 01 Lecture No. # 06 One-dimensional Gas Dynamics (Contd.) We

More information

0.1 Phase Estimation Technique

0.1 Phase Estimation Technique Phase Estimation In this lecture we will describe Kitaev s phase estimation algorithm, and use it to obtain an alternate derivation of a quantum factoring algorithm We will also use this technique to design

More information

Entropy and Mutual Information

Entropy and Mutual Information ENCYCLOPEDIA OF COGNITIVE SCIENCE 2000 Macmillan Reference Ltd Information Theory information, entropy, communication, coding, bit, learning Ghahramani, Zoubin Zoubin Ghahramani University College London

More information

RANDOM VIBRATION AN OVERVIEW by Barry Controls, Hopkinton, MA

RANDOM VIBRATION AN OVERVIEW by Barry Controls, Hopkinton, MA RANDOM VIBRATION AN OVERVIEW by Barry Controls, Hopkinton, MA ABSTRACT Random vibration is becoming increasingly recognized as the most realistic method of simulating the dynamic environment of military

More information

1 CHAPTER 7 THE FIRST AND SECOND LAWS OF THERMODYNAMICS

1 CHAPTER 7 THE FIRST AND SECOND LAWS OF THERMODYNAMICS 1 CHAPER 7 HE FIRS AND SECOND LAWS OF HERMODYNAMICS 7.1 he First Law of hermodynamics, and Internal Energy he First Law of thermodynamics is: he increase of the internal energy of a system is equal to

More information

Chapter 8 Maxwell relations and measurable properties

Chapter 8 Maxwell relations and measurable properties Chapter 8 Maxwell relations and measurable properties 8.1 Maxwell relations Other thermodynamic potentials emerging from Legendre transforms allow us to switch independent variables and give rise to alternate

More information

CBE 6333, R. Levicky 1 Differential Balance Equations

CBE 6333, R. Levicky 1 Differential Balance Equations CBE 6333, R. Levicky 1 Differential Balance Equations We have previously derived integral balances for mass, momentum, and energy for a control volume. The control volume was assumed to be some large object,

More information

THERMAL TO ELECTRIC ENERGY CONVERSION

THERMAL TO ELECTRIC ENERGY CONVERSION THERMAL TO ELECTRIC ENERGY CONVERSION PETER L. HAGELSTEIN Research Laboratory of Electronics, Massachusetts Institute of Technology, Cambridge, MA 02139,USA E-mail: plh@mit.edu As research in the area

More information

Mechanics 1: Conservation of Energy and Momentum

Mechanics 1: Conservation of Energy and Momentum Mechanics : Conservation of Energy and Momentum If a certain quantity associated with a system does not change in time. We say that it is conserved, and the system possesses a conservation law. Conservation

More information

ENTROPY AND THE SECOND LAW OF THERMODYNAMICS

ENTROPY AND THE SECOND LAW OF THERMODYNAMICS ENTROPY AND THE SECOND LAW OF THERMODYNAMICS Energy Reservoir The system consists of the red circles in the blue box. Energy and entropy fl ow out of the system. TIME Additional Energy is added to the

More information

Quasi-static evolution and congested transport

Quasi-static evolution and congested transport Quasi-static evolution and congested transport Inwon Kim Joint with Damon Alexander, Katy Craig and Yao Yao UCLA, UW Madison Hard congestion in crowd motion The following crowd motion model is proposed

More information

KINETIC THEORY AND THERMODYNAMICS

KINETIC THEORY AND THERMODYNAMICS KINETIC THEORY AND THERMODYNAMICS 1. Basic ideas Kinetic theory based on experiments, which proved that a) matter contains particles and quite a lot of space between them b) these particles always move

More information

Basic Probability Concepts

Basic Probability Concepts page 1 Chapter 1 Basic Probability Concepts 1.1 Sample and Event Spaces 1.1.1 Sample Space A probabilistic (or statistical) experiment has the following characteristics: (a) the set of all possible outcomes

More information

The derivation of the balance equations

The derivation of the balance equations Chapter 3 The derivation of the balance equations In this chapter we present the derivation of the balance equations for an arbitrary physical quantity which starts from the Liouville equation. We follow,

More information

AP Physics 1 and 2 Lab Investigations

AP Physics 1 and 2 Lab Investigations AP Physics 1 and 2 Lab Investigations Student Guide to Data Analysis New York, NY. College Board, Advanced Placement, Advanced Placement Program, AP, AP Central, and the acorn logo are registered trademarks

More information

- momentum conservation equation ρ = ρf. These are equivalent to four scalar equations with four unknowns: - pressure p - velocity components

- momentum conservation equation ρ = ρf. These are equivalent to four scalar equations with four unknowns: - pressure p - velocity components J. Szantyr Lecture No. 14 The closed system of equations of the fluid mechanics The above presented equations form the closed system of the fluid mechanics equations, which may be employed for description

More information

Solving Linear Equations in One Variable. Worked Examples

Solving Linear Equations in One Variable. Worked Examples Solving Linear Equations in One Variable Worked Examples Solve the equation 30 x 1 22x Solve the equation 30 x 1 22x Our goal is to isolate the x on one side. We ll do that by adding (or subtracting) quantities

More information

1 Determinants and the Solvability of Linear Systems

1 Determinants and the Solvability of Linear Systems 1 Determinants and the Solvability of Linear Systems In the last section we learned how to use Gaussian elimination to solve linear systems of n equations in n unknowns The section completely side-stepped

More information

1 if 1 x 0 1 if 0 x 1

1 if 1 x 0 1 if 0 x 1 Chapter 3 Continuity In this chapter we begin by defining the fundamental notion of continuity for real valued functions of a single real variable. When trying to decide whether a given function is or

More information

Topic 2: Energy in Biological Systems

Topic 2: Energy in Biological Systems Topic 2: Energy in Biological Systems Outline: Types of energy inside cells Heat & Free Energy Energy and Equilibrium An Introduction to Entropy Types of energy in cells and the cost to build the parts

More information

Numerical analysis of Bose Einstein condensation in a three-dimensional harmonic oscillator potential

Numerical analysis of Bose Einstein condensation in a three-dimensional harmonic oscillator potential Numerical analysis of Bose Einstein condensation in a three-dimensional harmonic oscillator potential Martin Ligare Department of Physics, Bucknell University, Lewisburg, Pennsylvania 17837 Received 24

More information

Linear Codes. Chapter 3. 3.1 Basics

Linear Codes. Chapter 3. 3.1 Basics Chapter 3 Linear Codes In order to define codes that we can encode and decode efficiently, we add more structure to the codespace. We shall be mainly interested in linear codes. A linear code of length

More information

Universal hashing. In other words, the probability of a collision for two different keys x and y given a hash function randomly chosen from H is 1/m.

Universal hashing. In other words, the probability of a collision for two different keys x and y given a hash function randomly chosen from H is 1/m. Universal hashing No matter how we choose our hash function, it is always possible to devise a set of keys that will hash to the same slot, making the hash scheme perform poorly. To circumvent this, we

More information

Solving Systems of Two Equations Algebraically

Solving Systems of Two Equations Algebraically 8 MODULE 3. EQUATIONS 3b Solving Systems of Two Equations Algebraically Solving Systems by Substitution In this section we introduce an algebraic technique for solving systems of two equations in two unknowns

More information

Bayesian probability theory

Bayesian probability theory Bayesian probability theory Bruno A. Olshausen arch 1, 2004 Abstract Bayesian probability theory provides a mathematical framework for peforming inference, or reasoning, using probability. The foundations

More information

Entropy From Wikipedia, the free encyclopedia

Entropy From Wikipedia, the free encyclopedia Entropy From Wikipedia, the free encyclopedia Entropy is a thermodynamic property that can be used to determine the energy not available for work in a thermodynamic process, such as in energy conversion

More information

Thermodynamics. Thermodynamics 1

Thermodynamics. Thermodynamics 1 Thermodynamics 1 Thermodynamics Some Important Topics First Law of Thermodynamics Internal Energy U ( or E) Enthalpy H Second Law of Thermodynamics Entropy S Third law of Thermodynamics Absolute Entropy

More information

Lecture Note 1 Set and Probability Theory. MIT 14.30 Spring 2006 Herman Bennett

Lecture Note 1 Set and Probability Theory. MIT 14.30 Spring 2006 Herman Bennett Lecture Note 1 Set and Probability Theory MIT 14.30 Spring 2006 Herman Bennett 1 Set Theory 1.1 Definitions and Theorems 1. Experiment: any action or process whose outcome is subject to uncertainty. 2.

More information

Basics of Statistical Machine Learning

Basics of Statistical Machine Learning CS761 Spring 2013 Advanced Machine Learning Basics of Statistical Machine Learning Lecturer: Xiaojin Zhu jerryzhu@cs.wisc.edu Modern machine learning is rooted in statistics. You will find many familiar

More information

The Goldberg Rao Algorithm for the Maximum Flow Problem

The Goldberg Rao Algorithm for the Maximum Flow Problem The Goldberg Rao Algorithm for the Maximum Flow Problem COS 528 class notes October 18, 2006 Scribe: Dávid Papp Main idea: use of the blocking flow paradigm to achieve essentially O(min{m 2/3, n 1/2 }

More information

Lecture Notes: Basic Concepts in Option Pricing - The Black and Scholes Model

Lecture Notes: Basic Concepts in Option Pricing - The Black and Scholes Model Brunel University Msc., EC5504, Financial Engineering Prof Menelaos Karanasos Lecture Notes: Basic Concepts in Option Pricing - The Black and Scholes Model Recall that the price of an option is equal to

More information

Practice problems for Homework 11 - Point Estimation

Practice problems for Homework 11 - Point Estimation Practice problems for Homework 11 - Point Estimation 1. (10 marks) Suppose we want to select a random sample of size 5 from the current CS 3341 students. Which of the following strategies is the best:

More information

Differential Balance Equations (DBE)

Differential Balance Equations (DBE) Differential Balance Equations (DBE) Differential Balance Equations Differential balances, although more complex to solve, can yield a tremendous wealth of information about ChE processes. General balance

More information

The first law: transformation of energy into heat and work. Chemical reactions can be used to provide heat and for doing work.

The first law: transformation of energy into heat and work. Chemical reactions can be used to provide heat and for doing work. The first law: transformation of energy into heat and work Chemical reactions can be used to provide heat and for doing work. Compare fuel value of different compounds. What drives these reactions to proceed

More information

Lecture 7: Continuous Random Variables

Lecture 7: Continuous Random Variables Lecture 7: Continuous Random Variables 21 September 2005 1 Our First Continuous Random Variable The back of the lecture hall is roughly 10 meters across. Suppose it were exactly 10 meters, and consider

More information

Chapter 6 The first law and reversibility

Chapter 6 The first law and reversibility Chapter 6 The first law and reversibility 6.1 The first law for processes in closed systems We have discussed the properties of equilibrium states and the relationship between the thermodynamic parameters

More information

AP CHEMISTRY 2007 SCORING GUIDELINES. Question 2

AP CHEMISTRY 2007 SCORING GUIDELINES. Question 2 AP CHEMISTRY 2007 SCORING GUIDELINES Question 2 N 2 (g) + 3 F 2 (g) 2 NF 3 (g) ΔH 298 = 264 kj mol 1 ; ΔS 298 = 278 J K 1 mol 1 The following questions relate to the synthesis reaction represented by the

More information

Energy comes in many flavors!

Energy comes in many flavors! Forms of Energy Energy is Fun! Energy comes in many flavors! Kinetic Energy Potential Energy Thermal/heat Energy Chemical Energy Electrical Energy Electrochemical Energy Electromagnetic Radiation Energy

More information

Microeconomic Theory: Basic Math Concepts

Microeconomic Theory: Basic Math Concepts Microeconomic Theory: Basic Math Concepts Matt Van Essen University of Alabama Van Essen (U of A) Basic Math Concepts 1 / 66 Basic Math Concepts In this lecture we will review some basic mathematical concepts

More information

The Simple Behaviour of Complex Systems Explained?

The Simple Behaviour of Complex Systems Explained? The Simple Behaviour of Complex Systems Explained? REVIEW M. STREVENS Bigger Than Chaos: Understanding Complexity Through Probability Cambridge (MA): Harvard University Press, 2003 $76 (hardback), ISBN:

More information

Exergy: the quality of energy N. Woudstra

Exergy: the quality of energy N. Woudstra Exergy: the quality of energy N. Woudstra Introduction Characteristic for our society is a massive consumption of goods and energy. Continuation of this way of life in the long term is only possible if

More information