04.02 The Microscopic View of Entropy

Comments

Principles of Probability.

This is supplemental Material from "Molecular Driving Forces, K.A. Dill, S. Bromberg", Garland Science, New York:NY, 2003, Chapter 1. See the next three screencasts. This content is useful for graduate level courses that go into more depth or for students interested in more background on probability.

Download Handout Notes to Accompany Screencasts (msu.edu)

Enter your rating: 
Your rating: None
4.25
Average: 4.3 (4 votes)

Principles of Probability I, General Concepts, Correlated and Conditional Events. (msu.edu, 17min) (Flash)
Comprehension Questions:
1. Estimate the probability of pulling an king from a randomly shuffled deck of 52 cards.
2. A coin is flipped 5 times. Estimate the probability that heads is observed three of the 5 times.
3. A die (singular of dice) is a cube with the numbers 1-6 inscribed on its 6 faces. If you roll the die 7 times, what is the probability that 5 will be observed on all 7 rolls?

Enter your rating: 
Your rating: None
3.25
Average: 3.3 (4 votes)

Principles of Probability II, Counting Events, Permutations and Combinations. This part discusses the binomial and multinomial coefficients for putting particles in boxes. The binomial and multinomial coefficient are used in section 4.2 to quantify configurational entropy. (msu.edu, 16min) (Flash) You might like to check out the sample calculations below before attempting the comprehension questions.
Comprehension Questions:
1. Write the formulas for the binomial coefficient, the multinomial coefficient, and the multinomial with repetition.
2. Ten particles are distributed between two boxes. Compute the number of possible ways of achieving 7 particles in Box A and 3 particles in Box B.
3. Note that the binomial distribution is a special case of the multinomial distribution where the number of categories is 2. Also note that the total number of events for a multinomial distribution is given by M^N where M is the number of categories (aka. outcomes, e.g. boxes) and N is the number of objects (aka. trials, e.g. particles). The probability of a particular observation is given by the number of combinations divided by the total number of events. Compute the probability of observing 7 particles in Box A and 3 Particles in Box B.
4. Ten particles are distributed between three boxes. Compute the probability of observing 7 particles in Box A, 3 particles in Box B, and zero particles in Box C.
5. Ten particles are distributed between three boxes. Compute the probability of observing 3 particles in Box A, 3 particles in Box B, and 4 particles in Box C.

Enter your rating: 
Your rating: None
3.25
Average: 3.3 (4 votes)

Principles of Probability III, Distributions, Normalizing, Distribution Functions, Moments, Variance. This screencast extends beyond material covered in the textbook, but may be helpful if you study statistical mechanics in another course. (msu.edu, 15min) (Flash)

Enter your rating: 
Your rating: None
3.6
Average: 3.6 (5 votes)

Connecting Microstates, Macrostates, and the Relation of Entropy to Disorder (uakron.edu, 14min). For small systems, we can count the number of ways of arranging molecules in boxes to understand how the entropy changes with increasing number of molecules. By studying the patterns, we can infer a general mathematical formula that avoids having to enumerate all the possible arrangements of 10^23 molecules (which would be impossible within several lifetimes). A surprising conclusion of this analysis is that entropy is maximized when the molecules are most evenly distributed between the boxes (meaning that their pressures are equal). Is it really so "disordered" to say that all the molecules are neatly arranged into equal numbers in each box? Maybe not in a literary world, but it is the only logical conclusion of a proper definition of "entropy." It is not necessary to watch the videos on probability before watching this one, but it may help. And it might help to re-watch the probability videos after watching this one. Moreover, you might like to see how the numbers relate to the equations through sample calculations (uakron, 15min). These sample calculations show how to compute the number of microstates and probabilities given particles in Box A, B, etc and also the change in entropy.

Comprehension Questions:

1. What is the number of total possible microstates for: (a) 2 particles in 2 boxes (b) 5 particles in 2 boxes (c) 10 particles in 3 boxes.

2. What is the probability of observing: (a) 2 particles in Box A and 3 particles in Box B? (b) 6 particles in Box A and 4 particles in Box B? (c) 6 particles in Box A and 4 particles in Box B and 5 particles in Box C?

Enter your rating: 
Your rating: None
2.666665
Average: 2.7 (3 votes)

Relating the microscopic perspective on entropy to macroscopic changes in volume (uakron.edu, 11min) Through the introduction of Stirling's approximation, we arrive at a remarkably simple conclusion for changes in entropy relative to the configurations of ideal gas molecules at constant temperature: ΔS = Rln(V2/V1). This makes it easy to compute changes in entropy for ideal gases, even for subtle changes like mixing.

Comprehension Questions:

1. Estimate ln(255!).

2. A system goes from 6 particles in Box A and 4 particles in Box B to 5 particles in each. Estimate the change in S(J/K).

3. A system goes from 6 moles in Box A and 4 moles in Box B to 5 moles in each. Estimate the change in S(J/mol-K).

Enter your rating: 
Your rating: None
2.666665
Average: 2.7 (3 votes)

Molecular Nature of S: Thermal Entropy (uakron.edu, 20min) We can explain configurational entropy by studying particles in boxes, but only at constant temperature. How does the entropy change if we change the temperature? Why should it change if we change the temperature? The key is to recognize that energy is quantized, as best exemplified in the Einstein Solid model. We learned in Chapter 1 that energy increases when temperature increases. If we have a constant number of particles confined to lattice locations, then the only way for the energy to increase is if some of the molecules are in higher energy states. These "higher energy states" correspond to faster (higher frequency) vibrations that stretch the bonds (Hookean springs) to larger amplitudes. We can count the number of molecules in each energy state similar to the way we counted the number of molecules in boxes. Then we supplement the formula for configurational entropy changes to arrive at the following simple relation for all changes in entropy for ideal gases: ΔS = Cv ln(T2/T1) + R ln(V2/V1). Note that we have related the entropy to changes in state variables. This observation has two significant implications: (1) entropy must also be a state function (2) we can characterize the entropy by specifying any two variables. For example, substituting V = RT/P into the above equation leads to: ΔS = Cp ln(T2/T1) - R ln(P2/P1).

Comprehension Questions:
1. Show the steps required to derive ΔS = Cp ln(T2/T1) - R ln(P2/P1) from ΔS = Cv ln(T2/T1) + R ln(V2/V1).
2. We derived a memorable equation for adiabatic, reversible, ideal gases in Chapter 2. Hopefully, you have memorized it by now! Apply this formula to compute the change in entropy for adiabatic, reversible, ideal gases as they go through any change in temperature and pressure.
3. Make a table enumerating all the possibilities for 3 oscillators with 4 units of energy. 
4. Compute the change in entropy (J/k) for 100 oscillators going from 3 units of energy to 50 units of energy.
5. Compute the change in entropy (J/K) for 100 particles going from 3 boxes to 50 boxes. (This is a review of configurational entropy.)

Enter your rating: 
Your rating: None
2.5
Average: 2.5 (2 votes)