Last modified on 2 December 2013, at 01:34

Biological Physics/Probability, Entropy, & the Second Law

Entropy of an Ideal GasEdit

When releasing heat a reaction is said to be exothermic, with enthalpy, \Delta H < 0. If a reaction absorbs heat from its surroundings, the reaction is said to be endothermic with enthalpy, \Delta H > 0. Many endothermic reactions get energy by rearranging themselves into more entropic final states. Entropy in its most basic definition is the the amount of disorder a system contains. Entropy always increases; a reaction will not proceed if the entropy of the system and/or the environment does not increase.

When considering the entropy of an ideal gas, one can consider how the gas molecules can be arranged in a volume. Let’s take, for example, a sphere. The volume of the sphere divided into its smallest division of the sphere gives a number of subdivisions equal to Ωvol. The entropy of an ideal gas involves its multiplicity (represented by the symbol Ω). The multiplicity of a gas is defined as how many microstates exist for each macrostate. A microstate is a specific configuration for a system, whereas the macrostate describes the probability of actually finding a system in a certain microstate. The multiplicity of a sphere is  \Omega_{ vol} = \frac{V}{\Delta x \Delta y \Delta z} . To derive the total multiplicity for an ideal gas in a sphere, we will consider the multiplicity pertaining to the volume (how many places a particle can occupy) and and the multiplicity of the momentum (how many speeds that particle can have). To do this, we will use the Heisenburg Uncertainty Principle. The Heisenburg Uncertainty Principle tells us what the smallest way you can break up momentum. The formula is defined as  x*ph where h is Planck’s constant and \Delta p is momentum. The Heisenburg Uncertainty Principle states that particles cannot get too close or else they will form waves; particles shouldn’t react or they will exhibit quantum mechanical properties. This essentially means that the gas cannot be too dense. The kinetic energy of a particle is equal to  U = \frac{1}{2} mv^{2} or  U = \frac{p^{2}} {2m} . Considering three dimensions, then kinetic energy would look something like  U = \frac{p_{x}^{2} + p_{y}^{2} + p_{z}^{2}}{2m} which represents a vector in three dimensional space. The multiplicity of the sphere in relation to momentum would appear as:  p = \frac{V_{p}}{\Delta p_{x}\Delta p_{y}\Delta p_{z}} . The total multiplicity of the ideal gas would then be the number of ways the gas can be arranged spatially multiplied by the number of ways the momenta can be divided among the gas particles. Or, more succinctly,  \Omega_{total} = \Omega_{p} * \Omega_{V} . By substitution,  \Omega_{total} = \frac{V * V_{p}}{\Delta x \Delta p_{x} * \Delta y \Delta p_{y} * \Delta z \Delta p_{z}} . We consider both v and p because each particle can have a different position and a different momentum. This equation in simplified form appears as:  \Omega = \frac{V*V_{p}}{h^{3}} .

The formula for entropy is  S = K_{B}*ln(\Omega) . For an ideal gas, the Sackur-Tetrode Equation is applied where S = N*k_{B} ln(\frac{V}{N} (\frac{4\pi mU}{3Nh^{2}})^{\frac{3}{2}}) + \frac{5}{2}. When considering an ideas gas going from a smaller volume to a larger, this is what the entropy would look like after going through the derivations: S = Nk_{B}ln\frac{V_{f}}{V_{i}}. For an isothermal reaction, U = Q+W where W=NK_{b}Tln\frac{V_{f}}{V_{i}}. Referring back to the previous formula, where U = \frac{f}{2}Nk_{b}\Delta T. Kinetic energy is also the sum of work and heat. Thus, for an isothermal process \Delta U = W + Q = 0, so then Q = -W. If we remember that a property of natural logs is the ability to flip the argument within parentheses so that the negative of the argument is true, then we have: Q = Nk_{B}Tln\frac{V_{f}}{V_{i}} = \Delta ST. The resulting general formula is \Delta S = \frac{Q}{T}. But Remember! This is only true for an isothermal process.

What happens when temperature is not held constant? By the definition of heat capacity, Q = mC_{V}\Delta T. Since \Delta S = \frac{Q}{T} , then it follows that dS = \int \frac{mC_{V}}{T}dT

ExampleEdit

Take 1 g of water at 293 K and another 1 g of water at 323 K and mix them together. In thermal equilibrium, the 2 g of water are at temperature 308 K. What are the entropy changes involved?

Begin with dS = mC_{V}\int \frac{dT}{T}. For the cooler system, dS_{cooler} = (1g)(1\frac{cal}{g*K})*ln\frac{308}{293} = 0.0499 \frac{cal}{K}. For the warmer system, dS_{warmer} = (1g)(1\frac{cal}{g*K})*ln\frac{308}{323} = -0.0476 \frac{cal}{K}. Now, to determine how the total entropy of the system behaves, add the entropy from the cooler and the warmer system together: dS_{total} = dS_{cooler} + dS_{warmer} = 0.0499 - 0.0476 = 0.0023 \frac{cal}{K} = 0.00944 \frac{J}{K}.

Two State SystemsEdit

Two state systems are systems that can only exist in one of two states at a time. Examples of this are monomers that can be linked into polymers either in a straight line or with a 180° turn, the spin up or spin down state of electrons, or a coin flip. Let’s consider the coin flip situation for a moment. The following chart displays the relationship between the number of coins flipped, the macrostates, the probability, the microstates, and the multiplicity. The macrostate is the total number of heads (or tails) that turn up for a given system. The micro state is any particular order that can occur to give a particular macrostate. The probability is the reciprocal of the product of the number of states that can exist for each particle (or coin) and the total number of particles (coins). The multiplicity is the summation of the number of microstates for a given macrostate. This will make more sense when looking at the table below.

# Coins Macrostate (# heads & # tails) Probability of Microstate Microstates \Omega = Multiplicity
0 0H 0T 1 0H 0T 1
1 1H 0T
0H 1T
½
½
HT
TH
1
1
2 2H 0T
1H 1T
0H 2T
½ x ½
½ x ½
½ x ½
HH
HT, TH
TT
1
2
1
3 3H
2H 1T
1H 2T
3T
½ x ½ x ½
½ x ½ x ½
½ x ½ x ½
½ x ½ x ½
H,H,H
H,H,T H,T,H T,H,H
T,T,H T,H,T H,T,T
T,T,T
1
3
3
1

You might be able to see a pattern forming in the Ω column. It is beginning to follow Pascal’s Triangle

              1
            1   1
          1   2   1
        1   3   3   1
      1   4   6   4   1
   1   5   10   10   5   1

etc.

The total probability for each macrostate is then the probability of the microstate multiplied by the multiplicity. This means that for the situation where there are two coins being flipped, the probability is actually as follows:

# Coins Probability
0 1
1 1/2
2 1/4
3 1/8

We will start moving to more general systems now. Say you have N coins. How many ways can you get n heads? You can use a choose function to determine the number, n.

 {n \choose N} = \Omega

N choose n function

\Omega = {n \choose N} =  \frac{N!}{n!(N-n)!}

and S = k_B ln(\Omega)

By doing this we convert a number from one that multiplies to one that adds just like we know that entropy is additive. For a mole of coin flips using this you will be able to see basically a single spike at 0.5 * 10^23 for the number of times heads occurs.

The Einstein Ideal SolidEdit

Einstein came up with a simple way to model "ideal solids". Imagine that particles in a solid are connected together via springs. If the springs are compressed or expanded, then the energy that is stored in the springs is like a simple harmonic oscillator: U = \frac{1}{2}kx^{2}. Imagine that the particles are arranged in a cubic lattice. Then, the total energy for three dimensions is U = kinetic energy + potential energy. So U = \frac{1}{2}kx^{2} + \frac{1}{2}ky^{2} + \frac{1}{2}kz^{2} +  \frac{1}{2}mv_{x}^{2} + \frac{1}{2}mv_{y}^{2} + \frac{1}{2}mv_{z}^{2}. By the Equipartition Principle, U = \frac{f}{2}Nl_{B}T = \frac{6}{2}Nk_{B}T = 3Nk_{B}T. From Quantum Mechanics (just take this one on faith) U = (n + 1)hf where h is Planck's constant and  f \approx 10^{13} Hz and n \in R; essentially, the units of energy that can be used are not continuous but quantized.

Einsteins ideal solid

Now, let's look at the multiplicity of the Einstein solid. How many atoms are in the solid? N. How many units of energy does the solid have? n. How many ways can I arrange the energy? \Omega! The multiplicity tells us how many different ways the energy can be shared among the different particles! However, calculating the multiplicity of this system is much more difficult than 2-state microstate systems. Instead of having only 2 different states (heads and tails), there are now any number n of states (or amounts of energy) for each particle (or coin).

ExampleEdit

Let's look at a system of 3 particles with a total energy packets n 3.

Example: N = 3

N atom 1 atom 2 atom 3 \Omega
0 0 0 0 1
1 0
0
1
0
1
0
1
0
0
3
2 0
1
1
2
0
0
1
0
1
0
2
0
1
1
0
0
0
2
6
3 3
0
0
2
2
1
1
0
0
1
0
3
0
1
0
2
0
1
2
1
0
0
3
0
1
0
2
2
1
1
10

As you can see, these calculations are much more involved and complex than the 2-state system of coin flips. There is, however, a formula that can be used to calculate the multiplicity that is very similar to the Binomial Theorem: the Multinomial Theorem: \Omega = \frac{(N + n - 1)!}{n! * (N - 1)!}. In fact, this formula is the generalization of the Binomial Theorem for any n number of states for each particle.

ExampleEdit

Let's look at the case where we have 10 particles (N = 10) with 4 units of energy (n = 4). From the Multinomial Theorem, we can calculate the multiplicity by \Omega = \frac{(10 + 4 - 1)!}{(4!)(10 - 1)!} = \frac{13!}{(4!)(9!)} = 715.

How many ways can we give the first atom (a_{1}) have 0 units of energy . How the rest of the energy is divided up doesn't matter. Then, we have an N = 9 (9 atoms left over) and an n = 4 (and 4 units of energy to split among themselves). So \Omega_{a_{1} = 0} = \frac{(9 + 4 - 1)!}{(4!)(9 - 1)! } = \frac{12!}{(4!)(8!) } = 495.

Now, let the first atom have 1 unit of energy. Then, \Omega_{a_{1} = 1} = \frac{(9 + 3 - 1)!}{(4!)(9 - 1)! } = \frac{11!}{(3!)(8!) } = 165.

A similar argument can show that \Omega_{a_{1} = 2} = 45, \Omega_{a_{1} = 3} = 9, and \Omega_{a_{1} = 4} = 1. If we plot the multiplicity versus the number of energy packets that the first atom has, then we can see a graph that represents the Boltzmann Distribution of the system. A Boltzman Distribution says that when a system has energy, the energy gets passed around and you are most likely to measure an atom with an energy of 0 in this particular system. This is because the energy is constantly passed so most of the time an atom has no energy.

Dispersion of energy

Let's now find the probability of each state P_{a_{1} = 0} = \frac{\Omega_{a_{1} = 0}}{\Omega_{total}} = \frac{495}{715} P_{a_{1} = 1} = \frac{\Omega_{a_{1} = 1}}{\Omega_{total}} = \frac{165}{715} P_{a_{1} = 2} = \frac{\Omega_{a_{1} = 2}}{\Omega_{total}} = \frac{45}{715} P_{a_{1} = 3} = \frac{\Omega_{a_{1} = 3}}{\Omega_{total}} = \frac{9}{715} P_{a_{1} = 4} = \frac{\Omega_{a_{1} = 4}}{\Omega_{total}} = \frac{1}{715}

As a sanity check, the sum of the probabilities should equal 1, and it does.

Let's calculate the Average energy of each particle. You may think of this as a simple calculation: \frac{4 units of energy}{10 atoms} = 0.4. However, let's look at the average energy from a more statistical approach. If we take the probabilities calculated above and multiply each by the energy that the first atom has and add all of these together P_{a_{1} = 0} * 0 + P_{a_{1} = 1} * 1 + P_{a_{1} = 2} * 2 + P_{a_{1} = 3} * 3 + P_{a_{1} = 4} * 4 = 0.4! This gives us the same answer as the previous calculation.

Thus, the Average energy may be 0.4 but the probabilistic energy is actually zero! this means that if you ever randomly choose a particle and measure its energy, it's most likely going to have an energy of zero. However, if you somehow add up all of the energies together and divide by the total number of particles, then this would yield an average energy of 0.4.

A 2-Body SystemEdit

Imagine having 2 solids (A and B) that are right next to each other and can freely exchange energy. Both A and B have the same number of particles: 10. A is hotter with 4 units of energy and B is cooler with only 2 units of energy.

Caption text

Intuition tells us that A will cool and B will heat up to 3 units of energy each. But why does that work? Well, entropy will show us the answer. The next table shows what the multiplicity looks like with a certain number n of energy packets in A. To find the total multiplicity, we must multiply the multiplicities from A and B together.

n_{A} \Omega_A (n_{A}) \Omega_B (6 - n_{B}) \Omega (\Omega_{A} * \Omega_{B})
0 1 5005 5005
1 10 2002 20020
2 55 715 39325
3 220 220 48400
4 715 55 39325
5 715 55 20020
6 5005 1 5005

As you can see, the highest multiplicity is when the energy packets are evenly divided between A and B. Thus, we're most likely to see both groups sharing energy equally.

Now, let's look at another 2-body system where the number of particles are not equal. This one may have a less intuitive answer than the previous. Let A have 10 particles with 4 packets of energy and B have 6 particles with 4

n_{A} \Omega_A (n_{A}) \Omega_B (8 - n_{B}) \Omega (\Omega_{A} * \Omega_{B})
0 1 1287 1287
1 10 792 7920
2 55 462 25410
3 220 252 55440
4 715 126 90090
5 2002 56 112112
6 5005 21 105105
7 4440 6 68640
8 24310 1 24310

So, the one we'll most likely observe is the state where A has 5 packets of energy and B has 3.

Energy, Entropy, and TemperatureEdit

Recall from the Equipartition Theorem that U = \frac{f}{2} Nk_{B}T and let f = 6. So U = 3Nk_{B}T where 3k_{B} are constant. Then, T \propto \frac{U}{N}. In the previous example, U = n_{A/B} and T = N_{A/B}. So initially,

T_{A} \propto \frac{n_{A}}{N_{A}} = \frac{4}{10} = 0.4

T_{B} \propto \frac{n_{B}}{N_{B}} = \frac{4}{6} = 0.67

And the final conditions are

T_{A} \propto \frac{n_{A}}{N_{A}} = \frac{5}{10} = 0.5

T_{B} \propto \frac{n_{B}}{N_{B}} = \frac{3}{6} = 0.5

In the final conditions, the temperatures are equal! Thus, the system is at equilibrium not only when the entropy has maximized but also when the temperatures are equal. Now knowing S = k_{B}ln(\Omega), let's determine \frac{S}{k_{B}} = ln(\Omega). From this relationship:

\frac{S_{A}}{k_{B}} \frac{S_{B}}{k_{B}} \frac{S_{Total}}{k_{B}}
0 7.16 7.16
2.3 6.67 8.97
4.01 6.14 10.15
5.39 5.53 10.93
6.57 4.84 11.41
7.60 4.02 11.62
8.52 3.04 11.50
9.34 1.79 11.14
10.1 0 10.1

Now, let's plot these entropy values versus the energy packets in a, n_A.

Caption text

The slope of the entropy from A and the slope of the entropy from B are equal and opposite exactly when entropy is at its maximum. Or written more succinctly,  \frac{dS_A}{dU_A} = - \frac{dS_B}{dU_A}. Since the total change in entropy when energy changes a little is \frac{dS}{dU_A} = \frac{dS_A}{dU_A} + \frac{dS_B}{dU_A}, then this implies that \frac{dS}{dU_A} = 0 at equilibrium.

Is it true that \frac{dS}{dU_A} = T? No. If you do some dimensional analysis, you find that \frac{dS}{dU_A} has units \frac{1}{Joules} so actually \frac{dS}{dU_A} = \frac{1}{T} or dU = TdS when N and V are held constant (i.e. no work is done on the system).

Next Page: Stirling's Approximation | Previous Page: Charles' Law

Home: Biological Physics