Structural Biochemistry/Second law

Overview edit

The first law of thermodynamics states that energy is conserved, however, it only describes the transformations observed, and it doesn’t impose any restriction on the process direction. Nevertheless, such a restriction has been observed and proved to be exited in all thermodynamic applications. The need of a law describing this phenomenon gives rise to the second law of thermodynamics.

The Second Law of Thermodynamics states that the entropy of a closed system is constantly increasing with respect to time. [1]
It is often said jokingly that the first law states that one cannot win and that the second law states that one cannot even break even.

The second law of thermodynamics may be expressed in two related statements as follows:

Statement 1: It is impossible to operate a system in such a way that heat absorbed by the system is completely converted into work done by the system.

Statement 2: It is impossible for a process to consist solely in the transfer of heat from one temperature level to a higher one.

Statement 1 is not contradictory to the first law of thermodynamics. Statement 1 does not imply that heat cannot be converted to work done by the system; it only implies that either the system or the surroundings have to be changed or modified when such a process takes place. As a corollary, any continuous production of work from heat is proved to be impossible. To compress the gas back to its initial state, energy must be drawn from the surroundings in the form of work; heat is transferred to the surroundings to maintain constant temperature at the same time. The amount of work gained from expansion is required by the reverse process described above, thus the production of net work is impossible.

 
Heat Conversion to Work

A thermodynamic theorem, the Carnot’s theorem is generated based on statement 2. The Carnot’s theorem states that no engine can have a higher thermal efficiency than that of a Carnot engine. Since a Carnot engine is reversible, it is able to transfer heat form one temperature level to a higher one. Although such an engine does not exist in the real world, it is the most efficient engine based on the laws of thermodynamics.

The study of heat engines, devices that are able to convert heat to work in a cyclical fashion, often serves as the classical approach to the second law. This macroscopic viewpoint of properties was able to be conducted without any knowledge of either the structure or the behavior of molecules. Any heat engines consist the following cycles: absorption of heat into the system at a relatively high temperature, rejection of heat to the surroundings at a relatively low temperature, and production of work. The two temperature levels are often referred to as heat reservoirs; the higher temperature level as the hot reservoir, and the lower temperature level as the cold reservoir. In thermodynamic applications, the working fluid, a liquid or a gas, connects the hot and the cold reservoirs in the sense that it absorbs heat from the hot reservoir, produces work, discards heat to the cold reservoir, and returns to its initial state to get ready for a new cycle.

 
Reversible Carnot Cycle

Spontaneous Processes and Entropy edit

Some processes proceed spontaneously in one direction, but not in the other. This process is similar to a gas diffusing to fill its container, but never really collecting at one end. These processes are said to be spontaneous; in other words, they occur without outside intervention.

Entropy, denoted by the symbol S, is the thermodynamic property that describes the spontaneity of a process. It is a macroscopic property of randomness or disorder, and is also a function that describes the number of arrangements (positions and/or energy levels) that are available to a system existing in a given state.

The Second Law of Thermodynamics can be explained with a simple example. Consider throwing a deck of ordered playing cards into the air and picking them all up at random. It is very improbable to pick up the cards in their original order because the probability for this to happen is so minute that we never really observe it. Entropy can be closely associated with probability. The more ways a particular state can be achieved, the greater the likelihood (probability) of finding that state. However, this does not mean that it is impossible for the cards to be put back in their original order or for the gas to be only at one end of the container. It merely is improbable.

There is a natural tendency toward disorder in closed systems because the state that has the highest probability of existing is where the system will be at equilibrium. The equilibrium state, such as within a container of gas, is referred to as the disordered state (where gas molecules are equally dispersed). The equilibrium state is the same as the disordered state because the gas molecules will occupy the largest volume possible, meaning they are all equally spaced out.[2]


The change in entropy of the universe can be represented as

ΔSuniv=ΔSsys+ΔSsurr

where ΔSsys and ΔSsurr represents the changes in entropy that occur in the system and the surrounding respectively.

If ΔSuniv is positive, the entropy of the system increases, and the process is spontaneous in the direction written. A negative value for ΔSuniv indicates that the process is spontaneous in the opposite direction. The system is at equilibrium, and the process has no tendency to occur if ΔSuniv is equal to zero.

In some cases, particularly biological systems, it is hard to see how the entropy of universe is increasing. For example, when a leaf use carbon dioxide and nutrients to produce cellulose, the randomness and consequently entropy is decreasing. However, this process does not have any contradiction with second law of thermodynamic because it accompanies with increasing heat in the environment which increases the entropy. [3]

[4]===Phase changes and Entropy ===


Entropy has to do with the freedom of particle motion. As a result, when entropy increases in the system, it can cause a phase change from solid state to liquid state, and even to the gas state. In the solid state, the particles' movements are restricted and they have less freedom to move around in a fixed area. However, in a liquid state, particles have more freedom to move around. Therefore, in a gas state, the particles have so much greater freedom to move around. Consequently, entropy increases as one goes solid to liquid and to gas state.

When such disorder occurs, the energy of motion becomes more dispersed. For instance,when a salt is dissolved in liquid water, there would be more ions and solvent molecules interacting with each other. As a result, the solution's energy of motion is more dispersed. The greater the freedom the particles have, the more energy of motion they would dispersed.

                                solid ---> liquid ----> gas
less freedom for particles interaction --------> much greater freedom for particles interaction
                fixed energy of motion --------> dispersed energy of motion

Therefore, the change in phase states and the freedom of motion of particles can help to determine whether a reaction is spontaneous or not.

[5]===The number of Microstates and Entropy ===

As stated in the previous section about the freedom of motion and the dispersed energy of motion, they are the two factors that can determine the direction of the spontaneous reaction. Silberberg defines microstates as "the quantized states of the whole system of a gas molecules". Microstates is about a gas molecules' state when it reacts with other molecules in the system. Consequently, there would be an increase in the energy of motion because the molecules vibrate and rotate around one another. In addition, there are different microstates for different conditions in the system. In thermodynamic terms, microstates can be related to entropy, the state of disorder, because the number of microstates is the number of ways that the thermal energy can be dispersed in the system. The equation is

                                       S= k ln W
     k (the Boltzmann constant)= R (gas constant)/ Avogadro's number= 1.38 x 10^ -23 J/K
     W= the number of microstates
     S is entropy

Therefore, entropy depends on the number of microstates.

           small number of microstates -----> much greater number of microstates
                           low entropy -----> high entropy

The Effect of Temperature on Spontaneity edit

Entropy changes in the surroundings ΔSsurr are primarily determined by heat flow.

The sign of ΔSsurr depends on the direction of the heat flow. In an exothermic process, the resulting energy flow increases the random motions in the surroundings, increasing the entropy of the surroundings (ΔSsurr is positive). Similarly, the tendency for systems to undergo changes that lower its energy can be explained by the fact that when a system at constant temperature moves to a lower energy state, the energy it gives up is transferred to the surroundings, leading to an increase in entropy there.

The magnitude of ΔSsurr depends on the temperature. At high temperatures, atoms in the surroundings are in rapid motion. A given quantity of energy transferred to the surroundings do not make a large percent change in their motions. Thus, the impact of the transfer of a given quantity of energy as heat to or from the surroundings is greater at lower temperatures, where the randomness of the surroundings experience a greater percent change. In other words, ΔSsurr depends directly on the quantity of heat transferred and inversely on temperature.

References edit

  1. Levine, Ira N. (2005). Physical Chemistry (6th Ed. ed.). McGraw Hill Publishing Company. ISBN0-0-07-049508-4. {{cite book}}: |edition= has extra text (help)
  2. Levine, Ira N. (2005). Physical Chemistry (6th Ed. ed.). McGraw Hill Publishing Company. ISBN0-0-07-049508-4. {{cite book}}: |edition= has extra text (help)
  3. Berg, Jeremy M. (2010). Biochemistry (7th Ed. ed.). W. H. Freeman and Company. ISBN0-1-42-922936-5. {{cite book}}: |edition= has extra text (help)
  4. Silberberg, Martin S.(2010). Principles of General Chemistry (2nd Edition).McGraw Hill Publishing Company. ISBN978-0-07-351108-05
  5. Silberberg, Martin S.(2010). Principles of General Chemistry (2nd Edition).McGraw Hill Publishing Company. ISBN978-0-07-351108-05


4.Smith, J.M. (2005). Introduction to Chemical Engineering Thermodynamics. McGraw Hill. ISBN 978-007-127055-7. {{cite book}}: Text "coauthors+ H.C. Van Ness, M.M. Abbott" ignored (help)

5. Silberberg, Martin S.(2010). Principles of General Chemistry (2nd Edition).McGraw Hill Publishing Company. ISBN978-0-07-351108-05