# Second law of thermodynamics

(Difference between revisions)
 Revision as of 15:10, 20 July 2010 (view source) (→References)← Older edit Revision as of 21:21, 25 July 2010 (view source)Newer edit → Line 1: Line 1: - The '''second law of [[thermodynamics]]''' is an expression of the universal principle of  [[entropy]], stating that the entropy of an [[isolated system]] which is not in [[Thermodynamic equilibrium|equilibrium]] will tend to increase over time, approaching a maximum value at equilibrium; and that the entropy change ''dS'' of a system undergoing any infinitesimal reversible process is given by $\delta q/T$, where $\delta q$ is the heat supplied to the system and ''T'' is the absolute temperature of the system. In [[classical thermodynamics]], the second law is taken to be a basic postulate, while in [[statistical thermodynamics]], the second law is a consequence of applying the [[Statistical_mechanics#Fundamental postulate|fundamental postulate]], also known as the equal ''[[a priori]]'' probability postulate,{{Clarify|reason=explain or make article for the equal prior probability postulate|date=October 2009}} to the future while empirically accepting that the past was low entropy, for reasons not yet well understood.{{clarify|reason=subject of "for reasons not yet well understood" is unclear|date=December 2009}} - - The origin of the second law can be traced to French physicist [[Nicolas Léonard Sadi Carnot|Sadi Carnot]]'s 1824 paper ''[[Reflections on the Motive Power of Fire]]'', which presented the view that [[motive power]] ([[mechanical work|work]]) is due to the flow of [[caloric]] ([[heat]]) from a hot to cold body ([[working substance]]). In simple terms, the second law is an expression of the fact that over time, ignoring the effects of self-gravity, differences in temperature, pressure, and chemical potential tend to even out in a physical system that is isolated from the outside world. Entropy is a measure of how much this evening-out process has progressed. - - There are many versions of the second law, but they all have the same effect, which is to explain the phenomenon of [[irreversibility]] in nature. ==Introduction== ==Introduction== - ===Versions of the Law=== + ===Versions of the Second Law=== - There are many ways of stating the second law of thermodynamics, but all are equivalent in the sense that each form of the second law logically implies every other form.{{cite book |last=Fermi |first=Enrico |authorlink=Enrico Fermi  |title=Thermodynamics |year=1956 |origyear=1936 |publisher=Dover Publications, Inc |location=New York |isbn=0-486-60361-X }} Thus, the theorems of thermodynamics can be proved using any form of the second law and third law. + There are many ways of stating the second law of thermodynamics, but all are equivalent in the sense that each form of the second law logically implies every other form. Thus, the theorems of thermodynamics can be proved using any form of the second law and third law. The formulation of the second law that refers to entropy directly is as follows: The formulation of the second law that refers to entropy directly is as follows: -
In a [[system]], a process that occurs will tend to increase the total entropy of the universe.
+
In a system, a process that occurs will tend to increase the total entropy of the universe.
Thus, while a system can go through  some physical process that decreases its own entropy, the entropy of the universe (which includes the system and its surroundings) must increase overall.  (An exception to this rule is a reversible or "isentropic" process, such as frictionless adiabatic compression.)  Processes that decrease the total entropy of the universe are impossible. If a system is at equilibrium, by definition no spontaneous processes occur, and therefore the system is at maximum entropy. Thus, while a system can go through  some physical process that decreases its own entropy, the entropy of the universe (which includes the system and its surroundings) must increase overall.  (An exception to this rule is a reversible or "isentropic" process, such as frictionless adiabatic compression.)  Processes that decrease the total entropy of the universe are impossible. If a system is at equilibrium, by definition no spontaneous processes occur, and therefore the system is at maximum entropy. - A second formulation, due to [[Rudolf Clausius]], is the simplest formulation of the second law, the heat formulation or '''Clausius statement''': + A second formulation, due to Rudolf Clausius, is the simplest formulation of the second law, the heat formulation or '''Clausius statement''': -
[[Heat]] generally cannot flow spontaneously from a material  at lower temperature to a material at higher temperature.
+
Heat generally cannot flow spontaneously from a material  at lower temperature to a material at higher temperature.
- Informally, "Heat doesn't flow from cold to hot (without work input)", which is true obviously from ordinary experience.  For example in a refrigerator, heat flows from cold to hot, but only when aided by an external agent (i.e. the compressor). Note that from the mathematical definition of [[entropy]], a process in which heat flows from cold to hot has decreasing entropy.  This can happen in a non-isolated system if entropy is created elsewhere, such that the ''total'' entropy is constant or increasing, as required by the second law.  For example, the electrical energy going into a refrigerator is converted to heat and goes out the back, representing a net increase in entropy. + Informally, "Heat doesn't flow from cold to hot (without work input)", which is true obviously from ordinary experience.  For example in a refrigerator, heat flows from cold to hot, but only when aided by an external agent (i.e. the compressor). Note that from the mathematical definition of entropy, a process in which heat flows from cold to hot has decreasing entropy.  This can happen in a non-isolated system if entropy is created elsewhere, such that the ''total'' entropy is constant or increasing, as required by the second law.  For example, the electrical energy going into a refrigerator is converted to heat and goes out the back, representing a net increase in entropy. - The exception to this is for statistically unlikely events where hot particles will "steal" the energy of cold particles enough that the cold side gets colder and the hot side gets hotter, for an instant. Such events have been observed at a small enough scale where the likelihood of such a thing happening is significant.G.M. Wang, E.M. Sevick, E. Mittag, D.J. Searles & Denis J. Evans (2002). "Experimental demonstration of violations of the Second Law of Thermodynamics for small systems and short time scales". Physical Review Letters 89: 050601/1–050601/4. doi:10.1103/PhysRevLett.89.050601 The mathematics involved in such an event are described by [[fluctuation theorem]]. + The exception to this is for statistically unlikely events where hot particles will "steal" the energy of cold particles enough that the cold side gets colder and the hot side gets hotter, for an instant. Such events have been observed at a small enough scale where the likelihood of such a thing happening is significant.G.M. Wang, E.M. Sevick, E. Mittag, D.J. Searles & Denis J. Evans (2002). "Experimental demonstration of violations of the Second Law of Thermodynamics for small systems and short time scales". Physical Review Letters 89: 050601/1–050601/4. doi:10.1103/PhysRevLett.89.050601 The mathematics involved in such an event are described by fluctuation theorem. A third formulation of the second law, by [[William Thomson, 1st Baron Kelvin|Lord Kelvin]], is the heat engine formulation, or '''Kelvin statement''': A third formulation of the second law, by [[William Thomson, 1st Baron Kelvin|Lord Kelvin]], is the heat engine formulation, or '''Kelvin statement''': Line 36: Line 31: Formulations of the second law in modern textbooks that introduce entropy from the statistical point of view, often contain two parts. The first part states that the entropy of an isolated system cannot decrease, or, to be more precise, the probability of an entropy increase is exceedingly small. The second part gives the relation between infinitesimal entropy increase of a system and an infinitesimal amount of absorbed heat in case of an arbitrary infinitesimal reversible process: $dS = \delta q/T$. The reason why these two statements are not combined into a single statement is because the first part refers to a general non-equilibrium process in which temperature is not defined. Formulations of the second law in modern textbooks that introduce entropy from the statistical point of view, often contain two parts. The first part states that the entropy of an isolated system cannot decrease, or, to be more precise, the probability of an entropy increase is exceedingly small. The second part gives the relation between infinitesimal entropy increase of a system and an infinitesimal amount of absorbed heat in case of an arbitrary infinitesimal reversible process: $dS = \delta q/T$. The reason why these two statements are not combined into a single statement is because the first part refers to a general non-equilibrium process in which temperature is not defined. - ===Microscopic systems=== + ==Descriptions== - Thermodynamics is a theory of macroscopic systems and therefore the second law applies only to macroscopic systems with well-defined temperatures.  For example, in a system of two molecules, there is a non-trivial probability that the slower-moving ("cold") molecule transfers energy to the faster-moving ("hot") molecule. Such tiny systems are not part of classical thermodynamics, but they can be investigated by quantum thermodynamics by using [[statistical mechanics]]. For any isolated system with a mass of more than a few [[Kilogram#SI multiples|picograms]], probabilities of observing a decrease in entropy approach zero.{{cite book | last = Landau | first = L.D. |coauthors=Lifshitz, E.M.| title = Statistical Physics Part 1 | publisher = Butterworth Heinemann | year = 1996 | isbn = 0-7506-3372-7}} + - ===Energy dispersal=== + === Informal descriptions === - The '''second law of [[thermodynamics]]''' is an axiom of thermodynamics concerning heat, entropy, and the direction in which thermodynamic processes can occur. For example, the second law implies that heat does not flow  spontaneously from a cold material to a hot material, but it allows heat to flow from a hot material to a cold material. Roughly speaking, the second law says that in an isolated system, concentrated energy disperses over time, and consequently less concentrated energy is available to do useful work. Energy dispersal also means that differences in temperature, pressure, and density even out. Again roughly speaking, thermodynamic [[entropy]] is a measure of energy dispersal, and so the second law is closely connected with the concept of entropy. + - == Overview == - In a general sense, the '''second law''' is that temperature differences between systems in contact with each other tend to equalize and that [[thermodynamic work|work]] can be obtained from these non-equilibrium differences, but that loss of heat occurs, in the form of entropy, when work is done.{{cite book|author=Mendoza, E. |title=Reflections on the Motive Power of Fire – and other Papers on the Second Law of Thermodynamics by E. Clapeyron and R. Clausius|location=New York | publisher=Dover Publications |year=1988|isbn=0-486-44641-7}} Pressure differences, density differences, and particularly temperature differences, all tend to equalize if given the opportunity. This means that an [[isolated system]] will eventually have a uniform temperature.  A [[heat engine]] is a mechanical device that provides useful work from the difference in temperature of two bodies: - - [[Image:800px-Carnot heat engine 2.svg.png|thumb|center|Heat engine diagram]] - - During the 19th century, the second law was synthesized, essentially, by studying the dynamics of the [[Carnot heat engine]] in coordination with James Joule's [[mechanical equivalent of heat]] experiments. Since any thermodynamic engine requires such a temperature difference, it follows that useful work cannot be derived from an [[isolated system]] in equilibrium; there must always be an external energy source and a cold sink. By definition, [[perpetual motion machines of the second kind]] would have to violate the second law to function. - - == History == - {{See also|History of entropy}} - - The first theory of the conversion of heat into mechanical work is due to [[Nicolas Léonard Sadi Carnot]] in 1824. He was the first to realize correctly that the efficiency of this conversion depends on the difference of temperature between an engine and its environment. - - Recognizing the significance of [[James Prescott Joule]]'s work on the conservation of energy, [[Rudolf Clausius]] was the first to formulate the second law during 1850, in this form: heat does not flow ''spontaneously'' from cold to hot bodies. While common knowledge now, this was contrary to the [[caloric theory]] of heat popular at the time, which considered heat as a fluid.  From there he was able to infer the principle of Sadi Carnot and the definition of entropy (1865). - - Established during the 19th century, the [[William Thomson, 1st Baron Kelvin|Kelvin]]-[[Planck]] statement of the Second Law says, "It is impossible for any device that operates on a [[cyclic process|cycle]] to receive heat from a single [[heat reservoir|reservoir]] and produce a net amount of work." This was shown to be equivalent to the statement of Clausius. - - The [[ergodic hypothesis]] is also important for the [[Boltzmann]] approach. It says that, over long periods of time, the time spent in some region of the phase space of microstates with the same energy is proportional to the volume of this region, i.e. that all accessible microstates are equally probable over long period of time. Equivalently, it says that time average and average over the statistical ensemble are the same. - - It has been shown that not only classical systems but also [[quantum mechanics|quantum mechanical]] ones tend to maximize their entropy over time. Thus the second law follows, given initial conditions with low entropy. More precisely, it has been shown that the local [[von Neumann entropy]] is at its maximum value with an extremely great probability.{{Citation - | last1 = Gemmer | first1 = Jochen - | last2 = Otte | first2 = Alexander - | last3 = Mahler | first3 = Günter - | title = Quantum Approach to a Derivation of the Second Law of Thermodynamics - | journal = Phys. Rev. Lett. - | volume = 86 - | issue = 10 - | pages = 1927–1930 - | year = 2001 - | url = http://prola.aps.org/abstract/PRL/v86/i10/p1927_1 - | doi = 10.1103/PhysRevLett.86.1927}} The result is valid for a large class of isolated quantum systems (e.g. a gas in a container). While the full system is pure and therefore does not have any entropy, the [[entanglement]] between gas and container gives rise to an increase of the local entropy of the gas. This result is one of the most important achievements of [[quantum thermodynamics]]{{Dubious|date=March 2009}}. - - Today, much effort in the field is to understand why the initial conditions early in the universe were those of low entropy[http://arxiv.org/abs/gr-qc/0505037 ''Does Inflation Provide Natural Initial Conditions for the Universe?''], Carrol SM, Chen J, Gen.Rel.Grav. 37 (2005) 1671-1674; Int.J.Mod.Phys. D14 (2005) 2335-2340, arXiv:gr-qc/0505037v1[http://www.sciencedirect.com/science?_ob=ArticleURL&_udi=B6VH6-4JYKMV7-1&_user=48161&_rdoc=1&_fmt=&_orig=search&_sort=d&view=c&_acct=C000005078&_version=1&_urlVersion=0&_userid=48161&md5=f102712b7b3f3977f4caa8c510d97e4e ''The arrow of time and the initial conditions of the universe''], Wald RM, Studies In History and Philosophy of Science Part B, Volume 37, Issue 3, September 2006, Pages 394-398, as this is seen as the origin of the second law (see below). - - === Informal descriptions === The second law can be stated in various succinct ways, including: The second law can be stated in various succinct ways, including: - * It is impossible to produce work in the surroundings using a cyclic process connected to a single heat reservoir ([[William Thomson, 1st Baron Kelvin|Kelvin]], 1851). + * It is impossible to produce work in the surroundings using a cyclic process connected to a single heat reservoir (Kelvin, 1851). - * It is impossible to carry out a cyclic process using an engine connected to two heat reservoirs that will have as its only effect the transfer of a quantity of heat from the low-temperature reservoir to the high-temperature reservoir ([[Rudolf Clausius|Clausius]], 1854). + * It is impossible to carry out a cyclic process using an engine connected to two heat reservoirs that will have as its only effect the transfer of a quantity of heat from the low-temperature reservoir to the high-temperature reservoir (Clausius, 1854). - * If thermodynamic [[work (thermodynamics)|work]] is to be done at a finite rate, [[thermodynamic free energy|free energy]] must be expended.Stoner, C.D. (2000). [http://arxiv.org/pdf/physics/0004055 Inquiries into the Nature of Free Energy and Entropy] – in Biochemical Thermodynamics.  ''Entropy, Vol 2''. + * If thermodynamic work is to be done at a finite rate, free energy must be expended.Stoner, C.D. (2000). [http://arxiv.org/pdf/physics/0004055 Inquiries into the Nature of Free Energy and Entropy] – in Biochemical Thermodynamics.  ''Entropy, Vol 2''. === Mathematical descriptions === === Mathematical descriptions === - In 1856, the German physicist [[Rudolf Clausius]] stated what he called the "second fundamental theorem in the [[mechanical theory of heat]]" in the following form:Clausius, R. (1865). ''The Mechanical Theory of Heat – with its Applications to the Steam Engine and to Physical Properties of Bodies.'' London: John van Voorst, 1 Paternoster Row. MDCCCLXVII. + In 1856, the German physicist Rudolf Clausius stated what he called the "second fundamental theorem in the [[mechanical theory of heat]]" in the following form:Clausius, R. (1865). ''The Mechanical Theory of Heat – with its Applications to the Steam Engine and to Physical Properties of Bodies.'' London: John van Voorst, 1 Paternoster Row. MDCCCLXVII. + :$\int \frac{\delta Q}{T} = -N$ :$\int \frac{\delta Q}{T} = -N$ where ''Q'' is heat, ''T'' is temperature and ''N'' is the "equivalence-value" of all uncompensated transformations involved in a cyclical process.  Later, in 1865, Clausius would come to define "equivalence-value" as entropy.  On the heels of this definition, that same year, the most famous version of the second law was read in a presentation at the Philosophical Society of Zurich on April 24, in which, in the end of his presentation, Clausius concludes: where ''Q'' is heat, ''T'' is temperature and ''N'' is the "equivalence-value" of all uncompensated transformations involved in a cyclical process.  Later, in 1865, Clausius would come to define "equivalence-value" as entropy.  On the heels of this definition, that same year, the most famous version of the second law was read in a presentation at the Philosophical Society of Zurich on April 24, in which, in the end of his presentation, Clausius concludes: -
The entropy of the universe tends to a maximum.
+
''The entropy of the universe tends to a maximum.''
- This statement is the best-known phrasing of the second law.  Moreover, owing to the general broadness of the terminology used here, e.g. [[universe]], as well as lack of specific conditions, e.g. open, closed, or isolated, to which this statement applies, many people take this simple statement to mean that the second law of thermodynamics applies virtually to every subject imaginable.  This, of course, is not true; this statement is only a simplified version of a more complex description. + This statement is the best-known phrasing of the second law.  Moreover, owing to the general broadness of the terminology used here, e.g. universe, as well as lack of specific conditions, e.g. open, closed, or isolated, to which this statement applies, many people take this simple statement to mean that the second law of thermodynamics applies virtually to every subject imaginable.  This, of course, is not true; this statement is only a simplified version of a more complex description. - In terms of time variation, the mathematical statement of the second law for an [[isolated system]] undergoing an arbitrary transformation is: + In terms of time variation, the mathematical statement of the second law for an isolated system undergoing an arbitrary transformation is: :$\frac{dS}{dt} \ge 0$ :$\frac{dS}{dt} \ge 0$ Line 98: Line 58: :''t'' is [[time]]. :''t'' is [[time]]. - It should be noted that [[statistical mechanics]] gives an explanation for the second law by postulating that a material is composed of atoms and molecules which are in constant motion. A particular set of positions and velocities for each particle in the system is called a [[microstate (statistical mechanics)|microstate]] of the system and because of the constant motion, the system is constantly changing its microstate. Statistical mechanics postulates that, in equilibrium,  each microstate that the system might be in is equally likely to occur, and when this assumption is made, it leads directly to the conclusion that the second law must hold in a statistical sense. That is, the second law will hold on average, with a statistical variation on the order of 1/√N where ''N'' is the number of particles in the system. For everyday (macroscopic) situations, the probability that  the second law will be violated is practically zero. However, for systems with a small number of particles, thermodynamic parameters, including the entropy, may show significant statistical deviations from that predicted by the second law. Classical thermodynamic theory does not deal with these statistical variations. + It should be noted that statistical mechanics gives an explanation for the second law by postulating that a material is composed of atoms and molecules which are in constant motion. A particular set of positions and velocities for each particle in the system is called a microstate of the system and because of the constant motion, the system is constantly changing its microstate. Statistical mechanics postulates that, in equilibrium,  each microstate that the system might be in is equally likely to occur, and when this assumption is made, it leads directly to the conclusion that the second law must hold in a statistical sense. That is, the second law will hold on average, with a statistical variation on the order of 1/√N where ''N'' is the number of particles in the system. For everyday (macroscopic) situations, the probability that  the second law will be violated is practically zero. However, for systems with a small number of particles, thermodynamic parameters, including the entropy, may show significant statistical deviations from that predicted by the second law. Classical thermodynamic theory does not deal with these statistical variations. - + - == Available useful work == + - {{See also|Available useful work (thermodynamics)}} + - + - An important and revealing idealized special case is to consider applying the Second Law to the scenario of an isolated system (called the total system or universe), made up of two parts: a sub-system of interest, and the sub-system's surroundings.  These surroundings are imagined to be so large that they can be considered as an ''unlimited'' heat reservoir at temperature ''TR'' and pressure ''PR'' — so that no matter how much heat is transferred  to (or from) the sub-system, the temperature of the surroundings will remain ''TR''; and no matter how much the volume of the sub-system expands (or contracts), the pressure of the surroundings will remain ''PR''. + - + - Whatever changes to ''dS'' and ''dSR'' occur in the entropies of the sub-system and the surroundings individually, according to the Second Law the entropy ''Stot'' of the isolated total system must not decrease: + - :$dS_{\mathrm{tot}}= dS + dS_R \ge 0$ + - + - According to the [[First Law of Thermodynamics]], the change ''dU'' in the internal energy  of the sub-system is the sum of the heat ''δq'' added to the sub-system, ''less'' any work ''δw'' done ''by'' the sub-system, ''plus'' any net chemical energy entering the sub-system ''d ∑μiRNi'', so that: + - :$dU = \delta q - \delta w + d(\sum \mu_{iR}N_i) \,$ + - + - where μiR are the [[chemical potential]]s of chemical species in the external surroundings. + - + - Now the heat leaving the reservoir and entering the sub-system is + - :$\delta q = T_R (-dS_R) \le T_R dS$ + - + - where we have first used the definition of entropy in classical thermodynamics (alternatively, in statistical thermodynamics, the relation between entropy change, temperature and absorbed heat can be derived); and then the Second Law inequality from above. + - + - It therefore follows that any net work ''δw'' done by the sub-system must obey + - :$\delta w \le - dU + T_R dS + \sum \mu_{iR} dN_i \,$ + - + - It is useful to separate the work ''δw'' done by the subsystem into the ''useful'' work ''δwu'' that can be done ''by'' the sub-system, over and beyond the work ''pR dV'' done merely by the sub-system expanding against the surrounding external pressure, giving the following relation for the useful work that can be done: + - :$\delta w_u \le -d (U - T_R S + p_R V - \sum \mu_{iR} N_i )\,$ + - + - It is convenient to define the right-hand-side as the exact derivative of a thermodynamic potential, called the ''availability'' or [[exergy]] ''X'' of the subsystem, + - :$X = U - T_R S + p_R V - \sum \mu_{iR} N_i$ + - + - The Second Law therefore implies that for any process which can be considered as divided simply into a subsystem, and an unlimited temperature and pressure reservoir with which it is in contact, + - :$d X + \delta w_u \le 0 \,$ + - + - i.e. the change in the subsystem's exergy plus the useful work done ''by'' the subsystem (or, the change in the subsystem's exergy less any work, additional to that done by the pressure reservoir, done ''on'' the system) must be less than or equal to zero. + - + - === Special cases: Gibbs and Helmholtz free energies === + - When no useful work is being extracted from the sub-system, it follows that + - :$d X \le 0 \,$ + - + - with the [[exergy]] ''X'' reaching a minimum at equilibrium, when ''dX=0''. + - + - If no chemical species can enter or leave the sub-system, then the term ''∑ μiR Ni'' can be ignored.  If furthermore the temperature of the sub-system is such that ''T'' is always equal to ''TR'', then this gives: + - :$X = U - TS + p_R V + \mathrm{const.} \,$ + - + - If the volume ''V'' is constrained to be constant, then + - :$X = U - TS + \mathrm{const.'} = A + \mathrm{const.'}\,$ + - + - where ''A'' is the thermodynamic potential called [[Helmholtz free energy]], ''A=U−TS''. Under constant-volume conditions therefore, ''dA ≤ 0'' if a process is to go forward; and ''dA=0'' is the condition for equilibrium. + - + - Alternatively, if the sub-system pressure ''p'' is constrained to be equal to the external reservoir pressure ''pR'', then + - :$X = U - TS + pV + \mathrm{const.} = G + \mathrm{const.}\,$ + - + - where ''G'' is the [[Gibbs free energy]], ''G=U−TS+PV''.  Therefore under constant-pressure conditions, if ''dG ≤ 0'', then the process can occur spontaneously, because the change in system energy exceeds the energy lost to entropy.  ''dG=0'' is the condition for equilibrium.  This is also commonly written in terms of [[enthalpy]], where H=U+PV. + - + - === Application === + - In sum, if a proper ''infinite-reservoir-like'' reference state is chosen as the system surroundings in the real world, then the Second Law predicts a decrease in ''X'' for an irreversible process and no change for a reversible process. + - :$dS_{tot} \ge 0$ is equivalent to  $dX + \delta W_u \le 0$ + - + - This expression together with the associated reference state permits a [[design engineer]] working at the macroscopic scale (above the [[thermodynamic limit]]) to utilize the Second Law without directly measuring or considering entropy change in a total isolated system. (''Also, see [[process engineer]]''). Those changes have already been considered by the assumption that the system under consideration can reach equilibrium with the reference state without altering the reference state.  An efficiency for a process or collection of processes that compares it to the reversible ideal may also be found (''See [[Exergy efficiency|second law efficiency]]''.) + - + - This approach to the Second Law is widely utilized in [[engineering]] practice, [[environmental accounting]], [[systems ecology]], and other disciplines. + - + - == Proof of the Second Law == + - As mentioned above, in statistical mechanics, the Second Law is not a postulate, rather it is a consequence of the [[Statistical_mechanics#Fundamental postulate|fundamental postulate]], also known as the equal prior probability postulate, so long as one is clear that simple probability arguments are applied only to the future, while for the past there are auxiliary sources of information which tell us that it was low entropy. The first part of the second law, which states that the entropy of a thermally isolated system can only increase is a trivial consequence of the equal prior probability postulate, if we restrict the notion of the entropy to systems in thermal equilibrium. The entropy of an isolated system in thermal equilibrium containing an amount of energy of $E$ is: + - + - :$S = k \log\left[\Omega\left(E\right)\right]\,$ + - + - where $\Omega\left(E\right)$ is the number of quantum states in a small interval between $E$ and $E +\delta E$. Here $\delta E$ is a macroscopically small energy interval that is kept fixed. Strictly speaking this means that the entropy depends on the choice of $\delta E$. However, in the thermodynamic limit (i.e. in the limit of infinitely large system size), the specific entropy (entropy per unit volume or per unit mass) does not depend on $\delta E$. + - + - Suppose we have an isolated system whose macroscopic state is specified by a number of variables. These macroscopic variable can e.g. refer to the total volume, the positions of pistons in the system etc.. Then $\Omega$ will depend on the values of these variables. If a variable is not fixed, e.g. we do not clamp a piston in a certain position, then because all the accessible states are equally likely in equilibrium, the free variable in equilibrium will be such that $\Omega$ is maximized as that is the most probable situation in equilibrium. + - + - If the variable was initially fixed to some value then upon release and when the new equilibrium has been reached, the fact the variable will adjust itself so that $\Omega$ is maximized, implies that that the entropy will have increased or it will have stayed the same (if the value at which the variable was fixed happened to be the equilibrium value). + - + - The entropy of a system that is not in equilibrium can be defined as: + - + - :$S = -k_{B}\sum_{j}P_{j}\log\left(P_{j}\right)$ + - + - [[entropy|see here]]. Here the $P_{j}$ are the probabilities for the system to be found in the states labeled by the subscript j. In thermal equilibrium the probabilities for states inside the energy interval $\delta E$ are all equal to $1/\Omega$, and in that case the general definition coincides with the previous definition of S that applies to the case of thermal equilibrium. + - + - Suppose we start from an equilibrium situation and we suddenly remove a constraint on a variable. Then right after we do this, there are a number $\Omega$ of accessible microstates, but equilibrium has not yet been reached, so the actual probabilities of the system being in some accessible state are not yet equal to the prior probability of $1/\Omega$. We have already seen that in the final equilibrium state, the entropy will have increased or have stayed the same relative to the previous equilibrium state. Boltzmann's [[H-theorem]], however, proves that the entropy will increase continuously as a function of time during the intermediate out of equilibrium state. + - + - ===Proof of $dS =\frac{\delta Q}{T}$ for reversible processes=== + - + - The second part of the Second Law states that the entropy change of a system undergoing a reversible processes is given by: + - + - :$dS =\frac{\delta Q}{T}$ + - + - where the temperature is defined as: + - + - $\frac{1}{k T}\equiv\beta\equiv\frac{d\log\left[\Omega\left(E\right)\right]}{dE}\,$ + - + - See [[microcanonical ensemble|here]] for the justification for this definition. Suppose that the system has some external parameter, x, that can be changed. In general, the energy eigenstates of the system will depend on x. According to the [[adiabatic theorem]] of quantum mechanics, in the limit of an infinitely slow change of the system's Hamiltonian, the system will stay in the same energy eigenstate and thus change its energy according to the change in energy of the energy eigenstate it is in. + - + - The generalized force, X, corresponding to the external variable x is defined such that $X dx$ is the work performed by the system if x is increased by an amount dx. E.g., if x is the volume, then X is the pressure. The generalized force for a system known to be in energy eigenstate $E_{r}$ is given by: + - + - :$X = -\frac{dE_{r}}{dx}$ + - + - Since the system can be in any energy eigenstate within an interval of $\delta E$, we define the generalized force for the system as the expectation value of the above expression: + - + - :$X = -\left\langle\frac{dE_{r}}{dx}\right\rangle\,$ + - + - To evaluate the average, we partition the $\Omega\left(E\right)$ energy eigenstates by counting how many of them have a value for $\frac{dE_{r}}{dx}$ within a range between $Y$ and $Y + \delta Y$. Calling this number $\Omega_{Y}\left(E\right)$, we have: + - + - :$\Omega\left(E\right)=\sum_{Y}\Omega_{Y}\left(E\right)\,$ + - + - The average defining the generalized force can now be written: + - + - :$X = -\frac{1}{\Omega\left(E\right)}\sum_{Y} Y\Omega_{Y}\left(E\right)\,$ + - + - We can relate this to the derivative of the entropy w.r.t. x at constant energy E as follows. Suppose we change x to x + dx. Then $\Omega\left(E\right)$ will change because the energy eigenstates depend on x, causing energy eigenstates to move into or out of the range between $E$ and $E+\delta E$. Let's focus again on the energy eigenstates for which $\frac{dE_{r}}{dx}$ lies within the range between $Y$ and $Y + \delta Y$. Since these energy eigenstates increase in energy by Y dx, all such energy eigenstates that are in the interval ranging from E - Y dx to E move from below E to above E. There are + - + - :$N_{Y}\left(E\right)=\frac{\Omega_{Y}\left(E\right)}{\delta E} Y dx\,$ + - + - such energy eigenstates. If $Y dx\leq\delta E$, all these energy eigenstates will move into the range between $E$ and $E+\delta E$ and contribute to an increase in $\Omega$. The number of energy eigenstates that move from below $E+\delta E$ to above $E+\delta E$  is, of course, given by $N_{Y}\left(E+\delta E\right)$. The difference + - + - :$N_{Y}\left(E\right) - N_{Y}\left(E+\delta E\right)\,$ + - + - is thus the net contribution to the increase in $\Omega$. Note that if Y dx is larger than $\delta E$ there will be the energy eigenstates that move from below E to above $E+\delta E$. They are counted in both $N_{Y}\left(E\right)$ and $N_{Y}\left(E+\delta E\right)$, therefore the above expression is also valid in that case. + - + - Expressing the above expression as a derivative w.r.t. E and summing over Y yields the expression: + - + - :$\left(\frac{\partial\Omega}{\partial x}\right)_{E} = -\sum_{Y}Y\left(\frac{\partial\Omega_{Y}}{\partial E}\right)_{x}= \left(\frac{\partial\left(\Omega X\right)}{\partial E}\right)_{x}\,$ + - + - The logarithmic derivative of $\Omega$ w.r.t. x is thus given by: + - + - :$\left(\frac{\partial\log\left(\Omega\right)}{\partial x}\right)_{E} = \beta X +\left(\frac{\partial X}{\partial E}\right)_{x}\,$ + - + - The first term is intensive, i.e. it does not scale with system size. In contrast, the last term scales as the inverse system size and will thus vanishes in the thermodynamic limit. We have thus found that: + - + - :$\left(\frac{\partial S}{\partial x}\right)_{E} = \frac{X}{T}\,$ + - + - Combining this with + - + - :$\left(\frac{\partial S}{\partial E}\right)_{x} = \frac{1}{T}\,$ + - + - Gives: + - + - :$dS = \left(\frac{\partial S}{\partial E}\right)_{x}dE+\left(\frac{\partial S}{\partial x}\right)_{E}dx = \frac{dE}{T} + \frac{X}{T} dx=\frac{\delta Q}{T}\,$ + - + - ====Proof for systems described by the canonical ensemble==== + - If a system is in thermal contact with a heat bath at some temperature T then, in equilibrium, the probability distribution over the energy eigenvalues are given by the [[canonical ensemble]]: + - + - :$P_{j}=\frac{\exp\left(-\frac{E_{j}}{k T}\right)}{Z}$ + - + - Here Z is a factor that normalizes the sum of all the probabilities to 1, this function is known as the [[Partition function (statistical mechanics)|partition function]]. We now consider an infintesimal reversible change in the temperature and in the external parameters on which the energy levels depend. It follows from the general formula for the entropy: + - + - :$S = -k\sum_{j}P_{j}\log\left(P_{j}\right)$ + - + - that + - + - :$dS = -k\sum_{j}\log\left(P_{j}\right)dP_{j}$ + - + - Inserting the formula for $P_{j}$ for the canonical ensemble in here gives: + - + - :$dS = \frac{1}{T}\sum_{j}E_{j}dP_{j}=\frac{1}{T}\sum_{j}d\left(E_{j}P_{j}\right) - \frac{1}{T}\sum_{j}P_{j}dE_{j}= \frac{dE + \delta W}{T}=\frac{\delta Q}{T}$ + - + - == Criticisms == + - Owing to the somewhat ambiguous nature of the formulation of the second law, i.e. the postulate that the quantity [[heat]] divided by [[Absolute temperature|temperature]] increases in spontaneous natural processes, it has occasionally been subject to criticism as well as attempts to dispute or disprove it.  Clausius himself even noted the abstract nature of the second law.  In his 1862 memoir, for example, after mathematically stating the second law by saying that integral of the differential of a quantity of heat divided by temperature must be less than or equal to zero for every cyclical process which is in any way possible: ([[Clausius theorem|Clausius Inequality]]) + - :$\oint \frac{\delta Q}{T} \le 0$, + - + - Clausius then stated: + -
Although the necessity of this theorem admits of strict mathematical proof if we start from the fundamental proposition above quoted it thereby nevertheless retains an abstract form, in which it is with difficulty embraced by the mind, and we feel compelled to seek for the precise physical cause, of which this theorem is a consequence.
"The law that entropy always increases holds, I think, the supreme position among the [[laws of Nature]]. If someone points out to you that your pet theory of the [[universe]] is in disagreement with [[Maxwell's equations]] — then so much the worse for Maxwell's equations. If it is found to be contradicted by observation — well, these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation." — Sir [[Arthur Stanley Eddington]], ''The Nature of the Physical World'' (1927)
+ -
The tendency for entropy to increase in isolated systems is expressed in the second law of thermodynamics — perhaps the most pessimistic and amoral formulation in all human thought. — [[Gregory Hill (writer)|Gregory Hill]] and [[Kerry Thornley]], ''[[Principia Discordia]]'' (1965)
+ -
There are almost as many formulations of the second law as there have been discussions of it. — Philosopher / Physicist [[Percy Williams Bridgman|P.W. Bridgman]], (1941)
+ - + - == See also == + - {{col-begin}} + - {{col-break}} + - * [[Clausius–Duhem inequality]] + - * [[Entropy]] + - * [[Entropy (arrow of time)]] + - * ''[[Entropy: A New World View]]'' [book] + - * [[Final Anthropic Principle]] + - * [[First law of thermodynamics]] + - * [[Heat death of the universe]] + - * [[History of thermodynamics]] + - {{col-break}} + - * [[Fluctuation theorem]] + - * [[Jarzynski equality]] + - * [[Laws of thermodynamics]] + - * [[Loschmidt's paradox]] + - * [[Maximum entropy thermodynamics]] + - * [[Statistical mechanics]] + - * [[Perpetual motion]] + - * [[Thermal diode]] + - {{col-end}} + - * [[Exergy efficiency|Second-law efficiency]] + - * [[Relativistic heat conduction]] + == References == == References == Line 351: Line 76: *[[E.T. Jaynes]], 1988, "[http://bayes.wustl.edu/etj/articles/ccarnot.pdf The evolution of Carnot's principle,]" in G. J. Erickson and C. R. Smith (eds.) ''Maximum-Entropy and Bayesian Methods in Science and Engineering, Vol 1'', p. 267. *[[E.T. Jaynes]], 1988, "[http://bayes.wustl.edu/etj/articles/ccarnot.pdf The evolution of Carnot's principle,]" in G. J. Erickson and C. R. Smith (eds.) ''Maximum-Entropy and Bayesian Methods in Science and Engineering, Vol 1'', p. 267. * [http://www.secondlaw.com/ Website devoted to the Second Law.] * [http://www.secondlaw.com/ Website devoted to the Second Law.] - - [[Category:Fundamental physics concepts]] - [[Category:Laws of thermodynamics|2]] - [[Category:Non-equilibrium thermodynamics]] - [[Category:Philosophy of thermal and statistical physics]]

## Introduction

### Versions of the Second Law

There are many ways of stating the second law of thermodynamics, but all are equivalent in the sense that each form of the second law logically implies every other form. Thus, the theorems of thermodynamics can be proved using any form of the second law and third law.

The formulation of the second law that refers to entropy directly is as follows:

In a system, a process that occurs will tend to increase the total entropy of the universe.

Thus, while a system can go through some physical process that decreases its own entropy, the entropy of the universe (which includes the system and its surroundings) must increase overall. (An exception to this rule is a reversible or "isentropic" process, such as frictionless adiabatic compression.) Processes that decrease the total entropy of the universe are impossible. If a system is at equilibrium, by definition no spontaneous processes occur, and therefore the system is at maximum entropy.

A second formulation, due to Rudolf Clausius, is the simplest formulation of the second law, the heat formulation or Clausius statement:

Heat generally cannot flow spontaneously from a material at lower temperature to a material at higher temperature.

Informally, "Heat doesn't flow from cold to hot (without work input)", which is true obviously from ordinary experience. For example in a refrigerator, heat flows from cold to hot, but only when aided by an external agent (i.e. the compressor). Note that from the mathematical definition of entropy, a process in which heat flows from cold to hot has decreasing entropy. This can happen in a non-isolated system if entropy is created elsewhere, such that the total entropy is constant or increasing, as required by the second law. For example, the electrical energy going into a refrigerator is converted to heat and goes out the back, representing a net increase in entropy.

The exception to this is for statistically unlikely events where hot particles will "steal" the energy of cold particles enough that the cold side gets colder and the hot side gets hotter, for an instant. Such events have been observed at a small enough scale where the likelihood of such a thing happening is significant.[1] The mathematics involved in such an event are described by fluctuation theorem.

A third formulation of the second law, by Lord Kelvin, is the heat engine formulation, or Kelvin statement:

It is impossible to convert heat completely into work in a cyclic process.

That is, it is impossible to extract energy by heat from a high-temperature energy source and then convert all of the energy into work. At least some of the energy must be passed on to heat a low-temperature energy sink. Thus, a heat engine with 100% efficiency is thermodynamically impossible. This also means that it is impossible to build solar panels that generate electricity solely from the infrared band of the electromagnetic spectrum without consideration of the temperature on the other side of the panel (as is the case with conventional solar panels that operate in the visible spectrum).

A fourth version of the second law was deduced by the Greek mathematician Constantin Carathéodory. The Carathéodory statement:

In the neighbourhood of any equilibrium state of a thermodynamic system, there are equilibrium states that are adiabatically inaccessible.

A final version of the second law was put to rhyme by Flanders and Swann[2], based on the Clausius statement:

Heat won't pass from a cooler to a hotter
You can try it if you like but you far better notter
'cos the cold in the cooler will get hotter as a ruler
'cos the hotter body's heat will pass to the cooler!

Formulations of the second law in modern textbooks that introduce entropy from the statistical point of view, often contain two parts. The first part states that the entropy of an isolated system cannot decrease, or, to be more precise, the probability of an entropy increase is exceedingly small. The second part gives the relation between infinitesimal entropy increase of a system and an infinitesimal amount of absorbed heat in case of an arbitrary infinitesimal reversible process: dS = δq / T. The reason why these two statements are not combined into a single statement is because the first part refers to a general non-equilibrium process in which temperature is not defined.

## Descriptions

### Informal descriptions

The second law can be stated in various succinct ways, including:

• It is impossible to produce work in the surroundings using a cyclic process connected to a single heat reservoir (Kelvin, 1851).
• It is impossible to carry out a cyclic process using an engine connected to two heat reservoirs that will have as its only effect the transfer of a quantity of heat from the low-temperature reservoir to the high-temperature reservoir (Clausius, 1854).
• If thermodynamic work is to be done at a finite rate, free energy must be expended.[3]

### Mathematical descriptions

In 1856, the German physicist Rudolf Clausius stated what he called the "second fundamental theorem in the mechanical theory of heat" in the following form:[4]

$\int \frac{\delta Q}{T} = -N$

where Q is heat, T is temperature and N is the "equivalence-value" of all uncompensated transformations involved in a cyclical process. Later, in 1865, Clausius would come to define "equivalence-value" as entropy. On the heels of this definition, that same year, the most famous version of the second law was read in a presentation at the Philosophical Society of Zurich on April 24, in which, in the end of his presentation, Clausius concludes:

The entropy of the universe tends to a maximum.

This statement is the best-known phrasing of the second law. Moreover, owing to the general broadness of the terminology used here, e.g. universe, as well as lack of specific conditions, e.g. open, closed, or isolated, to which this statement applies, many people take this simple statement to mean that the second law of thermodynamics applies virtually to every subject imaginable. This, of course, is not true; this statement is only a simplified version of a more complex description.

In terms of time variation, the mathematical statement of the second law for an isolated system undergoing an arbitrary transformation is:

$\frac{dS}{dt} \ge 0$

where

S is the entropy and
t is time.

It should be noted that statistical mechanics gives an explanation for the second law by postulating that a material is composed of atoms and molecules which are in constant motion. A particular set of positions and velocities for each particle in the system is called a microstate of the system and because of the constant motion, the system is constantly changing its microstate. Statistical mechanics postulates that, in equilibrium, each microstate that the system might be in is equally likely to occur, and when this assumption is made, it leads directly to the conclusion that the second law must hold in a statistical sense. That is, the second law will hold on average, with a statistical variation on the order of 1/√N where N is the number of particles in the system. For everyday (macroscopic) situations, the probability that the second law will be violated is practically zero. However, for systems with a small number of particles, thermodynamic parameters, including the entropy, may show significant statistical deviations from that predicted by the second law. Classical thermodynamic theory does not deal with these statistical variations.

## References

1. G.M. Wang, E.M. Sevick, E. Mittag, D.J. Searles & Denis J. Evans (2002). "Experimental demonstration of violations of the Second Law of Thermodynamics for small systems and short time scales". Physical Review Letters 89: 050601/1–050601/4. doi:10.1103/PhysRevLett.89.050601
2. http://iankitching.me.uk/humour/hippo/entropy.html
3. Stoner, C.D. (2000). Inquiries into the Nature of Free Energy and Entropy – in Biochemical Thermodynamics. Entropy, Vol 2.
4. Clausius, R. (1865). The Mechanical Theory of Heat – with its Applications to the Steam Engine and to Physical Properties of Bodies. London: John van Voorst, 1 Paternoster Row. MDCCCLXVII.

Faghri, A., and Zhang, Y., 2006, Transport Phenomena in Multiphase Systems, Elsevier, Burlington, MA.