Low-energy electron diffraction: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>Mywtfmp3
m →‎Interpretation of LEED patterns: figure 4, instead of figure 3, shows normal incident electron beam.
Line 1: Line 1:
The concept of '''[[entropy]]''' developed in response to the observation that a certain amount of functional energy released from [[combustion reactions]] is always lost to dissipation or friction and is thus not transformed into [[Work (thermodynamics)|useful work]]. Early heat-powered engines such as [[Thomas Savery]]'s (1698), the [[Newcomen engine]] (1712) and the Cugnot [[steam tricycle]] (1769) were inefficient, converting less than two percent of the input energy into useful [[work output]]; a great deal of useful energy was dissipated or lost.  Over the next two centuries, physicists investigated this puzzle of lost energy; the result was the concept of [[entropy]].
Hi there. Allow me start by introducing the author, her title is Myrtle Cleary. For many years I've been working as a payroll clerk. Years ago we moved to North Dakota and I love every working day residing right here. To collect cash is what her family members and her appreciate.<br><br>Here is my web site :: [http://nfldev.com/index.php?do=/profile-12302/info/ std testing at home]
 
In the early 1850s, [[Rudolf Clausius]] set forth the concept of the [[thermodynamic system]] and posited the argument that in any [[irreversible process]] a small amount of [[heat]] energy ''δQ'' is incrementally dissipated across the system boundary. Clausius continued to develop his ideas of lost energy, and coined the term ''entropy''.
 
Since the mid-20th century the concept of entropy has found application in the field of [[information theory]], describing an analogous loss of data in information transmission systems.
 
==Classical thermodynamic views==
{{Main|classical thermodynamics}}
 
An early formulation of the Second Law by [[Thomas Aquinas]], in the ''[[Summa Theologica]]'' (1274), is: "It is impossible for an effect to be stronger than its cause.".<ref>http://www.newadvent.org/summa/2029.htm#article3</ref> Here, "be stronger than" in modern terminology corresponds to "have less entropy than." Another early formulation is that "a cause must be equal to or greater than its effect."<ref>http://www.jstor.org/discover/10.2307/4181986</ref>
 
In 1803, mathematician [[Lazare Carnot]] published a work entitled ''Fundamental Principles of Equilibrium and Movement''. This work includes a discussion on the efficiency of fundamental machines, i.e. pulleys and inclined planes. Lazare Carnot saw through all the details of the mechanisms to develop a general discussion on the conservation of mechanical energy. Over the next three decades, Lazare Carnot’s theorem was taken as a statement that in any machine the accelerations and shocks of the moving parts all represent losses of ''moment of activity'', i.e. the [[Work (thermodynamics)|useful work]] done. From this Lazare drew the inference that [[perpetual motion]] was impossible. This ''loss of moment of activity'' was the first-ever rudimentary statement of the [[second law of thermodynamics]] and the concept of 'transformation-energy' or ''entropy'', i.e. energy lost to dissipation and friction.<ref>{{Cite book|author=Mendoza, E. |title=Reflections on the Motive Power of Fire – and other Papers on the Second Law of Thermodynamics by E. Clapeyron and R. Clausius|location=New York | publisher=Dover Publications |year=1988|isbn=0-486-44641-7}}</ref>
 
Lazare Carnot died in exile in 1823. During the following year Lazare’s son [[Nicolas Léonard Sadi Carnot|Sadi Carnot]], having graduated from the [[École Polytechnique]] training school for engineers, but now living on half-pay with his brother Hippolyte in a small apartment in Paris, wrote the ''Reflections on the Motive Power of Fire''. In this paper, Sadi visualized an [[Carnot heat engine|ideal engine]] in which any heat (i.e., [[Caloric theory|caloric]]) converted into work, could be reinstated by reversing the motion of the cycle, a concept subsequently known as [[thermodynamic reversibility]]. Building on his father's work, Sadi postulated the concept that “some caloric is always lost” in the conversion into work, even in his idealized reversible heat engine, which excluded frictional losses and other losses due to the imperfections of any real machine. He also discovered that this idealized efficiency was dependent only on the temperatures of the heat reservoirs between which the engine was working, and not on the types of working fluids.  Any real [[heat engine]] could not realize the [[Carnot cycle|Carnot cycle's]] reversibility, and was condemned to be even less efficient. This loss of usable caloric was a precursory form of the increase in entropy as we now know it. Though formulated in terms of caloric, rather than entropy, this was an early insight into the [[second law of thermodynamics]].
 
==1854 definition==
[[Image:Clausius.jpg|225px|thumb|right|[[Rudolf Clausius]] - originator of the concept of '''"entropy"''']]
In his 1854 memoir, Clausius first develops the concepts of ''interior work'', i.e. that "which the atoms of the body exert upon each other", and ''exterior work'', i.e. that "which arise from foreign influences [to] which the body may be exposed", which may act on a working body of fluid or gas, typically functioning to work a piston.  He then discusses the three categories into which heat ''Q'' may be divided:
 
#Heat employed in increasing the heat actually existing in the body.
#Heat employed in producing the interior work.
#Heat employed in producing the exterior work.
 
Building on this logic, and following a mathematical presentation of the ''first fundamental theorem'', Clausius then presented the first-ever mathematical formulation of entropy, although at this point in the development of his theories he called it "equivalence-value", perhaps referring to the concept of the [[mechanical equivalent of heat]] which was developing at the time rather than entropy, a term which was to come into use later.<ref>''Mechanical Theory of Heat'', by [[Rudolf Clausius]], 1850-1865</ref>  He stated:<ref>Published in Poggendoff’s Annalen, December 1854, vol. xciii. p. 481; translated in the Journal de Mathematiques, vol. xx. Paris, 1855, and in the Philosophical Magazine, August 1856, s. 4. vol. xii, p. 81</ref>
 
<blockquote>the ''second fundamental theorem'' in the mechanical [[theory of heat]] may thus be enunciated:
 
If two transformations which, without necessitating any other permanent change, can mutually replace one another, be called equivalent, then the generations of the quantity of heat ''Q'' from [[work (thermodynamics)|work]] at the temperature ''T'' , has the ''equivalence-value'':
 
::<math> \frac {Q}{T}</math>
 
and the passage of the quantity of heat ''Q'' from the [[temperature]] ''T<sub>1</sub>'' to the temperature ''T<sub>2</sub>'', has the equivalence-value:
 
::<math> Q \left( \frac {1}{T_2} - \frac {1}{T_1}\right)</math>
 
wherein ''T'' is a function of the temperature, independent of the nature of the process by which the transformation is effected.</blockquote>
 
In modern terminology, we think of this equivalence-value as "entropy", symbolized by ''S''.  Thus, using the above description, we can calculate the entropy change Δ''S'' for the passage of the quantity of [[heat]] ''Q'' from the [[temperature]] ''T<sub>1</sub>'', through the "working body" of fluid (see [[heat engine]]), which was typically a body of steam, to the temperature ''T<sub>2</sub>'' as shown below:
[[Image:Entropy-diagram.png|right|325px|thumb|Diagram of Sadi Carnot's [[heat engine]], 1824]]
If we make the assignment:
 
:<math> S= \frac {Q}{T}</math>
 
Then, the entropy change or "equivalence-value" for this transformation is:
 
:<math> \Delta S = S_{\rm final} - S_{\rm initial} \, </math>
 
which equals:
 
:<math> \Delta S = \left(\frac {Q}{T_2} - \frac {Q}{T_1}\right)</math>
 
and by factoring out Q, we have the following form, as was derived by Clausius:
 
:<math> \Delta S = Q\left(\frac {1}{T_2} - \frac {1}{T_1}\right)</math>
 
==1856 definition==
In 1856, Clausius stated what he called the "second fundamental theorem in the [[mechanical theory of heat]]" in the following form:
 
:<math>\int \frac{\delta Q}{T} = -N</math>
 
where ''N'' is the "equivalence-value" of all uncompensated transformations involved in a cyclical process.  This equivalence-value was a precursory formulation of entropy.<ref>Clausius, Rudolf. (1856). "''On the Application of the Mechanical theory of Heat to the Steam-Engine''." as found in: Clausius, R. (1865). [http://books.google.com/books?id=8LIEAAAAYAAJ The Mechanical Theory of Heat – with its Applications to the Steam Engine and to Physical Properties of Bodies]. London: John van Voorst, 1 Paternoster Row. MDCCCLXVII.</ref>
 
==1862 definition==
{{Main|disgregation}}
In 1862, Clausius stated what he calls the “theorem respecting the equivalence-values of the transformations” or what is now known as the [[second law of thermodynamics]], as such:
 
:''The algebraic sum of all the transformations occurring in a cyclical process can only be positive, or, as an extreme case, equal to nothing.''
 
Quantitatively, Clausius states the mathematical expression for this theorem is as follows.  Let ''δQ'' be an element of the heat given up by the body to any reservoir of heat during its own changes, heat which it may absorb from a reservoir being here reckoned as negative, and ''T'' the [[absolute temperature]] of the body at the moment of giving up this heat, then the equation:
 
:<math>\int \frac{\delta Q}{T} = 0</math>
 
must be true for every reversible cyclical process, and the relation:
 
:<math>\int \frac{\delta Q}{T} \ge 0</math>
 
must hold good for every cyclical process which is in any way possible.  This was an early formulation of the second law and one of the original forms of the concept of entropy.
 
==1865 definition==
In 1865, Clausius gave irreversible heat loss, or what he had previously been calling "equivalence-value", a name:<ref>{{Cite book| last = Laidler | first = Keith J. | title = The Physical World of Chemistry | publisher = Oxford University Press | year = 1995 | isbn = 0-19-855919-4 | pages = 104–105}}</ref><ref>[[OED]], Second Edition, 1989, "''Clausius (Pogg. Ann. CXXV. 390), assuming (unhistorically) the etymological sense of energy to be ‘work-contents’ (werk-inhalt), devised the term entropy as a corresponding designation for the ‘transformation-contents’ (verwandlungsinhalt) of a system"''</ref>
 
{{cquote|I propose to name the quantity ''S'' the entropy of the system, after the Greek word [τροπη ''trope''], the transformation.  I have deliberately chosen the word entropy to be as similar as possible to the word energy: the two quantities to be named by these words are so closely related in physical significance that a certain similarity in their names appears to be appropriate.}}
 
Although Clausius did not specify why he chose the symbol "S" to represent entropy, it is arguable that Clausius chose "S" in honor of [[Nicolas Léonard Sadi Carnot|Sadi Carnot]], to whose 1824 article Clausius devoted over 15 years of work and research.  On the first page of his original 1850 article "On the Motive Power of Heat, and on the Laws which can be Deduced from it for the Theory of Heat", Clausius calls Carnot the most important of the researchers in the [[theory of heat]].<ref name="Clausius" >{{Cite book| last = Clausius | first = Rudolf | title = On the Motive Power of Heat, and on the Laws which can be deduced from it for the Theory of Heat | publisher = Poggendorff's ''Annalen der Physick'', LXXIX (Dover Reprint) | year = 1850 | isbn = 0-486-59065-8}}</ref>
 
==Later developments==
In 1876, physicist [[J. Willard Gibbs]], building on the work of Clausius, [[Hermann von Helmholtz]] and others, proposed that the measurement of "available energy" Δ''G'' in a thermodynamic system could be mathematically accounted for by subtracting the "energy loss" ''T''Δ''S'' from total energy change of the system Δ''H''.  These concepts were further developed by [[James Clerk Maxwell]] [1871] and [[Max Planck]] [1903].
 
==Statistical thermodynamic views==
{{Main|statistical thermodynamics}}
 
In 1877, [[Ludwig Boltzmann]] formulated the alternative definition of entropy ''S'' defined as:
 
:<math>S = k_{\rm B} \ln \Omega \!</math>
where
:''k''<sub>B</sub> is [[Boltzmann constant|Boltzmann's constant]] and
:''Ω'' is the number of microstates consistent with the given macrostate.
 
Boltzmann saw entropy as a measure of statistical "mixedupness" or disorder.  This concept was soon refined by [[J. Willard Gibbs]], and is now regarded as one of the cornerstones of the theory of [[statistical mechanics]].
 
==Information theory==
An analog to ''thermodynamic entropy'' is '''information entropy'''.  In 1948, while working at [[Bell Telephone Company|Bell Telephone]] Laboratories electrical engineer [[Claude Elwood Shannon|Claude Shannon]] set out to mathematically quantify the statistical nature of “lost information” in phone-line signals. To do this, Shannon developed the very general concept of [[information entropy]], a fundamental cornerstone of [[information theory]].  Although the story varies, initially it seems that Shannon was not particularly aware of the close similarity between his new quantity and earlier work in thermodynamics.  In 1949, however, when Shannon had been working on his equations for some time, he happened to visit the mathematician [[John von Neumann]].  During their discussions, regarding what Shannon should call the “measure of uncertainty” or attenuation in phone-line signals with reference to his new information theory, according to one source:<ref>M. Tribus, E.C. McIrvine, “Energy and information”, Scientific American, 224 (September 1971).</ref>
 
:{{cquote|My greatest concern was what to call it. I thought of calling it ‘information’, but the word was overly used, so I decided to call it ‘uncertainty’. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, ‘You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage.}}
 
According to another source, when von Neumann asked him how he was getting on with his information theory, Shannon replied:<ref>{{Cite book|author= Avery, John|title=Information Theory and Evolution|publisher=World Scientific|year=2003|isbn=981-238-400-6}}</ref>
 
:{{cquote|The theory was in excellent shape, except that he needed a good name for “missing information”.  “Why don’t you call it entropy”, von Neumann suggested.  “In the first place, a mathematical development very much like yours already exists in Boltzmann’s statistical mechanics, and in the second place, no one understands entropy very well, so in any discussion you will be in a position of advantage.}}
 
In 1948 Shannon published his famous paper ''A Mathematical Theory of Communication'', in which he devoted a section to what he calls Choice, Uncertainty, and Entropy.<ref>C.E. Shannon, "A Mathematical Theory of Communication", ''[[Bell System Technical Journal]]'', vol. 27, pp. 379-423, 623-656, July, October, 1948, [http://cm.bell-labs.com/cm/ms/what/shannonday/paper.html Eprint], [http://cm.bell-labs.com/cm/ms/what/shannonday/shannon1948.pdf PDF]</ref> In this section, Shannon introduces an ''H function'' of the following form:
 
:<math>H = -K\sum_{i=1}^k p(i) \log p(i),</math>
 
where ''K'' is a positive constant.  Shannon then states that “any quantity of this form, where ''K'' merely amounts to a choice of a unit of measurement, plays a central role in information theory as measures of information, choice, and uncertainty.”  Then, as an example of how this expression applies in a number of different fields, he references R.C. Tolman’s 1938 ''Principles of Statistical Mechanics'', stating that “the form of ''H'' will be recognized as that of entropy as defined in certain formulations of statistical mechanics where ''p<sub>i</sub>'' is the probability of a system being in cell ''i'' of its phase space… ''H'' is then, for example, the ''H'' in Boltzmann’s famous [[H-theorem|H theorem]].”  As such, over the last fifty years, ever since this statement was made, people have been overlapping the two concepts or even stating that they are exactly the same.
 
Shannon's information entropy is a much more general concept than statistical thermodynamic entropy.  Information entropy is present whenever there are unknown quantities that can be described only by a probability distribution.  In a series of papers by [[E. T. Jaynes]] starting in 1957,<ref>E. T. Jaynes (1957) [http://bayes.wustl.edu/etj/articles/theory.1.pdf Information theory and statistical mechanics], ''Physical Review'' '''106''':620</ref><ref>E. T. Jaynes (1957) [http://bayes.wustl.edu/etj/articles/theory.2.pdf Information theory and statistical mechanics II], ''Physical Review'' '''108''':171</ref> the statistical thermodynamic entropy can be seen as just a particular application of Shannon's information entropy to the probabilities of particular microstates of a system occurring in order to produce a particular macrostate.
 
==Popular use==
The term entropy is often used in popular language to denote a variety of unrelated phenomena. One example is the concept of '''corporate entropy''' as put forward somewhat humorously by authors Tom DeMarco and Timothy Lister in their 1987 classic publication ''Peopleware'', a book on growing and managing productive teams and successful software projects.  Here, they view energy waste as red tape and business team inefficiency as a form of entropy, i.e. energy lost to waste. This concept has caught on and is now common jargon in business schools.
 
In another example, entropy plays the main villain in [[Isaac Asimov]]'s short story [[The Last Question]] (first copyrighted in 1956). The story plays with the idea that when tampering with the Second law of thermodynamics, entropy must always increase.
 
==Terminology overlap==
When necessary, to disambiguate between the statistical thermodynamic concept of entropy, and entropy-like formulae put forward by different researchers, the statistical thermodynamic entropy is most properly referred to as the '''[[Gibbs entropy]]'''. The terms ''Boltzmann-Gibbs entropy'' or ''BG entropy'', and ''Boltzmann-Gibbs-Shannon entropy'' or ''BGS entropy'' are also seen in the literature.
 
==See also==
*[[Entropy]]
*[[Enthalpy]]
*[[Thermodynamic free energy]]
 
==References==
{{Reflist}}
 
==External links==
* [[Max Jammer]] (1973). [http://etext.lib.virginia.edu/cgi-local/DHI/dhi.cgi?id=dv2-12 ''Dictionary of the History of Ideas'': Entropy]
 
{{DEFAULTSORT:History Of Entropy}}
[[Category:Thermodynamic entropy]]
[[Category:History of thermodynamics]]

Revision as of 11:16, 9 February 2014

Hi there. Allow me start by introducing the author, her title is Myrtle Cleary. For many years I've been working as a payroll clerk. Years ago we moved to North Dakota and I love every working day residing right here. To collect cash is what her family members and her appreciate.

Here is my web site :: std testing at home