“The entropy introduced in information theory is NOT a thermodynamical quantity.” — | ||

Anglo-Dutch physicist Dirk ter Haar and his 1954 Statistical Mechanics textbook quote on the fact that Shannon entropy is not a thermodynamical quantity, contrary to modern popular opinion (for many) [3] |

Thermodynamics | Confusion

The following are a set of collective statements, in chronological order, on the position that Shannon entropy (information entropy) has absolutely nothing to do with the entropy of thermodynamics—an apocryphal connection that is one of the greatest ongoing confusions in the history of science, the result of the fact that the situation is a buried

“The entropy introduced in information theory isNOTa thermodynamical quantity.”—Dirk ter Haar(1954), Anglo-Dutch physicist

“Everyone knows that Shannon’s derivation isin error.”— Benoit Mandelbrot (1961), audience comment to Myron Tribus on Claude Shannon’s 1948 information entropy derivation [6]

“The similarity of the formal expressions in the two cases hasmisledmany authors to identify entropy of information with negative physical entropy.”— Josef Jauch and Julius Baron (1972), Swiss theoretical physicist and American mathematician

“The bare truth is that one doesnot meetthe concept of physical entropy in communication theory.”— Nicholas Georgescu-Roegen (1977), Romanian-born American mathematician

“There is an obvious relation between [message transmission] and what happens, for instance, when a solid melts and the orderly crystal structure is replaced by the more random motion of the molecules in the liquid. This has led [Shannon and Weaver (1949) and Szilard (1925)] to propose that the production of information is in effect the production of negative entropy (increase in order). In spite of the obvious analogy between the two kinds of processes, the attempt to embody this general idea in quantitative terms runs into serious difficulties and Popper (1976) and others hold it to beinvalid.”— John Edsall and Hanoch Gutfreund (1983),Biothermodynamics: the Study of Biochemical Processes[7]

“Although information theory is more comprehensive than is statistical mechanics, this very comprehensiveness gives rise toobjectionable consequenceswhen it is applied in physics and chemistry. It remains true, nevertheless, that information theory can be of value in a heuristic sense. Notions about ‘loss of information’ can sometimes be intuitively useful. But they can also, like the comparable concept of ‘disorder’, give rise to mistakes. It needs to be kept in mind thatthermodynamicentropy is fully objective and the same must apply to any other ‘entropy’ which is used as surrogate.”— Kenneth Denbigh and John Denbigh (1985),Entropy in Relation to Incomplete Knowledge[5]

“It is amistaken beliefthat Shannon’s function—called ‘entropy’ by namesake misadoption in information theory (Shannon and Weaver 1949)—has resulted in a true generalization of the Carnot-Clausius state function [dQ/T] treatment and the Boltzmann-Gibbs statistical [H function] treatment of the original [thermodynamic] entropy formulation of heat [Q], thus, in a sense, freeing it from disciplinary framework of thermodynamics for use in probability distributions in general—which is a major impediment to productive communication [in science].”— Jeffrey Wicken (1987), American evolutionary biochemist

“The two—information theoretic ideas and thermodynamic entropy—have beenrepeatedly confusedsince the time of von Neumann.”— Peter Coveney and Roger Highfield (1990), English science historians

“[The] ‘entropies’ in contexts where temperatureTis absent haveNOTHING to do withentropy of thermodynamics and NOTHING to do with the second law of thermodynamics.”— Harold Morowitz (1991), American biological thermodynamicist

“Information ‘entropy’ in all of its myriad nonphysicochemical forms as a measure of information or abstract communication hasno relevanceto the evaluation of thermodynamic entropy change.”— Frank Lambert (1999), American inorganic chemist

“Shannon entropy and Boltzmann entropy arecompletely unrelated.”— Stephen Kline (1999), American thermodynamicist

“Neumann’s proposal to call Shannon’s function defining information by the name ‘entropy’ opened a Pandora’s box ofintellectual confusion.”— Antoine Danchin (2001), French geneticist

“Having now followed the three principle scholars’ work, regarded as responsible for the alleged equivalence between information and negative entropy, it is now clear that this alleged equivalence isphysically baseless.”— Kozo Mayumi (2001), Japanese ecological thermodynamics economist

“The use of the term ‘Shannon entropy’, although Shannon himself did this, is amistakebecause it leads to thinking that thermodynamic entropy is the same as the Shannon entropy. Shannon entropy, however, is NOT identical to ‘entropy’ because they have different units: bits per symbol and joules per kelvin, respective.”— Thomas Schneider (2002), American biomolecular chemist

“Shannon’s theory of information has set in motion one ofthe mostfarcical trains of misconceptions and misunderstandings in the modern history of the sciences, namely, that ‘information’ is a palpable thing with sufficient integrity to be measured and parceled out like so many dollops of butterscotch.”— Philip Mirowski (2002), American physical economist

“This [calling Shannon’s H function by the name ‘entropy’] created a situation ofenormous confusionlasting up to date in many areas [and] led to an immediate andunjustified identificationwith thermodynamic entropy.”— Alberto Solana-Ortega (2002), Spanish mathematician

“Thermodynamic entropy and Shannon’s entropy are twodifferent things.”— Erico Guizzo (2003), Brazilian electrical engineer

“Shannon entropy should not be confused with the term entropy as it is used in chemistry and physics. Shannon entropy does not depend on temperature. Therefore, it isnot the sameas thermodynamic entropy.”— Stuart Pullen (2005), American biochemist

“Von Neumann’s argument [basis of Shannon entropy] doesnot establishthe desired conceptual linkage between Tr ρ log ρ and thermodynamic entropy.”— Meir Hemmo and Orly Shenker (2006), Israeli science philosophers

“For level-headed physicists, entropy—or order and disorder—is nothing by itself. It has to be seen and discussed in conjunction with temperature and heat, and energy and work. And, if there is to be an extrapolation of entropy to a foreign field, it must be accompanied by the appropriate extrapolations of temperature, heat, and work.”— Ingo Muller (2007), German thermodynamicist

“The analogy [of Shannon entropy] to thermodynamic entropybreaks downbecause Shannon’s concept is a logical (or structural) property, not a dynamical property. Shannon entropy, for example, does not generally increase spontaneously in most communication systems, so there is no equivalent to the second law of thermodynamics when it comes to the entropy of information. The arrangement of units in a message doesn’t spontaneously ‘tend’ to change toward equiprobablity.”— Terrence Deacon (2011), American neurological anthropologist

“Information defined by Shannon and thermodynamic entropy arenot quantitatively related. I recommend [that we] restrict the meaning of ‘entropy’ to its thermodynamic one as originally intended by Clausius and Boltzmann and remove the term ‘entropy’ from all discussions on information as defined by Shannon.”—Sungchul Ji(2012), American cellular pathologist

“Do not confuse [entropy] with entropy in information theory (Shannon, 1948). This name is only the result ofa jokeby von Neumann and Shannon; thermodynamic entropy has nothing to do with so-called information entropy (Muller, 2007).”—“I agree that thermodynamic entropy (thermal energy displacement, J/K) isIchiro Aoki(2012), Japanese systems engineer [2]not the sameas other "extrapolations" based on very limited, statistical and stochastic modeling similarity, like mislabeled information entropy. Natural phenomena are not governed nor caused by simulation tools, like mathematical analysis, the latter being simplified description of unique natural phenomena, but not more than that.”

— Milivoje Kostic (2013), Serbian-born American mechanical engineer and thermodynamicist [4]

Related

The following are related quotes:

“Information is information, not matter or energy.”— Norbert Wiener (1948),Cybernetics[1]

“From a physical point of view, information theory hasnothing to dowith physics.”— Daniel Ueltschi (c.2011), ‘Shannon Entropy’ [1]

See also

● 2.332746 bits/elements distinguishability constant (19 Jan 2013) – Hmolpedia threads.

References

1. Thims, Libb. (2012). “Thermodynamics ≠ Information Theory: Science’s Greatest Sokal Affair” (url),

2. Aoki, Achiro. (2012).

3. (a) Zuzemsky, A.L. (2006). “Dirk ter Haar (1919-2002)”, Jinr.ru.

(b) Ter Haar, Dirk. (1954).

4. THETIT (reviews) - EoHT Beta wiki.

5. (a) Denbigh, Kenneth and Denbigh, John S. (1985).

(b) Labinger, Jay A. (1995). “Metaphoric Usage of the Second Law: Entropy as Time's (double-headed) Arrow in Tom Stoppard's

6. Tribus, M. (1998). “A Tribute to Edwin T. Jaynes”. In

7. (a) Popper, Karl. (1976).

(b) Edsall, John T. and Gutfreund, Hanoch (1983).