“The entropy introduced in information theory is NOT a thermodynamical quantity.”
Dirk ter Haar (1954), Elements of Statistical Mechanics

Dirk ter Haar
Anglo-Dutch physicist Dirk ter Haar and his 1954 Statistical Mechanics textbook quote on the fact that Shannon entropy is not a thermodynamical quantity, contrary to modern popular opinion (for many) [3]
In quotes, information entropy quotes are quotes about Claude Shannon's 1948 H formula namesake of "entropy" as the measure of the choice between two alternatives (0 or 1) in a signal.

Thermodynamics | Confusion
The following are a set of collective statements, in chronological order, on the position that Shannon entropy (information entropy) has absolutely nothing to do with the entropy of thermodynamics—an apocryphal connection that is one of the greatest ongoing confusions in the history of science, the result of the fact that the situation is a buried Sokal affair of sorts: [1]

“The entropy introduced in information theory is NOT a thermodynamical quantity.”
Dirk ter Haar (1954), Anglo-Dutch physicist

“Everyone knows that Shannon’s derivation is in error.”
Benoit Mandelbrot (1961), audience comment to Myron Tribus on Claude Shannon’s 1948 information entropy derivation [6]

“The similarity of the formal expressions in the two cases has misled many authors to identify entropy of information with negative physical entropy.”
— Josef Jauch and Julius Baron (1972), Swiss theoretical physicist and American mathematician

“The bare truth is that one does not meet the concept of physical entropy in communication theory.”
Nicholas Georgescu-Roegen (1977), Romanian-born American mathematician

“There is an obvious relation between [message transmission] and what happens, for instance, when a solid melts and the orderly crystal structure is replaced by the more random motion of the molecules in the liquid. This has led [Shannon and Weaver (1949) and Szilard (1925)] to propose that the production of information is in effect the production of negative entropy (increase in order). In spite of the obvious analogy between the two kinds of processes, the attempt to embody this general idea in quantitative terms runs into serious difficulties and Popper (1976) and others hold it to be invalid.”
— John Edsall and Hanoch Gutfreund (1983), Biothermodynamics: the Study of Biochemical Processes [7]

“Although information theory is more compre­hensive than is statistical mechanics, this very comprehensiveness gives rise to objectionable consequences when it is applied in physics and chemistry. It remains true, nevertheless, that information theory can be of value in a heuristic sense. Notions about ‘loss of information’ can sometimes be intuitively useful. But they can also, like the comparable concept of ‘disorder’, give rise to mistakes. It needs to be kept in mind that thermodynamic entropy is fully objective and the same must apply to any other ‘entropy’ which is used as surrogate.”
Kenneth Denbigh and John Denbigh (1985), Entropy in Relation to Incomplete Knowledge [5]

“It is a mistaken belief that Shannon’s function—called ‘entropy’ by namesake misadoption in information theory (Shannon and Weaver 1949)—has resulted in a true generalization of the Carnot-Clausius state function [dQ/T] treatment and the Boltzmann-Gibbs statistical [H function] treatment of the original [thermodynamic] entropy formulation of heat [Q], thus, in a sense, freeing it from disciplinary framework of thermodynamics for use in probability distributions in general—which is a major impediment to productive communication [in science].”
Jeffrey Wicken (1987), American evolutionary biochemist

“The two—information theoretic ideas and thermodynamic entropy—have been repeatedly confused since the time of von Neumann.”
— Peter Coveney and Roger Highfield (1990), English science historians

“[The] ‘entropies’ in contexts where temperature T is absent have NOTHING to do with entropy of thermodynamics and NOTHING to do with the second law of thermodynamics.”
Harold Morowitz (1991), American biological thermodynamicist

“Information ‘entropy’ in all of its myriad nonphysicochemical forms as a measure of information or abstract communication has no relevance to the evaluation of thermodynamic entropy change.”
Frank Lambert (1999), American inorganic chemist

“Shannon entropy and Boltzmann entropy are completely unrelated.”
Stephen Kline (1999), American thermodynamicist

“Neumann’s proposal to call Shannon’s function defining information by the name ‘entropy’ opened a Pandora’s box of intellectual confusion.”
— Antoine Danchin (2001), French geneticist

“Having now followed the three principle scholars’ work, regarded as responsible for the alleged equivalence between information and negative entropy, it is now clear that this alleged equivalence is physically baseless.”
Kozo Mayumi (2001), Japanese ecological thermodynamics economist

“The use of the term ‘Shannon entropy’, although Shannon himself did this, is a mistake because it leads to thinking that thermodynamic entropy is the same as the Shannon entropy. Shannon entropy, however, is NOT identical to ‘entropy’ because they have different units: bits per symbol and joules per kelvin, respective.”
— Thomas Schneider (2002), American biomolecular chemist

“Shannon’s theory of information has set in motion one of the most farcical trains of misconceptions and misunderstandings in the modern history of the sciences, namely, that ‘information’ is a palpable thing with sufficient integrity to be measured and parceled out like so many dollops of butterscotch.”
Philip Mirowski (2002), American physical economist

“This [calling Shannon’s H function by the name ‘entropy’] created a situation of enormous confusion lasting up to date in many areas [and] led to an immediate and unjustified identification with thermodynamic entropy.”
— Alberto Solana-Ortega (2002), Spanish mathematician

“Thermodynamic entropy and Shannon’s entropy are two different things.”
— Erico Guizzo (2003), Brazilian electrical engineer

“Shannon entropy should not be confused with the term entropy as it is used in chemistry and physics. Shannon entropy does not depend on temperature. Therefore, it is not the same as thermodynamic entropy.”
— Stuart Pullen (2005), American biochemist

“Von Neumann’s argument [basis of Shannon entropy] does not establish the desired conceptual linkage between Tr ρ log ρ and thermodynamic entropy.”
— Meir Hemmo and Orly Shenker (2006), Israeli science philosophers

“For level-headed physicists, entropy—or order and disorder—is nothing by itself. It has to be seen and discussed in conjunction with temperature and heat, and energy and work. And, if there is to be an extrapolation of entropy to a foreign field, it must be accompanied by the appropriate extrapolations of temperature, heat, and work.”
Ingo Muller (2007), German thermodynamicist

“The analogy [of Shannon entropy] to thermodynamic entropy breaks down because Shannon’s concept is a logical (or structural) property, not a dynamical property. Shannon entropy, for example, does not generally increase spontaneously in most communication systems, so there is no equivalent to the second law of thermodynamics when it comes to the entropy of information. The arrangement of units in a message doesn’t spontaneously ‘tend’ to change toward equiprobablity.”
Terrence Deacon (2011), American neurological anthropologist

“Information defined by Shannon and thermodynamic entropy are not quantitatively related. I recommend [that we] restrict the meaning of ‘entropy’ to its thermodynamic one as originally intended by Clausius and Boltzmann and remove the term ‘entropy’ from all discussions on information as defined by Shannon.”
Sungchul Ji (2012), American cellular pathologist

“Do not confuse [entropy] with entropy in information theory (Shannon, 1948). This name is only the result of a joke by von Neumann and Shannon; thermodynamic entropy has nothing to do with so-called information entropy (Muller, 2007).”
Ichiro Aoki (2012), Japanese systems engineer [2]

“I agree that thermodynamic entropy (thermal energy displacement, J/K) is not the same as other "extrapolations" based on very limited, statistical and stochastic modeling similarity, like mislabeled information entropy. Natural phenomena are not governed nor caused by simulation tools, like mathematical analysis, the latter being simplified description of unique natural phenomena, but not more than that.”
Milivoje Kostic (2013), Serbian-born American mechanical engineer and thermodynamicist [4]

Related
The following are related quotes:

“Information is information, not matter or energy.”
Norbert Wiener (1948), Cybernetics [1]

“From a physical point of view, information theory has nothing to do with physics.”
— Daniel Ueltschi (c.2011), ‘Shannon Entropy’ [1]

See also
‚óŹ 2.332746 bits/elements distinguishability constant (19 Jan 2013) – Hmolpedia threads.

References
1. Thims, Libb. (2012). “Thermodynamics ≠ Information Theory: Science’s Greatest Sokal Affair” (url), Journal of Human Thermodynamics, 8(1): 1-120, Dec 19.
2. Aoki, Achiro. (2012). Entropy Principle for the Development of Complex Biotic Systems: Organisms, Ecosystems, the Earth (pg. 2). Elsevier.
3. (a) Zuzemsky, A.L. (2006). “Dirk ter Haar (1919-2002)”, Jinr.ru.
(b) Ter Haar, Dirk. (1954). Elements of Statistical Mechanics (pg. 232). Rinehart.
4. THETIT (reviews) - EoHT Beta wiki.
5. (a) Denbigh, Kenneth and Denbigh, John S. (1985). Entropy in Relation to Incomplete Knowledge (abs) (pg. 117). Cambridge University Press.
(b) Labinger, Jay A. (1995). “Metaphoric Usage of the Second Law: Entropy as Time's (double-headed) Arrow in Tom Stoppard's Arcadia”, Presented at Nov meeting of the Society for Literature and Science, Los Angeles; in: The Chemical Intelligencer (pg. 32), Oct. 31-36, 1996.
6. Tribus, M. (1998). “A Tribute to Edwin T. Jaynes”. In Maximum Entropy and Bayesian Methods, Garching, Germany 1998: Proceedings of the 18th International Workshop on Maximum Entropy and Bayesian Methods of Statistical Analysis (pgs. 11-20; quote, pg. 13) by Wolfgang von der Linde, Volker Dose, Rainer Fischer, and Roland Preuss. 1999. Springer.
7. (a) Popper, Karl. (1976). Unended Quest: an Intellectual Autobiography. Routledge, 2005.
(b) Edsall, John T. and Gutfreund, Hanoch (1983). Biothermodynamics: the Study of Biochemical Processes at Equilibrium (pg. 27). John Wiley & Sons, Inc.

TDics icon ns