Claude Shannon nsIn existographies, Claude Shannon (1916-2001) (IQ:175|#212) [CR:180] was an American electrical engineer and mathematician noted for his 1948 article “A Mathematical Theory of Communication”, in which he situated a new type of logarithmic-based, communication information transmission measure that, on the spurious suggestion of his associate John Neumann, he called "entropy" (borrowed from statistical mechanics; specifically Ludwig Boltzmann's H-function), to which he assigned the symbol H (same as Boltzmann's H), a type of telegraphy signal transmission measurement, completely unrelated to thermodynamics (except in the sense of being a mathematical isomorphism), which Shannon described as a measure of “information, choice and uncertainty”. In effect, Shannon, for no apparent discernible reason (except possibly theory promotion), convoluted two unrelated subjects, i.e. the general theory of heat engines and the general theory of information transmission and reception, together. The backlash was such that by 1955 Shannon had to put forward the following clarifying defense of his action: [4]

“Workers in other fields should realize that the basic results of the subject [communication channels] are aimed in a very specific direction, a direction that is not necessarily relevant to such fields as psychology, economics, and other social sciences.”

Very few people in the decades to follow, however, ever received this memo, the result of which is that many people in modern times actually believe that Shannon's information theory is rooted in thermodynamics, in the sense that his so-called "Shannon entropy" (a measure of Hi's and Lo's in information transmissions) is exactly the same as "Boltzmann entropy" (a particle velocity/position interpretation of "Clausius entropy"), which is incorrect.

Genius rankings
As a boy, he rigged a telegraph machine to the barbed wired fence that ran along his country road, so that he could talk to his friend ½-mile away via Morse code; his 1937 (age 22) master’s thesis, which single-handedly founded the digital age, via applying Boolean algebra (0,1 number logic) to telecommunication signals and computer circuitry (switches and relays), has been described as "possibly the most important, and also the most famous, master's thesis of the century" (Howard Gardner); quote: “apparently, Shannon is a genius” (Vannevar Bush, 1939) [16]; in 1948, he founded information theory; he described John Neumann, as the ‘smartest person he had ever met’, above that of Einstein's (IQ=220) whom he occasionally bumped into at the Institute for Advanced Study, Princeton, where the three of them worked in 1940 to 1941; quote: “I think [Norbert Wiener] (see: Ex-Prodigy: My Childhood and Youth, 1953) had a great brilliance. I’m not putting down his great mind. I think he really did have a tremendous IQ and a tremendous grasp of many things” (Shannon, 1982). [17] quote: "There were many at Bell Labs and MIT who compared Shannon's insight to Einstein's. Others found the comparison unfair—unfair to Shannon"; quote: "Shannon's genius was like Leonardo's (IQ=205), skipping restlessly from one project to another, leaving few unfinished"; friends rated John L. Kelly, Jr., a gun collector who predicted football results by computer, as the “second smartest man at Bell Labs—next to Shannon himself.” [16]

The Bandwagon
See main: Shannon bandwagon
In 1956, Shannon penned a now-infamous editorial memo article “The Bandwagon” wherein he makes a plea to everyone to stop using his new so-called information entropy theory outside of communications engineering proper. [10] Reaction bandwagon stylized articles to this bandwagon editorial article followed in the aftermath, including one from Norbert Wiener entitled "What is Information Theory?". [13] In 1958, American electrical engineer Peter Elias (1923-2001) published "Two Famous Papers" a parody of Shannon's bandwagon. [12] See the 2001 MIT Project History article “Information Theory and the Digital Age” for an inside look at the early bandwagon years. [11]

Thermodynamics
A huge misconception in the mind of many modern scientists is the view that Shannon's information theory is based on or derived from thermodynamics. This could not be farther from the truth. The truth behind the myth was that sometime in the 1940s, while Shannon was developing his theory, he happen to visit Hungarian chemical engineer John Neumann, a close friend to Leo Szilard, who previously had published a 1929 article on Maxwell's demon, containing a calculation of the entropy generated during the information collection and storage processing of the demon's mind. Owing to this influence, Neumann suggested, humorously, to calls voltage and current type information (highs and lows in telephone wires) by the name entropy. On this suggestion, in his 1948 paper, Shannon infamously states, in a subtle, yet very influential, way that seemed to connect his work to thermodynamics, that:

"The form of H will be recognized as that of entropy as defined in statistical mechanics.”

Then, after explaining his formulation of H as a function of a set of probabilities involved in the transmission of information (line currents), he concludes, in reference to Austrian physicist Ludwig Boltzmann's famous 1882 paper Further Studies on the Thermal Equilibrium of Gas Molecules, "H is then ... the H of Boltzmann's famous H theorem." [1] Boltzmann's H theorem, of course, was a statistical formulation of Clausius entropy (1865), for an ideal gas system.
In short, what Shannon did, for whatever motive, was convolute telegraph theory together with heat engine theory as though it were the same subject matter, via a very crude unsubstantiated (verbalized) derivation. In 1961, to cite one opinion on this matter, French mathematician Benoit Mandelbrot commented that: [4]

“Everyone knows that Shannon’s derivation is in error.”

One of the greatest misconceptions in modern science, perpetuated most likely from the fragmenting and relative isolation of the various modern branches of science, is the view that Shannon was a great thermodynamicist. This is a hugely erroneous myth. Shannon was an electrical engineer who studied the transmission and coding of voltage and electrical currents in telephone wires, no more no less. Shannon never had any formal education or training in thermodynamics and his work had nothing to do with thermodynamics. Yet, for reasons which require further discussion, in online polls as to who is the greatest thermodynamicist of all-time?, Shannon’s name pops up, which is a puzzling phenomenon. [6]

Questionable applications
After hearing of Mandelbrot’s 1961 criticism, Shannon continued to express “misgivings about using his definition of entropy for applications beyond communication channels.” [3] In any event, Shannon's warnings didn't help and in the decades to follow his euphemistic verbalized derivation of his communication theory H function in relation to Boltzmann’s statistical thermodynamic H function soon led hundreds of individuals (examples: James Coleman (1964), Stephen Coleman (1975), Orrin Klapp (1978), Jay Teachman (1980), Kenneth Bailey (1990), etc.) to write theoretical papers and books on connections between communication, information, entropy, and thermodynamics; all of which, of course, being unsubstantiated. Shannon's formulation has come to be known as information entropy, Shannon entropy, information theoretic entropy, among others.

Tribus
To cite one dominant example of the influence of Shannon's thermodynamics-borrowed terminology, in 1948 American engineer Myron Tribus was asked during his examination for his doctoral degree, at UCLA, to explain the connection between Shannon entropy and Clausius entropy.
In retrospect, in 1998, Tribus commented that he went on to spend ten-years on this issue: [3]

“Neither I nor my committee knew the answer. I was not at all satisfied with the answer I gave. That was in 1948 and I continued to fret about it for the next ten years. I read everything I could find that promised to explain the connection between the entropy of Clausius and the entropy of Shannon. I got nowhere. I felt in my bones there had to be a connection; I couldn’t see it.”

Information entropy
See main: Information entropy, Shannon entropy, etc.
Shannon’s revolutionary idea of digital representation was to sample the information source at an appropriate rate, and convert the samples to a bit stream. He characterized the source by a single number, the entropy, adapting a term from statistical mechanics, to quantify the information content of the source. For English language text, Shannon viewed entropy as a statistical parameter that measured how much information is produced on the average by each letter. [2] The equation for H that Shannon defines as entropy is:

H = -K\sum_{i=1}^np_i\log p_i\,\!

in which H is a measure of "information, choice and uncertainty", the constant K is a variable that "merely amounts to a unit of measurement", and p1, ..., pn are a "set of probabilities." Through this equation, dozens of writers have, unknowingly, jumped from verbal descriptions of all varieties of information, e.g. genetic, computer, knowledge, etc., to mixtures of Boltzmann-Clausius statements of the second law of thermodynamics, in attempts to explain enumerable aspects of life and evolution, among other subjects.

Difficulties on theory
The essential difficulty of Shannon’s idea of entropy is that it’s terminology is a verbal-crossover theory, culled from statistical thermodynamics, but having little connection, if any at all, to thermodynamics. This has lead countless writers, having little training in thermodynamics, to proffer up some of the most illogical backwardly reasoned papers every written, however novel the intentions. These papers, from a thermodynamic point of view, become an almost strain on the mind to read. The 2007 views of German physicist Ingo Müller summarizes the matter near to a tee: [3]

“No doubt Shannon and von Neumann thought that this was a funny joke, but it is not, it merely exposes Shannon and von Neumann as intellectual snobs. Indeed, it may sound philistine, but a scientist must be clear, as clear as he can be, and avoid wanton obfuscation at all cost. And if von Neumann had a problem with entropy, he had no right to compound that problem for others, students and teachers alike, by suggesting that entropy had anything to do with information.”

Müller clarifies that “for level-headed physicists, entropy (or order and disorder) is nothing by itself. It has to be seen and discussed in conjunction with temperature and heat, and energy and work. And, if there is to be an extrapolation of entropy to a foreign field, it must be accompanied by the appropriate extrapolations of temperature, heat, and work.”

OMNI interview
In a 1987 interview, by Anthony Liversidge of Omni magazine, Shannon was asked several decisive questions, a few telling Q&A responses are shown below: [8]

OMNI: Before you wrote your classic paper on The Mathematical Theory of Communication, Norbert Wiener went around the office at Bell Labs announcing ‘information is entropy’. Did that remark provoke you in any way to come up with information theory?

Shannon: No, I hadn’t even heard of that remark when I started my work. I don’t think Wiener had much to do with information theory. He wasn’t a big influence on my ideas there, though I once took a course from him.

OMNI: Do you agree with Norbert Wiener, who is reported to have denied any basic distinction between life and non-life, man and machine?

Shannon: That’s a loaded question! Let me say this. I am an atheist to begin with. I believe in evolution theory and that we are basically machines, but a very complex type.

OMNI: Does your theory give a hint of how life might have evolved, seemingly in the face of the second law of thermodynamics, which says that order should slowly disintegrate?

Shannon: The evolution of the universe is certainly a puzzling thing to me as well as to everybody else. It’s fantastic we’ve ever come to the level of organization we have, starting from a big bang. Nonetheless, I believe in the big bang.

(add)

Education
Shannon completed a BS in electrical engineering and a BS in mathematics at the University of Michigan in 1936. Shannon completed a MS in 1937, with a thesis "A Symbolic Analysis of Relay and Switching Circuits", wherein he showed how the mathematics of British mathematician George Boole, which deals with such concepts as “if X or Y happens but not Z, the Q results”, could represent the workings of switches and relays in electronic circuits. Circuit design could now thus be treated mathematically and Shannon’s master’s thesis, written at the age of 22, came to be heralded as “possibly the most important master’s thesis in the century.” [9]

Shannon then completed his PhD in 1940, under the direction of Vannevar Bush, with a dissertation "An Algebra for Theoretical Genetics" at the Massachusetts Institute of Technology, in which he
tried to do for genetics what he had done for electronic circuits, but, according to according to physical economics historian Philip Mirowski, the attempt was premature. [15] It may be that this is from where Lila Gatlin gleaned her misaligned inspiration to apply information theory to DNA transcription and gene expression, but his is only a conjecture.

In 1940, Shannon became a National Research Fellow at the Institute for Advanced Study in Princeton, New Jersey, staying there for one year, during which time the famous Neumann-Shannon anecdote occurred. [14]

In 1941, Shannon joined Bell Labs, remaining affiliated with this group, university appointments aside, until 1972.

Quotes | On

The following are quotes on Shannon:

“Apparently, Shannon is a genius.”
Vannevar Bush (1939), written comment

Shannon is the most important genius you’ve never heard of, a man whose intellect was on par with Albert Einstein and Isaac Newton.”
— Jimmy Soni (2017), “10,000 Hours With Claude Shannon” (Ѻ)

Quotes | By
The following are quotes by Shannon:

Neumann is the smartest person I have ever met.”
— Claude Shannon (c.1941)

“Although this wave of popularity is certainly pleasant and exciting for those of us working in the [information science] field, it carries at the same an element of ‘danger’. Information theory is no panacea to solve nature’s secrets. It is too easy for our somewhat artificial propensity to collapse overnight when it is realized that the use of a few ‘exciting words’ like information, entropy, redundancy, do NOT solve all our problems. Workers in other fields, should realize that the basic results of the subject [information science] are aimed in a very specific direction, a direction that is NOT relevant to such fields as: psychology, economics, and other social sciences.”
— Claude Shannon (1956), “The Bandwagon” (Ѻ), Mar

“I think Wiener had a great brilliance. I’m not putting down his great mind. I think he really did have a tremendous IQ and a tremendous grasp of many things.”
— Claude Shannon (1982)

References
1. Shannon, Claude E. (1948). "A Mathematical Theory of Communication", Bell System Technical Journal, vol. 27, pp. 379-423, 623-656, July, October.
2. Claude Shannon 1916-2001 – Research.ATT.com.
3. Muller, Ingo. (2007). A History of Thermodynamics - the Doctrine of Energy and Entropy (ch 4: Entropy as S = k ln W, pgs: 123-126). New York: Springer.
4. Tribus, M. (1998). “A Tribute to Edwin T. Jaynes”. In Maximum Entropy and Bayesian Methods, Garching, Germany 1998: Proceedings of the 18th International Workshop on Maximum Entropy and Bayesian Methods of Statistical Analysis (pgs. 11-20) by Wolfgang von der Linde, Volker Dose, Rainer Fischer, and Roland Preuss. 1999. Springer.
5. (a) IEEE Transactions on Information Theory, December 1955.
(b) Hillman, Chris. (2001). “Entropy in the Humanities and Social Sciences.” Hamburg University.
6. Orzel, Chad. (2009). “Historical Physicist Smackdown: Thermodynamics Edition”, Science Blogs, Uncertainty Principles.
7. Avery, John (2003). Information Theory and Evolution (pg. 81). New Jersey: World Scientific.
8. (a) Shannon, Claude. (1987). “Claude Shannon: Interview: Father of the Electronic Information Age”, by Anthony Liversidge, Omni, August
(b) Liversidge, Anthony. (1987). “Profile of Claude Shannon”, reprinted in: Claude Elwood Shannon: Collected Papers (pgs. xix-xxxiv), N.J.A. Sloane and Aaron D. Wyner, editors. IEEE Press, 1993.
(c) Kahre, Jan. (2002). The Mathematical Theory of Information (pg. 218). Springer.
9. Horgan, J. (1992). “Claude Shannon” (abs), Spectrum, IEEE, 29(4): 72-75.
10. (a) Shannon, Claude. (1956). “The Bandwagon”, IRE Transactions: on Information Theory, 2(1):3, March.
(b) Mitra, Partha and Bokil, Hemant. (2007). Observed Brain Dynamics (§1.3.1: Reversible and Irreversible Dynamics; Entropy, pgs. 9-; Appendix A: The Bandwagon by C.E. Shannon, pgs. 343-44; Appendix B: Two Famous Papers by Peter Elias, pgs. 345-46). Oxford University Press.
11.
Aftab, O., Cheung, P., Kim, A., Thakkar, S., and Yeddanapudi, N. (2001). “Information Theory and the Digital Age”, Project History, Massachusetts Institute of Technology.
12. (a) Elias, Peter. (1958). “Two Famous Papers” (pdf), IRE Transactions: on Information Theory, 4(3):99.
(b) Mitra, Partha, and Bokil, Hemant. (2008). Observed Brain Dynamics (Appendix A: The Bandwagon by C.E. Shannon, pg. 343; Appendix B: The Two Famous Papers by Peter Elias, pg. 345). Oxford University Press.
13. Weiner, Norbert. (1956). “What is Information Theory”, IRE Transactions on Information Theory, 48, June.
14. Claude Shannon – History-Computer.com.
15. Mirowski, Philip. (2002). Machine Dreams: Economics Becomes a Cyborg Science (pg. 68). Cambridge University Press.
16. Poundstone, William. (2005). Fortune’s Formula: the Untold Story of the Scientific Betting System that Beat the Casinos and Wall Street (pgs. 15, 17, 148-49; Bush quote, pg. 21; Thorp, pgs. 38-40). Macmillan.
17. Price, Robert. (1982). “Interview: Claude E. Shannon”, Winchester, MA, Jul 28.

Further reading
● Shannon, Claude E. (1950). "Prediction and Entropy of Printed English", Bell Sys. Tech. J (3) p. 50-64.

External links
Claude Shannon – Wikipedia.

TDics icon ns