Neumann-Shannon anecdote
A 2010 parody of the Neumann-Shannon anecdote, by Israeli physical chemist Arieh Ben-Naim, form his three-page chapter section “Snack: Who’s Your Daddy?”, in which retells the story such that Shannon has a new baby and he and his wife are deciding on what name to pick: ‘information’ or ‘uncertainty’? Neumann suggests they name their son after Rudolf Clausius’ son’s name ‘entropy’, which Shannon decides to do—only to find out, in the years to follow, that people continually confuse his son with Clausius’ son and also misuse and abuse the name; after which, it is suggest to Shannon that he change is son’s name to: Smomi, a short acronym for “Shannon’s Measure Of Missing Information”. [2]
In anecdotes, Neumann-Shannon anecdote, or "Shannon-Neumann anecdote", is a famous conversation, or "widely circulated story" (Mirowski, 2002), that occurred in the time period fall 1940 to spring 1941 in a discussion between American electrical engineer and mathematician Claude Shannon and Hungarian-born American chemical engineer and mathematician John Neumann, during Shannon’s postdoctoral research fellowship year at the Institute for Advanced Study, Princeton, New Jersey, where Neumann was one of the main faculty members, during which time Shannon was wavering on whether or not to call his new logarithmic statistical formulation of data in signal transmission by the name of ‘information’ (of the Hartley type) or ‘uncertainty’ (of the Heisenberg type), to which question Neumann suggested that Shannon use neither names, but rather use the name ‘entropy’ of thermodynamics, because: (a) the statistical mechanics version of the entropy equations have the same mathematical isomorphism and (b) nobody really knows what entropy really is so he will have the advantage in winning any arguments that might erupt. [1]

Date
The date of the infamous Neumann-Shannon conversation is predominately attributed to the year 1940 and said to have occurred at the Institute for Advanced Study, Princeton, New Jersey, where at John Neumann was one of the main faculty members. [12]

To put the infamous conversation in chronological context, Time staff reporter Robert Slater, in his 1989 computer history book Portrait in Silicon, written following his travels to Silicon Valley where he interviewed the designers, entrepreneurs, hardware engineers, and software writers behind the modern computer, states specifically that Shannon completed his PhD with a dissertation entitled “An Algebra for Theoretical Genetics” in the spring of 1940 in mathematics, while at the same time completing a MS in electrical engineering. Then in summer of 1940 he worked under T.C. Fry, director of the mathematics department at Bell Labs, doing work on switching circuits, resulting in the paper “The Synthesis of Two-Terminal Switching Circuits”, wherein he outlined a new method of design that reduced the number of contacts needed to synthesize complex switching functions.

Then, in the fall of 1940, Shannon accepted a National Research Fellowship that allowed him to work under German mathematician Hermann Weyl at the Institute for Advanced Study, Princeton, New Jersey. Then, in the spring of 1941, he was back at Bell Laboratories. [11]

This would put the conversation as having occurred between the fall of 1940 and the spring of 1941. It would seem likely that the conversation—being something important on Shannon’s mind—would have occurred early upon his arrival at the Institute, which would assign the conversation as having occurred in September to December of 1940.
Neumann-Shannon anecdote (2010)
A 2010 snapshot summary of the 1940 Neumann-Shannon anecdote from the 2010 lecture notes turned book Biomedical Informatics of German computer processing scientist and cognitive theorist Andreas Holzinger. [15]

American information theory historians Jorge Schement and Brent Ruben, however, state that Shannon “spent a year (1939-1940) as a National Research Fellow at the Institute for Advanced Study, studying mathematics and Boolean algebra with Hermann Weyl. [1] This 1939 date, however, seems to be incorrect, as most sources state that Shannon had not yet finished his PhD at MIT until 1940. [12] In agreement with this Schement, in 2012, issued a correction retraction on the "1939" date, stating that they may have jumped the gun in regards to this dating timeframe.

Shannon used the word "entropy" once in his classified 1945 “A Mathematical Theory of Cryptography”. [16]

Overview
The Neumann-Shannon anecdote has been retold so many times that it has classified by some as an urban legend in science.

In Apr 1961, the story of the incident became public knowledge when American engineer Myron Tribus was invited to gave a seminar at MIT on a new way to derive thermodynamics based on information theory. The audience, according to Tribus, was a critical audience, comprised of students of American mechanical engineer Joseph Keenan, founder of the MIT school of thermodynamics, who “tried to rip it apart”, and also that French mathematician Benoit Mandelbrot was in the audience and quickly attacked the MaxEnt interpretation, saying: “Everyone knows that Shannon’s derivation is in error.”

It also happened to be the case that Shannon was in residence at MIT that week, so naturally enough Tribus went to see him. Shannon, according to Tribus, “was immediately able to dispel Mandelbrot’s criticism, but went on to lecture me on his misgivings about using his definition of entropy for applications beyond communication channels.” [3] During this meeting, Tribus queried Shannon as to his reason for choosing to call his information function by the name ‘entropy’, the details of which were first made public in his 1963 Helvetica Physica Acta article “Information Theory and Thermodynamics”, and in many symposiums and publications to follow. The most-cited version comes from Tribus' 1971 article “Energy and Information”, co-written with American physicist Edward Charles McIrvine (1933-), wherein they states: [4]

“What’s in a name? In the case of Shannon’s measure the naming was not accidental. In 1961 one of us (Tribus) asked Shannon what he had thought about when he had finally confirmed his famous measure. Shannon replied: ‘My greatest concern was what to call it. I thought of calling it ‘information’, but the word was overly used, so I decided to call it ‘uncertainty’. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, ‘You should call it entropy, for two reasons. In the first place you uncertainty function has been used in statistical mechanics under that name. In the second place, and more importantly, no one knows what entropy really is, so in a debate you will always have the advantage.”

Likewise, Tribus, in his 1987 article “An Engineer Looks at Bayles”, recounts his discussion with Shannon on this question as follows: [5]

“The same function appears in statistical mechanics and, on the advice of John von Neumann, Claude Shannon called it ‘entropy’. I talked with Dr. Shannon once about this, asking him why he had called his function by a name that was already in use in another field. I said that it was bound to cause some confusion between the theory of information and thermodynamics. He said that Von Neumann had told him: ‘No one really understands entropy. Therefore, if you know what you mean by it and you use it when you are in an argument, you will win every time.”

Truncated variations of the above have been retold ever since.

In 1982, Shannon, in recorded interview (ΡΊ), commented, rather hazily, on this past event as follows:

Shannon:
Well, let me also throw into this pot, Szilard, the physicist. And von Neumann, and I’m trying to remember the story. Do you know the story I’m trying to remember?
Price:
Well, there are a couple of stories. There’s the one that Myron Tribus says that von Neumann gave you the word entropy, saying to use it because nobody, you’d win every time because nobody would understand what it was.
Shannon:
[laughs]
Price:
And furthermore, it fitted p*log(p) perfectly. But that, but then I’ve heard . . .
Shannon:
von Neumann told that to me?
Price:
That’s what you told Tribus that von Neumann told that to you.
Shannon:
[laughs – both talking at once]
Price:
Bell Labs too, that entropy could be used. That you already made that identification. And furthermore in your cryptography report in 1945, you actually point out, you say the word entropy exactly once in that report. Now this is 1945, and you liken it to Statistical Mechanics. And I don’t believe you were in contact with von Neumann in 1945, were you? So it doesn’t sound to me as though von Neumann told you entropy.
Shannon:
No, I don’t think he did.
Price:
This is what Tribus quoted.
Shannon:
Yeah, I think this conversation, it’s a very odd thing that this same story that you just told me was told to me at Norwich in England. A fellow —
Price:
About von Neumann, you mean?
Shannon:
Yeah, von Neumann and me, this conversation, this man, a physicist there, and I’ve forgotten his name, but he came and asked me whether von Neumann, just about the thing that you told me, that Tribus just told you, about this fellow. . .
Price:
That was Jaynes, I imagine the physicist might have been [Edwin] Jaynes.
Shannon:
Yes, I think it was, I think so. Do you know him?
Price:
Well, he’s published in the same book as Tribus, you see. This is a book called The Maximum Entropy Formalism. You’ve probably seen that book, but they have chapters in it, and Jaynes, the physicist —
Shannon:
Now, I’m not sure where I got that idea, but I think I, somebody had told me that. But anyway, I think I can, I’m quite sure that it didn’t happen between von Neumann and me.
Price:
Right. Well, I think that the fact that it’s in your 1945 cryptography report establishes that, well, you didn’t get it from von Neumann, that you had made the p*log(p) identification with entropy by some other means. But you hadn’t been —
Shannon:
Well, that’s an old thing anyway, you know.
Price:
You knew it from thermodynamics.
Shannon:
Oh, yes, from thermodynamics. That goes way back.
Price:
That was part of your regular undergraduate and graduate education of thermodynamics and the entropy?
Shannon:
Well, not in class, exactly, but I read a lot, you know.

(add)

Restated versions of the anecdote
The following are regurgitated versions of the conversation, originally reported by Trybus (above), that varies depending on the source and point of view, and in which if one looks closely at the version reported by Tribus, we see him jumping from the report that Neumann used the phrase already used in "statistical thermodynamics" (1964) to "statistical mechanics" (1971) to "thermodynamics" (1983), which is somewhat humorous in itself:

Evolution of the Neumann-Shannon Conversation
Source



Conversation, between John Neumann and Claude Shannon, occurred between fall 1940 to spring 1941 at the Institute for Advanced Study, Princeton, New Jersey.

April of 1961, Myron Tribus visits Shannon at his office at MIT and questions him about the reason behind his "entropy" name adoption.

Tribus recounts the Shannon interview as such:

"When Shannon discovered this function he was faced with the need to name it, for it occurred quite often in the theory of communication he was developing. He considered naming it "information" but felt that this word had unfortunate popular interpretations that would interfere with his intended uses of it in the new theory. He was inclined towards naming it "uncertainty" and discussed the matter with the late John Von Neumann. Von Neumann suggested that the function ought to be called "entropy" since it was already in use in some treatises on statistical thermodynamics (e.g. ref. 12). Von Neumann, Shannon reports, suggested that there were two good reasons for calling the function "entropy". "It is already in use under that name," he is reported to have said, "and besides, it will give you a great edge in debates because nobody really knows what entropy is anyway." Shannon called the function "entropy" and used it as a measure of "uncertainty," interchanging the two words in his writings without discrimination.

Myron Tribus
(1963)
[8]

“What’s in a name? In the case of Shannon’s measure the naming was not accidental. In 1961 one of us (Tribus) asked Shannon what he had thought about when he had finally confirmed his famous measure. Shannon replied: ‘My greatest concern was what to call it. I thought of calling it ‘information’, but the word was overly used, so I decided to call it ‘uncertainty’. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, ‘You should call it entropy, for two reasons. In the first place you uncertainty function has been used in statistical mechanics under that name. In the second place, and more importantly, no one knows what entropy really is, so in a debate you will always have the advantage.” Myron Tribus and Edward McIrving (1971)
[4]

“You should call it ‘entropy’ for two reasons: First, the function is already used in thermodynamics under that name; second, and more importantly, most people don’t know what entropy really is, and if you use the word ‘entropy’ in an argument you will win ever time.” Myron Tribus (1983)
Philip Mirowski (2002)
[13]

At first, Shannon did not intend to use such a highly charged term for his information measure. He thought “uncertainty” would be a safe word. But he changed his mind after a discussion with John von Neumann, the mathematician whose name is stamped upon some of the most important theoretical work of the first half of the twentieth century. Von Neumann told Shannon to call his measure entropy, since “no one really knows what entropy is, so in a debate you will always have the advantage.”Jeremy Campbell (1982)
[9]

Shannon, the pioneer of information theory, was only persuaded to introduce the word 'entropy' into his discussion by the mathematician John von Neumann who is reported to have told him: “it will give you a great edge in debates because nobody really knows what entropy is anyway.” Peter Coveney and Roger Highfield (1992)
[10]

“The theory was in excellent shape, except that he needed a good name for ‘missing information’. ‘Why don’t you call it entropy’, von Neumann suggested. ‘In the first place, a mathematical development very much like yours already exists in Boltzmann’s statistical mechanics, and in the second place, no one understands entropy very well, so in any discussion you will be in a position of advantage’.” John Avery (2003)
[7]


By 1990, Shannon’s Mathematical Theory of Communication sold more than 51,000 copies (Stockanes, 1990), and the anecdote has since become somewhat of an established factual legend, apocryphal or not. [1]

Discussion
There is some reference, to note, where it is claimed that Shannon says that he did not get the concept of entropy from either John Neumann, who has been working with entropy formulas since 1927, the or Alan Turning (Ellersick, 1984), whom Shannon frequently met with for lunch at Bell Labs in the 1940s, or from Norbert Wiener (Omni, 1987), who in 1946 was equating entropy with information, but rather that he derived his equation for the amount of information, and found it was identical to the formula that physicists use to calculate the equation known as entropy in thermodynamics (Bello, 1953). [6]

In his 1968 Encyclopedia Britannia article “Information Theory”, Shannon wrote the following telling statement: [20]

“The formula for the amount of information is identical in form with equations representing entropy in statistical mechanics, and suggest that there may be deep-lying connections between thermodynamics and information theory. Some scientists believe that a proper statement of the second law of thermodynamics requires a term related to information. These connections with physics, however, do not have to be considered in the engineering and other [fields?].”

Shannon, during his 1977 interview with graduate student science historian Friedrich-Wilhelm Hagemeyer, also recorded on tape, commented that when he wrote his 1948 paper, that he wasn’t aware of Leo Szilard’s 1929 work. He also told Hagemeyer that he read Norbert Wiener's 1942 Yellow Peril report, which contained discussions on entropy and communication. [1]

This, however, conflicts to some extent with the fact that Shannon was a close associate of Warren Weaver who was the first to alert Leon Brillouin to the so-called importance of Szilard’s paper; also in 1950, at the Rockefeller Institute, formerly introduced Szilard and Brillouin. [19]

Shannon, during his 1982 oral history interview recorded on tape, conducted with communication engineer Robert Price, as reported in the 1984 article “A Conversation with Claude E. Shannon”, said that he did not get the concept of entropy from either John Neumann or Alan Turing, but did so in an evasive way mixed with a seeming lack of memory. [16] [17] As Brazilian electrical engineer Erico Guizzo, who did his 2003 MS thesis on the origins of information theory, conducting many interviews himself, and listing to the Hagemeyer interview tapes, "Shannon seemed to always evade this kind of question", as to the origins of information theory". [18]

References
1. Schement, Jorge R. and Ruben, Brent D. (1993). Information Theory and Communication, Volume 4 (pgs. 43, 53). Transaction Publishers.
2. Ben-Naim, Arieh. (2012). Discover Entropy and the Second Law of Thermodynamics: a Playful Way of Discovering a Law of Nature (pg. 12). World Scientific.
3. Tribus, Myron. (1998). “A Tribute to Edwin T. Jaynes”. In Maximum Entropy and Bayesian Methods, Garching, Germany 1998: Proceedings of the 18th International Workshop on Maximum Entropy and Bayesian Methods of Statistical Analysis (pgs. 11-20) by Wolfgang von der Linde, Volker Dose, Rainer Fischer, and Roland Preuss. 1999. Springer.
4. (a) Tribus, Myron and McIrving, Edward C. (1971). “Energy and Information”, Scientific American, 225: 179-88.
(b) McIrvine, Edward C. (1959). “A Quantum Theory of Thermal Transport Phenomena in Metals” (abs), PhD Dissertation, Cornell University, Feb.
(c) Ben-Naim, Arieh. (2010). Discover Entropy and the Second Law of Thermodynamics: a Playful Way of Discovering a Law of Nature (pg. 12). World Scientific.
5. Tribus, Myron. (1987). “An Engineer Looks at Bayes”, Seventh Annual Workshop: Maximum Entropy and Bayesian Methods, Seattle University August, in: Foundations (editor: Gary J. Erickson) (pgs. 31-32, etc.), Springer, 1988.
6. (a) Shannon, Claude. (1987). “Claude Shannon: Interview: Father of the Electronic Information Age”, by Anthony Liversidge, Omni, August
(b) Liversidge, Anthony. (1987). “Profile of Claude Shannon”, reprinted in: Claude Elwood Shannon: Collected Papers (pgs. xix-xxxiv), N.J.A. Sloane and Aaron D. Wyner, editors. IEEE Press, 1993.
(c) Kahre, Jan. (2002). The Mathematical Theory of Information (pg. 218). Springer.
(d) Schement, Jorge R. and Ruben, Brent D. (1993). Information Theory and Communication, Volume 4 (pg. 43). Transaction Publishers.
(e) Bello, Francis. (1953). “The Information Theory”, Fortune, 48(6):136-58, Dec.
7. Avery, John (2003). Information Theory and Evolution (pg. 81). World Scientific.
8. Tribus, Myron. (1963). "Information theory and thermodynamics", in: Heat Transfer, Thermodynamics and Education: Boelter Anniversary Volume (editors: Harold Johnson and Llewellyn Boelter) (pg. 348-68; quote, pg. 354). New York: McGraw-Hill, 1964.
9. Campbell, Jeremy. (1982). Grammatical Man: Information, Entropy, Language, and Life (pg. 22). Simon and Schuster.
10. Coveney, Peter V. and Highfield, Roger. (1992). The Arrow of Time: A Voyage Through Science to Solve Time’s Greatest Mystery (pgs. 178, 253). Fawcett Columbine.
11. Slater, Robert. (1989). Portrait in Silicon (pg. 36). MIT Press.
12. (a) Claude Shannon – History-Computer.com.
(b) Mirowski, Philip. (2002). Machine Dreams: Economics Becomes a Cyborg Science (pg. 68). Cambridge University Press.
(c) Day, Lance and McNeil, Ian. (1998). Biographical Dictionary of the History of Technology (pg. 1097). Taylor & Francis.
13. (b) Tribus, Myron. (1983). “Thirty Years of Information Theory”, in: The Study of Information: Interdisciplinary Messages (editors: Fritz Machlup and Una Mansfield) (pg. 476). Wiley.
(b) Mirowski, Philip. (2002). Machine Dreams: Economics Becomes a Cyborg Science (pg. 68). Cambridge University Press.
14. Email communication with Libb Thims (20 Nov 2012).
15. Holzinger, Andreas. (2010). Biomedical Informatics: Lecture Notes to LV 444.152 (pg. 65). Books on Demand.
16. (a) Price, Robert. (1982). “Interview: Claude E. Shannon”, Winchester, MA, Jul 28.
(b) Claude, Shannon. (1945). “A Mathematical Theory of Cryptography”, Memorandum MM 45-110-02, Sep 1, Bell Laboratories; declassified and published in 1949 in the Bell System Technical Journal (BSTJ) as the “Communication theory of secrecy systems” (though this seems to be an expanded second edition).
17. (a) Price, Robert. (1982). “A Conversation with Claude E. Shannon” (conducted: 28 July) (interview by Robert Price; edited by Fred Ellersick), in: IEEE Communications Magazine, 22(5): 123-26, 1984; reprinted as: “A Conversation with Claude Shannon: One Man’s Approach to Problem Solving”, in Cryptologia (abs), 9(2):167-75, 1985.
(b) Rogers, Everett M. (1994). A History of Communication Study (pg. 422). The Free Press.
(c) Schement, Jorge R. and Ruben, Brent D. (1993). Information Theory and Communication, Volume 4 (pg. 43-44). Transaction Publishers.
18. (a) Guizzo, Erico M. (2003). “The Essential Message: Claude Shannon and the Making of Information Theory” (pg. 44). MS thesis, University of Sao Paulo, Brazil.
(b) Hagemeyer, Friedrich-Wilhelm. (1977). “Interview: Claude Shannon”, PhD dissertation “Die Entstehung von Informationkonzepten in der Nachrichtentecknik”, Free University of Berlin, Germany
(c) Erico Guizzo (2003) states that he has digital MP3 files, mailed to him by Hermann Rotermund, who transformed Hagemeyer’s tape-recorded analog interviews into MP3, in the form of a CD; which were used in the writing of his MS thesis.
19. (a) Lanouette, William. (1992). Genius in the Shadows: a Biography of Leo Szilard: the Man Behind the Bomb (pg. 64). C. Scribner’s Sons.
(b) Mirowski, Philip. (2002). Machine Dreams: Economics Becomes a Cyborg Science (pg. 50). Cambridge University Press.
20. (a) Shannon, Claude E. (1968). “Information Theory”, Encyclopedia Britannica, 14th edition; in: Claude Elwood Shannon: Collected Papers (§Article on Information Theory for Encyclopedia Britannica, pgs. 212-20). IEEE Press, 1993.
(b) Guizzo, Erico M. (2003). “The Essential Message: Claude Shannon and the Making of Information Theory” (pg. 47). MS thesis, University of Sao Paulo, Brazil.

Further reading
● Dutta, Mahadev. (1968). “A Hundred Years of Entropy” (abs), Physics Today, Jan.
● Skagerstam, Bo-Sture K. (1975). “On the Notions of Entropy and Information”, Journal of Statistical Physics, 12(6): 449-62.
● Thims, Libb. (2012). “Thermodynamics ≠ Information Theory: Science’s Greatest Sokal Affair” (url), Journal of Human Thermodynamics, 8(1):1-##.

TDics icon ns