Schmitz human entropy diagram
Danish chemist John Schmitz's 2007 so-called "relative entropy of a human", i.e. “human entropy”, diagram, according to which he seems to conceptualize the notion that a person's level of entropy is the lowest (low entropy) at their last decades existence, prior to death (dereaction), e.g. an adult aged 50-70, after which 50-years later, following decomposition, a person's so-called afterlife entropy is as high (high entropy) as it was before his or her birth (reaction synthesis). [9]
In hmolscience, human entropy is the value of entropy associated with an individual human molecule (person), in a given state, or entropy of a system of human molecules (social configuration or social system) in a given state.

Entropy | Free energy of bodies
In 1866, German physicist Rudolf Clausius, in Schlomilch’s Zeitschrift fur Mathematik und Physik, published a "note" entitled “On the Determination of the Energy and Entropy of a Body”, latter appended to chapter nine of the first edition of his The Mechanical Theory of Heat, wherein, building on the work of German physicist Gustav Kirchhoff, who in prior had done research on the calculation of the energy of bodies, a methodology is set forth on how to calculate and measure both the energy U and entropy S of bodies—a calculation methodology theoretically applicable to "any" body in the universe, e.g. a volume of gas, an iron bar, a human, a society, or a black hole, or the universe as a whole, conceptualized as system; though this latter point was about human applications, at this stage, is implicit. [18]

In 1882, Hermann Helmholtz, in his "On the Thermodynamics of Chemical Processes", building on the earlier work of Willard Gibbs (1876), showed, via disproof of the now defunct so-called thermal theory of affinity, how the calculation of the "free energy", Gibbs free energy (isothermic-isobaric) or Helmholtz free energy (isothermic-isochoric), depending on reaction conditions, i.e. the measure of the work of the forces of the chemical affinities of the reactants involved in producing products, is what is important in the determination of the thermodynamics of formation, i.e. free energy of formation (see: Standard Gibbs free energy of formation), of a chemical or body, produced via chemical synthesis, not the entropy of the body alone.

In 1905, German chemist Fritz Haber, in his Thermodynamics of Technical Gas Reactions, presented the first systematic study of all the thermodynamic data necessary for the calculation of the free energy, the Helmholtz free energy (U – TS) in particular, of chemical substances in a group of important reactions. [19]

In 1914, American physical chemists Gilbert Lewis, and his assistant Merle Randall, building on the methodology introduced by Haber, published the first so-called “table of free energies”, giving free energies of formation values for: oxygen, hydrogen, and a few oxides of hydrogen. [20] This formed the basis for their expanded-followup 1923 “Table of Standard Free Energies of Formation at 25 °C”, giving free energies of formation for 28 cations and a few metallic compounds and 111 non-metallic compounds and anions. [21]

In 1957, English electrical engineer and physicist Keith Burton, in Krebs and Kornberg's Energy Transformations in Living Matter, produced the first thermodynamic table of free energies for biochemical species, containing about 100 species. [22]
Human free energy (mouse)


Free energy of formation (human)
Left: American physical chemist Martin Goldstein, in his 1993 section "Entropy of a Mouse", argues that to determine the free energy of formation of a mouse, we need to ask: [11]

“What net energy and entropy changes would have been if simple chemical substances, present when the earth was young, were converted into [the mouse]. To answer this question, we must determine the energies and entropies of everything in the initial state and final state.”

This very same logic, by extrapolation, can be applied to humans, in the calculation of the standard human free energy of formation. Right:
An artistic rendition of relationship between Gibbs free energy G and the formation, “creation”, or synthesis of a human (human molecule) from standard state atoms and molecules of the periodic table and earlier earth conditions; from American physicist Daniel Schroeder’s 2000 Thermal Physics textbook, who comments: [23]

“To create a [human] out of nothing and place it on the table, the magician need not summon up the entire enthalpy, H = U + PV. Some energy, equal to TS, can flow in spontaneously as heat; the magician must provide only the difference, G = H – TS, as work.”

The original text and depiction, to note, showed a "rabbit", but, nevertheless, the same principles apply.

Bridgman paradox
See main: Bridgman paradox
In 1946, American physicist Percy Bridgman, during the famous 1946 Harvard "what is life in terms of physics and chemistry?" debate, pointed out the paradox that while a so-called living thing, i.e. a human defined as a powered CHNOPS+ molecule, has an entropy, as does any body in the universe, there, apparently, is no way to calculate this entropy, being that, according to standard calculation of entropy methods (e.g. reaction calorimetry), one would have to either synthesize (create) or destroy (analyze) the organism in a reversible way. Bridgman commented how he saw a fundamental difficulty in the possibility of applying the laws of thermodynamics to any system containing living organisms (chnopsological organisms). French-born American physicist Leon Brillouin, in his “Life, Thermodynamics, and Cybernetics” (1949), summarized the “Paradox of Bridgman”, as he referred to it, as follows, which he says is Bridgman view: [1]

“How can we compute or even evaluate the entropy of a living being? In order to compute the entropy of a system, it is necessary to be able to create or to destroy it in a reversible way. We can think of no reversible process by which a living organism can be created or killed: both birth and death are irreversible processes. There is absolutely no way to define the change of entropy that takes place in an organism at the moment of death.”

Bridgman’s view on this seeming paradox can also be compared to American physical chemist Martin Goldstein's 1993 chapter subsection on the entropy of a mouse, which gives led into modern human free energy theories of human synthesis. [2]

Human affinities | Entropies
In circa 1808, Johann Goethe worked out in his mind, and likely on paper (although, unusually to common practice, he destroyed all his notes), so-called human affinity tables (see: Goethe's affinity table), the forerunner to "human free energy tables" (a future subject), which tabulate "human free energies of formation" per person per state, e.g. the free energy of formation of Thomas Jefferson in 1802, or bound state of persons per state, e.g. a married couple (i.e. dihumanide molecule) during their honeymoon, showing reaction affinity differences between the characters (reactants) in his 1809 physical chemistry based novella Elective Affinities.

In 1914, American chemical engineer William Fairburn, in his Human Chemistry, discussed the verbal idea that an individual person or might be associated with a value of relative "energy" but also "entropy", and therein employed human chemical theory to the effect that workers in a factory were types of chemicals that required efficient and intelligent handling by the foremen. [1]

In 1931, psychologists Siegfried Bernfeld and Sergei Feitelberg, in their “The Principle of Entropy and the Death Instinct”, presented the results of their study where they attempted to measure a paradoxical pulsation of entropy within a living organism, specifically in the nervous system of a man. [16] Specifically, by comparing the brain temperature to the rectal temperature of a man, they thought to acquire evidence of paradoxical variations, i.e. variations not conforming to the principle of entropy as it functions in physics for inanimate systems. [17]

In the late 1980s, Japanese systems engineer Ichiro Aoki began to make theoretical estimates of the entropy production in plant leaves and white-tail deer (1987), in day and at night; eventually applying these methods to humans, physiologically, into the 1990s. [14] The end result of Aoki’s work (2012), according to his conclusion, is that “entropy itself cannot be measured and calculated for biological systems, even for very small systems”, rather only “process variables, entropy flow, and entropy production can be quantified by the use of energetic data and physical methods.” [15]

In 1995, mining engineer Raj Singhal defines "human entropy" as the effect of individual variations in the efficiency of work of individuals and managers on the system. [6]

In 2002, American physicist Jack Hokikian defined the concept of the entropy of a human as as such: [10]

Human beings can be classified into low-entropic and high-entropic people.”

This view, to note, although in the right direction, is very elementary. To measure the entropy of a living structure, such as a mouse or a human, as American chemist Martin Goldstein explains, encounters numerous difficulties, but invariably is a measurement obtained in the same manner as are the entropies of simple chemical species obtained via laboratory experiments. [11]

In a 2004 article “Entropy and Information of Human Organisms”, Hungarian astrophysicist Attila Grandpierre claims that he was the first person to determined the entropy content of human being. [12] Likewise, in the 2007 article “Thermodynamic Measure for Nonequilibrium Processes”, Grandpierre, in association with Hungarian physicist Katalin Martinas, estimated the entropy of a 70-kg human to be 202 KJ/K and on this value estimate the extropy of a human to be 2.31 MJ/K. The calculation, although a good first attempt, is nearly baseless in that it's value is ascertained using entropy estimates of things such as glucose and water. [13] They even attempt a calculation of human enthalpy, using data such as the combustion of heat of fat, and use these estimates of S and H, to calculated a human Gibbs free energy G, using the formula G = H - TS (see: human free energy). These types of calculations are way off in that the Gibbs free energy of a human molecule is the summation of the Gibbs free energy component reactions involved in the synthesis of human beings over evolutionary time periods, starting from elementary components on the extent of reaction time line approaching millions or billions of years.
Human entropy table (Thims)
American electrochemical engineer Libb Thims's 2007 tabulation of the entropy components of a human, attributing the measure largely to neurological attributes. [2]

In 2007, American electrochemical engineer Libb Thims outlined the basic definition of the human chemical bond, i.e. electromagnetic attachments between people, being comprised of individual measures of enthalpies and entropies, according to which he defined the entropy of an average human, considering entropy as an ordering magnitude parameter of a human chemical reaction, formulaically, as follows: [2]

Human entropy (Thims, 2007)

where SP is the entropy associated with the personality (social graces + character + dependability), SO the entropy associated with the occupation (possessions + money), SI the entropy associated with the intelligence (information + education + knowledge), SS the entropy associated with status (prestige), and SN the entropy associated with the inner nature of a person (values + ambition).

In the 2007 book The Second Law of Life, Danish chemist John Schmitz estimated the so-called "relative entropy" of a human body over the course of its life-span, to decrease with age, becoming maximum at reaction end (death), as shown above. [9]

Literature
In literature thermodynamics, the term “human entropy” is often associated, in an unsubstantiated manner, with a gradual but cosmic dissolution of life. [7] In the 1932, English writer Aldous Huxley explicitly used the term "human entropy" in relation to the energy of expansion released due to sexual restraint.

Into the 1950s, literature definitions on entropy likely began to stem from Austrian physicist Erwin Schrödinger’s 1944 conception of “positive entropy” and death, in connection to information theory, such as found in the work of Thomas Pynchon. In other cases, however, different definitions can be found.


Human computer systems
In computer science, the conception of human entropy E(S) related to the interactions involved in a computer-human system was introduced in 1992 by Polish-born, American industrial engineer Waldemar Karwowski, in what seems to be based on a type of fuzzy entropy logic. [3] Strangely, Karwowski uses the symbol “E” for entropy and "S" for system. In any event, according Karwowski, using a bit of argument, the “system entropy” E(S), such as a person in their office interacting with a computer, can be defined as the difference between the human entropy E(H) and the entropy of a system regulator E(R), which he defines as “ergonomic intervention efforts”, or in equation form: [4]

E(S) ≥ E(H) – E(R)

This view, to note, seems to have little connection to actual thermodynamics.

See also
Shannon Entropy and Thermodynamic Love – my thoughts (2009) – Hmolpedia threads.

References
1. Fairburn, William Armstrong. (1914). Human Chemistry, (entropy, pgs. 34-35). The Nation Valley Press, Inc.
2. Thims, Libb. (2007). Human Chemistry (Volume One) (entropy components of the human chemical bond, pgs. 270-72). Morrisville, NC: LuLu.
3. (a) Karwowski, Waldemar. (1992). “The human world of fuzziness, human entropy, and the need for the general fuzzy systems theory.” Journal of Japan Society for Fuzzy Theory and Systems, 4, 591-609.
(b) Karwowski, Waldemar. (1995). “A general modeling framework for the human-computer interaction based on the principle of ergonomic compatibility requirements and human entropy.” In Grieco, A. Molteni, G., Occhipinti, E. and Piccoli, B. (eds.) Work witrh Display Units 94 (Amesterda: North-Holland), pgs. 473-8.
4. Jacko, Julie A. and Sears, Andrew. (2003). The Human-computer Interaction Handbook: Fundamentals, Evolving Technologies, (pgs. 1229-30). Lawrence Erlbaum Assoicates.
5. (a) Huxley, Aldous. (1938). Ends and Means: An Enquiry Into the Ideals and Into the Methods Employed for their Realization (pg. 368). Chatto & Windus.
(b) Word Study (1969). G. C. Merriam Co.
6. Singhal, Raj K. (1995). Mine Planning and Equipment Selection 1995, (pg. 928: “human entropy”). Taylor & Francis.
7. Docherty, Thomas. (1986). John Donne, Undone. (pg. 19, 23, 76). Routledge.
8. Smith, Sam. (1992). “Global Dumbing: the Politics of Entropy”, Progressive Review, April.
9. Schmitz, John E.J. (2007). The Second Law of Life: Energy, Technology, and the Future of Earth as We Know It (pg. 119). William Andrew Publishing.
10. Hokikian, Jack. (2002). The Science of Disorder: Understanding the Complexity, Uncertainty, and Pollution in Our World (pg. 48). Los Feliz Publishing.
11. Goldstein, Martin and Goldstein, Inge F. (1993). The Refrigerator and the Universe: Understanding the Laws of Energy (section: Entropy of a mouse, pgs. 297-99). Harvard University Press.
12. Grandpierre, Attila. (2004). “Entropy and Information of Human Organisms and the Nature of Life.” Frontier Perspectives, Vol. 13, pg. 16. Mar. 22.
13. Martinas, Katalin and Grandpierre, Attilia. (2007). “Thermodynamic Measure for Nonequilibrium Processes”, Interdisciplinary Description of Complex Systems, 5(1): 1-13.
14. (a) Aoki, Ichiro. (1987). “Entropy Balance of White-Tailed Deer During Winter Night” (abs), Bulletin of Mathematical Biology, 49(3): 321-27.
(b) Aoki, Ichiro. (1992). “Entropy Physiology of Swine: a Macroscopic Viewpoint”, Journal of Theoretical Biology, 157(3):363-71.
(c) Aoki, Ichiro. (1994). “Entropy Production in Human Life Span: A Thermodynamic Measure for Aging” (abstract). Age 1: 29-31.
(d) Aoki, Ichiro. (1997). “Introduction to Entropy Physiology” (abs), Siebutso Butsuri, 37(3): 106-10.
15. Aoki, Achiro. (2012). Entropy Principle for the Development of Complex Biotic Systems: Organisms, Ecosystems, the Earth. Elsevier.
16. (a) Bernfeld, Seigfried, Feitelberg, Sergei. (1931). "The Principle of Entropy and the Death Instinct" ("Der Entropiesatz und der Todestrieb"). Int. J. Psycho-Anal., 12:61-81.
(b) Kapp, R.O. (1931). “Comments on Bernfeld and Feitelberg's 'The Principle of Entropy and the Death Instinct”. J. Psycho-Anal., 12:82-86.
(c) Spring, W.J. (1934). “A Critical Consideration of Bernfeld and Feitelberg's Theory of Psychic Energy”. Psychoanal Q., 3:445-473.
17. Lacan, Jacques and Miller, Jacques-Alain. (1991). The Ego in Freud’s Theory and in the Technique of Psychoanalysis, 1954-1955 (entropy, pgs. 77, 81, 83, 95, 114, 327, 334; Bernfeld and Feitelberg, pg. 115). W.W. Norton & Co.
18. Clausius, Rudolph. (1866). “On the Determination of the Energy and Entropy of a Body”, Schlomilch’s Zeitschrift fur Mathematik und Physik, Bd. Xi. S. 31 (note); in (English): Philosophical Magazine, 4(32):1; in: The Mechanical Theory of Heat: with its Applications to the Steam Engine and to Physical Properties of Bodies (§:Appendix to Ninth Memoir, pgs. 366-74). London: John van Voorst.
19. Lewis, Gilbert N. and Randall, Merle. (1923). Thermodynamics and the Free Energy of Chemical Substances (pgs. 5-6; Table of Free Energies, pgs. 607-08). McGraw-Hill Book Co., Inc.
20. (a) Lewis, Gilbert and Randall, Merle. (1914). “The Free Energy of Oxygen, Hydrogen, and the Oxides of Hydrogen”, Journal of American Chemical Society, 35:1964.
(b) Randall, Merle and Young, Leona E. (1942). Elementary Physical Chemistry (note, pg. 302). Randall and Sons.
21. Kim, Mi G. (2003). Affinity , That Elusive Dream: a Genealogy of the Chemical Revolution. Cambridge, Mass.: MIT Press.
22. (a) Krebs, H.A. and Kornberg, H.L. (1957). Energy Transformations in Living Matter (with an Appendix by K. Burton with 21 figures). Berlin: Springer-Verlag.
23. (b) Alberty, Robert, A. (2003). Thermodynamic of Biochemical Reactions, (pg. 2). Hoboken, New Jersey: John Wiley & Sons, Inc.
Schroeder, Daniel V. (2000). An Introduction to Thermal Physics (pg. 150). Addison Wesley Longman.

Further reading
● Guskin, Dave. (2010). “Entropy and People”, GuskIntelligence, Jan 15.

TDics icon ns