In thermodynamics, essergy, short for ‘essence of energy’, is a hypothetical information theory stylized type of free energy, so to say.

Overview
In 1968, American engineer Robert Evans introduced essergy, a rather blurry, Shannon bandwagon riding, concept, posited to be a replacement for all measures of potential work, such as availability, exergy, available work, Gibbs free energy, Gibbs chemical potential, Helmholtz free energy, negentropy and other common energy expressions. [1] The idea of essergy is best expressed in Evens own words: [2]

“An attempt is made to prove that all of the many seemingly independent measures of potential work, such as availability, exergy, available work, Gibbs free energy, Gibbs chemical potential, Helmholtz free energy, and other common energy expressions are necessarily all special cases of a unique quantity that is called essergy, a contraction of the term essence of energy. The proof is attempted rigorously for chemical systems, and then is extended. If correct, the proof will be of consequence to the design of any engineering system in which potential work is a significant factor, since it will show that by evaluating the one quantity, essergy, the designer will have taken account of the other seemingly independent considerations. A possible consequence of the proof may be a more general formulation for the concept of information based on Brillouin's principle of the equivalence of thermodynamic information and potential work. The proof indicates that negentropy is not as general a measure of potential work as is the quantity, essergy. This result could imply that essergy is a more general measure of thermodynamic information than negentropy, an implication that might lead to a broader formulation about information and, thus, new insight into the foundations of information theory.”

It is difficult to even describe what a mess this entire statement is.

In 1974, James Lovelock and Lynn Margulis cite Evans' publication to conclude that it has been proved "the classical properties of entropy and free energy have exact information theoretic equivalents", where information I is defined as:

 I = S_0 - S \,

where S0 is the entropy of the components of the system at thermodynamic equilibrium and S the entropy of the system assembled. They then state that this relation can be transferred directly from information theoretic to classical thermodynamic terms as follows:

 I = \frac{E +PV - TS - \Sigma N, N_i}{T} \,

where E is the energy, P the pressure, V the volume, S the entropy, and N the chemical potential (and likely Ni the particle count) of the molecules present. It is difficult to even see how they made the jump from the first equation to the second equation, although it looks to have involved the Gibbs fundamental equation in some way? In any event, the pair naively conclude that "it follows that information is a measure of disequilibrium in the classic sense and recognizable in the information theoretic sense." This is the gist of their barely tenable derivation and they even go on to cite Lewis and Randall (1923), among others, to allude to the idea that Evans has proved that chemical thermodynamics is can be entirely re-written on information. [3]

References
1. Dincer, Ibrahim and Rosen, Marc A. (2002). Thermal Energy Storage: Systems and Applications, (pg. 22). John Wiley and Sons.
2. Evans, Robert B. (1969). A Proof that Essergy is the Only Consistent Measure of Potential Work (abs), PhD thesis, Dartmouth College, Hanover H H Thayer School of Engineering.
3. Lovelock, James and Margulis, Lynn. (1974). “Atmospheric Homeostasis and by and for the Biosphere: the Gaia Hypothesis, Tellus, 26(1-2): 2-10.

TDics icon ns