S = k log W (closeup)
The equation S = k log W engraved on the Boltzmann tombstone in Vienna Central Cemetery.
In equations, the formula:

 S = k \ln W \,

in the natural logarithm format or in equivalent notation (see: logarithm) as:

 S = k \log W \,

with base e assumed, is called the Planck entropy, Boltzmann entropy, Boltzmann entropy formula, or Boltzmann-Planck entropy formula, a statistical mechanics, i.e. particle position, interpretation of Clausius entropy, where S is the entropy of an ideal gas system, k is the Boltzmann constant (ideal gas constant R divided by Avogadro's number N), and W, from the German Wahrscheinlichkeit (var-SHINE-leash-kite), meaning probability, often referred to as multiplicity (in English), is the number of “states” (often modeled as quantum states), or "complexions", the particles or entities of the system can be found in according to the various energies with which they may each be assigned; wherein the particles of the system are assumed to have uncorrelated velocities and thus abide by the Boltzmann chaos assumption.

Some consider S = k ln W to be easily the second most important formula of physics, next to E = mc² or at par with it. [1] This formula is sometime called "Boltzmann formula" (or Boltzmann entropy formula) and entropy calculated from this formula is sometimes called Boltzmann entropy.

Einstein
In 1901, German physicist Max Planck, in his "On the Law of Distribution of Energy in the Normal Spectrum", introduced the S = k log W + c formula, in semi deus ex machina fashion, via citation of Austrian physicist Ludwig Boltzmann and his H-theorem (1872).

In 1904, and thereafter, German physicist Albert Einstein began to repeatedly criticize the formula S = k log W; in 1910, Einstein had the following to say on the matter:

“The equation S = k log W + const appears without an elementary theory — or however one wants to say it — devoid of any meaning from a phenomenological point of view.”
— Albert Einstein (1910), popular 2007+ re-quote version

“Usually W is set equal to the number of ways (complexions) in which a state, which is incompletely defined in the sense of a molecular theory (i.e. coarse grained), can be realized. To compute W one needs a complete theory (something like a complete molecular-mechanical theory) of the system. For that reason it appears to be doubtful whether Boltzmann's principle alone, i.e. without a complete molecular-mechanical theory (Elementary theory) has any real meaning. The equation S = k log W + const. appears [therefore] without an Elementary theory—or however one wants to say it—devoid of any meaning from a phenomenological point of view.”
— Albert Einstein (1910), Ezecheil Cohen 2005 abbreviated translation

“Usually W is put equal to the number of complexions.... In order to calculate W, one needs a complete (molecular-mechanical) theory of the system under consideration. Therefore it is dubious whether the Boltzmann principle has any meaning without a complete molecular-mechanical theory or some other theory which describes the elementary processes. Formula. seems without content, from a phenomenological point of view, without giving in addition such an Elementartheorie.”
— Albert Einstein (1910), Abraham Pais 1982 abbreviated translation

(add discussion)

Derivation
The actual full rigorous step-by-step derivation of the logarithmic formulation or interpretation of entropy (S = k ln W) is a bit difficult to track down for a number of reasons, the first of which being that it was originally done in German and readily available English translations are wanting.

The start of the derivation traces to Austrian physicist Ludwig Boltzmann's 1872 H-theorem, a formula to approximate the velocity distributions of the particles of a body of gas, as described in his "Further Studies on the Thermal Equilibrium of Gas Molecules".

The "famous expression" that equates entropy with the logarithm of the state probability, supposedly in the form of S = k ln W, was first presented, according to science historians Helge Kragh and Stephen Weininger, in 1877 (fact check); possibly in Boltzmann's article “On the Relation of a General Mechanical Theorem to the Second Law of Thermodynamics”. [13]

resonator
An absorbing (left) and radiating (right) "resonator", which is a sphere with a hole in it (originally a type of iron stove with a hole in it, sooted black on the inside) that absorbs electromagnetic radiation when cold, acting like a black body; or, conversely, emits electromagnetic radiation when hot. This is termed "black body radiation" and is the physical model on which the S = k ln W equation was derived by German physicist Max Planck in 1901. [1]

The modern formulation, S = k ln W (or S = k log W, base e assumed), as it is now known as a staple of science, was proposed or rather stated as a matter-of-fact by German physicist Max Planck as being the standard formula for the measure of the entropy of black bodies, as discussed in his 1901 "On the Law of Distribution of Energy in the Normal Spectrum", albeit done without derivation and without clear justification as to why this formulation applies to the measurement of the entropy of what Planck defined as an "irradiated, monochromatic, vibrating resonator".

A crude approximate derivation is as follows, as given in 1946 by Belgian-born English thermodynamicist Alfred Ubbelohde. [8] First we start with a body of ideal gas containing an Avogadro number n of particles (atoms or molecules). Then suppose that we compress the body, using a small hand pump, down to a small initial volume v1. This will be the initial state. We then connect the compressed volume to an evacuated second volume v2, e.g. a deflated football, and let the gas expand into the the larger volume. This will be the final state.

The first condition we stipulate is that the process is done very slowly, the model of the infinitely slow rate, idealizing the view that the temperature of the gas never differs from that of the surroundings. Thus, the process is thus defined as being an isothermal process. A second assumption behind the idealization of the mention "infinitely slow" means that model the process as being reversible, and thus use the equals sign " = " instead of the Clausius inequality in our expression for the first law equation for this process. Therefore, according to Joule's second law, which states that the internal energy of an ideal gas is solely a function of temperature:

 U = f(T) \,
Boltzmann equation tattoo (stomach)
The Boltzmann entropy equation tattooed on the stomach of man, in circa 2006, who says “he’s not very good at math”, done by Kimsu at Body Graphics, New Jersey. [12]

the internal energy change for this process will be zero and, thus, according to the first law :

dU=dQ-dW\,

the heat change dQ occurring in the process will be converted completely into the work dW of expansion:

 dQ = dW \,

To then calculate this work W we use French physicist Emile Clapeyron's 1834 expression for the pressure-volume work:

 W = \int_{v_1}^{v_2} P dv \,

Classical ideal gas law
The pressure function for this body will be the ideal gas equation, which in classical ideal gas law notation is:

P V = n R T \,

and with substitution:

 W = nRT \int_{v_1}^{v_2} \frac{1}{V} dv \,

Statistical ideal gas law
Alternatively, as was introduced in the 1900 work of German physicist Max Planck, the statistical version of the ideal gas law can be written as:

PV equals NkT

where N is the actual number of molecules in the system and k is the Boltzmann constant. With substitution, this gives the alternative work integral function:

Work integral (statistical ideal gas)

Definite integral of 1/x
The above integral has the form of what is called a "definite integral", one with upper and lower limits, which integrates according to the following rule (a rule to note which seems to have been the method in which the logarithm found its way into thermodynamics): [9]

 \int_{x_1}^{x_2} \frac{1}{x} dx = \ln \left\vert x_2 \right\vert - \ln \left\vert x_1 \right\vert \,

For more on graphical nature of this rule, see the WordPress blog: “why is the integral of 1/x equal to the natural logarithm of x?”. [14]

Integration
Therefore, using the above rule for the definite integral of 1/x, we have:

      W = NkT (\ln v_2 - \ln v_1 ) \, \,

This can be reduced, using the rule that the logarithm of a ratio is the difference of the two logarithms:

 \ln \frac{x}{y} = \ln x - \ln y \,

to the following form:

 W = NkT \ln \frac{v_2}{v_1} \,

With substitution of this into the reduced first law, step three (above), we have:

      \Delta Q = NkT \ln \frac{v_2}{v_1} \,

Then, bringing the temperature over, we have:

 \frac{\Delta Q}{T} = Nk \ln \frac{v_2}{v_1} \,

And, by definition (Rudolf Clausius, 1865), this is thus the entropy change of the body of gas during the expansion:

      \Delta S = Nk \ln \frac{v_2}{v_1} \,

Sticky pointIt is at this point that the derivation becomes a bit murky, in particular being that it makes speculations about atomic motion and position. It is assumed that the molecules of the gas will always spread themselves as uniformly as possible throughout the space available to them, because of their thermal motion which drives them hither and thither. On this so-called "principle of uniform spread", the ratio of the probability W1 that the molecules will all remain in the volume v1, to the probability W2 of their occupying uniformly the larger volume v2 which is made available to them, is given by:

 \frac{W_2}{W_1} = \frac{v_2}{v_1} \,

and with substitution:

 \Delta S = Nk \ln \frac{W_2}{W_1} \, \,

Here, we have introduced Boltzmann's notion of Wahrscheinlichkeit, pronounced var-SHINE-leash-kite, symbol W, which does not seem to have an exact English equivalent but is assumed to be a cross or blend between "probability" and "multiplicity", so to speak, referring to the number of “states” the particles of the system can be found in according to the various energies with which they may each be assigned, in quantum mechanical speak.

To note, if our integral had the form of what is called as an "indefinite integral", one without upper and lower limits, an additive constant would be used:

 \int \frac{1}{x} dx = \ln \left\vert x \right\vert + c \,

and our formulation of entropy for our expanding gas body would be:

      \Delta S = Nk (\ln v + c )  \,

or, if the nR constant is divided through:

      \Delta S = Nk \ln v + c \,
Sticky point
This version, although it seems we are missing a step in the derivation (somewhere?), seems to be the sense in which Planck first introduced the natural logarithm formulation of entropy in 1901, in his own notation as follows:

 S_N = k \log W + c  \,

and where SN is the entropy of a system or black body composed of N resonators, which, according to Planck, "depends on the disorder with which the total energy U is distributed among the individual resonators. In his 1909 lectures, Planck was calling the following equation, without the added constant, the ‘general definition of entropy’, albeit discussed in his chapter on the equation of state of a monoatomic gas: [6]

 S = k \log W  \,

This is the formula that is famously displayed on the Boltzmann tombstone (pictured above), which was erected in the 1930s, at the Central Cemetery (Zentralfriedhof), Vienna, Austria. [2]

In any event, in this "crude derivation", as Ubbelohde calls it, somewhere we are missing a few end steps and details to arrive at the exact formulation introduced (without proof) by Planck.

Note
In the decades to follow Planck's probabilistic "disorder-dependent" model of entropy seems to have been adopted with abandon to be the universal measure of entropy, often being applied in biothermodynamics and in human thermodynamics. Ubbelohde even seems to corroborate on this, in his concluding statement that "the probability increase accompanying any process is quite general, and holds for all the diversity of spontaneous happenings for which the measurement of entropy has any meaning."
Issue
The discerning theorist, however, should hold reserve in using this formulation as an absolute universal measure of entropy, for a number of obvious reasons, the foremost of which being that the equation contains the has constant R hidden in its body:

 S = \frac{R}{n} \log W  \,

When we thus see entropy calculations using this equation in non-ideal gas situations, e.g. the rubber band model of entropy, entropy measures in cells, entropy measures of systems of humans, etc., one begins to question the validity of such a calculation?

ln vs log
The use of "log", in the historical sense, refers to the natural logarithm (base e) in this formula, which seems to be the case when in 1900 German physicist Max Planck first scripted the version of the statistical entropy formula, as displaced in Boltzmann's tomb. It was only sometime afterwords (date needed), that log was used to denote the base 10 logarithm and ln the base e logarithm.

Discussion
The statement that Boltzmann defined entropy as function of the number of microstates rather than of a probability, according to some arguments, is a misrepresentation, supposedly owing to the fact that Boltzmann's writing was fairly difficult and few people understood it at the time; and that most of Boltzmann's original papers have never been translated from German, so most information about them is second or third hand interpretations. According to American physicist Robert Swendsen, Boltzmann, supposedly, interpreted the second law to mean that a system tends toward more probable macrostates. Boltzmann, supposedly, originally—incorrectly (some have argued)—identified the entropy as the probability (wahrscheinlichkeit) of a system's macrostate. Later, in the same paper, he considers an ideal gas as a special case and calculates the logarithm of the macrostate's probability as proportional to the volume in phase space. In later papers he re-identifies the entropy as the logarithm of a macrostate's probability, but as he is usually considering the special case where all microstates are equally likely, he calculates it as proportional to the volume in phase space. Readers likely misidentified this special case as the definition of entropy since he seemed to have rarely mentioned what his definition was. [15]
Moris key telegraphy systemHartley (voltage transmissions)
The keen scientist should be aware that American electronics researcher Ralph Hartley's 1927 H-formula, quantifying the information content or Boolean logic content of telegraphy messages (depicted above), has absolutely nothing to do with Austrian physicist Ludwig Boltzmann's 1872 H-theorem, a kinetic theory-based definition of heat, quantifying the movements of the particles of an ideal gas system, or with German physicsit Max Planck's 1901 logarithmic definition of entropy, an attempt to quantify heat in terms of the energy states of the particles of bodies.

Warning | Information theory
In the 1927, American electronics researcher Ralph Hartley published his "Transmission of Information" in which he outlined a way to mathematically quantify telegraph signals being sent down a telegraph wire using the following formula:

H = n \log s \,

where s is the number of symbols (or voltage levels) available at each selection, n the number of different current values, e.g. high or low, to be applied to the line, and value H is the amount of information associated with n selections for a particular system.

IssueHartley's H-formula telegraph wire information function later came to be confused with Austrian physicist Ludwig Boltzmann's 1878 H-theorem in the 1948 paper "A Mathematical Theory of Information" by American electrical engineer Claude Shannon, and forever since people have been confusing the mathematical subject of information theory with physical science subject of thermodynamics, to no end. [4]

This mess was then compounded when information theory researchers discovered Szilard's demon as described in Hungarian physicist Leo Szilard's 1929 “On the Decrease in Entropy in a Thermodynamic System by the Intervention of Intelligent Beings”, and attempt at disproving the existence of a Maxwell's demon, wherein he argued that the logarithmic interpretation of entropy could be used to determine the entropy produced during the ‘measurement’ of information the demon discerns when viewing the speeds and positions of the particles in his two compartments. [5]

Thus, in sum, these two of these topics (mathematics of telegraphy and hypothetical demons) have justified to many that entropy has an information interpretation. Some scientists, such as physical chemist Arieh Ben-Naim, have gone to extreme lengths to embed the assertion that entropy or rather heat is simply information that is measurable in units of bits in stead of joules, and have even proclaimed that the entire SI unit system should be thrown out the window and that the entire field of science all the way down to the sub-atomic and Planck length scale should be re-written in units of bits. This is one of the most absurd things in all of science, yet one that has many pied piper followers.

A proof or disproof of the relation of entropy to information, in a modern sense, requires a formal treatise, which has not been done. The modern reader should be warned that use of logarithms to measure entropy for systems other than ideal gas systems, is mostly a baseless conjecture.

Human thermodynamics
In human thermodynamics, owing to the a mixture of both information and multiplicity view of entropy, Boltzmann's entropy formula has since been used to model the entropy of any number of anthropomorphic quantities or qualities, with unabandon, in nearly every scenario or situation conceivable. A random example, one of many, is American physicist Edwin Jaynes' 1991 article “How Should we Use Entropy in Economics”, in which is he introduces some tentative outlines of how an economic system can be modeled as a thermodynamic system, such as how Willard Gibbs' 1873 graphical thermodynamic ideas on entropy convexity can be mixed with logarithmic interpretations of what he calls "economic entropy", which he defines as follows:

Economic entropy (Jaynes) n

where (X,Y,Z ...) are some type of macroeconomic variables, which he doesn't really go into, and W is the multiplicity factor of the macroeconomic state, which he describes as the "number of different microeconomic ways in which it can be realized", whatever that means, and tries to connected in some way to French mathematician Rene Thom’s 1960s catastrophe theory and the thermodynamics of a ferromagnet and Curie temperature. [11]

The justification for these types of theories or models, however, are near baseless, in that multiplicity approximations of entropy are only good for ideal gas systems, generally, and information interpretations of entropy are based on Maxwell demon and telegraph wire proofs, which are fictionalized abstractions, having almost nothing to do with Clausius' logic of 'transformation equivalents' (entropy) or uncompensated transformations (entropy change). [7]

References
1.
(a) Planck, Max. (1901). “On the Law of Distribution of Energy in the Normal Spectrum.” Annalen der Physik, Vol. 4, pg. 553 ff.
(b) Muller, Ingo. (2007). A History of Thermodynamics - the Doctrine of Energy and Entropy, (ch. 4: "Entropy as S = k ln W," pg. 101-02). New York: Springer.
2 (a) Planck, Max. (1901). “
On the Law of Distribution of Energy in the Normal Spectrum,” Annalen der Physick, Vol. 4, pg. 553 ff.
(b) Schmitz, John E.J. (2007). The Second Law of Life: Energy, Technology, and the Future of Earth as We Know It, (pg. 83). William Andrew Publishing.
(c) Boltzmann equation – Eric Weisstein’s World of Physics (states the year was 1872).
3. Photo of Boltzmann tomb (Vienna, 2005).

4.
Hartley, Ralph V. L. (1928). “Transmission of Information”, Bell Technical Journal, July, pgs. 535-64; Presented at the International Congress of Telegraphy and telephony, lake Como, Italy, Sept., 1927.
5. Szilárd, Leó. (1929). “On the Decrease in Entropy in a Thermodynamic System by the Intervention of Intelligent Beings” (Uber die Entropieverminderung in einem thermodynamischen System bei Eingriffen intelligenter Wesen), Zeitschrift fur Physik, 53, 840-56.
6. Planck, Max. (1909). Eight Lectures on Theoretical Physics (1: Reversibility and Irreversibility, pgs. 1-20; 2: Thermodynamic States of Equilibrium in Dilute Solutions, pgs. 21-40; 3: Atomic Theory of Matter, pgs. 41-57; 4: Equation of State for a Monoatomic Gas, pgs. 58-69; 5: Heat Radiation: Electrodynamic Theory, pgs. 70-86; 6: Heat Radiation: Statistical Theory, pgs. 87-96; 7: General Dynamics: Principle of Least Action, pgs. 97-111; 8: General Dynamics: Principle of Relativity, pgs. 112-). (Trans. A.P. Wills.). BiblioBazaar, 2009.
7. Gladyshev, Georgi. (2010). “On the Thermodynamics of the Evolution and Aging of Biological Matter.” Journal of Human Thermodynamics, 6: 26-38.
8. Ubbelohde, Alfred René. (1947). Time and Thermodynamics (pgs. 83-87). Oxford University Press.
9. Definite integral – Math World.
10. Indefinite integral – Math World.
11. Jaynes, Edwin. (1991). “How Should we Use Entropy in Economics: Some Half-baked Ideas in Need of Criticism”, Feb 01.
12. Boltzmann equation (stomach tattoo) Capo3433, 19 Dec 2006 – News.BMEzine.com.
13. (a) Boltzmann, Ludwig. (1877). “On the Relation of a General Mechanical Theorem to the Second Law of Thermodynamics” (“Uber die Beziehung eines Allgemeine Mechanischen Satzes zum zweiten Hauptsatze der Warmetheorie”), Sitzungsberichte Akad. Wiss., Vienna, Part II, 75: 67-73.
(b) Kragh, Helge and Weininger, Stephen J. (1996). “Sooner Science than Confustion: the Tortuous Entry of Entropy into Chemist” (abs), Historical Studies in the Physical and Biological Sciences, 27(1): 91-130.
14. Author. (2011). “Why is the integral of 1/x equal to the natural logarithm of x?”, ArcSecond.WordPress.com, Dec 17.
15. Swendsen, Robert H. (2006). “Statistical Mechanics of Colloids and Boltzmann’s Definition of the Entropy” (abs), American Journal of Physics, 74(3): 187.
16. (a) Einstein, Albert. (1910). “The Theory of the Opalescence of Homogeneous Fluids and Liquid Mixtures near the Critical State” (“Theorie der Opaleszenz von homogenen Flüssigkeiten und Flüssigkeitsgemischen in der Nähe des kritischen Zustandes” (pdf)), Annalen der Physik, 33:1275-1298.
(b) Pais, Abraham. (1982). Subtle is the Lord (pg. 72). Oxford University Press.
(c) Cohen, Ezechiel G.D. (2005). “Boltzmann and Einstein: Statistics and Dynamics – An Unsolved Problem”, Pramana Journal of Physics (abs) (pdf), 64(5):635-42.
(d) Tsallis, Constantino, Gell-Mann, Murray and Sato, Yuzuru. (2005). “Asymptotically Scale-Invariant Occupancy of Phase Space Makes Entropy Sq Extensive” (Ѻ), Proceedings of the National Academy of Science, 102(43):15377-382.
(e) Bais, F. Alexander and Farmer, J. Doyne. (2007). “Physics of Information” (pdf), Santa Fe Institute working paper; in: Philosophy of Information (quote, pg. 649). Elsevier, 2008.
(f) Klyce, Brig. (2013). “The Second Law of Thermodynamics” (Ѻ), Panspermia.org.

Further reading

● Braunstein, Jerry. (1969). “States, Indistinguishablity, and the Formula S = k ln W in Thermodynamics” (abs), J. Chem. Educ. 46(11): 719.
● Johnson, Eric. (2018). Anxiety and the Equation: Understanding Boltzmann’s Entropy (Amz). MIT Press.

External links
Boltzmann’s entropy formula – Wikipedia.

TDics icon ns