In thermodynamics, order, in contrast to disorder, in a system or structure is often correlated to a low measure of entropy. A perfectly ordered crystalline structure, at absolute zero of temperature, for instance, is considered to have a measure of zero entropy, the lowest value of entropy.

Overview
In the 1862, Rudolf Clausius was associating entropy with the aggregation (coming together) and disgregation, i.e. dis-aggregation (coming apart) of bodies.

In 1936, British biochemist and historian Joseph Needham, in his Order and Life, stated: [3]

“For the astronomers and the physicists the world is, in popular words, continually ‘running down’ to a state of dead inertness when heat has been uniformly distributed through it. For the biologists and sociologists, a part of the world, at any rate is undergoing a progressive development in which an upward trend is seen, lower states of organization being succeeded by higher states. For the ordinary man the contradiction, if such it is, is serious, because many physicist, in expounding the former of these principles, the second law of thermodynamics, employ the word ‘organization’ and say it is always decreasing.”

Needham followed up his own query with the publication of the 1942 article “Evolution and Thermodynamics: a Paradox with Social Significance”. [4]

In 1944, Austrian physicist Erwin Schrodinger, in his What is Life?, chapter "Order, Disorder and Entropy", gave the following equivalence for systems "in a uniform environment":

High (↑) levels of orderliness = Low (↓) levels of entropy

This stated equivalence, for systems that are not isolated and are not made of ideal gas phase particles, i.e. those that are not bound by the Boltzmann chaos assumption, however, becomes rather non-logical as system complexity increases. This becomes apparent in the concept of material entropy or if one tries to model human social system organizations via entropy qualifications, as in measures of "human entropy" or "social entropy". Schrodinger, went on to argue that, via the statistical thermodynamics of Austrian physicist Ludwig Boltzmann and American engineer Willard Gibbs, disorder is given by the following expression:

 \text{entropy} = k \log D \,

where k is the Boltzmann constant and D is a "quantitative measure of the atomistic disorder of the body in question". [5]
On this basis, Schrödinger states that the reciprocal of the measure of disorder, 1/D, can be regarded as a direct measure of order. Subsequently, according to Schrödinger, since the logarithm of 1/D is minus the logarithm of D, according to the rule:

 \log_b(x^y) = y \log_b(x) \!\,

then, in his own words, we can write Boltzmann's equation as:

 -(entropy) = k \log (\frac{1}{D})

Hence, via this derivation, according to Schrödinger:

“The awkward expression ‘negative entropy’ can be replaced by: entropy, taken with the negative sign, is itself a measure of order.”

Thus, as he states, "the device by which an organism maintains itself stationary at a fairly high level of orderliness ( = fairly low level of entropy) really consists in continually sucking orderliness from its environment." [1]

n 1973, American physicist Robert Lindsay argued that “the statistical interpretation of entropy and the principles of thermodynamics leads at once to another important point of view, namely, that of the role of order in the direction of naturally occurring processes.” He continues, “we can therefore introduce another interpretation of entropy: increase in entropy means a transition from a more orderly state to a less orderly state”, a principles which he oversimplifies to be applicable to all systems. In particular, he concludes that the second law of thermodynamics states:

“In any naturally occurring process the tendency is for all systems to proceed from order to disorder, and the maximum entropy of Clausius is the state of complete disorder or thorough randomness, out which no return to order is practically possible.”

This sloppy argument is an example of unsubstantiated over-extrapolation of an ideal gas system logic to an all-system logic.

References
1. Schrödinger, Erwin. (1944). What is Life? (ch. 6 “Order, Disorder, and Entropy). pgs. 67-75 Cambridge: Cambridge University Press.
2. Lindsay, Robert B. (1973). The Role of Science in Civilization, (pg. 161-62). Westport: Greenwood Press.
3. (a) Needham, Joseph. (1936). Order and Life. New Haven: Yale University Press.
(b) Layzer, David. (1990). Cosmogenesis: the Growth of Order in the Universe, (section: “Biological Order”, pgs. 28-32). Oxford University Press.
4. Needham, Joseph. (1942). “Evolution and Thermodynamics: a Paradox with Social Significance” (abs), Science and Society, 352-75.

TDics icon ns