In computer science, a bit refers to one binary digit, i.e. a 0 or a 1, in Boolean algebra, which in a physical sense is typically represented by a low or high voltage, respectively, as stored in the geometry of semiconductor device, such as a flip-flop circuit or a relay, readable using logic gates. [1]
Etymology
The term "bit", shortened version of the term "binary digit", was introduced in 1948 by American electrical engineer Claude Shannon who was utilizing a suggestion made earlier by American mathematician John Tukey. [4] Shannon later estimated the "entropy", defined via a mixture of statistical mechanics and information transmission arguments, of written language to be 0.6 to 1.3 bits per character.
Zeilinger’s principle
A related term is Zeilinger's principle, which states that any elementary system carries just one bit of information. This principle was put forward by Austrian physicist Anton Zeilinger in 1999 and subsequently developed by him to derive several aspects of quantum mechanics. Some have reasoned that this principle, in certain ways, links thermodynamics with information theory. [2]
Cessation thermodynamics
In 1998, in the context of cessation thermodynamics, American physician Gerry Nahum presented a derivation to quantify consciousness energetically in which he stated that in the consciousness: [3]
“The change in heat that has to be liberated per bit of information lost is about three times ten to the minus-twenty-one joules.”or: