The measure of "quality or grade of energy" (L. BRILLOUIN, 1962, p.116).
Negentropy is, of course, the opposite of entropy, and represents the availability of useful energy.
It is also the opposite of entropy taken as statistical uniformity (so-called disorder). Negentropy can thus be interpreted as a measure of order. This led BRILLOUIN to connect the concept of negentropy with the existence of information (see in-formation).
BRILLOUIN writes, for example: "Whether he be a Maxwell's Demon or a physicist, the observer can obtain a bit of information only when a certain quantity of negentropy is lost. Information gain means an increase of entropy in the laboratory. Vice-versa, with the help of this information the observer may be in a position to make an appropriate decision, which decreases the entropy of the laboratory. In other words, he may use the information to recuperate part of the loss in negentropy, but the over-all balance is always an increase in entropy (a loss in negentropy)… This leads to the conclusion that both quantities are of similar nature" (1968, p.161).
The subject is however, as BRILLOUIN himself emphasizes, a thorny one: Negentropy cannot be simply equated with information. For a thorough discussion see BRILLOUIN's and SHANNON & WEAVER's works.
K. KRIPPENDORFF evaluates the term as "a non-recommendable near synonym for information. (It) has created considerable confusion suggesting that information processes negate the 2d. law of thermodynamics by producing order from chaos" (In view of the recent development of "chaos theory", KRIPPENDORFF would now possibly write "disorder". He proposes as a "meaningful interpretation of negentropy… that it measures the complexity of a physical structure in which quantities of energy are invested, e.g. buildings, technical devices, organisms but also nuclear reactor fuel, the infrastructure of a society" (1986, p.52).
From a more specific systemic viewpoint, V.G. DROZIN wrote: "… negentropy is a measure of complexity of a system expressed in the number of its parts or subsystems and in their variety" (1975, p.9). This seems somehow insufficient: constraints over interrelations also are aspects of negentropy taken in this sense.
- 1) General information
- 2) Methodology or model
- 3) Epistemology, ontology and semantics
- 4) Human sciences
- 5) Discipline oriented
To cite this page, please use the following information:
Bertalanffy Center for the Study of Systems Science (2020). Title of the entry. In Charles François (Ed.), International Encyclopedia of Systems and Cybernetics (2). Retrieved from www.systemspedia.org/[full/url]
We thank the following partners for making the open access of this volume possible: