|Peter Fleissner, Wolfgang Hofkirchner (1)
Entropy and Its Implications for Sustainability
In: Dragan, J.C., Demetrescu, M.C., Seifert, E.K. (Eds.), Implications and Applications of Bioeconomics, Proceedings of the Second International Conference of the E.A.B.S., Palma de Mallorca, 1994, Edizioni Nagard, Milano 1997, 147-155
The paper deals with the concept of entropy in different contexts. The mathematical description is the common denominator for entropy in all off these contexts: in physics entropy is a measure for the usability of energy (Clausius' macro-concept as well as Boltzmann's combined micro-macro-approach); in communication theory (Shannon) entropy is a measure for the degree of surprise or novelty of a message, in biology and sociology entropy is closely connected with the concept of order and structure. Thus, it is important to clarify the dependence of the concept used on the context of application. Therefore we have to analyze how the above concepts can be linked to economics and environmental sciences. Although it is evident that all economic processes of production, distribution and consumption necessarily transform free energy into dissipated heat it still has to be asked whether the second law of thermodynamics is really restricting economic activities and thus if Georgescu-Roegen's fourth law holds.
The paradigm of self-organization shows how the evolution of dynamic systems can be explained in a scientific way in spite of the law of entropy. Guided by this idea the feasability of a sustainable society has to be investigated.
The roots of entropy in thermodynamics
Since the times of Clausius (1822-1888) the concept of entropy has become the subject of intensive debates. These discussions were to a great extent prompted by a radical change in the world view of physics. While classical mechanics as well as classical quantum mechanics had perceived the world as being reversible (there is the same result if one solves the basic equations towards the future or the past), entropy focuses on the irreversibility of physical processes. It represents a quantitative measure for the tendency of a closed system towards thermodynamic equilibrium. Thermal gradients are leveled out, while entropy is growing. Consequently one had to accept that most of real activities are connected with an overall increase in entropy. Clausius' analysis of the Carnot (1796-1832) process indicated that reversibility still may be true for idealized processes, but any real activity has to be regarded as being irreversible. Thus follows his statement of the Second Law of Thermodynamics: The entropy of a closed system can only increase, but never decrease. To state it in the economic word game: "There is no such thing like a free lunch". Any activity degrades the energy intake to a certain degree and necessarily is linked to waste heat. In the long term perspective this means the "heat death of the universe".
Schroedinger (2) and later on Prigogine (3) and his school qualified the Second Law of Thermodynamics by stressing that its range of application is true for closed systems only. They proved the opposite tendency (an entropy decrease over time) for an open system which eats up energy of low entropy and dissipates energy of higher entropy to its environment. The overall system does not violate the Second Law. Nevertheless more unlikely material structures can arise locally. They are the basis for the development of any higher forms of evolution which counteract the overall tendency of the entropy increase. In addition to Schrödinger, who had formulated a more static view, Ilya Prigogine was able to present self-organizing mechanisms which definitely produce higher ordered macro structures. An element of uncertainty still remains which seems to be essential for evolutionary processes: While the occurrence of the macro structure can be predicted with certainty, its precise location in space and time remains fuzzy.
The next step of the debate started with Boltzmann (1844-1906) who, by analyzing heat as a statistical phenomenon, was able to base the (macro-) concept of entropy on the possible (micro-) states of the particles in a thermodynamic system. His famous function for entropy
S = k log W,
where k = 1.38 10-23 J/¡K is the Boltzmann constant, and W - roughly speaking - is the number of possible micro-states which correspond to one macro-state, reaches a maximum if all the micro-states are equally distributed. Micro-states in this respect refer to particles which are in a particular state of energy or show a particular impulse (mass times velocity). They are energy-related. The investigation of the consequences of the different distributions of the micro-states in the phase space is one of the main goals of statistical thermodynamics (4). Each of these states can be interpreted as being representative of a "region" in an abstract "space". By his definition of entropy Boltzmann established a link between the heat- (and thus energy-) related notion of entropy and a measure of structure of abstract micro-states.
Entropy becomes a measure of structure
In their famous booklet "The Mathematical Theory of Communication" Shannon and Weaver (5) have exploited the second meaning of entropy by using the identical mathematical formulation to characterize the average information of a message transmitted from a source to a sink through a channel, thus giving rise to a source of confusion which has not come to an end yet. Entropy in this context reaches its maximum if each of the single pieces of information can occur with equal probability, but there is no connection to physical reality. The "micro-states" no longer represent impulses or energy levels. The only thing in common is the name and the mathematical formula.
It may be characteristic for the contemporary situation that the choice of the term entropy was suggested by John von Neumann, who gave Claude Shannon the advice to use this concept. The reason: this term would promote the discussions of his theory "because nobody would understand its meaning" (6).
Consequently the US physicist Jeffrey Wicken stated: "As a result of independent developments in thermodynamics and in information theory, today there exist two 'entropies' in science, this is one too many" (7).
Since then, an inflation of the concept of entropy can be found in different branches of science and social science. We shall just give a few examples: In mathematical statistics and econometrics the formula of entropy in its logarithmic shape is used to define a goal function for maximum likelihood estimators (8), in sociology it is used to measure social equality, in political science to describe voting behavior (9), in economics to characterize market activities (10), in the theory of neural networks it is applied as a tool for finding the global minimum instead of local minima by simulating an entropy function using the method of "simulated annealing" (11), in biology the entropy measure is used as a measure of complexity (12) of organisms (by reversing the sign) etc.
Georgescu-Roegen has used the entropy concept to construct a Fourth Law of Thermodynamics where he extended the entropy concept to matter and arrived at very pessimistic conclusions (13): There is no possibility of a complete recycling of matter once it is dispersed. He states that in a system like the Earth (nearly no exchange of matter with the environment) mechanical work cannot proceed at a constant rate forever, or, there is a law of increasing material entropy. This means that it is not possible to get back all the dissipated matter of, for instance, tires worn out by friction.
Of course such a statement could not pass unchallenged. The inherent contradiction of the Fourth Law and the Second Law was recently reveiled (14). We will come back to Georgescu's Law later on.
By Boltzmann's correct step to link the energy oriented measure of entropy to an energy-structural measure of micro states many epigones applied the entropy concept to other dimensions of reality without any further look whether linking the two aspects in the new field of application remains correct as well. Order of any material structure on the macroscopic level was more and more identified with negentropy, and its connex to available energy was taken for granted. Thus it seems no surprise that Werner Ebeling, the "Prigogine of the East", comes to the following conclusion: "It has to be mentioned that for economic-technological processes the quantification of flows of entropy is not yet solved today" (15). This can be illustrated by a recent paper by T. H. Dung (16). In describing consumption and production he separated entropy completely from energy or heat. The term "energy" is not used even once throughout his paper. The link to micro-states is cut, too. Entropy in Dung's context means some state of disorder of macro-structures. The question remains open why, then, he still believes in the applicability of the Second Law of Thermodynamics to his concept of entropy?
The background of the unending controversies
While it seems correct that the evolution of any ordered material structure needs free energy and creates waste heat, the reverse statement is not true. It is not possible to regain the energy used during the construction of a house, although an ordered material structure was formed. On the contrary, dismantling the house once again needs free energy (to blow it up or to brake it down into bricks and other parts). The order produced (falsely identified with low thermodynamic entropy) does not give rise to any amount of free energy. The same holds for the familiar Shannon-entropy. There is no longer any connection to thermodynamics, but merely pure structures of signals, devoid of any material basis.
Nevertheless one link remains: If one wants to realize a physical structure which carries information (let's say 1 bit), the minimum effort of energy can be computed by means of the well known Boltzmann entropy (17):
S(1 bit) = k log 2.
Since entropy on a thermodynamic level can be described by Clausius' formula
S = Q/T,
where Q represents a heat difference which can be expressed in energy units, and T represents temperature measured in ¡Kelvin, one ends up with the amount of energy to create the smallest possible measurable difference in matter.
E(1 bit) = k T log 2.
As one can see from the formula, the lower the temperature the lower will be the level of energy needed. Thermal noise has to be overcome by such an amount of energy that the microstate of the particle to be coded remains fixed. It is even more speculative to extend the formula further to the amount of mass necessarily connected with one bit of information. One could apply Einstein's energy equivalent E = mc**2 to the formula above such that
m(1 bit) = (k T log 2)/c2
represents the necessary mass equivalent to encode 1 bit in a material structure.
Maybe these formulae will be real restrictions for chip-technologies if they will shrink towards orders of magnitude of the size of elementary particles. At the contemporary level of technology (the current memory chips for computers, e. g. the 64MBit chip on an area of about 1cm2, carries flip-flops of an average length of about 1/1000000 m = 10**-6 m. To compare this figure with the size of nuclei: The Bohr-diameter of the oxygen atom is of the length of 10**-10 m) they are simply not yet applicable.
Entropy and techno-economic processes
In order to describe all our actual economic activities (production and consumption of consumer goods and services and/or the production and use of means of production like machinery, construction and intermediary goods) it seems sufficient to measure the activities by using the concept of available or free energy and waste heat. The majority of economic processes uses all the energy intake for the production of the desired commodity (or material structure) or service (a material process) and, in the end, transforms the energy into waste heat. The exceptions are the production processes in food and agriculture, the conversion processes in the energy sector, and, quantitatively less important, in chemistry. In the above cases the output of production can be used as an energy source again, either for consumption (where chemical energy will be used to maintain a temperature gradient between the corpse and the environment of many mammals and human beings, but finally that gradient is transformed into waste heat) or for starting new production activities. If enough energy is available no restrictions to the production process will apply directly by the Laws of Thermodynamics.
In fact Georgescu-Roegen did not base his Fourth Law on theoretical grounds, but on empirical data which weaken the persuasive power of his argument. He connected his Fourth Law to the limited availability of energy to humankind. While for the development of humankind it is impossible to refer to nonrenewable energy resources in the long run one has to reconstruct the energy base towards renewable sources, namely solar energy. In the quoted article he repeats the findings by SOLAREX Corporation that the energy input for the construction of photovoltaic cells is higher than their output over their lifetime. His main argument consists in the rather outdated information by SOLAREX whose final report on "The Energy Requirement for the Production of Silicon Solar Arrays" stated that "the harnessed energy is insufficient for reproducing the array even if all the materials necessary are supplied gratis".
Implicitly Georgescu-Roegen expects humankind to run short of energy and therefore matter cannot be recycled.
More recent sources unanimously tell us a different story: Photovoltaic devices need 3 to 7 years to reproduce their energy consumption, while their lifetime is about 20 years (18), thermal collectors of solar energy have a pay-back period of energy of 2 to 5 years only (the lifetime is, again, about 20 years) (19). The pay-back period depends on the amount of recycled aluminum used for the device.
What seems to be true for the energy surplus seems to be true for the economic surplus as well. Recently the US company United Solar Systems, Virginia, announced a break-through in the production of photovoltaics. While the energy produced by earlier technologies cost apprx. 50 cents per kilowatt, the recent technology brings costs down to less than 15 cents. Nicholas Lenssen, energy expert of the Worldwatch-Institute in Washington, D. C.: "For the fabrication of these photovoltaic cells by a kind of sputtering the metal to glass less resources are needed and less waste is produced ... The efficiency of these cells probably will be further improved .. Prices will decline and there will be more funds for research." (20)
The photovoltaic use of solar energy is not the only one possible. Side by side we find the solar-chemical and the solar-thermic option. Solar-chemical processes would either allow for a hydrogen based-economy which exploits the photolytic effect (21) (breaking up water molecules into hydrogen and oxygen by photons - in analogy to electrolysis), or the biomass-production for renewable energy resources. It can be shown that mere 5% of the biomass would be sufficient for satisfying the world's energy demand. At the moment about 1.5% of the biomass is used for human and animal food, 2% is consumed as fibers or wood. While the photolytic effect does not have any applications in our economy yet, the solar-thermic option locally plays an increasing role. Particularly for low (central heating and hot water) and for medium temperatures (tower concepts based on light concentrating mirrors e. g. in Francia, Italy, or in Barstow, California, or Ocean-Thermal-Electric Conversion = OTEC) practical solutions are available (22). All the solar options show another additional advantage: They will not increase carbondioxide in the atmosphere.
From the above the following understanding can be derived (23):
Running out of free energy is no actual threat to the economy
Such empirical findings would break up Georgescu-Roegen's Fourth Law as well as the pessimistic mood about the use of available energy. As far as we can see production on Earth now and in the foreseeable future is not restricted by the Laws of Thermodynamics. Nevertheless difficulties may arise for other reasons, for example by changing the major feedback loops of our biosphere, or by poisoning the atmosphere etc. but there is no limit with respect to thermodynamic entropy.
Methodologically speaking, the manifold laws determining the scope of human action compose a hierarchy of encapsulated cones of possible developments, the cosmic evolution being a chain of intertwined cycles of ever more sophisticated, qualitatively different self-organizing systems. The laws of physics apply to the entire universe and cannot be disregarded. They define the ultimate boundaries that cannot be transcended by any feasible development. However, they may be - and have been - prerequisites for realizing possibilities of further developments. These new developments constitute only a fraction of the range of physically possible developments. The phenomena belonging to the new cone are subject to laws that are specific to these phenomena exclusively and are not applicable to phenomena outside of this cone. Though these phenomena are restricted to the existence of well-defined conditions essential for this segment they increase the variety in which reality appears.
Biotic phenomena are governed by biotic laws that are far more specific to them than laws ruling both inorganic and living matter. Societal developments, in turn, are made possible by biotic laws, but not fully determined by them. They are subordinate to more specific laws, and the range over which they are valid strictly covers the cone of cultural phenomena. So the boundaries of each cone are far more restrictive than those in which the cones are embedded. The more general the laws are, the less specific they are in explaining any particular phenomenon.
The laws concerning the degradation of energy in a physical sense are applicable to every open and dynamic system with regard to the physical aspects of the system. But the laws do not determine the specific way in which a dissipative structure, a living system or a human society obeys to them. If the law of entropy holds for the universe as a whole, it is by far the most distant boundary mankind will ever come close to reaching.
There are urgent challenges in much greater proximity. The human race so far has not yet learned to control its interference in the biosphere, thereby causing a lot of environmental problems such as a poisoned atmosphere, a change of natural cycles of matter and energy caused by man-made substances, the possible depletion of the ozone layer, the dying of the woods, the greenhouse effect, land degradation, reduced biodiversity and so forth. Most of these problems arise from the fact that humans overtax the buffer capacity beyond which given natural global cycles and feedback loops threaten to stop working. These problems are in no way entropic, but bio-geo-chemical. Biotic laws give necessary conditions to be fulfilled if metabolism shall take place but they do not define in particular how it has to be done and which strategy the organisms will pursue. Likewise there do not exist any definite ways in which mankind can meet the challenges. Instead there are a variety of ways to solve environmental problems depending on the state of historically accumulated knowledge. The solution of environmental problems therefore cannot be reduced to physical or biological considerations - it remains a societal, that is mainly an economic and a political task. The question arises how production can be designed so that it is in harmony with our natural environment - and this is a political question.
So we arrive at the understanding that in principle plenty of usable energy is available in our environment. There is no need to believe that by the Laws of Thermodynamic humankind will come to an end. All these effects point to the fact that it is our task to restructure our economy towards a sustainable one. The question remains whether the social decision process can be directed towards the availability of energy resources to everybody all over the world.
1 c/o Institut für Gestaltungs- und Wirkungsforschung, Department for Social Cybernetics, Vienna University of Technology, Möllwaldplatz 5, A-1040 Vienna, Austria
6 my translation, for the original see Sietmann, R., Schöpfer der Kommunikationstheorie - C. Shannons Schrift sorgt auch 50 Jahre nach der Veröffentlichung für Diskussionen, in: VDI-Nachrichten 45, Nr. 27, VDI-Verlag, Düsseldorf 1991. We have learned about this nice piece of information at Prof. Hans-Joachim Dubrau's lecture, TU-Dresden, at the Workshop "Der Informationsbegriff aus interdisziplinörer Sicht", March 1st - 4th, 1994, at the Cottbus Technical University, Cottbus, FRG.
13 Georgescu-Roegen, N., Thermodynamics and We, the Humans, in: Dragan, J. C., E. K. Seifert, M. C. Demetrescu (eds), Entropy and Bioeconomics, Proceedings of the First International Conference of the E.A.B.S, Nagard, Milano 1993: 184-201.
15 "Es muß jedoch darauf hingewiesen werden, daß das Problem einer Quantifizierung von Entropieflüssen für ökonomisch-technische Prozesse heute noch nicht gelöst ist". Source: Ebeling, W., Das Neue in der natürlichen und in der technischen Evolution, in: Parthey, H. (Hg.), Das Neue - Seine Entstehung und Aufnahme in Natur und Gesellschaft, Akademie Verlag Berlin 1990: 19-44, 38.
17 In this case the concept of entropy which is useful for a large amount of particles has been extended to its limit. One particle only is considered. There the question arises if this application of a statistical concept to a system of only one particle with two states (yes or no) will not create the same problems like the well known problems of quantum mechanics at the level of one particle (a photon crossing a double slit etc.).
23 for similar conclusions see Stephan, G., Entropie, Umweltschutz und Rohstoffverbrauch: Ein thermodynamischer Ansatz in der Umweltökonomik, in: M. Hauff, U. Schmid (eds), Ökonomie und Ökologie, Schöffer Poeschel Verlag, Stuttgart 1992:2 75-292.