Edited by Emanuel Lotem
Copyright 2013 Kafri Nihul VeHashkaot Ltd.
All rights reserved.
ISBN-13: 978-1482687699
eBook ISBN: 978-1-63003-433-7
Library of Congress Control Number (LCCN): 201390681
Life is much more successfully looked at from a single window after all.
-F. Scott Fitzgerald, The Great Gatsby
Contents
Acknowledgements
This book is dedicated to the late Ofra Meir, our sister and sister-in-law, who had taken part in numerous discussions before the actual writing of this book began, and passed away untimely before editing the Hebrew manuscript.
We would like to thank the editor of the this book, the indefatigably intense and knowledgeable Emanuel Lottem, who left no stone unturned, fixed, polished, checked, and even added examples of his own. He belongs to a rare breed in danger of extinction.
One of us (O. K.) would like to thank his colleagues for discussions in which much learned was before writing this book: Rafi Levine, Joseph Agassi, Yehuda Band, Yariv Kafri, and especially Joseph Oreg, who even corrected a derivation.
We would like to thank those that read the original manuscript and made significant suggestions: Alina Tal, Joshua Maor, Daniel Rosenne, Noa Meir-Leverstein, Rachel Alexander, Zvi Aloni, Moshe Pereg, Galit Kafri, and Ehud Kafri. Special thanks to Dan Weiss for the numerical calculations and figures.
Mathematician Arnold Sommerfeld (18681951), one of the founding fathers of atomic theory, said about thermodynamics, the branch of physics that incorporates the concept of entropy:
Thermodynamics is a funny subject. The first time you go through it, you dont understand it at all. The second time you go through it, you think you understand it, except for one or two small points. The third time you go through it, you know you dont understand it, but by that time you are so used to it, it doesnt bother you anymore.
This is true of many more things than we would like to imagine, but it is especially true of our subject matter, namely entropy.
Entropy is a physical quantity, yet it is different from any other quantity in nature. It is definite only for systems in a state of equilibrium, and it tends to increase: in fact, entropys tendency to increase is the source of all change in our universe.
Since the concept of entropy was first discovered during a study of the efficiency of heat engines, the law that defines entropy and its properties is called the second law of thermodynamics. (This law, as its name implies, is part of the science called thermodynamics, which deals with energy flows.) Despite its nondescript name, however, the second law is actually something of a super-law, governing nature as it does.
It is generally accepted that thermodynamics has four laws, designated by numbers, although not necessarily in order of importance, and not in order of their discovery.
- The first law, also known as the law of conservation of energy, states that the energy in an isolated system remains constant.
- The second law is the one with which this book is concerned, and we shall soon go back to discuss it.
- The third law states that at the temperature of -273C, an object is devoid of all energy (meaning that it is impossible to cool anything to a temperature lower than that).
- There is another law, called the zero law, which deals with the meaning of thermal equality between bodies.
The second law of thermodynamics concerns the inherent uncertainty in nature, which is independent of forces. Because the nature of particles and the nature of forces are not intrinsically parts of the second law, we may conclude that any calculation basing on this law that will be made in some area of physics, will also apply to other areas as well.
The main quantity behind the second law of thermodynamics is entropy which is a measure of uncertainty. The forces deployed by entropy are not as strictly determined as those described by other laws of nature. They are expressed as a tendency of changes to occur, in a manner somewhat analogous to human will. In order for entropy to increase, energy must flow; and any flow of energy that leads to change, increases entropy. This means that any action in nature whether natural or man-made increases entropy. In other words, entropys tendency to increase is the tendency of nature, including us humans, to make energy flow.
The importance of entropy is immeasurable, yet it is a safe bet that the average educated person, who is familiar without doubt with such terms as relativity, gravity, evolution and other scientific concepts, may never have heard of it, or else misunderstands it. For even those familiar with the concept of entropy admit that while they may understand its mathematical properties, they cannot always comprehend its meaning. So what, then, is entropy? We hope that after reading this book you will understand what it is, why it is significance, and how it is reflected in everything around us.
This book has two main sections:
The first section deals with the historical development of the second law of thermodynamics, from its beginning in the early 19 th century to first decades of the 20 th . Here, the physical aspects of the concept of entropy will be discussed, along with the energy distributions that arise due to entropys propensity to increase.
The second part of this book deals with the effects of entropy in communications, computers and logic (studied since the beginning of the 1940s), along with the influence entropy has had on various social phenomena.
The first section is concerned with thermodynamics, which is, in essence, the study of energy flows. Every physical entity, whatever its nature, ultimately involves energy. We consume energy in order to live and we release energy when we die. Our thoughts are energy flowing through our neurons. We put in energy to communicate and consume energy to keep warm. Since energy cannot be created (or destroyed), this means it has to come, or flow, from elsewhere. Why does energy flow, and what principles guide it? Well, the reason energy flows is this: there is a theoretical quantity that is not directly measurable, called entropy, and it tends to increase. In order for it to increase, energy must flow.
The first person who quantified this flow of energy, in 1824, was a young French aristocrat and engineer called Sadi Carnot. He understood that in order to obtain work for example, to apply a force in order to move an object from one place to another in space heat must flow from a hot area to a colder one. He calculated the maximum amount of work that can be obtained by transferring a given amount of energy between objects having two different temperatures, and thus laid the foundations of thermodynamics.
About forty years later, in 1865, physicist Rudolf Clausius, a fanatic Prussian nationalist, formulated the laws of thermodynamics and defined a mysterious quantity that he named entropy. Entropy, as Clausius defined it, is the ratio between energy and temperature.
About twelve years later, in 1877, an eccentric Austrian, Ludwig Boltzmann, and an unassuming American, Josiah Willard Gibbs, derived an equation for entropy concurrently but independently. They described entropy as the lack of information associated with a statistical system. At that point, the second law obtained some unexpected significance: it was now understood that uncertainty in nature has a tendency to increase.
At approximately the same time, Scottish estate-owner James Clerk Maxwell, an illustrious figure in the history of modern science comparable only to Galileo, Newton, or Einstein calculated energy distribution in gases and demonstrated how energy is distributed among the gas particles in accordance with the second law.
Next page