BARBELITH underground
 

Subcultural engagement for the 21st Century...
Barbelith is a new kind of community (find out more)...
You can login or register.


Entropy

 
 
Private Frost
22:55 / 20.04.04
I'm sure this question has been knocked around to the point of being taboo, but I have to ask: what is entropy? It seems to pop up occasionally in philosophical conversations/debates, and I've gone through my life hearing this word and possibly never really understanding what it means. It's gotten to the point where I just "smile and nod", because I question the relevance of the term (as used by party A in argument or statement to party B), but have not enough concrete understanding to point it out.

Anyone care to take a shot at a good explanation?
 
 
Smoothly
00:28 / 21.04.04
Usually, it refers to the level of disorder in a closed system.
See here or here.
 
 
Tom Coates
15:09 / 21.04.04
To be honest it's more of a scientific principle than it is a philosophical one and the scientists among us would probably be better able to give you a more detailed exposition of the principle.
 
 
Private Frost
15:26 / 21.04.04
Right-O. I've read the dictionary definitions (thanks for that google link. I've never used "define: xxx" before).

I guess a lot of people seem to have a different viewpoint on it. I went to about 6 people in my workplace and asked them what they thought entropy was. Two of them said they didn't know. One of them said "the tendancy for something to return to it's natural state". One of them said, "the tendancy of a system to return to maximum randomness". One of them said something regarding atoms moving apart from eachother. Another pointed me to webster.com.

The first one really bothered me. He gave an example:

"If the building we are standing in right now were to be left alone, and the plants and vines were to grow through it and turn it into rubble, that would be entropy. The place where we are standing returning to it's natural state."

That just seems like a bunch of crud to me. What exactly is a natural state? How could it be "returning" to anything when everything about it has changed?

He said I was being contrary. *shrug*

So I guess another way to phrase my question might be, "Can anyone give me a good example of entropy?"
 
 
sdv (non-human)
15:49 / 21.04.04
Order Out of Chaos by Ilya Prigogine & Isabelle Stengers (1984): is by far the most interesting text around on this but the material on the Web simply doesn't do it justice... It is remarkably non-techical (few equations) but will explain how entropy enters into non0linear chaotic systems as well... a classic I think.
 
 
sdv (non-human)
15:52 / 21.04.04
in case you don't believe me about the web - lookup entropy+stengers on google... (shudder...)
 
 
Scrubb is on a downward spiral
18:59 / 21.04.04
Okily-dokily.

As Tom said, entropy is a scientific principle. There are two main definitions of entropy that I’ll try to explain:

1. Thermodynamic Entropy : This ties in to the 2nd law of thermodynamics, a principle which says that a physical process will not finish with as much available energy as there was to start with.
eg. You cannot create a heat engine which extracts heat and converts it all to useful work.
In this case, entropy is defined as the unavailable energy, and can be measured in physical units (eg. joules); thus, according to the second law, entropy always increases with each physical process.

2. Logical entropy : In this instance, entropy is used to mean chaos or disorder - “A measure of disorder or randomness in a closed system”. This explains why , in a closed system, 2 gases will always mix despite the fact that no heat may be exhanged. Because physical units do not pertain to it, it is known as logical entropy.

On a practical basic level, entropy is why a house will deteriorate without maintanance; why a broken glass never gets fixed if there’s no-one to mend it; why those Dorito crumbs will stay on your jeans unless you brush them off.

If sciencey lingo makes your brain hurt, there are two pieces of literature/theatre that deal with and explain entropy in a joycore way: “The Last Question” by Isaac Asmiov (short story), and “Arcadia” by Tom Stoppard (play).
 
 
Cheap. Easy. Cruel.
20:05 / 21.04.04
I don't have much to add to the above posts.

Entropy is the tendency of a system or object to deteriorate over time without intervention to the contrary.
 
 
Smoothly
23:36 / 21.04.04
according to the second law, entropy always increases with each physical process.

What about the process of life? That exists in a closed system, but isn't it an exception to this rule?
If you plant an apple seed, it will pull atoms and energy from disorder, into order. Disparate carbon atom form complex carbon molecules. Thermal energy from the sun is turned into chemical energy in the form of complex carbohydrates.
Does life run contrary to the second law of thermodynamics?
 
 
Cloned Christ on a HoverDonkey
03:08 / 22.04.04
I think when you take into consideration the heat dissipated by the plant/animal and the disordered state of its effluent (solid, liquid and gaseous), then the entropy will be found to have increased in the closed system (i.e. the bounds of the atmosphere).

As far as thermodynaic entropy is concerned, ultimately this will lead to all energy in the Universe becoming useless, untappable energy (the kind that cannot be converted to a more useful form). This state is known as the heat death of the Universe, i.e. everything the same temperature; only fundamental particles, therefore no chemical reactions available.

Quote from here, which explains heat death and also gives a pretty good illustration of how entropy can only ever increase:

The heat death of the universe will only occur if the universe will last for an infinite amount of time (i.e there will be no big crunch).

It will occur because according to the second law of thermodynamics, the amount of entropy in a system must always increase. The amount of entropy in a system is a measure of how disordered the system is - the higher the entropy, the more disordered it is..

It is sometimes easier to imagine if you think of an experiment on earth. A chemical reaction will only occur if it results in an increase of entropy. Let us imagine burning petrol. We start of with a liquid that contains atoms arranged in long chains - fairly ordered. When we burn it, we create a lot of heat, as well as water vapour and carbon dioxide. Both of these are small gaseous molecules, so the amount of disorder of the atoms in their molecules has increased, and the temp. of the surroundings has also increased.

Now lets think what this means for the universe. Any reaction that takes place will either result in the products becoming less ordered, or heat being given off. This means at some time far in the future, when all the possible reactions have taken place, all that will be left is heat (i.e electromagnetic radiation) and fundamental particles. No reactions will be possible, because the universe will have reached its maximum entropy. The only reactions that can take place will result in a decrease of entropy, which is not possible, so in effect the universe will have died.


Quite a bleak outlook, really.
 
 
Gyan
20:27 / 22.04.04
Entropy is quite a subjective concept.

Order or disorder has to be perceived by some chain of apparatus, of which the last rung is always one's brain. Who's to say that you can't superimpose a hitherto unknown organizational schema onto a "high-entropy" system?
 
 
Perfect Tommy
06:02 / 24.04.04
[Life] exists in a closed system, but isn't it an exception to this rule?

When we're talking life on earth, we have the constant energy infusion from the sun. When the sun burns out, disorder on earth will increase once all the fuel and food is used up. (I mean, it would except that by that time the sun will have turned into a red giant and consumed the earth. But, y'know.)
 
 
SiliconDream
05:26 / 26.04.04
Order or disorder has to be perceived by some chain of apparatus, of which the last rung is always one's brain. Who's to say that you can't superimpose a hitherto unknown organizational schema onto a "high-entropy" system?

Scientific definitions of entropy don't have that much to do with ordinary notions of order and disorder. The entropy of a system in a given macrostate is a function of the number of possible microstates that system could be in, not of the subjective amount of "order" the macrostate represents.

I think, though, that we tend to view macrostates as well-ordered if there exists a brief but complete description of them--for instance, a crystal could be described as an arrangement of atoms such that every position satisfying the following periodic equations is occupied by an atom--and such states tend to have relatively few microstates, since if a brief description is to be completely accurate, it doesn't allow for much subtle variation.

But, in short, if you decide to consider a high-entropy macrostate as "ordered", it doesn't affect the entropy calculation at all.
 
 
tom-karika nukes it from orbit
07:19 / 26.04.04
What you are saying about microstates is the underlying theoretical basis of entropy; the number of ways of arranging the base-components in a system.

For a gas, there are many ways of rearranging the atoms. A gas is highly disordered.

For a crystal, the atoms can be arranged in relatively few ways. The crystal is less disordered.

Counting the number of ways in which the components can be arranged is just a way of 'counting' disorder.

A human being will conceive of order and disorder in slightly different ways to this 'counting' method. Often it correlates (A person will usually say that a gas is more ordered than a crystal, for instance). But disorder is rigoroulsy defined in scientific terms, and the human brain has little to do with it.
 
 
Gyan
00:37 / 27.04.04
SiliconDream: Your reply actually corroborates my notion:

Scientific definitions of entropy don't have that much to do with ordinary notions of order and disorder. The entropy of a system in a given macrostate is a function of the number of possible microstates that system could be in, not of the subjective amount of "order" the macrostate represents.

And who or what decides what's possible?
 
 
tom-karika nukes it from orbit
08:20 / 27.04.04
Ummm... well surely what's possible is just determined by simple laws of physics. You can't have two objects in the same place.

So if there are ten atoms and ten places, there is one way to arrange them all. A high-order system. If there are ten atoms and twenty places, there are... lots... of ways to arrange them all, and so a low-order system.
I don't think the human mind comes in to that.
 
 
Gyan
15:46 / 27.04.04
Tom-Karikar: That's where the unknown schemas come into play. Let's say there is a method to arrange the elements that you currently don't know about. Then your calculation of entropy doesn't factor in that method as a *possible* microstate. Generalising this, you can't ever know all what you currently don't know. Since entropy is a formal statement of "order/disorder" within a system and relies on your knowledge and conceptions of possibility, it's necessarily a subjective concept.
 
 
Lurid Archive
17:29 / 27.04.04
But in order to seriously maintain your point, Gyan, you have to ignore the last 300 years of physics. Of course, there are things we don't know but there two distinct senses in which this applies to entropy. First, some unknown force may be acting to order a system. That means your system is not closed and the second law is not violated. This can usefully identify forces and sources of energy.

Second, the concept of entropy may be invalid because matter arranges in some mysterious way that we have no experience of. Like planes turning into brie. This is possible, but implausible. It is also impossible to embrace as a organising principle of knowledge. Morover, if you count knowledge that *only* relies on a wealth of previous experience as "subjective", then all you are really saying is that you think that everything is subjective. Which is fine, of course, but then for the sake of clarity you should make that explcit.
 
 
Gyan
17:47 / 27.04.04
Lurid Archive: Morover, if you count knowledge that *only* relies on a wealth of previous experience as "subjective", then all you are really saying is that you think that everything is subjective.

Indeed.

The reason I adopt this tack is because entropy of the universe is said to be directional. Which seems a reckless statement to make, given the subjectivity.
 
 
Lurid Archive
18:09 / 27.04.04
Do you think that all statements, perhaps all scientific statements, are "reckless"? Because it sounds like you are using a cheap rhetorical trick. Namely, that of taking a universal position and, with scant regard for specifics, applying it in a particular instance. The second law of thermodynamics is as well supported as any science - it is provisional and subject to revision and it's utility completely misrepresented by a charge of subjectivity (whose meaning you are employing against common usage).
 
 
SiliconDream
20:00 / 27.04.04
Gyan,

It's true that we could be "wrong"--or at least disagree with one another--about how to identify and arrange microstates. I can think of three possible ways; none of these really pose a problem in terms of "subjectivity," though.

1) We could choose to classify microstates differently, by defining our macrostates with reference to a new set of properties. For instance, we could define the macrostates of a gas not with reference to temperature and pressure, but with reference to the first nth Fourier components of the density and velocity distributions.

This would give us a new set of macrostates, and entropic considerations would give us a new set of predictions about the system's thermodynamics. These would not contradict our old predictions, because those were made about a differently-defined set of macrostates. In fact, both sets of predictions together give us more information about the system than either set alone. So "imposing a new organizational schema" in this sense simply lets us work the thermodynamic equations for all they're worth.


2) There could be more microstates than we realize, because of some unknown degree of freedom (e.g. "spin" or "color") which is independent of the known ones. For instance, in a free-electron gas we might be defining microstates by electron position and velocity (or their quantum equivalents), but not by spin.

This would not affect entropic considerations, because the entropy of a macrostate is a function of the logarithm of the number of microstates. An independent degree of freedom would simply multiply the number of microstates in each macrostate by a constant--which would raise the entropy of each macrostate by a constant. Since the system's behavior only depends on the entropy of one macrostate relative to another,, our predictions would not change.


3) There could be an unknown degree of freedom which is not independent of the known ones--or, equivalently, a degree of freedom we believe in might not actually exist. This would change the number of microstates per macrostate by a non-constant factor, so that the relative entropies of macrostates would be different.

This would change our predictions. And that gives us a way to test them! If a system evolves in a way such that entropy seems to increase, our assumptions about its possible microstates must be wrong. Experiments on this issue can advance our understanding of a system by suggesting new degrees of freedom, or new ways that known degrees of freedom might correlate. In this case, observed experimental results will tell us whether our "subjective" microstate arrangement method is right or wrong.


Are you thinking of instances of subjectivity which lie outside these three categories? If so, could you outline them?
 
 
SiliconDream
20:26 / 27.04.04
Concerning "entropy of the universe is said to be directional":

This statement does not depend on precisely how we calculate entropy. Given the definition of entropy, it will hold true. Doesn't matter if we "subjectively" change our method of entropy calculation...if that results in the universe's entropy increasing, then we were simply wrong in our method. We didn't use a method which correctly matches the definition of entropy.

However, a couple of caveats should be mentioned concerning the unidirectionality of entropy. First, it's a statistical law and therefore can be violated in small ways, in the same way that virtual particles and vacuum temporarily violate the law of conservation of energy. Big violations are highly improbable but still not impossible over the longterm; some have suggested that the universe itself is a vacuum fluctuation.

Second, all the arguments about entropy increasing in the future can, as far as I can see, be inverted to state that entropy should have decreased in the past. That is, if the universe is currently in a low-entropy state, not only will it evolve to a more probable high-entropy state, but it probably came from one as well. The only reason we reject this, as far as I can tell, is that we know empirically that entropy has been even lower in the past (due to the higher proportion of hydrogen to helium, etc.) But AFAIK this doesn't apply to the pre-Big Bang universe, which statistically we ought to assume was high-entropy.

If someone can tell me why I'm wrong on this last bit, and why entropic arguments should not be time-symmetric, I'd greatly appreciate it. My thermo professor only ever said, "Well, we don't try to predict the behavior of a closed system far in the past, because we assume that if it's not steady-state already it must not have been closed," which seems like a bit of a cop-out to me.

Comments? Have you heard a better argument for entropy being unidirectional?
 
  
Add Your Reply