6 Things That Blew My F*****g Mind – 1\ Entropy

The increase of entropy is THE law of the universe.

 

What is it?

 

As described before:

It is the integral of reversible heat transferred into a system over the infinitesimal change in the temperature of system over which that heat is transferred.

In an isolated system entropy always increases or remains static. In a closed system entropy always increases or remains static, unless there is an input of work in which case a decrease in entropy is possible. It should note that entropy can decrease locally, but it must increase the entropy of its surroundings as a natural consequence, such that there is a net positive increase in entropy of the closed system and the surroundings.

Part of the reason I’m being intentionally evasive and technical is that when the topic of entropy comes up, seasoned professionals tend to become pedantic and exacting. One of the biggest “Weeeeell acktchuallllly…” inducing statements is connecting entropy to disorder. I should point out it was Clausius who did this and everyone else is just following suit.

Clausius also made the statement that it is impossible to construct a device which operates on a cycle and produces no other effect than the transfer of heat from a cooler body to a hotter body. This is what we would call the second law of thermodynamics.

There are actually multiple ways to state this law. Lord Kelvin stated that no process is possible whose sole result is the absorption of heat from a reservoir and the conversion of this heat into work.

The short version of all of these is, in an isolated system the entropy of the system must increase or remain constant.

It’s pretty simple, if something is left alone, it will fall apart. This seems like it’s making a bigger statement that it really is, but it is really just an observation. What happens to your room if you don’t clean it? It gets dusty. It tends to thermodynamic equilibrium. “Disorder” increases.

A good way to think of it is if you had two plates next to each other, one hot and one cold, you would expect the hot plate to get cooler and the cool plate to get warmer. Unless you are actively pumping the heat from the cool plate to the warm plate i.e. a refrigerator, which requires work, the cool plate will eventually get warmer.

Suppose I were to put an engine in between the hot plate, which we call a heat source, and the cool plate, which we can call a heat sink. Heat will flow through the heat engine into the heat sink. It must do so because of the the second law of thermodynamics. Why would we do this? To get work, which in the case of the heat engine is necessary to create power. This is (a very, very high level of) how mechanical power generation works.

What if heat flowed from the cold to the warm plate through the engine? Then, we could run that guy forever and never worry about our energy needs! Nice idea, but impossible. Remember Clausius? Ok, but what if we were just really efficient? Could we turn every btu, every joule of heat into work? Alas, no. That violates Kelvin’s statement.

What I just gave you was a quick rundown of “classical thermodynamics”.

One way I think of it is the deviation from perfect symmetry. This is my personal understanding, so it’s probably flawed, but at 0 Kelvin we would expect a crystal lattice to be perfectly translationally symmetric. Entropy is the observation that in a closed system, some part of this crystal will eventually deviate from the rigid lattice.

This is closer to what you might consider a “statistical mechanics understanding of thermodynamics”.

Here’s the mind-blowing part for me: the concept of entropy is everywhere.

I don’t mean just in thermodynamics or in the universe in general. There is a version of entropy in information theory called Shannon entropy. It has been generalized into Renyi Entropy. Entropy shows up in statistical mechanics and in quantum mechanics as von Neumann entropy. Entropy seems to describe time under an Arrow of Time construct. There is an analogue of entropy, and actually all four laws of thermodynamics, in black holes. There are extensions of entropy used to determine the possibility of evolution of life from primordial soups. Entropy in abiogenesis has given rise to the idea of ectropy. There are even attempts at consolidating entropy in economic theories called ‘thermoeconomics’, although that attempt is not without controversy.

This is why entropy is so mindblowing to me. It touches just about every field, and seems to be, upon initial inspection, inherent to any system as it scales. The universality of entropy is important not only on microscopic levels, but seems to be important on macroscopic, systems-level scales too.

It think that’s why there is so much confusion about entropy. Every discipline has their own understanding of entropy relevant to their field. Chemists focus more on the Arrow of Time and disorder sense of entropy because determining when products form and mixing solutions is more relevant than understanding how a piston operates. Both are explained by thermodynamics, but the nuances of the arguments differ and thus shape our understanding in different ways.

Mechanical engineers tend understand the classical presentation best. I know of the fact that entropy is important in statistical mechanics and in information theory, but I don’t understand it to the extent that a physicist or a computer scientist does. But that mathematical understanding of the concept of entropy is helpful because it shows that despite the nuances of each individual field, there is a similar underlying structure to all of these fields of study.

 

The lesser version: None

 

No lesser version. It’s just the be-all end-all of everything from what I can tell. I dunno, maybe nihilism?

 

Why this thought experiment is more interesting

 

It still just blows my mind how universal it is.

The only thing to add to this quick diatribe on entropy is that I think there is still more to expand on, at least in terms of the theoretical foundations.

For example, there are what are called conservation laws, which state a certain property cannot be created or destroy, only transformed. Most relevantly, the conservation of energy states that, as I just wrote, energy cannot be created or destroyed. Same for matter, same for momentum. But why?

There exists a theorem called Noether’s theorem that shows that conservation laws arise from symmetry. So the conservation of energy comes from the symmetry of an action over time. Momentum and angular momentum are conserved because of actions over translational and rotational symmetries.

Is there a corollary to Noether’s theorem (an anti-Noether’s Theorem?) that shows that non-conserved quantities, like entropy, arise from an asymmetry? What would this anti-Noether charge be? An actions variance over what quantity could yield entropy?

This is pretty general, but I feel that understanding something like that, if true, would be a truly clarifying insight. It might explain why so many of these entropy analogues all have a similar mathematical form (some constant times the natural log of some argument as in s = k*ln(w)  or ).

That’s one thing that was very interesting about Noether’s theorem; when I was first exposed to the theorem, my professor said something like “This is the most fundamental thing your basic physics class skipped” and I remember thinking “Well, why did I pay so much money to sit through four years and skip such basic fundamentals until the very end?” But I see why now.

Noether’s theorem is the closest we have to an explanation of a law. With the thermodynamic laws, at least with the second law of thermodynamics, there is a certain amount of faith to be had. We kind of just have to accept that entropy increases and things fall apart. That may be why entropy feels so mysterious to so many of us and has so many hair-splitting, pedantic detractors. Noether’s theorem took so much of that out for energy conservation.

They are our axioms, taken as true and as our baseline assumption.

The other theoretic foundation to extend entropy I believe should be in ecology and anthropology. I’m not an ecologist nor an anthropologist so maybe there is some version of entropy in either field, but I’ve noticed something interesting about the environmentalists from either of those fields.

There are two schools of thought in environmentalism, bright green and dark green. Bright greens tend to believe that most of our environmental problems can be mitigated by developing technologies to counter our growing problems. Electric Cars. Vertical Farms. Solar Panels. That sort of environmentalist.

Dark Greens, on the other hand, are more nihilistic. They tend to believe that no amount of progress will deflect the inevitable decline of our environment. Their view is that the only thing that might mitigate environmental collapse is an extreme act of uncivilization – a return to the stone ages. This is often associated with extremists like the Unabomber, Earth First!, or John Zerzan.

Most environmentalists from non-STEM fields, like anthropology or ecology, tend to fall in either camp. I think this reflects on societies view of technology. The Bright Greens trust STEM-professionals to do right and create technologies that save the world and reflect that trust in technology. Dark Greens are cognizant that most efforts for improved technologies will inevitably still result in environmental collapse.

I think a lack of awareness about entropy explains this dissonance. The environment will fall apart, eventually. Everything falls apart, eventually. Our ecological understanding is still underpinned with an understanding of energy. Trees still need an energy input (solar) to initiate photosynthesis. We know that these fields are still governed by the laws of physics.

But on the bright green side, understanding that there will never be a 100% efficient Carnot engine is not an excuse to not make an engine however. Just because the Coefficent of Performance for a refrigerator never hits one doesn’t mean we should give up on refrigerators. It hasn’t stopped us in the past. That point speaks to developing technologies that decrease human suffering, but are cognizant of physical constraints and environmentally conscious.

My whole point here is that I do firmly believe that the second law of thermodynamics and entropy has a place in ecology and possibly anthropology, but it isn’t made aware at the most basic, undergraduate levels. Introducing these fields early on might help contextualize the hard science aspects of these fields. That way, divisions like that between dark greens and bright greens might ease away.

 

 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s