You are given insulated cylinder with a barrier in the middle. Left side of the cylinder filled with ideal gas A, and the right side filled with gas B. If given a particle one can distinguish A from B. The pressure and temperature on both sides are the same. Then you remove the barrier and gases mix. Question: how much work you need to do to revert the system into the original state? Hint: the work is equal to entropy difference between two states.
More generally, if you have proper insulated system and leave it be for a while. All of sudden you will have to do some work to come back to the original state despite energy conservation law holds.
Given the scenario you just laid out it seems no work can be extracted just by letting mix two substances that are at the same temperature and pressure. But there is something about it that doesn't quite add up to my intuition of symmetry and conservation laws. Could you please elaborate more on that?
Note that you're describing equilibrium as a unique situation where the number of possible states is at a maximum. Now how can a situation be unique if it has the maximum number of possible states? Clearly the situation is as far from unique as it can be.
To resolve the contradiction requires distinguishing between features of the probability distribution and features of a random sample (i.e. a possible state) and also needs an explanation how it even makes sense to view a deterministic physical system (leave quantum mechanics for now) as a random variable.
The theory that links everything together is ergodic theory, which has a couple of handy theorems. One is that for a certain kind of dynamical system the average over time and the average over the 'possible states' agree. Such a system can also be assigned an entropy. It even suggests that generally a system will be found around states with a probability close to 2^-entropy (this is not absolutely always true ..but close enough for physicists)
Now what does such a system look like? Well we need a state space (easy) and a measure on it which is constant as the system evolves (i.e. we can pick a region in the state space, evolve it and its volume will stay constant). The last part is tricky, but as it turns out classical mechanics gives us the phase space and the canonical volume on it (basically the standard notion of volume) which fit the bill. This gives a probability distribution on the state space and an entropy equal to log(volume in phase space), which matches the definitions in statistical physics but also gives a solid foundation for some of the seemingly arbitrary choices.
So there you have it, that's why a system can have a probability distribution attached to it, despite being deterministic, why 'high entropy states' are common, and why physical systems have a uniform distribution (and therefore an entropy which is the log of the number of states).
This also explains physicists got away with using a uniform distribution without worrying about which variables they used. By pure 'coincidence' the standard choice of variables that physicists use have this incredibly nice property that makes everything work out. I'm not sure if this is too well known so it might be worth abusing this to 'prove' a perpetuum mobile is possible to stop people using uniform distributions without due deliberation.
Like the best ideas, it’s simple and makes sense if you think about it, but it’s still a really interesting framing that the complex machinery of life is really just the most efficient “entropy converter”.
If there’s something about the arrow of time that speeds towards the heat death of the universe, we’re just helping it go a tiny bit faster here on our floating speck of dust.
Both newtons and Einstein's description of the universe are time reversable. The physics are valid independent of the direction of time.
Entropy is literally a separate observable phenomenon with it's own set of axioms.
Right? I think this could be a really interesting basis for a sci-fi novel - the first space-faring civilization in the entire universe trying to force every form of life to keep "entropy usage" to a minimum, so they can prolong their own life span.
And a space full of children is exponentially faster at increasing entropy.
[0]: https://docs.google.com/document/d/10Vi8s-azYq9auysBSK3SFSWZ...
Also prof. Suo puts entropy as the main character of the "play". The other concepts (temperature, etc.) are defined from entropy.
I decided I didn't have the brainpower/mental capacity to think about The Witcher and that this video on Entropy would be easier to digest.
The opposite is true with fiction. You're intentionally trying to have the audience make connections themselves, like a Sherlock story or the Great Gatsby. The point is in the discovery by the viewer.
"Can you get time out of Quantum Mechanics?": https://youtu.be/nqQrGk7Vzd4
So, the sun is a low-entropy source of energy, and Earth (and everything on it) increases that entropy as it uses and then reradiates that energy. This process is entirely consistent with the second law of thermodynamics.
The relationship between light frequency and entropy comes from the fact that entropy is a measure of disorder or randomness. High-frequency light, such as ultraviolet or visible light, is more ordered and less random than lower-frequency light, such as infrared or microwave light.
This is due to how light is structured. Light is made up of particles called photons, and each photon carries a certain amount of energy. The energy of a photon is directly proportional to its frequency: higher-frequency photons carry more energy than lower-frequency ones.
So, if you have a fixed amount of energy to distribute among photons, you can do so in many more ways (i.e., with higher entropy) if you use low-energy, low-frequency photons. That's because you would need many more of them to carry the same total amount of energy.
On the other hand, if you use high-energy, high-frequency photons, you would need fewer of them to carry the same total amount of energy. There are fewer ways to distribute the energy (i.e., lower entropy), so this arrangement is more ordered and less random.
Therefore, high-frequency light is considered a lower-entropy form of energy compared to low-frequency light, because the energy is concentrated in fewer, more energetic photons.
Laymen to the extreme but, didn't it? The thing about the low entropy of the universe near the big bang, gravity naturally bringing things together, and such?
to my best understanding, to go from high entropy state to low entropy state you need work to do. The sun is a source of energy to do the work
I've always referred to the inverse as "information" or "order".
If entropy is the distribution of potential over negative potential, emergence is that where an outcome creates potential (of some discrete domain).
The mystifying relationship with thermodynamic entropy is that through accelerating entropy in other domains (burning), the entropy in the primary domain (eternal drag) is supplanted by the potentials provided by the external domain.
Entropy and energy are orthogonal anyway. You can understand entropy without the need to even use the word "energy."
Entropy is an aspect of probability, it is in fact a numerical phenomenon.
Take any photograph in Photoshop. First, save one copy of it as a compressed JPG.
Now, on the original, add a small densely repeating tiled pattern multiplied on top as a layer. Like a halftone effect, dot texture, whatever. Technically you're adding more order and less chaos. The resulting image won't compress as efficiently.
I recommend this video as well.
> I don't believe the 2nd law of thermodynamics.
But even this is not the full story, because I can take a mass-spring network, and no matter how I choose to coarse-grain it, I will not see the entropy corresponding to that coarse-graining increase, because the trajectory of a mass-spring system is periodic. Entropy increase requires that the system is ergodic with respect to the chosen coarse-graining operation, i.e. that over long times the trajectory visits the coarse-grained states in a "random" and uniform way. It's not at all obvious to me why the dynamics of particles bouncing around in a box have this property, and particles attached in a mass-spring network do not; and neither the Sabine nor the Veritaserum videos address this or why we should expect all practical real-world physical systems to be ergotic with respect to practical coarse-graining mechanisms.
I don't pretend to understand this stuff, but wouldn't a real mass-spring system slowly stop, due to friction, air resistance, heat dissipation, ...? So a real system wouldn't be periodic.
I posted it without realizing somebody else had already posted it.
Gotta love Prof. Hossenfelder.
Of course I can't give him a pass on how crass it was telling that women he has a PhD in physics (he does not). The video would have been so much better without that two seconds of footage...
To be clear, he does have a PhD but it is in physics education research, not physics.
Why do you think so? (Most to me seem reasonable but one on speed of electricity stands out as badly done (he redid the video but it too could have been better).)
The worst example I remember, which is actually what drove me to unsubscribe, was when he said that the golden ratio was "a pretty five-y number" because it can be written as 0.5 + 0.5 * (5^0.5). Anyone with a good mathematical background could tell you there's nothing five-y about 0.5 at all. I'll grant him, the golden ratio is still a little bit five-y because of the sqrt(5).
The whole context and presentation seemed like it was designed to make the viewer feel like they'd learnt something even though nothing of substance was really delivered in those 20 seconds. He does that a lot.
Kind of sad that a expensive camera + clickbait thumbnail/title > Experts communicating clearly and accurately.
I imagine he/his team is scouring youtube for the experts, and remaking their videos with more production value.
This sounds quite bitter, as does griping about him mentioning his PhD.
His videos have excellent production quality, and do a great job of communicating advanced STEM concepts to laypeople in an entertaining way.
Maybe you don't like them, but that doesn't mean they are bad. Given their popularity, it would seem they are anything but.
[0] https://en.wikipedia.org/wiki/Holographic_principle#Black_ho...
What makes it weird is that black holes must necessarily* have the maximum amount of entropy for a specific volume. So not only the entropy of a black hole is proportional to its surface area, but the entropy of some volume of space can not grow beyond that. In particular entropy cannot be proportional to volume without limit, the density must be 0 on average for a big enough region.
*: According to some people anyway.
Suppose I show you a snapshot of a random universe, would you be able to tell if the entropy of the universe is going to increase or decrease as the time progresses?
Let's assume that universe's entropy would increase. Consider another universe exactly the same as current universe, but all the particles' velocities reversed. Then this universe's entropy would decrease.
So you are equally like to select both the universe and hence the original assumption of increasing entropy is wrong.
Discarding quantum properties of the particles, is it then fair to say that time's direction is unrelated to whether entropy increases or decreases?
> Suppose I show you a snapshot of a random universe, would you be able to tell if the entropy of the universe is going to increase or decrease as the time progresses?
Yes, if it has low entropy then entropy will probably increase; if it has high entropy then the entropy will probably fluctuate up and down statistically.
> Let's assume that universe's entropy would increase. Consider another universe exactly the same as current universe, but all the particles' velocities reversed. Then this universe's entropy would decrease.
The key is that you're exponentially unlikely to find yourself in a universe where all the particles' velocities are reversed. See this: https://en.wikipedia.org/wiki/Fluctuation_theorem
The probability that a system randomly evolves in a way that reduces entropy is very very small.
> Yes, if it has low entropy then entropy will probably increase
The problem is that it probably increases in both time directions, such that the state of minimum entropy is now. As you said, we have to stipulate that the entropy in the past is low, we can't (yet?) infer it from observation. Which raises the question what justifies us making this assumption in the first place.
"I don't believe the 2nd law of thermodynamics. (The most uplifting video I'll ever make.)" by Sabine Hossenfelder
[Note that this is intended to be a rhetorical question advanced for the purposes of pedagogy. If you find yourself wanting to post an answer, you have missed the point.]
Something similar could perhaps be said for the video’s approach; “what do we get from the sun?” is an ambiguous question, not necessarily a fair setup to ask a lay person when you have entropy in mind as the answer. We do get energy from the sun, that is a correct answer, and we use some of it before it goes away. But, there is the nice a-ha that all the energy from the sun eventually leaves the earth, right?
[1] “An energy crisis or energy shortage is any significant bottleneck in the supply of energy resources to an economy.“ https://en.wikipedia.org/wiki/Energy_crisis
I think it's not so much a shortage of energy, but that there thermodynamic equilibrium and thus no available energy to do anything.
I don't think this will ever happen tho, it's pretty clear to me that making energy more dense is a universal process.
Useful work, aka information, is work that can be employed in dynamics vis a vis processing. Useless work, aka heat, is the devil's share of the energy expenditure which is lost as entropy when undergoing a process.
In essence what you need to realize is that entropy is just a label for an aspect of probability.
Things tend to become disordered over time because disordered states are more probable then ordered states. Entropy is thus simply phenomenon of probability... of things moving from a low probability state to a high probability state. That's it.
That's really all there is to it. That's all you need to digest, all the complicated math and explanations are all just surrounding the above concept.
Entropy is just a high level abstraction of probability. It just allows you to explain things without the intuition of probability bogging you down. For example, explaining life in terms of probability is harder to grasp as it's akin to rolling 10 dice and having all the dice roll a 6.
Entropy isn't a property of an object, or a system or things in physics. Entropy is a property of our _description_ of systems. More precisely it is a measure of how poorly a given specification of a physical system is, i.e. given description of a systems, typically the pressure / volume / temperature of a gas or whatnot, how many different physical systems correspond to such a description.
In particular, _thermodynamic entropy is Shannon entropy_.
In the case where the description of state specifies a volume of phase space wherein a physical state lies within, then the entropy is the logarithm of the volume of this fragment of phase space. If we take this collection of states and see how they evolve in time, then Liouville’s theorem says the volume of phase space will remain constant.
If we want to build a reliable machine, i.e. an engine, that can operate in any initial state that is bounded by our description, and ends up win a final state bounded by some other description, well, in order for this machine to preform reliably, the volume of the final description needs to be greater than the volume of the description of the initial state. Otherwise, some possible initial states will fail to end up in the desired final state. This is the essence of the second law of thermodynamics.
I want to emphasis this: entropy exists in our heads, not in the world.
E.T. Jaynes illustrated this "5. The Gas Mixing Scenario Revisited" in https://www.damtp.cam.ac.uk/user/tong/statphys/jaynes.pdf where two imaginary variants of Argon gas are mixed together. If one engineer is ignorant of the different variants of Argon gas, it is impossible to extract work from the gas, but armed with knowledge of the difference (which must be exploitable otherwise they wouldn't actually be different) work can be extracted.
Knowledge _is_ power.
Taking an extreme example, suppose we have two volumes of gas at different volumes / pressures / temperature. We can compute how much work can be extracted from those gases.
But, suppose someone else knows more than just the volume / pressure / temperature of these gases. This someone happens to know the precise position and velocity of every single molecule of gas (more practically they know the quantum state of the system). This someone now gets to play a the role of Maxwell's demon and separate all the high velocity and low velocity molecules of each chamber, opening and closing a gate using their perfect knowledge of where each particle is at each moment in time. From this they can now extract far more work than the ignorant person.
In both cases the gas was identical. How much useful work one can extract depends on how precise one's knowledge of the state of that gas is.
If it only existed in our minds and not in physical reality that would mean it would be possible to construct a device that decreases global entropy on average.
This article is also interesting: "THE EVOLUTION OF CARNOT'S PRINCIPLE" https://bayes.wustl.edu/etj/articles/ccarnot.pdf
Building on these ideas, the first five chapters of this (draft of a) book from Ariel Caticha are quite readable: https://www.arielcaticha.com/my-book-entropic-physics
entropy is a fancy way of explaining probability.
Things with higher probability tend to occur over things of lower probability.
Thus when certain aspects of the world like configurations of gas particles in a box are allowed to change configurations, they will move towards high probability configurations.
High probability configurations tend to be disordered. Hence the reason why we associate entropy with things becoming increasingly disordered. For example... gas particles randomly and evenly filling up an entire box is more probable then all gas particles randomly gathering on one side of the box.
If you understand what I just explained than you understand entropy better than the majority of people.
In all the debate over global warming little is talked about why say CO2 and other greenhouse gasses increase the earth's temperature and how they shift the wavelength of the radiated energy from earth. In other words we need to explain in simple terms why the incoming and outgoing energy can remain the same yet the earth's temperature has increased.
Global warming occurs because the previous equilibrium between incoming and outgoing energy has been broken by changes in the composition of the atmosphere.
So until we reach a new equilibrium long after the atmosphere composition ceases to change, the outgoing energy will be less than the incoming energy.
The Earth is taking on energy every day from the sun. If we didn’t release it all back, the earth would be warming much much faster. It only remains relatively cool because it releases almost as much as it receives.
Another important note is that long term energy is not only stored as heat on earth. It’s stored as potential energy in the atoms of cells in plants and animals. Think of how cold a gallon of gasoline is, yet how much energy it stores.
For an example think of hot asphalt from a summer day. It gets real hot all day and slowly cools down at night. Sometimes it can be pretty warm to stand on the road even if it’s a cool night.
Within the human timescale, the Earth is retaining some (tiny fraction) of heat. That tiny fraction of heat is a very small window of heat that life can tolerate. It’s not too much and not too little. If the earth were to retain just a tiny bit more, suddenly life can’t tolerate it. On the scale of the universe, the difference between those realities is minuscule, even though it’s enormous to us.
It may help lower the temperature of the debate if they did.
Edit, we're pitching this discussion at the level he has—the lay public. Scientific argument over the minutiae is another matter altogether.
https://www.researchgate.net/publication/228935581_How_physi...
Whether discussing what is over what may be, or thermal equilibrium, potential distribution describes it all!