- "identification of conceptual equivalencies among disparate phenomena were foundational to developing previous laws of nature" - what exactly is a "conceptual equivalence"? You mean models? Unifying disparate observations into models is basic science. Not sure why it is highlighted here as some important insight.
- "The laws of classical physics emerged as efforts to provide comprehensive, predictive explanations of phenomena in the macroscopic world" - followed by a laymen's listing of physical laws, then goes on to claim "conspicuously absent is a law of increasing “complexity.”"
- then a jumble of examples including gravitation, stellar evolution, mineral evolution and biological evolution
- this just feels like a slight generalization of evolution: "Systems of many interacting agents display an increase in diversity, distribution, and/or patterned behavior when numerous configurations of the system are subject to selective pressure."
At this point, I gave up.
I don't really have an issue with any of the points you raised - why do they bother you?
The interesting stuff is the discussion about "functional information" later in the paper, which is their proposed quantitative measure for understanding the evolution of complexity (although it seems like early stages for the theory).
It's "just" a slight generalisation of the ideas of evolution but it applies to nonbiological systems and they can make quantitative predictions. If it turns out to be true then (for me) that is a pretty radical discovery.
I'm looking forward to seeing what can be demonstrated experimentally (the quanta article suggests there is some evidence now, but I haven't yet dug into it).
Indeed, and Natural Philosophy was the precursor to what we now call Science.
I still think the old name better fit what we’re doing because it admits that the work is still a philosophical endeavor.
This is not to question the validity of what we now call science, but it’s common these days to believe in the ultimate supremacy of science as the answer to questions that are best explored both philosophically and scientifically, and because pure science still can’t answer important philosophical questions that that the entire scientific discipline rests upon.
or in my words: "the first approximation is poetic. the last one is mathematical"
from philosophy to hard-science and engineered tooling and other products (andor services)
similarly to
from poetry as dubious, cloudy, and vague ideas all the way to crystal clear, fixed and unmoving (dead) formalizations
Idk about GP, but bad science writing ("identification of conceptual equivalencies ...") does bother me. It's sloppy, and tends to hide possibly invalid shortcuts taken by the authors by being an impenetrable fog of words. That sort of thing is a very good indicator of bunk, and it tends to peg my BS meter. Which isn't to say that there is no place for that sort of language in a scientific paper, but that one should preface the use of it with an admission of hand-waving for some purpose.
A distributed system can still achieve centralized outcomes as a result of centralizing constraints acting on it. For example, matter under gravity forces leads to celestial bodies, particles under EM forces lead to stable chemical molecules, genes and species under the constraint of replication lead to evolution, language under constraint of usage leads to the evolution of culture, and brains under the constraint of serial action lead to centralized semantics and behavior. In neural nets we have the loss function as a centralizing constraint, moving weights towards achieving a certain functional outcome.
Ok, so what is the relation between centralizing constraints and recursion? Recursion is how distributed activity generates constraints. Every action becomes a future constraint. I think this approach shows great promise. We can link recursive incompressibility and undecidability to explanatory gaps. You can't know a recursive process unless you walk the full path of recursion, you have to be it to know it. There is no shorter description of a recursive process than its full history.
So what looks like constraints when seen top-down, looks like search seen bottom-up. Particles search for minimal energy, genes for survival, markets search for profit, and our actions for goal maximization. Search acts on all levels, but since constraints are emergent, search is also open-ended.
Complexity is probably most formally modeled in entroy in thermodynamics, although it behaves in the opposite direction that these ideas and oberservations suggest it should.
It still asks questions about the reason for this complexity and there is no scientific answer aside from "propably accidental complexity".
Science is curious so it probably shouldn't be dismissed by unmet formal requirements that aren't specified. "Layman" is unspecific, so what would your requirements be exactly?
No, a model is not an "identification of conceptual equivalencies among disparate phenomena". It's a simplified representation of a system.
"identification of conceptual equivalencies among disparate phenomena were foundational to developing previous laws of nature" could be called an analogy, an isomorphism, a unifying framework, etc.
>Unifying disparate observations into models is basic science. Not sure why it is highlighted here as some important insight.
Perhaps because the most important insights are the most basic ones - it's upon those eveything else sits upon.
>At this point, I gave up
If you can't bother beyond the abstract or 1st paragraph, or are perplexed that the abstract has a 10,000ft simplistic introduction into the basics, then it's better that you did :)
This paper just reads like an attempt at sounding smart while actually saying little.
Good examples of these are anything that Kolmogorov-compresses well. For example, by almost any measure the output of a pseudo random number generator has high entropy. Yet it has low information density (low complexity), as the program that generates the sequence, plus its state, is really small.
I wonder if it always increases though? Eventually there will be enough entropy that any change may cause it to reduce or oscillate? (At universe / reachable universe scale).
It always increases in an isolated system. That caveat is almost always missing in pop-sci level of discussions about entropy, but it is crucial.
> Eventually there will be enough entropy that any change may cause it to reduce or oscillate?
Assuming that the universe is actually an isolated system, entropy will reach a maximum (it cannot oscillate). It is interesting to speculate, and of course our theories are imperfect and we are certainly missing something. In particular, the relationship between time and entropy is not straightforward. Very roughly: is the entropy a function of time, which we could define otherwise, or is time a consequence of entropy changes?
In the first case, we can suppose that if the universe reaches an entropy maximum we’d be far enough outside the conditions under which our theories work that we’d just have entropy decrease with time (i.e., the rule that entropy increases with time is only valid close to our usual conditions).
But in the second case, it would mean that the universe reached the end of time. It could evolve in any conceivable way (in terms of the fundamental laws of Physics), and the arrow of time would always point to the same moment. "What comes after?" Would be a question just as meaningless as "what came before the Big Bang?"
In any case, there are a lot of assumptions and uncertainty. The story does not do the subject any justice.
The book describes "Assembly Theory", a theory of how life can arise in the universe. The idea is that you can quantitatively measure the complexity of objects (especially chemicals) by the number of recursive steps to create them. (The molecule ATP is 21 for instance.) You need life to create anything over 15; the idea of life is it contains information that can create structures more complex than what can be created randomly. The important thing about life is that it isn't spontaneous, but forms an unbroken chain through time. Explaining how it started may require new physics.
If the above seems unclear, it's because it is unclear to me. The book doesn't do a good job of explaining things. It looks like a mass-market science book, but I found it very confusing. For instance, it's unclear where the number 21 for ATP comes from, although there's an analogy to LEGO. The book doesn't define things and goes into many, many tangents. The author is very, very enthusiastic about the ideas but reading the book is like looking at ideas through a cloud of vagueness.
The writing is also extremely quirky. Everyone is on a first-name basis, from Albert (Einstein) to Johnny (von Neumann) and Erwin (Schrödinger). One chapter is written in the second person, and "you" turn out to be "Albert." The book pushes the idea that physics is great and can solve everything, covering physics "greatest hits" from relativity and quantum mechanics to gravitational waves and the Higgs boson. (The underlying theme is: "Physics is great. This book is physics. Therefore, this book is great.") The book has a lot of discussion of how it is a new paradigm, Kuhn's paradigm shifts, how it will move astrobiology beyond the pre-paradigmatic phase and unify fields of research and so forth. It's not a crackpot book, but there are an uncomfortable number of crackpot red flags.
I'm not rejecting the idea of assembly theory. To be honest, after reading the book, I don't understand it well enough to say which parts seem good and which parts seem flawed. There seem to be interesting ideas struggling to get out but I'm not getting them. (I don't like to be negative about books, but there are a few that I regret reading and feel that I should warn people.)
How do you know it's not a crackpot book? All evidence you mentioned here seems to support that conclusion.
"There is a theory which states that if ever anyone discovers exactly what the Universe is for and why it is here, it will instantly disappear and be replaced by something even more bizarre and inexplicable. There is another theory which states that this has already happened."
"When one compares a hotplate with and without a Benard cell apparatus on top, there is an overall increase in entropy as energy passes through the system as required by the second law, because the increase in entropy in the environment (at the heat sink) is greater than the decreases in entropy that come about by maintaining gradients within the Benard cell system."
(Think: no heat death!)
Related to another heresy understated by qmag just this week: https://news.ycombinator.com/item?id=43665831
In that case, qmag didn't (dare to?) shout loud enough that the para-particles are globally ?distinguishable..
That's like a very restricted version of TFA's claim though..
Another take on the issue:
https://scottaaronson.blog/?p=762
*I don't want to say "entropy" because it's not clear to many folks, including experts, whether entropy is uh, "correlated" or "anticorrelated" with complexity.
Also the value of entropy has different signs in thermodynamics and computer science for example. Not helpful either...
[0] <https://onlinelibrary.wiley.com/doi/pdf/10.1002/%28SICI%2910...>
[1] <https://www.liverpooluniversitypress.co.uk/doi/book/10.3828/...>
Seriously, phrases like “selection for function”, unified theories of biology and physics, and big ideas about the second law of thermodynamics are major red flags.
The article talks a lot about biological evolution, but in that case the only claim that is likely to be true is that the complexity of the entire biosphere increases continuously, unless a catastrophe resets the biosphere to a lower complexity.
If you look only at a small part of the biosphere, like one species of living beings, it is extremely frequent to see that it evolves to become simpler, not more complex, because a simpler structure is usually optimal for constant environmental conditions, the more complex structures are mainly beneficial for avoiding extinction when the environmental conditions change.
Only if that system isn’t already in thermodynamic equilibrium. A closed system that reaches thermodynamic equilibrium has maximum entropy.
Why the universe as a whole didn’t start out in thermodynamic equilibrium, i.e doesn’t have maximum entropy is something we don’t understand.
If you subscribe to the big bang theory (and the idea that the purpose of a system is what it does), then the universe's purpose is to walk a path from low entropy to high entropy. Of what use is life, in such an endeavor? Well, life tends to seek out bits of stuck energy (food/fuel) and release it (metabolism/economy)--moving the universe further along on its path.
This gives a sort of answer to the question: "why bother have live at all?" And so I think the entropy purpose makes sense--moreso than just having it just be a side effect. Nobody will ever be absolutely right or wrong about such things (purposes), but they're handy to have around sometimes.
It's my, somewhat lazy, philosophical opinion, that there isn't any purpose and there doesn't need to be one.
I don't see why the universe would need a purpose for anything. Things are what they. Things changing state. Entropy.
I see reproduction as more of built in motivation to our system than a purpose as such. But that's semantics, and my purpose in life is not to argue about words! ;-)
Though, I'm not sure if life is the best at it, when compared to say a black hole. Some smart apes burning off fossil fuels seems pretty insignificant in comparison -- or even seeing what our own Sun does in a few seconds.
File that under, "The Earth will be fine in the long run, it's humans that are f'd" George Carlin pov. Maybe when we start building Death Stars (plural)
It gets a bit blurry when you start to substitute "life" for any "complex cosmological system" though...
The statement is a category error, but that criticism distracts from the very valuable insight he does provide regarding entropy, life and complexity.
He did a series on minutephysics explaining it quite well, worth a watch. He does explain why complexity increases as entropy increases (with some additional qualification).
https://www.youtube.com/playlist?list=PLoaVOjvkzQtyZF-2VpJrx...
It is puzzling why life isn't more common. Perhaps dissipative self-organizing structures are everywhere - stars, solar systems and galaxies themselves maintain their order by dissipating energy. They just don't look like "life" to us.
I presume the end-state of entropy would be the same (excluding ways to escape the universe).
One of the consequences of that extension was a possibility of a cyclic universe. On expansion one sees that classically defined entropy increases but then it will decrease on contraction.
These days that work is pretty much forgotten, but still it showed that with GR heat dearth of the universe was not the only option.
If I had to bet money on it, I would say it's right, especially in light of things like this: https://phys.org/news/2025-03-ai-image-recognition-universe....
So apparent increase in complexity can be attributed to gravity.
How would you test for it though? I've seen enough residual data from RL processes to almost see semblences of patterns that could be extracted and re-applied at a macro scale.
A "new force of nature"? It's just so pretentious. Some interesting biases of a selection process driven by copious excess energy doesn't make for a new force of nature. Otherwise we'd be positing all kinds of absurdities that are not directly explained by particle physics are woo woo a new force of nature--fashion choices (hey, copy, select, mutate there too).
[1] And no, I don't think that the computer simulations of evolution they carry out are any additional evidence. So you made a computer program with a copy/select/mutate loop in it. Big deal. I can make a computer simulation about anything.
Sounds like they're struggling to accept that the cosmos is not conscious and it doesn't design, and possibly confuse the fantasies we construct to, as it might be phenomenologically put, make sense of our environment, with the environment itself.
In ancient abrahamic cosmology it was proposed that the cosmos was designed, and first it was stone and water and so on, and then the biological matter was put in there, segmenting stone, hippopotamus and human into a kind of cosmological hierarchy of ethical and divine importance. Famous ancient greek philosophers imagined that there was another world shaping ours, geometrically purer and to people with a particular taste perceived as obviously more beautiful and holy.
Different strains of similar thinking survived in parts of the world for a long time, and had a renaissance due to european colonialism spreading it with a diverse set of tools.
One of the strongest views that followed is a cosmological dualism, the belief that there is something like soul or mind that is different from matter, usually paired with the belief that this is how truth enters the world and that truth is otherworldly, etherical.
Modern physics turned out to be absolutely brutal towards ideas like these. For a hundred years experiment upon experiment just smashed such segmentations and expectations against a growing mountain of experiential evidence. As of yet we have no evidence of the cosmos being governed by laws and selection, it just is what it is and the supposed laws are human interpretations, hopes and fantasies.
Protestant christianity is in an especially bad place due to this development, since it bets all it has on mental phenomena being more real than matter. Catholics and muslims can fall back on arguing that the divine is unknowable and that the effects of certain acts and traditions are socially beneficial, which sometimes puts them at odds with or makes them absolutely incompatible with worldly regimes of power. Protestant ideology on the other hand, can be fitted in with basically any regime, material conditions just aren't that important, ethically or otherwise.
Looking at the micro-perspectives we didn't find geometrical simplicity, instead we found weird, messy fields and almost-existences, putting all sorts of expectations about the foundations of the cosmos into question. Maybe it'll change, but at the moment there's no evidence for some grand principle or cosmic selector or whatever. One might argue something here about cosmic constants or the symmetry Dirac sussed out but that's still just pushing human experience into an algebra.
The expectation that life is somehow special is wrong. There is, as far as we can see, no difference in the quarks in a dog and those in a rock. The argument that 'DNA encodes more information' is childish, there are repetitive structures everywhere, like in the crystalline structures in a piece of rock. Protein sacks carrying their own emulation of a particular old ocean on a particular planet and flubbing around on land, carefully putting in salts and carbon and so on to keep it going, is neither more or less complex, neither more or less "information dense" in itself, than a photovoltaic panel pushing electrons to light up a screen.
There is a good book from the nineties on this topic, https://en.wikipedia.org/wiki/Ilya_Prigogine#The_End_of_Cert.... One should be very suspicious of people that talk about being cosmically selected, or about natural laws.
> The expectation that life is somehow special is wrong. There is, as far as we can see, no difference in the quarks in a dog and those in a rock
But the authors' examples do include the "speciation" of minerals! As I read it, the authors describe:
- some initial set of physical states (organisms, minerals, whatever)
- these states create conditions for new states to emerge, which in turn open up new possibilities or "phase spaces", and so on
- these new phase spaces produce new ad hoc "functions", which are (inevitably, with time and the flow of energy) searched and acted upon by selective processes, driving this increase of "functional information".
I don't think it's saying that living things are more complex or information dense per se, but rather, that this cycle of search, selection, and bootstrapping of new functions is a law-like generality that can be observed outside of living systems.
I'm not endorsing this view! There do seem to be clear problems with it as a testable scientific hypothesis. But to my naive ear, all of this seems to play rather nicely with this fundamentally statistical (vs deterministic) picture of reality that Prigogine described, with the "arrow of time" manifesting not just in thermodynamics and these irreversible processes, but also in this diversification of functions.