I had to go check that this was real - https://www.nature.com/articles/s41557-023-01300-3, because it could have as easily been a marketing site for the next Marvel movie for all I could ground it in my understanding of experimentation.
It's a (very) weird way to write the same equations as Newton. It happens to be easier to solve on some cases, because calculus works like that.
It's conceptually not very different from using a Fourier or Laplace transformation to simplify some signal handling. But 400 years ago they didn't have any easy way to understand it, so it got an aura of magic that never got away.
(But you probably already know all that. You probably just didn't internalize it because of that aura.)
Spoiler: E=m_rest c2 + E_kinetic unless you redefine the mass as a function of velocity. Something that people used to do a century ago but is unusual to it nowadays
I thought I would do differently and everyone else was doing wrong. I was the one doing wrong. I didn't learned much from all these original manuscript. I also lost precious years which I could have obtained real Masters degree. It is super important to understand that original manuscripts have tons of noise and baggage that only make sense in historical context. They also have unbacked goods which are super hard to digest, if you can digest at all. A ton of experts have already spent their lives distilling these original writings that fits with everything else, easy to digest and doesn't have all that noise. So, get a good textbook and follow that. Stop chasing original manuscripts.
Also, the fashion of physics changes. Back in the day, physicists seemed much more comfortable writing long intuitive arguments with lots of words, and were always happy with a strong connection to classical physics which they obviously had a strong understanding of (since at the time, that was all physics). These days, it's far more common to prefer a more mathematical oriented approach; people prefer to see an equation rather than paragraphs of intuition dumping. Also, we don't care so much about a connection to classical physics, since most physicists are now quite comfortable with quantum mechanics.
Even as a physics researcher it can be painful and difficult to go back to the original papers for things. This is also reflected in the popularity of wikipedia among physicists. It's much more likely that I'm going to understand the explanation of (for example) Bell's theorem on wikipedia than reading his original paper "On the Einstein Podolsky Rosen Paradox." People have come up with better examples, emphasised important points better, and refined the understanding.
We also can't forget the impact that LaTeX (or equation typesetting in general) has had on physics. Reading some of these old typewritten equations can add unnecessary cognitive load.
Now, this is unique to physical sciences. None of this can be relied on with philosophy (at least, continental philosophy). Sure, there are good modern summaries like on SEP, but you can never really be sure that your interpretation of the author will be the same.
I can't. I tried reading e.g. Plato and I just want to argue with him forever. He starts with assumptions which were "obvious" then, but are just obviously wrong now. It's much better to read a modern interpretation / summary which takes the bits which are still relevant, and Contextualize them with what we know today.
(Note - if you want to study history of something, by all means do read original texts. But if you actually want working knowledge of the subject, it's a horrible horrible way to go).
> The research was supported by grants from the US Office of Naval Research; the US Army Research Office Laboratory for Physical Sciences; the US Intelligence Advanced Research Projects Activity; Lockheed Martin; the Australian Defence Science and Technology Group, Sydney Quantum; a University of Sydney-University of California San Diego Partnership Collaboration Award; H. and A. Harley; and by computational resources from the Australian Government’s National Computational Infrastructure.
This is sponsored by the military.
So what's your point?
Why should we be worried that the military fund the internet, quantum computing, etc? Can there be any reason for concern?
Could it be that the military is less concerned about external threats, but way more concerned about managing internal ones? And that funding technology to create a technocratic panopticon has been long in the making? If directing technology towards technocratic infrastructure, such goals are really pretty complete, when you think about it.. Surely only a couple more elections before it is switched on, if that.
Kind of like how simulating anything in detail tends to be a lot slower than the actual thing you're simulating?
Is the difference here that it's basically an analog rather than a digital simulation?
Not following if there was any breakthrough here or not.
No, they didn’t simulate it in the way we typically simulate. They created a physical process that was an analogue of the actual process but 100b times slower so they could directly observe it.
From the article[1]:
Our approach avoids the limitations of direct experiments on molecular systems, where only few observables such as spectra and scattering cross sections can be measured. [...] A further advantage comes from the ratio (r) of the ion’s natural timescale (ms) and the measurement speed (ns), leading to an increase in the observable timing resolution of r ∼ 10^6. This improves the achievable resolution of chemical dynamics measurements relative to ultrafast observations.
So seems this would be more like running a computer simulation with a super-short timestep, allowing you to extract more details of a process. It's only related to the wall-clock in that they're using a physical analog system, rather than a computer.
> Not following if there was any breakthrough here or not.
Again from the paper:
Our approach to quantum simulation using an MQB trapped-ion system makes chemical dynamics that are otherwise unmeasurable directly accessible in the laboratory. This is a key demonstration of the utility of small-scale quantum computational devices to offer practical insights into chemical dynamics and resolve intractable problems in chemical physics.
Seems the measurement itself was a showcase for the techniques developed.
They say they mapped the problem.... So is this a model of an observation, or an actual observation?
> “Until now, we have been unable to directly observe the dynamics of ‘geometric phase’; it happens too fast to probe experimentally.
> “Using quantum technologies, we have addressed this problem.”
'cos if its a model, they are obviously still not observing whatever-it-is directly, right?
PS I'm pretty sure they are talking about their model.
“You raise a good point. There is no absolute certainty that the analog quantum system operated in exactly the same way as the original chemical reaction dynamics it was meant to model. Some key caveats and limitations include:
- The analog system is still an approximation, so there may be small differences in how the dynamics play out compared to the real system.
- Mapping a complex molecular system onto qubits necessarily requires simplifications and abstractions that could influence the outcomes.
- Factors like experimental errors, imperfect state preparation or measurement in the trapped ion system may introduce discrepancies.
- Important details like multi-particle interactions or higher-order effects may not be fully captured.
- Verification that the analog system exhibits the same identifying signatures or phenomena as the natural system would strengthen confidence in the analogy.
So while the researchers aim to design the quantum analog to faithfully mimic the essential physics, perfect equivalence cannot be taken for granted due to modeling approximations and technological limitations. The mapping should be validated by testing for characteristic properties before concluding the slow-motion "observations" definitively represent the original phenomenon. With improvements, analog quantum simulation could provide increasingly accurate models of chemistry.”
Is this a reasonably well grounded statement? And if so, how can anybody hope to verify the analog is exhibiting “the same identifying signatures or phenomena as the natural system” if the whole point is that we can’t observe the natural system with any precision to start with?
It's a direct demonstration of the utility of quantum computation in molecular modeling. The meta-relevance, to me at least, is that it demonstrates real progress in one of the areas where quantum computation is most likely to have an important impact.
In order to describe chemical reactions or atomic arrangements in terms of wave equations one normally treats the motion of the nuclei (slow/heavy) and the motion of the electrons (fast/light) separately simplifying the Schrödinger equation to the Born-Oppenheimer approximation.
In introductory chemistry textbooks [0] a diatomic example is mostly used as an illustration, for >2 atoms usually only the ground state is considered. This is because (1) in a diatomic setting the vibrational degree of freedom in the nucleus reduces to 1 and (2) the ground state can be well distinguished from other electronic states.
However when studying (advanced theoretical) chemistry or material sciences, polyatomic arrangement with tightly packed electronic states and a lot of nuclear degrees of freedom are the norm and the theory of so-called conical intersection of electronic energies essential in that regard.
Early on this was taken into account as the Jahn-Teller distortion[1]: a kind of spontaneous symmetry-breaking which seemed exotic when it was first described in the 1930s; in that same vein Teller later proposed an ultrarare occurrence within a few vibrational periods (sub-femtoseconds) by which a loss of electronic excitation was not followed by a photon being emitted: radiationless decay. Now, in refined orbital models [2] this seems to be a normal state of affairs e.g. in organic chemistry.[3]
Because of the tiny time scales involved theoretically predicted phenomena like a Geometrical phase/Berry phase (which itself has the Foucault pendulum in relation to Earth's latitude as its mechanical analogue [4]) have not been observed, yet. So borrowing from a topological analogue (Dirac points) [5] a quantum simulation seemed feasible.
To be honest the actual paper [6] linked in the article was hard to follow through so I found a similar paper [7] where the presentation of the general idea is more clear and concise.
[0]https://chem.libretexts.org/Courses/Pacific_Union_College/Qu...
[1]https://en.m.wikipedia.org/wiki/Jahn%E2%80%93Teller_effect
[2]https://core.ac.uk/download/pdf/9426023.pdf
[3]https://en.m.wikipedia.org/wiki/Quenching_(fluorescence)
[4]https://en.m.wikipedia.org/wiki/Geometric_phase#Foucault_pen...
[5]https://condensedconcepts.blogspot.com/2015/08/conical-inter...
https://www.mit.edu/people/dmredish/wwwMLRF/links/Humor/Admi...
There are 86k seconds in a day. 345 600 seconds in 4 days.
This quantum-slowing effect reduces the speed by 100 000 000 000.
Administratium is about as close to the actual speed of the reaction as it is to the slowed-speed reaction.
I understand the desire to make the discovery accessible but this does not accomplish that. If we measure information by “what predictions can a reader now make that they couldn’t make before” then this press release is information free.
Instead we have a lot of words to attempt to create the impression of having read something.
- a quality that makes something seem removed from everyday life, especially in a way that gives delight.
- something that has a delightfully unusual quality.
- very effective in producing results, especially desired ones.
- (informal, British) wonderful; exciting.