https://non-trivial-solution.blogspot.com/2022/04/do-we-have...
Another take many people in the field are expressing is that it's simply infeasible to reliably interpret statistical models at that level (especially one that is dominated by systematic uncertainty), since they are based on approximations and assumptions e.g. that certain nuisance parameters are "nicely" distributed and uncorrelated. See e.g. comments from Prof. Cranmer [1] who is one of the folks who developed the standard statistical formalism and methods used in modern particle physics experiments.
[1] https://twitter.com/kylecranmer/status/1512222463094140937?s...
I know these people are incredibly smart and conscientious. And the standard model is extremely successful and well confirmed. But that's a lot of degrees of freedom.
That's... cute. I doubt it will stop the theorists from flooding the arxiv with explanaitions in the coming days/weeks. Recall what happened when there was a barely 3 sigma (local) statistical fluctuation in LHC data:
https://resonaances.blogspot.com/2016/06/game-of-thrones-750...
Edit: Thank you for posting the excellent article!
In particle physics, sigma denotes "significance", not standard deviation. Technically what we're quoting as "sigmas" are "z-values", where z=Phi^{-1}(1 - p), where Phi^{-1} is the inverse CDF of the Normal distribution and p is the p-value of the experimental result. So, 7 sigma is defined to be the level of significance (for an arbitrary distribution) corresponding to the same quantile as 7 standard deviations out in a Normal distribution.
In other words, "z sigma" means: That a result like this occurs as a statistical fluke, is just as likely as a standard-normal distributed variable giving a value above z.
Nitpick: this is still a standard deviation in some (potentially very contrived and nonlinear) coordinate system. (As a simple example, a log-normal distribution might have a mean of 1 and a standard deviation effectively of multiplying or dividing by 2. Edit: also, multidimensional stuff might have to be shoehorned into a polar coordinate system.) But in practice you'd never bother to construct such a coordinate system, so that's more a mathematical artifact than anything useful.
> In statistics, the standard deviation is a measure of the amount of variation or dispersion of a set of values.[1]
https://en.wikipedia.org/wiki/68%E2%80%9395%E2%80%9399.7_rul...
Known unknowns and unknown unknowns, as Rumsfeld would put it.
About a decade ago I saw a very nice figure of estimates of the speed of light over time showing this effect. Unfortunately I haven't been able to find it since.
https://www.nhn.ou.edu/~johnson/Education/Juniorlab/C_Speed/...
Edit: here’s some error bars!
https://www.researchgate.net/figure/Uncertainties-in-Reporte...
So instead of the heavy theory, I'd like to see the stuff that made people scratch their heads in the first place.
Rutherford [1] showed that atoms consist of a tiny, positively charged nucleus and rather large negatively charged shell. It was hypothesized that electrons are flying around the nucleus like planets around the sun. But we already knew at that point that moving charges emit radiation, which causes the electron to lose energy and move closer to the nucleus. So it should pretty much immediately collapse into a point. Bohr then showed that if you assume that only certain orbits were allowed, it works out pretty nicely. Nowadays we now that there is such a thing as a ground state, meaning the lowest amount of energy the electron can possibly have around a nucleus is enough to keep it moving.
The idea for quantizing things came from observing the black body spectrum. If you sum up all contributions classically, you get infinity. Planck tried to see what happens if you assume that energy comes in little packets instead of a continuous spectrum. He didn't have any justification for it, but it matched the observations pretty well.
This is more up to date and specifically on challenges to the SM. Where is physics going? | Sabine Hossenfelder, Bjørn Ekeberg and Sam Henry https://www.youtube.com/watch?v=b8npmtsfsTU&t=2306s
As a sibling poster commented, the blackbody spectrum was also inexplicable from a classical point of view (see https://www.feynmanlectures.caltech.edu/I_41.html Section 41-2), but I think that the specific-heat problem was known before the blackbody problem.
Theoretical Concepts in Physics by Malcolm Longair is a mix of history and physics, by explaining how physicists came to discover their theories. I actually don't think it says much about modern particle physics though. It includes quantum mechanics.
Introduction to Elementary Particles by David Griffiths if you just want particle physics. Griffiths also has an intro book on quantum mechanics.
Anyway, the books you proposed look interesting, thanks.
You’re probably thinking of how a proton has a mass of 938 MeV/c^2. This is still a mass and not a voltage. 1 eV (electronvolt) is the amount of kinetic energy that an electron would have after being accelerated though an electric potential of one volt. By the mass-energy equivalence 1 eV is equivalent to a mass of ~1.783x10^-36 kg and a proton has a mass of ~1.673x10^−27 kg.
Yes, that’s what I was thinking. But it seems that there is a problem with the definition of the word “mass”. Clearly there are at least two definitions. First, the weight of an object. Here weight is measured and weight is called “mass”. There is no equivalence, same thing is called weight and mass. Weight and mass are synonyms. This mass has nothing to do with electricity and has nothing to do with motion.
The second definiton of mass is related to electricity and motion. It has no meaning outside electricity. In this case, they accelerate an electric current and measure its kinetic energy and call this kinetic energy “mass”. Again these words are synonyms. Why do physicists like these silly word plays so much, I have no idea.
(Like if you're trying to predict what happens when speeds approach those of light, you have to make weird relativistic corrections stemming from the observed speed of light being the same for all observers, regardless of their relative speeds.)
If you're lucky to not be in one of those weird cases, objects as spherical things with a sharp boundary in space and using the normal composition of speeds work just fine.
This is important because the weight of that particle was predicted by our generally-accepted theory of how the universe works. If the weight is different, it means the theory hasn't taken into account everything that it should.
https://news.fnal.gov/2022/04/cdf-collaboration-at-fermilab-...
do they stand by the result or is it more of a call for "hey, come have a look at this. we can't explain it."
it's got to be anxiety inducing! (and exciting, of course)
sqrt(6.4^2 + 6.9^2) ≈ 9.4
You can have a look here: http://ipl.physics.harvard.edu/wp-uploads/2013/03/PS3_Error_...
This is from the editor's comment at the top of the article, I'm guessing it was a mistake, but that might be why people are getting thrown off by it
For years I've argued foreign symbols and single-letter variable names mainly seem to serve to keep a walled garden around the sciences, and this was cemented when I eventually went for a master's degree and I was expected to do this as well in compsci to get a better grade even if there is no advantage. If we could just write what we mean, I suspect people would find that more useful even if it makes it look less cultivated and more mainstream.
(To be clear, this is not criticism on the person I'm replying to, but split between the author of this specific title and most of the sciences as a whole because it's a universally supported barrier (if only ever implicitly), aside from a few science communicators.)
Edit: scrolled further in the thread. Looks like I'm not the only one, though this person at least knew to name the sigma: https://news.ycombinator.com/item?id=30955621
I know nothing about Quantum though, only maths.
When you add in the "10% chance that some scientist messed up the maths or something in the experiment", then it's impossible to ever reach 7 sigma...
Probability is subjective, in this case because it's dependant on the design of the experiment / quality of the analysis of that experiment to determine a p-value of a given result.
The book "Bayesian analysis in high energy physics" is a short and sweet introduction. If I got the title wrong I'll update it later.
I would assume that the implication is that its 7 sigma assuming the measurements were done correctly.
EDIT: Yes, because the Gaussian distribution extends to +/- infinity; davrosthedalek explains it best, below.
Alternate reply: Gaussian approximation to the binomial is perfectly valid in all sorts of cases.