I found the answer here:
https://en.wikipedia.org/wiki/Atomic_clock#Accuracy
It turns out that the current state of the art is 10^-15. Which immediately raises the second question: how do they measure this? 10^-15 is an error of roughly a nanosecond a year. GR causes that kind of time difference between your head and your feet when you stand up.
https://www.nist.gov/news-events/news/2010/09/nist-clock-exp...
I haven't done the math, but I'm guessing that just standing next to a 10^-15 clock would have noticeable effects due to the effects of your gravitational field.
Also: why does thorium-229 in particular have such a low-energy atomic transition? That seems kind of random.
Mind-boggling stuff.
The gravitational redshift amounts to around 1E-16 (not 1E-15) when moving the earth one meter closer or further from your clock. You standing next to it is going to have absolutely no effect.
Maybe GR messes this up, but at least with newtonian gravity the difference is not as stark as this comparison would imply: a 1m change in elevation is about 300ugal, while an 80kg person 1m away is about .5ugal, so still not going to show up on a 1e-18 clock, but it's pretty close.
This and the recent advances in clock precision. In the last few years cryogenic sapphire clocks have achieved short-term stability on the order of 10s of attoseconds.