All they need to do is filter for pure oxygen, nitrogen is fine as well.
It's really just CO2 that's the problem.
But furthermore, it's not radioactive oxygen isotopes that get into the steel in the first place - they're scant to non-existent in nature, since all three common isotopes of oxygen are stable and most of the rest decay in seconds. It's other radioactive isotopes in the air from the bomb tests.
99+% isn't 100%, and it turns out those tiny fractions of a percent of junk contain the isotopes that are the real problem, namely Cobalt-60 created by the nuclear tests. Carbon-14 isn't nearly as big of a problem, since its decay mode is just beta and can be designed around, but the gamma decay from Cobalt-60 contamination is much harder to deal with.
Furthermore, because of Cobalt's position on the periodic table and the desire to have a small amount of cobalt in steel anyways to give it better working properties, it's not something that's easily filtered out, even in processes that reform steel like vacuum remelting which exist to make mechanically harder and better quality steel by slow melting and recrystalization. Once the Cobalt's in there, it's in there - you just have to wait for it to decay.
As it turns out, we're in luck, most of the fallout from those bomb tests has passed through numerous half-lives and is much less of a problem today than it was in the 1980s and 1990s when the low background stuff became such a hot commodity. So it doesn't really matter as much that we're running out. Furthermore, oxygen separation technologies and cryogenic liquid handling have improved, so we can do an even better job keeping contamination out. If someone wanted to set up a low background mill, they probably could do it today with commodity molecular sieves and centrifugation of the oxygen rejecting all but the light fraction...
Also, what's in the lead? I didn't think we wanted CO2 in lead but I'm really not sure what impurities it might have naturally.
So the term "fallout" is actually a pretty piece of propaganda. While a lot of it did or does indeed "fall out", there's still a lot of radioactivity in the air and on the surface from those nuclear tests in the form of fine particulate. It's in the fine dust all around you as 100nm and smaller particles, dancing around the air through Brownian motion. It's all over everything all of the time. It's in the water and the ocean. Nanograms here and there and everywhere. Not enough to really cause you health problems anymore, but plenty enough to increase the background radiation of the entire surface of the planet by a tiny amount.
How it gets into the steel is through the actual blasting of oxygen into the steel - hundreds of cubic meters of oxygen are used per ton of steel made, concentrating those tiny particulates into the steel they're going into and dissolving them throughout the melt, which is precisely why using ultrapure oxygen and vacuum processes could be used to make lower background steel today... if there was high enough demand to justify the absurd cost of that kind of handling. Fortunately though, there was plenty of steel made before the 1940s, and the demand is not all that high since it's usually used as a shielding material and not as large structural elements. As long as they don't remelt it, or do so in a high vacuum reformer, the metal can retain its low background nature.
Intermediate-lived gamma-emitting isotopes (cobalt-60, strontium-90, cesium-137, and so on) are the particular problem children of nuclear fallout in steel making. The cesium and strontium are largely removed by the same processes as steel is made in the first place - they're simply reactive enough to bond with the silicon and carbon and aluminum impurities being removed and will happily exclude the majority of themselves as part of the slag. So while they do contribute to the background, they're not the main problem. Cobalt, on the other hand, is right next to iron on the periodic table and its happy to stay stuck to the iron, even through rounds of recrystallization. Once the cobalt is in the steel, it's in there until it decays away to nickel over the next century or so. (This is also why it's much less of a problem now than it was even 20 years ago; the halflife of cobalt-60 is about 5 years, which means much of it from the nuclear testing is already gone - most of what remains is from nuclear reactor releases and neutron activation products.)
The more sensitive your instrument needs to be, the more radioactive contamination wrecks your instrument, which is already why physics experiments have to go to extreme lengths to keep everything clean of dust and debris, and are often located underground or underwater to avoid exposure to cosmic rays and atmospheric muons decaying. But the even higher sensitivity experiments like dark matter searches and measurements of cosmic background radiation have little choice but to reach for low background steels and lead as shielding material.
Also, maybe this isn't really a concern, but "high temperature and pure oxygen" makes me think "metal fire."