Excellent example! I'm no climate scientist, but I am a former geoscientist, and as it happens, all of our estimates for historic atmospheric conditions are based on various proxies which in turn are rooted in assumptions. Let's take a look at ice cores, for example. The theory is that bubbles of gas are trapped during solidification and a very strong assumption is that gas ratios do not change through processes like diffusion while the ice changes state through gradual burial over thousands of years. Now I could not find any literature after a cursory search questioning or even exploring such assumptions. But it is a fact that at least one other indicator contradicts estimates from ice cores. Plant stomata overreport historic CO2 concentrations relative to ice cores.
Now, a hypothetical thought experiment: suppose a scientist is able to obtain funding for and completes research which suggests that ice cores actually underestimate peak CO2 concentrations, indicating that modern levels are actually not as unprecedented as commonly believed. Which scenario do you think is more plausible:
1. The paper is published and the field of climate scientists, quite entrenched in a particular dominant viewpoint, become much more sceptical.
2. The scientist is shunned, if they even find an outlet willing to publish, for being a denier, and their career is jeapordized.
Remember, scientists need to eat too.
I do not wish to turn this into a debate about climate science. But there are other perfectly valid indicators, hard to find in "prestigious" journals, which are in desperate need of scrutiny but are not explored because people who spend their best years pursuing a PhD are unlikely to risk destroying their careers and losing their jobs when they have safer questions to ask. That's a large component of this emergent pressure to research in a particular safe direction.
Honestly, given the massive uncertainty that we deal with in geoscience, with strong economic pressures to be precise (oil wells are expensive) it is extremely difficult for me to see the same wildly uncertain proxies, models, and technologies being used to justify unquestionability of facts in climate science, particularly considering the risks of overestimating warming are far lower than drilling a $200MM dry hole. Climate is a massive, complex, chaotic system which we have only recently begun studying, and requires enormous infrastructure for data collection, processing, and modeling, the latter of which cannot be empirically verified. To believe that the science is settled and beyond questioning is naive. CO2 levels are rising, but how far this deviates from the norm and how bad it may ultimately be for humans and animals is still an open question.
It seems to me this should be an easy matter to analyze in detail. Diffusion would blur two distinct bands into each other, right? A sharp change in gas composition would appear more gradual with time, as diffusion blurred the boundary. So, are there sharp changes in the old ice? Lack of sharp changes might not prove anything, but the presence of sharp changes in very old ice should be informative. I'm no statistician but it seems to me a statistician should be able to look at the data and give you an upper bound on diffusion.
The oil industry is literally the wealthiest industry in the history of the world. Oil companies and petrostates are perfectly capable of funding and feeding any and all contrarian scientists. I find it really hard to believe that the stigma of being labeled a 'denier' is what is holding back researchers. More likely is that plenty of research is being or has been done, but nothing truly compelling has yet to be found.