I don't see where you're going, though.
If the compiler is unable to prove a transformation preserves correctness, it should not do the transformation.
To your point below: The compiler is definitely not "forced" to assume that the pointer is not null - that is a choice made by the compiler writers. Even the ridiculous standard does not require the compiler writer to make that assumption. The compiler can simply compile the code as written - or, if it is smart enough to see the problem - it can produce a warning.
In the axiomatic system implied by the standard, the hypothetical compiler being discussed can prove that the null check can be eliminated. The fact that you believe this axiomatic system is inconvenient does not constitute a refutation of the truth of the theorem.
> If the compiler is unable to prove a transformation preserves correctness, it should not do the transformation.
Actually, the compiler is able to prove the invariance of correctness. Eliminating a null check after a pointer has been dereferenced does in fact preserve the correctness. Either the function is never called with NULL, and the program is correct, or the function is sometimes called with NULL, and the program is incorrect.
> the bureaucratic mindset holds that the rules are always, by definition, correct
Since you think that the compiler following the standard rigorously is "bureaucratic" and, I imagine, bad, it follows that you would prefer the compiler to sometimes ignore what the standard says and do something different. I suggest that you try compiling your code with a compiler for a different language. Compilers for languages other than C are guaranteed to not follow the C standard.
EDIT: I think I see where you're going. Your argument is that, since in some platforms NULL can be correctly dereferenced, if the compiler was to eliminate the null check, that would change the behavior of the program. If a compiler for that platform did that, I would agree that it would not be preserving correctness. A compiler for a given platform is only required to generate correct code for that particular platform. Compilers for platforms where dereferencing a null pointer is always invalid can correctly eliminate that late null check.
No. The compiler is forced to assume x isn't null, because int y = *x; has no meaning iff x IS null, so the compiler can't possibly generate any code to cover that case. There's no definition of that construct for the compiler to work off of that could possibly allow x to be null.
Blame the standard if you want, but you can't blame the compiler for not generating code to cover behaviour that you've made up in your head.
It has long been known that you can get better optimization if you disregard correctness ;)
C compiler writers know that the assumption "pointer is not null because it was dereferenced" doesn't hold in general. C compiler writers know that they're performing incorrect optimizations.
The bureaucracy now doesn't tell them that the transformation is correct. It tells them it is fine to be incorrect because UB.
The bureaucracy gives them a "license to kill" for the greater good.
(What is the greater good, you ask? Can't answer that; ask a bureaucrat.)
You could argue the rules are "wrong", but that's a totally different topic.