I am a huge C fan but this is not true at all. C has tons of pitfalls, especially with modern UB-aggressive optimizing compilers. There are a lot of rules you need to be aware of that are not naturally-occurring results of the fundamentals.
You put your finger on the problem: "modern UB-aggressive optimising compilers". C, the language, is actually quite simple (if not easy). The crazy stuff that compiler writers have been doing recently while aggressively mis-reading the C standard is the problem and does make things very complicated.
Why "misreading"?
From 1.1:
"The X3J11 charter clearly mandates the Committee to codify common existing practice."
Their emphasis, not mine. So is there a mandate to use the definitions of the standard to invalidate common existing practice? Clearly not. Yet that is what is happening.
More from the standard (defining UB):
"Undefined behavior gives the implementor license not to catch certain program errors that are difficult to diagnose. It also identifies areas of possible conforming language extension: the implementor may augment the language by providing a definition of the officially undefined behaviour."
Does it say "Undefined behaviour gives implementors license to add new optimisations that break existing programs"? Clearly and unambiguously not.
> More from the standard (defining UB):
Your quote is not from the normative text of the standard, but from the non-normative rationale. Note however that it explicitly says that programs that contain undefined behaviors are erroneous, and that the implementation is not required to emit diagnostics for the UB. Pretty clearly this allows implementations to optimize erroneous programs into whatever they think is funny this week.
The normative text of the standard is pretty unambiguous:
undefined behavior
behavior, upon use of a nonportable or erroneous program construct or of erroneous data,
for which this International Standard imposes no requirements
http://www.iso-9899.info/n1570.html#3.4.3Utter nonsense. I use that word carefully, but in this case it is absolutely appropriate.
Compiler optimisations per an old but very useful definition aren't allowed to change the visible behaviour of programs (in terms of output, obviously they are allowed to change execution times).
For example, even just a couple of years ago the compilers I used would execute a loop that sums the first n integers. Nowadays compilers detect this and replace the loop with the result. While this isn't particularly useful, because probably the only reason you're summing the first n integers in a loop is to do some measurements, it is (a) a perfectly legal optimisation and (b) happened after 1990.
Unsurprisingly, you left out the second part of the (later) definition:
NOTE Possible undefined behavior ranges from ignoring the situation completely with unpredictable
results, to behaving during translation or program execution in a documented manner characteristic of the
environment (with or without the issuance of a diagnostic message), to terminating a translation or
execution (with the issuance of a diagnostic message).
Notably absent is "use the undefined behaviour to shave another 0.2% off my favourite benchmark".No it doesn't say that. It says that they are either "nonportable" or "erroneous". I'll take "nonportable" for 400, please.
Maybe I live in a C reality distortion field. :)
void free_circularly_linked_list(struct node *head) {
struct node *tmp = head;
do {
struct node *next = tmp->next;
free(tmp);
tmp = next;
} while (tmp != head);
}
Can you spot the undefined behavior?I've written up a demo with your code, running it through several analysers:
https://gist.github.com/technion/1b12c9b4581e915241d9483c5c2...
The tl;dr here is that tis-interpreter is a fantastic new tool, as it correctly complains about this.
Edit: I also note a departure from yester-year, where every linting tool would only manage to complain about unchecked malloc() returns.