I like it that we have a name for this now. Let's keep calling it the "low-code fallacy", because I'm tired of explaining over and over the same idea that semicolons and for loops are not what makes programming hard.
(And semicolons are ugly and I avoid them, wherever I can get away with it, but no, are probably not the reason)
I also agree that 0.1 + 0.2 != 0.3 is another thing that makes programming hard. This is intrinsic complexity, because it is a fundamental limitation in how all computers work. The way around this is -- you guessed it -- better programming languages, that help you "fall into the pit of success". Perhaps floating point equality comparisons should even be a compiler error. Again, low-code goes the opposite direction, by simply pretending this kind of fundamental complexity doesn't exist. You are given no power to avoid it biting you nor to figure out what's going on when it does. Low-code's entire premise is that you shouldn't need to understand how computers work in order to program them, but of course understanding how floating-point numbers are represented is exactly how you avoid this issue.
The SQL `numeric` makes the right choice here, putting the problem right at the front so you can't ignore it.
That said, I completely agree with your main point. Modern software development is almost completely made of unnecessary complexity.