That's the sort of shift in the environment that the grandparent is talking about. Fundamental CS tech was arguably better in the 1970s and 1980s, because it moved more slowly and you had time to get the details right. That doesn't matter if you're building say a mobile Ethereum wallet in 2018, because you're building for the user expectations of today, they don't care about data integrity or security as long as it doesn't fail during the period where they're deciding which tech to use, and software that solves the problem (poorly) now is better than software that doesn't exist.
I believe you are a victim of survivorship bias.
There was plenty of shitty software in the 70s and 80s. The difference between then and now is that we haven't been able to wait for 4 decades, to see what software of 2018 stood the test of time.
That has always been the case and is about as good a one-line summary of the software industry as I can think of.
The "modern" database systems are now going back to the exact design principles that the books you refer to solved long time ago. There is tons of research, dissertations,.. that focuses on this from decades ago.
Its just now that the new systems realize that these problems actually exist.
If you dont know the history of a certain field and what came out, you repeat and make the same mistakes again. This seems to also apply to software engineering.