It took quite long for personal computers to catch up with super-computers of 50s and 60s. Many tools and concepts might have been conceived 60 years ago but they have become easily accessible much more recently. In late 80s and early 90s most people still had to pay a lot of money for tools like C-compiler and UNIX-like systems. Internet became easily accessible only in 90s.
There are some old techs that are just now becoming easily available like transactional memory. We used to need specialized hardware for that, but now many languages have libraries for software trasactional memory. Some are even implementing it as a language feature. This development is likely due to rise of consumer multi-core devices.
Sure I'd use crazy new tools in addition, but its not like the fundamentals of a field have to be constantly reinvented; they're just honed, refined, expanded and specialized. The space-flight specific bits would have their own new ways of working on them specific to them, but at the end of the day I'd still just be bolting a bunch of shit together. If the new tech introduces some improvements to the fundamentals to support itself, those improvements will slowly get rolled back into the base systems.
Calling for a revolutionary breakthrough is silly, his analogy of cars is a good example. The automobile's history has just been a slow evolution of better ways to perform the same old tricks. Revolutions within software are going to be smaller because the field already exists. At a high level nothing has changed, while on smaller scale the last few years have completely redefined source control.
modern programming programming languages are quite different from those of the 50's
personally, I think the industry missed a beat in the non standardization of CASE tools.