TDD has it's place, but taking it as an incontrovertible dogma is not useful.
> I never spoke out 'against' TDD. What I have said is, life is short and there are only a finite number of hours in a day. So, we have to make choices about how we spend our time. If we spend it writing tests, that is time we are not spending doing something else. Each of us needs to assess how best to spend our time in order to maximize our results, both in quantity and quality. If people think that spending fifty percent of their time writing tests maximizes their results—okay for them. I’m sure that’s not true for me—I’d rather spend that time thinking about my problem. I’m certain that, for me, this produces better solutions, with fewer defects, than any other use of my time. A bad design with a complete test suite is still a bad design. (http://www.codequarterly.com/2011/rich-hickey/)
Another question to ask is does this help others understand the problem? You often don't write tests for yourself, but for other people, including the later you that has popped the problem off the mental stack long ago. Writing tests might lose you time right now, but the net gain of time saved by others may make up for it. Let me emphasize the word "may", because you might make the problem clearer by writing documentation in English rather than code. So the only conclusion is that it depends, it depends...
The benefits of using tests as documentation come regardless of religious adherence to test-first, and in my opinion are often hindered by it. I agree that tests can make great documentation, as well as adding value for a number of other reasons (regression, etc).
I severely question the concept of writing them first in all (or even most) cases, though.
In general this means you will be writing mathematical or algorithmic code which can easily be separated from the rest of the system.
I dare anyone to provide any example of incontrovertible dogma as being useful.
Always use version control.
The important thing, to me, is that the language's path of least resistance leads you to The Right Thing. This generally improves quality of life for good devs and reduces the amount of harm bad devs can cause.
So is it an objectively bad language? Arguably. It just happens to be better for some tasks than all the alternatives, giving me code that's "safer" than C, almost as fast as C, and portable to all the platforms I need to target.
Microsoft TRIED to get people to move to a "better" language (C#, which IS arguably better than Java) but the .NET runtime is massive and doesn't ship with existing tablets/phones, meaning it (or rather, Mono) would add 10Mb+ to a download size that needs to stay under 20Mb to download over-the-air (2G/3G/4G).
A lot of people did jump on that bandwagon, but C# isn't really great for games, and so the adoption has been lukewarm outside of the Xbox/Windows Phone development community.
Point is, until someone has created something that really is BETTER than C++ in the ways that it needs to be (which probably requires heavy investment in tools as well as libraries), it's going to be an uphill battle to get people to change.
Even if you had the Perfect Ultimate Language designed an implemented today, you'd need support in everyone's editors, debuggers (including source-level remote debugging support and any necessary server modification on all target platforms), cross-platform build support to all the relevant architectures, C/C++ linkage (unless you plan to rewrite all current libraries), JNI linkage (so you could interoperate with Java, which is required on Android), and support for multiple paradigms (NO single-paradigm language will ever be adopted across the board, nor should it be -- different programming problems are solved best with different paradigms).
Honestly "D" is the language that best fits the bill that I'm aware of, though it has a long way to go as far as support on other platforms and mindshare. I haven't yet USED D, though, so I can't really critique it intelligently.