Anyway, Racket has an additional internal CI called DrDr and some very subtle errors are only detected there, after the commit is merged.
Some repositories, like CoreCLR, have outer loop testing that runs on a separate schedule (nightly, I think), but those tests are far less likely to break and are more devoted to finding rare and difficult to compute edge cases.
See for instance how having your cursor on top of each side of a condition tells you how many times each individual condition were evaluated!
Are there any parallel compliers?
Certainly the Roslyn C# compiler is highly parallel. All files are parsed in parallel, then all classes are bound (semantically analyzed) in parallel, then the IL serialization phase is sequential.
Across different machines, not cores on ^a^ chip?
At the recent 2016 US Dev Conf, there was a consensus to move to git and that the new host would be github.
Really subjective IMO part: In general, there's tons of really smart folks working on really awesome stuff in LLVM+clang+etc. There's a handful of folks also focusing on the general "plumbing" software within and among those projects. The meta-plumbing job of the dev infrastructure is "kinda interesting" to several folks who want to improve the way the project is developed. But "kinda interesting" doesn't pay the bills and so it's a second (or nth responsibility) for the folks volunteering to work on it. Add to that the "no good deed goes unpunished" rule that they'll get the responsibility/blame after making a sweeping change, it means it will require extreme patience and caution.
Current status: http://lists.llvm.org/pipermail/llvm-dev/2017-January/109015...
When i moved GCC from CVS to SVN, it made life a bit easier but it's not revolutionary change.
Which is funny, considering how often people argue about VCS systems.
Before we moved to Git, Roslyn was on TFS, which was basically Perforce/CVS/SVN.
You're absolutely right that the distinction among the former VCS's is minimal. However, Git offers value that was transformative compared to the former. Namely,
1. Git allows you to easily switch between multiple work items while keeping track of the work done in each item.
2. Git allows you to easily integrate with people who have significantly diverged clones of your tree without too much trouble.
3. Git allows you to easily work offline.
(1) is definitely the largest benefit, but was mitigated with tools like g5 when I was at Google. However, the Google gravity well has its own drawbacks.(2) is very important if you want to host rapid release schedules with divergence of features. It's especially useful if you want to have long stabilization periods and low-risk bug fixes delivered in parallel to multiple customers.
(3) is pretty self-explanatory, but for most people it's underestimated how much downtime your VCS has. I'd bet, for most people, it's significantly less than 5 9's. Not only is that wasted time, it's frustrating because it's usually consecutive and removes entire working days at random.
Compare that to Git's author: "Subversion has been the most pointless project ever started. There is no way to do cvs right."
Git and Hg (+ the many tools that surround them: GitHub, Bitbucket, Gerrit, GitLab, etc.) have a model that makes community contribution far easier than CVS and SVN.
(edit: LLVM is surprisingly small, actually - a git clone comes in at just under 900MB. for more painful examples tho, see repos that commit(ted) binaries, or the scale of Android's repos)
[1]: AFAIK Mercurial still has no built-in support, though extensions exist. Which is probably the right choice for Mercurial.
That's a little bit on the small side, but it's still very manageable. For comparison Linux's .git folder comes in at 1.3GB on my computer, and LibreOffice's repo which has git history going back to the year 2000 weights some 3.6 GB. I can happily say that I haven't had any performance or space problem dealing with either full repos, even on my fairly weak laptop.