The mailing list piece is from 2012, and describes how git is very slow on a synthetic repo with millions of files and commits. Today, my current place of work has a monorepo that’s approaching the size described in this mailing list, but git seems to be holding up just fine. If you checkout a branch that’s far enough away from master it takes a minute, but add, rebase, commit, status and blame are all negligibly impacted speed-wise. The only issue we run into is rejected non-conflicting pushes to master during peak hours, with maybe several dozens of engineers trying to merge and push master simultaneously.
Does anybody have any insight into what’s changed in git internally since 2012 to support bigger repos?