Is this sort of complexity common? I'm not saying I think it's bad, as I am sure it's all there for a good reason, but it just seems like so many services to bounce through before a web page is even served.
Google, of course, is in a class way above all those. I'm not sure if anyone actually knows how many services a single search hits now - perhaps some of the old-timers who're now VP level or above in search. The last public figures I was aware of were "hundreds of distinct services spread across tens of thousands of machines".
Most startups that actually do something useful require far more than a web frontend and a database.
Could you point me to more complex examples?
As far as complexity goes, I believe Amazon touches some hundred something services before a page is even served. It's no surprise that they decided to make some of that into a business and start charging for it.
But, to be fair, it's not always like that. I guess that they had lots of trouble on our server recently and we've just been badlucky.
We have a couple developers more or less dedicated to performance at this point, and we're moderately satisfied with server response time for most areas of the site (the issue tracker is a notable exception -- we're working on that).
I ask about your location because we also have some latency issues as you move further away from the US east coast (Wash. DC) datacenter. We're working on that also.
Would you mind maybe submitting the output of `ping github.com' and any other pages you find intolerably slow at http://support.github.com/ ? If that's too slow just dump stuff here I guess.
Either you found a page that took too long to render or we're getting more requests right now than we can handle.
You can try refreshing the page, the problem may be temporary. Learn how to deal with GitHub outages and other access problems.