So needless to say, if you depend on GitHub for critical business operations, you need to start thinking about what a world without GitHub looks like for your business and start working your way toward that. I know my confidence in GitHub's engineering leadership is at rock bottom.
"The evidence is clear: Either you embrace AI, or get out of this career." -Github CEO
"Sooner than later, 80% of the code is going to be written by Copilot. And that doesn’t mean the developer is going to be replaced." -Github CEO
I like AI but actually not for coding because code quality is correlated to how well you understand the underlying systems you're building on, and AI is not really reasoning on this level at all. It's clearly synthesizing training data and it's useful in limited ways.
Did you hear about the screenwriters school in which the professors said to avoid AI for writing, but it's great for storyboards. And the storyboard school where the professors said the opposite?
The reality is that AI isn't actually "good" at anything. It produces passable ersatz facsimiles of work that can fool those not skilled in the art. The second reality of AI is that everyone is busy cramming it into their products at the expense of what their products are actually useful for.
Once people realise (1), and stop doing (2), the tech industry has a chance of recovering.
Seemingly the decline started with the Microsoft acquisition in 2018, and subsequent "unlimited private repository" change in 2019 (to match Gitlab's popular offer)
Maybe it wasn't as noticeable when Github had less features, but our CI runners and other automation using the API a decade ago always had weekly issues caused by Github being down/degraded.
Would you like help?
- Get help with developing the software
- Just develop the software without help
[ ] Don't show me this tip again"
Another site was constantly getting DDoS by Russians who were made we took down their scams on forums, that had to go through verisign back then, not sure who they're using now. They may have enough aggregate pipe it doesn't matter at this point
Is all the recent GitHub downtime entirely attributable to GitHub AI Copilot related development? How hard can it be to reduce the blast radius of new AI features to not affect the core parts of hosting repositories? Because of Copilot everywhere, The UX has become bad and I had to click all over the place and on my profile to find repositories.
That helps with Git not so much issues etc.
https://gitlab.com/gabriel.chamon/ci-components/-/tree/main/...
https://www.forbes.com/sites/bernardmarr/2025/07/08/microsof...
Artificial intelligence, Azure integration, many other things.
I think they may need to do that once again. Almost every product of theirs feels like a dumpster fire. GitHub is down constantly, Windows 11 is a nightmare and instead of patching things they're adding stupid features nobody asked for. I think they need to stop and really look closely at what they're prioritizing.
I can’t be specific but we are constantly complaining.
Edit: oh look, their site says all good, but I still have jobs stuck. What a pile of garbage.
I'm so sick of this.
They have not even bothered to implement entra login when they have their competitors login for years, do they even know what their product is? Or are you just a middle man for slop?
Might catch 90% of problems before they make it into the real stack?
E.g. every step of GitHub's migration to Azure could be mimicked on the duplicate stack before it's implemented on the primary stack. Is this just considered too much work? (I doubt cost would be the issue, because even if it costs millions, it would pay for itself in reduced reputational damage from outages).
EDIT: downvotes - why? - I think this is a good idea (I'd do it for my sites if outages were an issue).
If you'd ever worked on a codebase as terrible as I imagine GH's internals are and looked at the git history, you'd find two things:
1) fixing it would require rolling back 100's-1000's of engineer-years of idiocy that make things like testing or refactoring untenable
2) many prior engineers got part of the way through such improvements before leaving or being kicked out. Their efforts mostly just made it worse, because now you never know what sort of terribleness to expect when you open an unfamiliar file.
Because that's a monumental amount of work, and extraordinarily difficult to retrofit into a system that wasn't initially designed that way. Not to mention the unstated requirement of mirroring traffic to actually exercise that system (given the tendency of bugs to not show up until something actually uses the system).
I've been considering it for a while, but I'm definitely now pitching a move away from GitHub at our organization.
If there was a prediction market for when GitHub experiences an outage every week, then you would make a lot of money.
there are tens of thousands of stupid scripts hosted on github itself that have scheduled progmatic pushes or pulls to repos via cron jobs with millions and millions of users -- yeah LLMs accelerate the fire but let's not pretend that GH was some bastion of real-user-dom somehow at some point.
Sorry, I realise this comment isn't up to HN's usual standards for thoughtfulness and it is perhaps a bit inflammatory but... look, I'd bet the majority of us on this site rely on GitHub and I can't be the only one becoming incredibly frustrated with its recent unreliability[0]?
(And, yes, I did enough basic data analysis to confirm that it IS indeed getting worse versus a year, two years, and three years ago, and is particularly bad since the start of this year.)
[0] EDIT: clearly not from looking at the rest of the comments in this discussion.
> And, yes, I did enough basic data analysis to confirm
Perhaps you'd consider showing us that analysis? That sounds like it would make a pretty substantive, thoughtful comment.