It’s also massively more performant
- IT policies around GH accounts make no sense. It's a long story but, in short, you can't use any of your pre-existing GH accounts whether personal or professional (as in, an account I made exclusively for $DAYJOB before The Synergy Mandate) and must create a new one aligned with IT conventions.
- We don't monorepo hence we made extensive use of groups. There is no direct mapping for this concept in GitHub so we have to manually namespace projects.
- And now of course GH's no-nines availability :(
For my team, profit happens to be sensitive to our release dates---a day or two of delay can really make the difference if we'll make the month's projections or not. In another world, I would proactively mirror our profit-essential code but it's not worth the risk making a skunkworks guerilla effort. I'd like to think we can blame The Synergy Mandate in a few postmortems in the near future but of course I did not graduate yesterday, I know that's not gonna happen.
Thoughts and prayers we keep hitting our profit projections and they don't axe our product for underperformance.
(Writing this down, I can really feel how this job has changed since I joined.)
At least give people the option to start moving away from GitHub to contribute to your project. It will, ultimately, be better for the ecosystem.
The difficult part is all what's around the code:
* the tickets/PR (including the closed ones)
* the links referencing the project
* the CI setup
* for large projects, the committers permission setup
* if applicable, the push/commit/branch rules
All that will be deeply annoying to migrate on a per project basis, or might get lost.
But that's not even the worst on my opinion. Losing the go-to platform for finding software is (fediverse for software when?).
If you want a similar but different experience use GitLab.
If you want something more akin to the kernel experience (i.e. hosting, flexible repository structure, user auth via ssh keys, and a simple web UI) use gitolite with cgit, or alternatively gitweb.
I mean, technically it's a code review platform, not a complete toolbox like Gitlab and co, but damn if it isn't the most professional feeling experience.
Edit: Actually there's Gitea Actions and Forgejo Actions, that might be enough for my use case.
Can't wait for Microsoft to go the IBM way.
https://status.codeberg.org/status/codeberg
https://social.anoxinon.de/@codebergstatus/11647770704799298...
But, if not: It is different because Drew DeVault is scathingly anti-AI, and has a history of sticking to strong opinions (for better or worse). Seems like the best bet for off-premise source control if you are concerned about AI scraping and downtime.
Yeah, collaboration usually requires some sort of centralisation. Whether that is the LKML+git.kernel.org, gitlab.gnome.org, salsa.debian.org or Sourcehut, or GitHub. At least Sourcehut isn't completely proprietary and shoving AI down your throat at every possible chance. The same can be said for Codeberg and almost any GitLab CE, Gitea or Forgejo instance
"intermittent" is kind of underselling a failure on ~9/10 page loads
I don't think that chart shows what it seems like it shows. There were plenty of pre-2018 outages that don't show up there: https://hn.algolia.com/?dateEnd=1545696000&dateRange=custom&...
An alternate interpretation of that chart is "After the microsoft acquisition, they got serious about actually tracking outages."
That said, anecdotally, it's felt much worse over the last 6 months. I'd guess it's a combination of MS-induced quality drops and AI-induced scale increases.
- Switching to Azure
- Adding more AI features
- Using AI more for development
- Higher load caused by AI agents
Three of those are top-down direction from MS.
I'm guessing its a combo of Azure still not being stable enough and a byproduct of trying to move an entire company's operations from a physical DC into a cloud while its running.
It seems pretty reasonable that the massive surge in AI over the past 6 months has put tremendous strain on GitHub’s infrastructure, and most of these outages are as a result of that one way or another.
It’s astonishing how bad their software is now. I guess 20 years of outsourcing and bean-counting will do that
They dropped Ruby on Rails.
Ruby on rails got a bad rap IMO.
It was maybe the epitome of the get shit done internet era, and despite AI's proported productivity gains, I actually don't think we've got anywhere close to the velocity, stability, and simplicity of the peak Rails era just coming out of those PHP days. And teams were actually way smaller than they are now even after all these AI cuts!
But in the past year or so, it does feel like outages are becoming commonplace.
I wouldn't rule out them moving away from offering the free tier to stop the all the code pushes. I think new code mostly written by AI isn't that appealing of a data set to train on.
Came from Gitlab which started pushing out basic users in 2022 with massive price hikes. I weighed Github as an option but was like "no I don't want to be dealing with this same problem in another 5 years" when some other rug pull or degradation happens with that service. So I'm feeling pretty validated for that decision these days.
The speed improvement was massive (super low latency), and was worth the switch on it's own, but we also saved 90% in immediate cost... probably more in secondary effects from the git host just not being a pain point. The only long or unplanned downtime we've had was 2 hours in that whole 3 years where the tiny Linode VPS host had a total hardware failure and got migrated, which is a pretty damn good number of 9s for a simple easy to host single server solution. We also gained more durable and fast offsite backups (zfs) that Gitlab could never offer, but that's more of a custom self hosted thing not specific to Gitea.
It's unbeleivably snappy and fast. I can't recommend Forgejo enough.
(FWIW https://diversion.dev is at 100% uptime. Different scale, obviously, but also we're not Microsoft.)
I don't trust Microsoft's status page. It might be "fine" over all but it definitely is not fine for me.
GitHub is in a tight space right now. The pace of software development is increasing and they are in a load-bearing position. In addition, their GitHub Copilot license was a massive loss-leader both directly costing them money, and making the traffic problem even worse. Simply put, they aren't prioritizing scaling and reliability like they need to be in this current situation and instead are focusing on feature build outs that boil down to being Microsoft's AI Middleman Salesperson.
Their position is hard, but they are potentially fumbling the ball in a big way. I for one don't trust them to not be down right before I want to do a production deploy.
It is now being run into the ground.
At this point their chatbots Tay.ai, Zo, and Copilot are wrecking the platform and there is no CEO of GitHub to complain to about this so it now makes no sense to use GitHub at all. (Especially GitHub Actions)
It is now time to self host and not "centralize everything to GitHub". [0]
The Empire may fall ...