An update from GitHub: https://github.com/orgs/community/discussions/159123#discussioncomment-13148279
The rates are here: https://docs.github.com/en/rest/using-the-rest-api/rate-limits-for-the-rest-api?apiVersion=2022-11-28
- 60 req/hour for unauthenticated users
- 5000 req/hour for authenticated - personal
- 15000 req/hour for authenticated - enterprise org
No no, no no no no, no no no no, no no there’s no limit
Until there will be.
I think people are grossly underestimating the sheer size and significance of the issue at hand. Forgejo will very likely eventually get to the same point Github is at right now, and will have to employ some of the same safeguards.
Except Forgejo is open source and you can run your own instance of it. I do, and it’s great.
That’s a very accurate statement which has absolutely nothing to do with what I’ve said. Fact of the matter stands, is that those who generally seek to use a Github alternative do so because they dislike Microsoft or closed source platforms. Which is great, but those platforms with hosted instances see an overwhelmingly significant portion of users who visit because they choose not to selfhost. It’s a lifecycle.
- Create cool software for free
- Cool software gets popular
- Release new features and improve free software
- Lots of users use your cool software
- Running software becomes expensive, monetize
- Software becomes even more popular, single stream monetization no longer possible
- Monetize more
- Get more popular
- Monetize more
By step 30 you’re selling everyone’s data and pushing resource restrictions because it’s expensive to run a popular service that’s generally free. That doesn’t change simply because people can selfhost if they want.
To me, this reads strongly like someone who is confidently incorrect. Your starting premise is incorrect. You are claiming Forgejo will do this. Forgejo is nothing but an open source project designed to self host. If you were making this claim about Codeberg, the project’s hosted version, then your starting premise would be correct. Obviously, they monetize Codeberg because they’re providing a service. That monetization feeds Forgejo development. They could also sell official support for people hosting their own instances of Forgejo. This is a very common thing that open source companies do…
Dude, this is cool!
It works really well too. I have an instance.
No, no limits, we’ll reach for the skyyyy
LOL!!! RIP GitHub
EDIT: trying to compile any projects from source that use git submodules will be interesting. eg ROCm has more than 60 submodules to pull in 💀
The Go module system pulls dependencies from their sources. This should be interesting.
Even if you host your project on a different provider, many libraries are on github. All those unauthenticated Arch users trying to install Go-based software that pulls dependencies from github.
How does the Rust module system work? How does pip?
already not looking forward to the next updates on a few systems.
Yeah this could very well kill some package managers. Without some real hard heavy lifting.
scoop relies on git repos to work (scoop.sh - windows package manager)
Compiling any larger go application would hit this limit almost immediately. For example, podman is written in go and has around 70 dependencies, or about 200 when including transitive dependencies. Not all the depends are hosted on GitHub, but the vast majority are. That means that with a limit of 60 request per hour it would take you 3 hours to build podman on a new machine.
The numbers actually seem reasonable…
Not at all if you’re a software developer, which is the whole point of the service. Automated requests from their own tools can easily punch through this building a large project even one time.
…
60 requests
Per hour
How is that reasonable??
You can hit the limits by just browsing GitHub for 15 minutes.
Without login
Probably getting hammered by ai scrapers
you mean, doin’ what microsoft and their ai ‘partners’ do to others?
Yeah but they’re allowed to do it because they have brazillions of dollars.
The funny thing is that rate limits won’t help them with genai scrapers
Everything seems to be. There was a period where you could kinda have a sane experience browsing over a VPN or otherwise using a cloud service IP range endpoint but especially the past 6 months or so things have gotten worse exponentially by the week. Everything is moving behind cloudflare or other systems
If Microsoft knows how to do one thing well, it’s killing a successful product.
I came here looking for this comment. They bought the service to destroy it. It’s kind of their thing.
Github has literally never been doing better. What are you talking about??
We are talking about EEE
RIP Skype
we could have had bob or clippy instead of ‘cortana’ or ‘copilot’
Microsoft really should have just leaned into it and named it Clippy again.
It was never named Clippy 😉
60 req/hour for unauthenticated users
That’s low enough that it may cause problems for a lot of infrastructure. Like, I’m pretty sure that the MELPA emacs package repository builds out of git, and a lot of that is on github.
That’s low enough that it may cause problems for a lot of infrastructure.
Likely the point. If you need more, get an API key.
Do you think any infrastructure is pulling that often while unauthenticated? It seems like an easy fix either way (in my admittedly non devops opinion)
It’s gonna be problematic in particular for organisations with larger offices. If you’ve got hundreds of devs/sysadmins under the same public IP address, those 60 requests/hour are shared between them.
Basically, I expect unauthenticated pulls to not anymore be possible at my day job, which means repos hosted on GitHub become a pain.
Same problem for CGNAT users
Ah yeah that’s right, I didn’t consider large offices. I can definitely see how that’d be a problem
Quite frankly, companies shouldn’t be pulling Willy nilly from github or npm, etc anyway. It’s trivial to set up something to cache repos or artifacts, etc. Plus it guards against being down when github is down, etc.
If I’m using Ansible or something to pull images it might get that high.
Of course the fix is to pull it once and copy the files over, but I could see this breaking prod for folks who didn’t write it that way in the first place
I didn’t think of that - also for nvim you typically pull plugins from git repositories
Crazy how many people think this is okay, yet left Reddit cause of their API shenanigans. GitHub is already halfway to requiring signing in to view anything like Twitter (X).
They make you sign in to use search, on code anyways.
Which i hate so much anytime i want to quickly look for something
Just browsing GitHub I’ve got this limit
i’ve hit it many times so far… even as quick as the second page view (first internal link clicked) after more than a day or two since the last visit (yes, even with cleaned browser data or private window).
it’s fucking stupid how quick they are to throw up a roadblock.
Then login.
Just browse authenticated, you won’t have that issue.
that is not an acceptable ‘solution’ and opens up an entirely different and more significant can o’ worms instead.
Wow so surprising, never saw this coming, this is my surprised face. :-l
The enshittification begins (continues?)…
just now? :)
I see the “just create an account” and “just login” crowd have joined the discussion. Some people will defend a monopolist no matter what. If github introduced ID checks à la Google or required a Microsoft account to login, they’d just shrug and go “create a Microsoft account then, stop bitching”. They don’t realise they are being boiled and don’t care. Consoomer behaviour.
Its always blocked me from searching in firefox when I’m logged out for some reason.
Good thing git is “federated” by default.
& then you have fossil which is github in a box
THIS is why I clone all my commonly used Repos to my personal gitea instance.
I recently switched my instance from gitea to forgejo because everyone said to do it and it was easy to do.
What were the benefits
Mostly people stopped telling them to do it, I guess 🤷♂️
That’s actually kind of an interesting idea.
Is there a reasonable way that I could host my own ui that will keep various repos. I care about cloned and always up to date automatically?
Afict, you should be able to follow the instructions for migrating the repo and it will clone it to your instance and track for updates. It’s been a minute since I’ve read up on it though