Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

60 req/hour for unauthenticated users

5000 req/hour for authenticated - personal

15000 req/hour for authenticated - enterprise org

According to https://docs.github.com/en/rest/using-the-rest-api/rate-limi...

I bump into this just browsing a repo's code (unauth).. seems like it's one of the side effects of the AI rush.



Why would the changelog update not include this? it's the most salient piece of information.

I thought I was just misreading it and failing to see where they stated what the new rate limits were, since that's what anyone would care about when reading it.


> Why would the changelog update not include this?

I don't know. The limits in the comment that you're replying to are unchanged from where they were a year ago.

So far I don't see anything that has changed, and without an explanation from GitHub I don't think we'll know for sure what has changed.


because it will go way lower soon. and because they don't have to.

they already have all your code. they've won.


you are not ... you don't have any part of your body in reality, do you? you have left the room.

If people training LLMs are excessively scraping GitHub, it is well within GitHub's purview to limit that activity. It's their site and it's up to them to make sure that it stays available. If that means that they curtail the activity of abusive users, then of course they're going to do that.


it was never about avoid scrapers. that's just the excuse. they own the scrapers too, remember.

why do you think before they blocked non logged in users from even searching? they need your data and they are getting it exactly in their terms. because as I've said, they have already won.


Embrace, extend, extinguish.


… I… what has been embraced, extended and extinguished?

I see no MS or GitHub specific extension, here. Copilot exists, and so do many other tools. Copilot can use lots of non-Microsoft models, too, including models from non-Microsoft companies. You can also get git repository hosting from other companies. You can even do it yourself.

So, explain yourself. What has been embraced, extended, and extinguished? Be specific. No “vibes”. Cite your sources or admit you have none. I see no extending unique to MS and I see no extinguishing. So explain yourself.


I'm with you, but let's not forget that they haven't started the extinguishing yet. They might yet do it. The extending they've done plenty: issue tracker, wiki, discussions etc.


Those things all existed before Microsoft bought them, and they’re all present in competing products, even free ones.


the entire open source community exist in github.

Microsoft have a more successful social network for programmers than HN or google circles (heh) ever dreamed.

the arguments had already dropped access to the information by scrapers, since they own the scrapers and all... why did you brought it back as the main argument? they hijacked what could have been a community hub and turned into a walled garden to sell a few enterprise licenses.


[flagged]


> What the hell are … no, this is not a drug. This is a mental illness. Get help.

This is an unacceptable comment on HN and we have to ban accounts that do it repeatedly. We've warned you in the past about inappropriate comments. Please remind yourself of the guidelines and take care to observe them in future.

https://news.ycombinator.com/newsguidelines.html


Ban me then.

The person I responded to clearly has a mental illness and needs help.

The people behind this site think it’s some bastion of civility, and it just isn’t. People can be assholes using any words they choose, and they do so continuously here, but you mods don’t care because your rules are followed.

“Orange website bad” isn’t a meme for no reason. It’s because the orange website is bad. So fucken ban me.


We don't need to ban you, we just need you, along with everyone else here, to observe the guidelines, the first of which in the Comments section, is to ”be kind”. If everyone made the effort to do that, the site wouldn't be bad. It's no big deal, and it's not that hard to observe the guidelines if you're sincere about making a positive contribution to the site.


I'm 100% kind when people are kind to me.

I am 0% kind when people are unkind to me.

I've lived my entire life rolling over when people are assholes to me because I don't want to make the situation worse, or as seen here, throw the 2nd punch. the 2nd punch is always the one that gets caught. Never the first.


1 request a minute?!? wow that's just absurd you get it for just looking through code.


I opened a repo in a spare computer browser and clicked on a couple things and got a rate limit error. It feels effectively unusable unless you're logged in now (couldn't search from before, now you can't even browse).


agreed. when i first read the title i thought "oh what did the they up the rates to" - then i realized its more of a "downgraded rate limits"

thanks github for the worse experience


I've hit this over the past week browsing the web UI. For some reason, github sessions are set really short and you don't realise you're not logged in until you get the error message.

I really wish github would stop logging me out.


Hmmmm, Github keeps me logged in for months I feel like. Unless I'm misunderstanding the github security logs, my current login is since march.


GH is Microsoft's most successful social network.

GH now uses the publisher business model, and as such, they lose money when you're logged out. same reason why google, fb, etc will not ask you for a password for decades.


Something strange is going on. I think GH has kept me logged in for months at a time. I honestly can’t remember the last time I had to authenticate.


Yes, it's not the rate limits that are the problem per se but GitHub's tendency to log you out and make you go through 2fa.

If they would let me stay logged in for a year then I wouldn't care so much.


You might be afflicted with some SSO or enterprise thing, I haven't logged into Github on my personal account in years.


Nope, just normal GitHub account.

Though GitHub did force me to use 2fa earlier because they said I have a "popular repo", so perhaps my account is considered high risk. Or maybe it's triggered by travelling and changing IP locations? I have no clue, but it's annoying to have to 2fa more than once in a blue moon.


1/min? That’s insanely low.


60/hr is not the same as 1/min, unless you're trying to continually make as many requests as possible, like a crawler. and if that is for your use case, then your traffic is probably exactly what they're trying to block.


60/h is obviously well within normal human usage of an app and not bot traffic...

A normal rate limit to separate humans and bots would be something like 60 per minute. So it's about an order of magnitude too low.


Use case: crawling possibly related files based on string search hints in a repo you know nothing about...

Something on the order of 6 seconds a page doesn't sound TOO out of human viewing range depending on how quickly things load and how fast rejects are identified.

I could see ~10 pages / min which is 600 pages / hour. I could also see the argument that a human would get tired at that rate and something closer to 200-300 / hr is reasonable.


All of that assuming they're limiting based on human-initiated requests, not the 100x requests actually generated when you click a link.


I bump into these limits just using a few public install scripts for things like Atuin, Babashka, and Clojure on a single machine on my home IP. They're way too low to be reasonable.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: