@fatboy Apart from DDoS, I bet this is also very wasteful and not environment friendly. Generating that much of unnecessary traffic every day of every month of every year is terrible.

@fatboy good lord, that is a very subtle and sneaky way to DDoS people 😠


For boring technical reasons, it would be a fair bit of extra work for us to read robots.txt, so rather than going to a bunch of work to do that, we implemented a trivial list and offered to add sr.ht to it. That offer was ignored at the time, but is still open.

one of my least favourite thing to do is talk to Google employees, because most of them absolutely believe they are fuckin Leonardo da Vinci like geniuses. but the contrast between their perceived personas and their public work is hilarious. and sad for our entire engineering profession


> I also suggested keeping the repositories stored on-site and only doing a git fetch, rather than a fresh git clone every time, or using shallow clones

This was my first thought as well when I started reading this, why the f**k are they repeatedly cloning repos when they have all these other options that are way more efficient ?

Sign in to participate in the conversation

Fosstodon is an English speaking Mastodon instance that is open to anyone who is interested in technology; particularly free & open source software.