Is anyone surprised Google doesn’t even try to fix issues that are damaging its users?

  • X_Cli@lemmy.ml
    link
    fedilink
    arrow-up
    4
    arrow-down
    2
    ·
    edit-2
    2 years ago

    I don’t think that a robots.txt file is the appropriate tool here.

    First off, robots.txt are just hints for respectful crawlers. Go proxies are not crawlers. They are just that: caching proxies for Go modules. If all Go developers were to use direct mode, I think the SourceHut traffic would be more, not less.

    Second, let’s assume that Go devs would be willing to implement something to be mindful of robots.txt or retry-after indications. Would attackers do? Of course not.

    If a legitimate although quite aggressive traffic is DDoSing SourceHut, that is primarily a SourceHut issue. Returning a 503 does not have to be respected by the client because the client has nothing to respect: the server just choose to say “I don’t want to answer that request. Good Bye”. This is certainly not a response that is costly to generate. Now, if the server tries to honor all requests and is poorly optimized, then the fault is on the server, not the client.

    I have not read in details the Go Proxy implementation, to be truthful. I don’t know how it would react if SourceHut was answering 503 status code every now and then, when the fetching strategy is too aggressive. I would simply guess that the server would retry later and serve the Go developers a stale version of the module.