by jacobgkau 9 hours ago

I'm guessing you don't manage any production web servers?

robots.txt isn't even respected by all of the American companies. Chinese ones (which often also use what are essentially botnets in Latin American and the rest of the world to evade detection) certainly don't care about anything short of dropping their packets.

cpncrunch 5 hours ago | [-4 more]

I have been managing production commercial web servers for 28 years.

Yes, there are various bots, and some of the large US companies such as Perplexity do indeed seem to be ignoring robots.txt.

Is that a problem? It's certainly not a problem with cpu or network bandwidth (it's very minimal). Yes, it may be an issue if you are concerned with scraping (which I'm not).

Cloudflare's "solution" is a much bigger problem that affects me multiple times daily (as a user of sites that use it), and those sites don't seem to need protection against scraping.

filleduchaos 4 hours ago | [-1 more]

It is rather disingenuous to backpedal from "you can easily block them" to "is that a problem? who even cares" when someone points out that you cannot in fact easily block them.

cpncrunch 4 hours ago | [-0 more]

I was referring to legitimate ones, which you can easily block. Obviously there are scammy ones as well, and yes it is an issue, but for most sites I would say the cloudflare cure is worse than the problem it's trying to cure.

kvirani 4 hours ago | [-1 more]

Security almost always brings inconvenience (to everyone involved, including end users). That is part of its cost.

cpncrunch 4 hours ago | [-0 more]

What security issue is actually being solved here though?

dingnuts 9 hours ago | [-0 more]

[dead]