Originally Posted by
smontanaro
I imagine that would turn into a game of whack-a-mole. Any web crawler large enough to effectively execute a denial-of-service attack on the BF server(s) would likely be coming from multiple places at once (or nearly at once).
The BF admins should be able to identify the culprits and threaten their ISPs with total blockage. Whatever it is, it certainly seems not to be well-behaved. OTOH, the BF robots.txt file is pretty skimpy, disallowing the ChatGPT bot, a few specific pages, some Google thing I don't recognize as a typical search engine crawler, and imposing a one-second delay only on Bing's crawler. If clients are ignoring the specific disallow entries (which mostly look like they would cause database activity), they should definitely be blocked strenuously. If more than Bing might hit the server too hard, I'd add a larger crawl delay and apply it to all crawlers, something like
Tech did add a 5 second delay over the weekend (which looked like it did not do much, as a lot of you were still experiencing the error). It was then changed back to a one second delay yesterday. An IP was banned, but the errors still persisted. There's nothing starkly out there that points towards why we were experiencing the db errors. But it seems to have died down today- I at least have not come across any errors...