Anyone who has created any sort of website that isn't 100% static html in the last decade or longer has probably encountered some sort of malicious activity from an automated spam bot computer crawling across the web looking for websites to fill with all sorts of crap. Years ago I added a simple guestbook page to my website that allowed visitors to write a message which was then displayed in a list. Within weeks I was constantly adding to the list of 'filtered' words to keep the spam out.
With my current website, a user is required to create a user account, sign-in and all that before being allowed to post comments, and additional checks are in place to keep the vast majority of the data safe from such pests. Earlier this week I (finally) added some custom error handling code that sent me an email any time a visitor encountered an error on the website. Suddenly, within minutes of deploying these changes, my inbox was starting to fill up with mail. I was worried at first that users were actually hitting errors all over the site, till I looked closer at the request addresses. Spam bots. 99.5% of them. Attempts at submitting page requests with URLs embedded into the query strings and the sort. Of course my code was blowing up on that sort of thing when it's not looking for a http string. I wasn't so worried that these spammers were getting the error page, but that my inbox was getting full, and fast.
So I've spent the last few evenings adding some changes to the request handling and error handling of the website. Since I have confirmed that no valid request string on the website will contain 'http://' within it, I am now redirecting those to the definition of 'BLARG' at UrbanDictionary.com. No longer even allowing them to get to the error page. For less obviously spam related bad requests I'm redirecting to google and a few other redirection destinations. Haven't got a new error email all day long, other than 2 that I created myself just to verify that valid errors were still being caught.