My esteemed colleagues often decide to insult me. Occasionally, though, they really pull out all the stops.
A week or two ago, I started getting search referrers in my site access logs, from apparent Google searches for phrases including the following:
- short geriatric twunt -attractive
- sleazy ginger midget god's gift to women
- short ginger dude god's gift to women
- bearded geriatric gnome spring chicken
- bearded sleazy dude god's gift to women
- ginger bearded legend god's gift to women
- short bearded man flange magnet
- ginger geriatric gnome spring chicken
The mention of the word “twunt” made me suspect that Derek, Gary et al had more than a little to do with it, especially since those queries don’t produce links to my site when fed back into Google (and I’m aware that mentioning them here may well change that situation before very long). I nevertheless mostly forgot about it for a week or two, periodically noticing again and meaning to follow it up, until today I finally got around to asking the guys. They were indeed responsible, and their method was quite ingenious.
As I mentioned a while back, Derek created a feeds aggregator for the blogs of people within the Department of Computing Science. He also later created a WordPress plugin, so that whenever a WordPress blog (such as this one) received a new post or a new comment, his aggregator would be pinged, and thus kept completely up to date. That’s all fine. The thing is, that’s not all that happened, as explained earlier by Derek:
- Someone pings the aggregator (either with a new post or a new comment).
- As part of the ping handler, a GET request is made to an XML-RPC handler script on Stu's site.
- The PHP script invokes a Perl script residing on one of Stu's less-identifiable servers.
- The Perl randomly selects two adjectives, a noun and a negative term (e.g. -"flange magnet").
- It invokes wget, specifying the referrer as http://www.google.com/...$1+$2+$3+-%22$4%22 (or something like that) and the target as http://www.mattgemmell.com/.
It’s also worth mentioning that the useragent for the request is set to that of the spiders from our distributed topic-driven web crawler which Derek, Gary, Mark, Neil and I wrote last year in Java as our team project.
I’m not entirely sure whether to be extremely flattered at the effort or just very, very frightened, but you’ve got to admit that it’s all pretty goddamn hardcore. Who needs enemies when you have friends who study Comp Sci?