Blog
Telling Computers and Humans Apart in web forms without visible CAPTCHAs
Many websites use CAPTCHAs of one form or another to try and tell the difference between Humans and Computers interacting with web forms. As the BBC have recently commented, these can be cumbersome to implement and also annoying to users.
Automated ways to distinguish humans and computer bots when submitting data via a web page are relatively easy once you figure out the ways each interact with web servers.
The basic thing that almost all spambots have in common is that they
- Don't support session cookies,
- Don't run Javascript code, and
- Try to submit hyperlinks in one form or another
Unfortunately, to implement a website form that requires both session cookies and Javascript to submit correctly requires a little more code than a basic <form> tag, but it shouldn't be difficult to make this code generic and manipulate the browser DOM to make the developer implementation very simple. On top of that, requiring moderation for anything that includes a hyperlink means that even the human spammers don't get through.
After very close to three years of running such a system on various websites that I look after, it is still yet to be cracked by any automated bots and also has not had any false-positives (apart from requiring moderation for a very small number of posts where people are submitting legitimate URLs). The only human (non-automated) spammer that has made it through to be moderated is from one IP address in Russia that regularly tries to include a link to a fake finance blog, so again something that's very easy to block with the right back end in place.
By Theo Gray on October 6, 2010 | Permalink | Comment
Reader Comments
Skip to form
There are currently no comments about this article.