I am trying out a method to reduce bot attempts on forms like on my contact page based on fluffy’s example.
On select pages, I now check for a specific cookie. If it is not found or is more than 24 hours old, then the browser redirects to the “Sentience Check” page. That page is a minimal form with a button to indicate “Yes, I am a hooman.” Submitting the form sets the expected cookie and redirects back to the original page. If Javascript is enabled, it will submit the form as soon as the page loads, so most hooman visitors will only see the intermediate page for a second and should be able to continue without issues.
Also at fluffy’s suggestion, the sentience check page returns a response code of 429: Too Many Requests with a header that indicates: retry after one hour. I have no high expectation of the bots respecting that, but maybe the lack of successful response codes will cause some to back off.
The last thing I did was add a noindex meta tag on the page, so search engines should ignore it.
If you’d like to view the page, I recommend turning Javascript off temporarily and then visiting: gregorlove.com/sentience-check/.
I am interested to see how much this will reduce bot attempts on the contact and public sign-in pages. I have had CSRF and honeypot form field protections on both for quite a while, but of course I still see a lot of attempts on them.
Depending how this goes, I might expand its usage to the “send a webmention” form and explore using it to block LLM bots.
I did consider using “I am a meat popsicle” on the button, but not everyone might get The Fifth Element reference.
