2021-08-04, 00:36

Feb 27, 2012 — Notify Users module, ver. 0.5 released — user can choose between sending single message (...) »

2013-01-28, 08:32

19 студзеня 2013 у Гродне трагічна загінуў паэт, журналіст і мой сябар (...) »

2009-10-14, 07:19

The site Litaratura.org is devoted to Belarusian literary works and translations. (...) »

2009-01-12, 20:46

Late November 2008 marked the 20th anniversary of the official founding of the Belarusian (...) »

Registered User Area Log into This Site
Show Credits... Credits
~ / Public Domain Software / Protection of specific URLs from bots, cutting-off of automated spam $_
Protection of specific URLs from bots, cutting-off of automated spam
Click To View A Full Size Image...
2012-01-01, 13:32

This text describes my Protect URLs plugin that makes it possible to protect specific urls of a site from access by bots and malicious scripts. The creation of such a plugin was inspired by my observation that even an excellent forum software (for example, Simple Machines Forum — SMF) may be accessed by bots that are able to register accounts and post automated spam.

The plugin was primarily designed for my copy of an SMF forum, but it is coded in the way that makes possible to use it with any PHP-driven site.

The concept

Protect URLs plugin consists of three small parts:

  1. Configuration (__protect_urls.conf.php) — this is a part where you define which URLs of your site you would like to protect from accessing by bots.
  2. Main (__protect_urls.php) — this part analyzes what was requested by a visitor and takes further action.
  3. Trap (index1.php) — if a visitor requested one of the protected URLs, the main sends such a visitor to this part; the trap is a small webpage automatically redirecting human visitors back to the site (i.e., the URL they have originally requested), while bots and scripts are not sent back and silently die on this trap minipage.

In order to cut off automated spam, you need to protect two URLs on your forum: the page where people register/create their accounts and the page where they log in. The side effect of using my Protect URLs plugin is that search engines, too, are not able to access/index pages being under protection. However, it is worth noticing that the „login” and „register” pages actually are not very valuable for search engines — these pages are primarily designed for humans, and humans can freely access content and functionality, regardless of imposed protection.

Short manual

Several simple steps to activate the Protect URLs plugin:

  1. Download and unpack the package.
  2. Adjust/edit the settings in __protect_urls.conf.php file.

    (The config included in the package is a ready-to-use file for users who installed their SMF forum in /forum subfolder of the domain. If you have installed your SMF forum in the root of a domain or if you intend to use my plugin with other type of a PHP-driven site, you need to do several simple adjustments in config. Please read carefully my comments to the settings whose names, in most cases, should be self-explanatory.)

  3. Copy files to the same folder where your site’s main file (usually: index.php) is located.
  4. At the top of your site’s main file (index.php) add the following statement:


  5. Done; enjoy your spam-free site!


Many contemporary sites rewrite URLs through Apache’s module mod_rewrite. In order to use my plugin with such a site, you need to provide the config file with actual/real URLs that are to be protected, not the virtual paths displayed in a browser’s address bar. You need to be either familiar with the concept of rewriting and rewrite rules used by your site or ask the assistance of someone who knows what is going on.

There are several advantages of using this plugin. Here I would like to mention those noticed by me while using my copy of an SMF forum. Regardless of its low traffic, spam bots found the forum very fast and started posting automated spam. As a first countermeasure, I turned the registration from a „confirmation by email” into „acceptance by admin” mode. This stopped automated posts, but I still kept getting notifications about new users created by bots. Then I started to ban IPs, single numbers or whole ranges. This significantly slowed down the creation of new users by bots, but error logs reporting unsuccessful attempts started to grow rapidly. In other words, I drifted from one problem into another...

One day my brain was dazzled by a simple idea: why not to use a trick from Google Analytics? Google Analytics counts visits by triggering its code from a piece of JavaScript. Visits by humans are counted since their browsers are able to interpret/execute the embedded JavaScript code, while bots are omitted. This was exactly what I needed: I will send all visitors requesting crucial URLs to nowhere (plugin’s „trap”), but I also give users a chance to be rescued by the „JavaScript lifeboat” and successfully return them to the point from which they initially were thrown out.

The whole process is completely transparent for visitors. They do not have to solve either easy or difficult tests to prove their humanity — this humanity is already confirmed by the use of a human-made technology (browser), isn’t it?

I believe, it is. Now I have no automated spam, no automated user creation, and clean logs on my forum. You can achieve it, too. If you have read to this point, the solution is very near... ;)

Download This Attachment... Protect URLs ver. 1.0 — the very latest release as tar.bz2 package
Last modified: 2012-01-04, 00:00 — Size: 3,58 KB
Download This Attachment... Protect URLs ver. 1.0 — the very latest release as tar.gz package
Last modified: 2012-01-04, 00:00 — Size: 3,51 KB
Download This Attachment... Protect URLs ver. 1.0 — the very latest release as zip package
Last modified: 2012-01-04, 00:00 — Size: 4,49 KB
Download This Attachment... Protect URLs ver. 1.0 — version with config adjusted for MODx Evolution manager
Last modified: 2012-01-29, 00:00 — Size: 4,29 KB