I woke up this morning to the sound of thousands of 404 requests hitting the server. It’s sad that there are kiddies out there who have nothing better to do than buy some pathetic $50 script and then sit there like an imbecile harassing people for hours on end. But alas, that is the world we live in — fortunately it’s less than trivial to block the entire scan with just a few lines of good old .htaccess.

About the scans

Before getting to the code, a bit more about the scans that it protects against. Apparently some despearate lowlife released yet another script for scanning sites for vulnerabilities and exploit opportunities. So suddenly there are waves of scans probing for a vulnerability related to the following query-string parameter:

?whatever=http://www.google.com/humans.txt?

Unless this means something to your server, thousands of scans for it are simply wasting time, bandwidth, energy, money, and everything else. Responding with 404 is fine if you’ve optimized the response, otherwise they’re basically leeches sucking the lifeblood from your business.

In a few months (or less) the script should be much less active, but right now there are too many idiots running it, which means that untold numbers of sites are being assaulted with relentless, malicious URL requests by the thousands, because, apparently, it’s not enough to run the script only once per site: they just keep it on auto-loop and repeat the same set of several hundred queries over and over and..

Here is a sampling of the requests made by the “humans.txt” scans:

http://example.com/admin.php?lang=http://www.google.com/humans.txt
http://example.com/zoomstats/libs/dbmax/mysql.php?GLOBALS['lib']['db']['path']=http://www.google.com/humans.txt?
http://example.com/ytb/cuenta/cuerpo.php?base_archivo=http://www.google.com/humans.txt?
http://example.com/yabbse/Sources/Packages.php?sourcedir=http://www.google.com/humans.txt?
http://example.com/xt_counter.php?server_base_dir=http://www.google.com/humans.txt?
http://example.com/xarg_corner_top.php?xarg=http://www.google.com/humans.txt?
http://example.com/xoopsgallery/init_basic.php?GALLERY_BASEDIR=http://www.google.com/humans.txt?&2093085906=1&995617320=2
http://example.com/xarg_corner_bottom.php?xarg=http://www.google.com/humans.txt?
http://example.com/wsk/wsk.php?wsk=http://www.google.com/humans.txt?
http://example.com/xarg_corner.php?xarg=http://www.google.com/humans.txt?
http://example.com/wp-content/plugins/wp-table/js/wptable-button.phpp?wpPATH=http://www.google.com/humans.txt?
http://example.com/wp-content/plugins/wordtube/wordtube-button.php?wpPATH=http://www.google.com/humans.txt?
http://example.com/wp-content/plugins/mygallery/myfunctions/mygallerybrowser.php?myPath=http://www.google.com/humans.txt?
http://example.com/wp-cache-phase1.php?plugin=http://www.google.com/humans.txt?
http://example.com/worldpay_notify.php?mosConfig_absolute_path=http://www.google.com/humans.txt?
http://example.com/wordpress/wp-content/plugins/sniplets/modules/syntax_highlight.php?libpath=http://www.google.com/humans.txt?
.
.
.

This sort of scan is malicious for a number of reasons, but most annoying is the sheer volume of requests now hitting servers looking for a “humans.txt”-related response. Utterly pathetic how many times this scanning script is being executed with nothing more than a single potential payload (i.e., that made possible via the http://www.google.com/humans.txt? vulnerability). What a waste.

Until the usage of this scan script slows down, you may want to take a moment and check if your server logs show any signs of getting hit — and you’ll know immediately because thousands upon thousands of humans.txt requests are hard to miss.

I could sit here all day and discuss the matter, but my time is limited, and 50-dollar kiddies are simply not worth the effort. Stopping malicious nonsense, however, IS worth my time, so let’s look at a drop-dead simple way to stop the entire “humans.txt” scan cold dead.

Block the humans.txt scanning with .htaccess

As mentioned, all of the requests for this particular scan are targeting, via query string, “humans.txt”, which makes my job super-easy. Crack open your root .htaccess file and add the following snippet:

# block humans.txt scans
<IfModule mod_rewrite.c>
	RewriteCond %{QUERY_STRING} http://www.google.com/humans.txt? [NC]
	RewriteRule .* - [F,L]
</IfModule>

I have this code installed on most of my sites now, and thankfully the pesky scans have stopped completely. Of course, they’ll be back with a new $50 script next month, but until then it’s nice to conserve server resources and keep error logs clear of nonsense.

I can hear it already: “wait, why are you blocking requests for the humans.txt file?” — I’m not, this technique blocks requests that include http://www.google.com/humans.txt? in the query string. Consider:

http://example.com/humans.txt == good request, never blocked.
http://example.com/?some_path=http://www.google.com/humans.txt? == bad request, always blocked.

Hopefully that clears up any potential confusion regarding this simple yet effective solution.

Note: I’ve also added this snippet to the recently posted 2014 Micro Blacklist, so no need to add both 😉

Happy hunting people.

No Comments
Comments to: Protect Against Humans.txt Query-String Scans

Recent Articles

Good Reads

Sapphires have been the fascination of gemstone enthusiasts for thousands of years. Although the classic blue variety probably crossed your mind, collectors understand that there is much more to the sapphire family than one shade of blue. With its blazing pink-orange, jewel-like clearness or sapphire crystal-blue, rare types of sapphire are worth a fortune both for their beauty […]
In today’s competitive digital landscape, getting website traffic is only half the battle. The real success lies in converting that traffic into leads, inquiries, and paying customers. This is where a professional website content writing company plays a critical role. It doesn’t just create content—it crafts strategic messaging designed to guide visitors through the buyer […]

Worlwide

Overview VipsPM – Project Management Suite is a Powerful web-based Application. VipsPM is a perfect tool to fulfill all your project management needs like managing Projects, Tasks, Defects, Incidents, Timesheets, Meetings, Appointments, Files, Documents, Users, Clients, Departments, ToDos, Project Planning, Holidays and Reports. It has simple yet efficient layout will make managing projects easier than […]
Sapphires have been the fascination of gemstone enthusiasts for thousands of years. Although the classic blue variety probably crossed your mind, collectors understand that there is much more to the sapphire family than one shade of blue. With its blazing pink-orange, jewel-like clearness or sapphire crystal-blue, rare types of sapphire are worth a fortune both for their beauty […]
In today’s competitive digital landscape, getting website traffic is only half the battle. The real success lies in converting that traffic into leads, inquiries, and paying customers. This is where a professional website content writing company plays a critical role. It doesn’t just create content—it crafts strategic messaging designed to guide visitors through the buyer […]
The transportation and logistics industry is rapidly evolving, and businesses are now moving beyond basic tracking solutions. An AI-powered GPS tracking system represents the next generation of fleet intelligence, combining real-time location tracking with artificial intelligence, predictive analytics, and smart automation. As competition increases and operational costs rise, companies need smarter tools to stay ahead. […]

Trending

Turquoise Jewelry is one of the ancient healing stones used for personal adornment and astrological benefits. The rare greenish blue-colored pectolite is celebrated for its enchanting powers among many crystal lovers. It is a hydrated phosphate of copper and aluminum that ranks 5 to 6 on the Mohs hardness scale. It is deemed a protective […]