Please note that some examples might be outdated.
Use .htaccess files to secure your website
Sometimes you have no choice but to protect your website yourself, for example if your hosting provider doesn't offer a Web Application Firewall (WAF) [2] security solution. So, what .htaccess
rules can you use to protect your website from online threats?
Did you know you can use .htaccess on IIS (Windows Server)? Helicon Ape provides support for Apache .htaccess and .htpasswd configuration files in Microsoft IIS. And it's pretty easy to set up!
But first things first, what is a Web Application Firewall?
A web application firewall (WAF) is an appliance, server plugin, or filter that applies a set of rules to an HTTP conversation. Generally, these rules cover common attacks such as cross-site scripting (XSS) ([2]) and SQL injection. By customizing the rules to your application, many attacks can be identified and blocked. The effort to perform this customization can be significant and needs to be maintained as the application is modified.
Helicon Ape can be used, in a way, to prevent basic SQL injection attacks too.
But there is a small downside using Helicon Ape to secure websites: the .htaccess file needs to be read for every HTTP request. Even though it'll be cached for some time, this might cost some performance.
Using the .htaccess rewrite rules below, you can protect your website somewhat from online threats. An important fact is that you know your own website; you need to know what requests you can expect , which requests are valid, and which requests need to be blocked.
Here are some simple .htaccess examples and snippets for you. Where possible, I try to keep them cross platform, for both Windows Server IIS and Linux Apache.
3 Ways of blocking sendmail.php on IIS webserver
Want to learn everything about Access Control in Apache .htaccess and WordPress? Ever wondered when to use 'Allow/Deny from all' or 'Require All Granted/Denied'? I explain it in my post WordPress .htaccess security best practices in Apache 2.4.6+.
Magento app/etc/local.xml security
Older Magento versions suffered from a direct accessible app/etc/local.xml
file containing MySQL database credentials. Block requests to Magento's local.xml file easily:
# Secure Magento's local.xml file which contains MySQL database credentials
# See http://www.saotn.org/magento-appetclocal-xml-beveiliging/
#
RewriteEngine On
RewriteCond %{REQUEST_URI} app/etc/local.xml$ [NC]
RewriteRule .? - [F,L]
Protect against known SQL injection attacks through HTTP GET
# Secure your website from some known SQL Injectie attacks:
# Enable the rewrite engine
RewriteEngine On
# continue with SQL functions
# this works only on HTTP GET, *not* POST body
RewriteCond %{THE_REQUEST} (?:limit|union|select|concat|1==1|like|drop|\#|--) [NC]
RewriteRule .? - [F,L]
# Use mod_security to access the POST body
Block spam bots in a .htaccess file
# block spambots (unverified!)
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} (?:Alexibot|Art-Online|asterias|BackDoorbot|Black.Hole|
BlackWidow|BlowFish|botALot|BuiltbotTough|Bullseye|BunnySlippers|Cegbfeieh|Cheesebot|CherryPicker|ChinaClaw|CopyRightCheck|cosmos|Crescent|Custo|DISCo|DittoSpyder|DownloadsDemon|eCatch|EirGrabber|EmailCollector|EmailSiphon|EmailWolf|EroCrawler|ExpresssWebPictures|ExtractorPro|EyeNetIE|FlashGet|Foobot|FrontPage|GetRight|GetWeb!|Go-Ahead-Got-It|Go!Zilla|GrabNet|Grafula|Harvest|hloader|HMView|httplib|HTTrack|humanlinks|ImagesStripper|ImagesSucker|IndysLibrary|InfonaviRobot|InterGET|InternetsNinja|Jennybot|JetCar|JOCsWebsSpider|Kenjin.Spider|Keyword.Density|larbin|LeechFTP|Lexibot|libWeb/clsHTTP|LinkextractorPro|LinkScan/8.1a.Unix|LinkWalker|lwp-trivial|MasssDownloader|Mata.Hari|Microsoft.URL|MIDownstool|MIIxpc|Mister.PiX|MistersPiX|moget|Mozilla/3.Mozilla/2.01|Mozilla.*NEWT|Navroad|NearSite|NetAnts|NetMechanic|NetSpider|NetsVampire|NetZIP|NICErsPRO|NPbot|Octopus|Offline.Explorer|OfflinesExplorer|OfflinesNavigator|Openfind|Pagerabber|PapasFoto|pavuk|pcBrowser|ProgramsSharewares1|ProPowerbot/2.14|ProWebWalker|ProWebWalker|psbot/0.1|QueryN.Metasearch|ReGet|RepoMonkey|RMA|SiteSnagger|SlySearch|SmartDownload|Spankbot|spanner|Superbot|SuperHTTP|Surfbot|suzuran|Szukacz/1.4|tAkeOut|Teleport|TeleportsPro|Telesoft|The.Intraformant|TheNomad|TightTwatbot|Titan|toCrawl/UrlDispatcher|toCrawl/UrlDispatcher|True_Robot|turingos|Turnitinbot/1.5|URLy.Warning|VCI|VoidEYE|WebAuto|WebBandit|WebCopier|WebEMailExtrac.*|WebEnhancer|WebFetch|WebGosIS|Web.Image.Collector|WebsImagesCollector|WebLeacher|WebmasterWorldForumbot|WebReaper|WebSauger|WebsiteseXtractor|Website.Quester|WebsitesQuester|Webster.Pro|WebStripper|WebsSucker|WebWhacker|WebZip|Wget|Widow|[Ww]eb[Bb]andit|WWW-Collector-E|WWWOFFLE|XaldonsWebSpider|Xenu's|Zeus) [NC]
# Prohibit and block the request:
RewriteRule .? - [F,L]
Deny access to known PHP backdoors like l_backuptoster.php
# Deny access to l_backuptoster.php / l_backuptoster_backup.php
# (PHP backdoor)
RewriteEngine On
RewriteCond %{REQUEST_URI} (^/l_backuptoster.php) [NC]
# or match any location
# RewriteCond %{REQUEST_URI} (l_backuptoster.php) [NC]
# or expanded
# RewriteCond %{REQUEST_URI} l_backuptoster(_backup)?.php [NC]
RewriteRule .? - [F,L]
Joomla Search Engine Friendly (SEF) .htaccess
# From the standard Joomla .htaccess file, as reference
#
# Block out any script trying to base64_encode data within the URL.
RewriteCond %{QUERY_STRING} base64_encode[^(]*([^)]*) [OR]
# Block out any script that includes a <script> tag in URL.
RewriteCond %{QUERY_STRING} (< |%3C)([^s]*s)+cript.*(>|%3E) [NC,OR]
# Block out any script trying to set a PHP GLOBALS variable via URL.
RewriteCond %{QUERY_STRING} GLOBALS(=|[|%[0-9A-Z]{0,2}) [OR]
# Block out any script trying to modify a _REQUEST variable via URL.
RewriteCond %{QUERY_STRING} _REQUEST(=|[|%[0-9A-Z]{0,2})
# Return 403 Forbidden header and show the content of the root homepage
RewriteRule .* index.php [F]
The flag F
means "Forbidden" and L
is "Last" (stop rewriting this request).
.htaccess RewriteMap as blacklist
Using a RewriteMap, .htaccess files are very well usable to block IP addresses of known abusers. You easily create a web blacklist as described in this post.
Here's how:
First you have to create a text file called "blacklist.txt" and put all IP addresses you want to block in that file. Because a RewriteMap uses a key1 / value1 structure, you have to add a key/value line. For example:
203.0.113.15 -
The IP address 203.0.113.15 is the key, and -
the value.
Secondly, place your blacklist.txt file somewhere outside the web root preferably, and include the file in your .htaccess file as a RewriteMap:
RewriteMap blacklist txt:D:/path/to/your/blacklist.txt [NC]
# Or on Linux: RewriteMap blacklist txt:/path/to/your/blacklist [NC]
RewriteCond %{REMOTE_ADDR} (.*)
RewriteCond ${blacklist:%1|NOT_FOUND} !NOT_FOUND
RewriteRule .? - [F,L]
Line by line explanation
- you declare a map file 'blacklist', called 'blacklist.txt'
- the REMOTE_ADDR (the visitors IP address) is the look up key
- the value of our REMOTE_ADDR key is looked up in the blacklist.txt map
- if the value is found, the RewriteRule is executed
A ready to use PHP blacklist web application is found here:
- Filter web traffic with blacklists. Always verify with your host whether .htaccess files are supported.
Block 0-day exploits, Remote File Inclusion (RFI) exploits, and so on in .htaccess
Use with care! If your hosting provider offers support for .htaccess
files and you are on top of new vulnerabilities in web software, you can make use of the rewriting capability of .htaccess. Whether you use .htaccess with Apache mod_rewrite, or .htaccess in IIS with Helicon Ape on Windows Server.
Use the rewrite engine, for instance, to block out known, and even unknown vulnerabilities in the software you use. You can block new and known - or even yet unknown 0-day - exploits by matching either a QUERY_STRING
, REQUEST_URI
or REQUEST_FILENAME
.
In the above part, I showed you how to protect your website from some known exploits. Please keep in mind: you cannot scan, filter and block POST payloads with .htaccess. You need ModSecurity for that.
Now follows a practical tip to block Remote File Inclusion (RFI) Cross Site Scripting (or XSS) attacks. Simply by blocking all requests to external HTTP addresses and websites.
Use .htaccess rule to block requests to remote URL's & secure your website
Lots of vulnerabilities in scripts are exploited by requesting remote files and content. This is called Remote File Inclusion: the output of remote scripts is executed within the context of the vulnerable script and server. A notorious example is Timthumb. Timthumb is used in many WordPress themes and plugins.
If you know what to look for, you can easily block those requests to remote files with a .htaccess file. For example, if a remote domain name is provided as URL parameter. The following example looks at the ?src=
query string parameter and blocks the request if it doesn't match our own domain name.
RewriteEngine On
RewriteCond %{QUERY_STRING} (src=) [NC]
RewriteCond %{QUERY_STRING} ((http(s)?)://)? [NC]
RewriteCond %{QUERY_STRING} !((.+\.)?(example\.com)) [NC]
RewriteRule .? / [F,L]
Replace example.com
with your own domain name. You need to escape the dot (".
") in the expression with a back slash \
.
The .htaccess rule explained
The above .htaccess rules looks at three query string parameters and then decides to block the request with a Forbidden flag if all conditions are met.
- if
src=
is provided in the query string - and optional HTTP or HTTPS as protocol
- if it doesn't contain your own domain name example.com
- then refuse the request
For example, a HTTP request with query string parameter ?src=
and value http://evil_hacker_site.com
:
http://localhost/wp-content/themes/thema/timthumb.php?src=http://evil_hacker_site.com/exploit.php
Is blocked with a 403 Forbidden HTTP status code.
RewriteCond matching in .htaccess
The above RewriteCond
condition (or rule) matches random positions in the query string. Keep that in mind. The following URL is matched too:
http://localhost/wp-content/themes/thema/timthumb.php
?foo=http://&src=123&bar=example.com
Also, try to combine as much in one rule as possible:
RewriteEngine On
RewriteCond %{QUERY_STRING} src=(http(s)?://)?(?!((.+\.)?example\.com)) [NC]
RewriteRule .? / [F,L]
Be careful though, web application security firm Acunetix warns against using .htaccess for security restrictions:
Many PHP web applications use .htaccess files to restrict access to specific files or directories that may contain sensitive information. For example, in order to restrict access to all files in a specific directory you can create a .htaccess file in that directory containing the string “deny from all”. In many cases it is wrong to impose security restrictions using .htaccess files.
Acunetix - htaccess files should not be used for security restrictions