WordPress Hacked? Total Security Lockdown

 

It is no huge secret that I have had this WordPress blog hacked twice this year but some consolation is that I am not alone.

Helpful resources

Alex recently launched a DVD course on WordPress security that is available for FREE + shipping
Stop – I know what you are thinking – FREE + Shipping these days normally comes with lots of strings attached, forced continuity often hidden etc. Whilst Alex does cross-sell a few related products, the main offer is genuinely free.

Michael VanDeMar has a useful plugin to lock down your login process

SEO Egg Head offers a WordPress firewall

Donna has a useful script for monitoring your files

Of course you should also keep backups which you have total control over – this includes both database and files and you shouldn’t rely on claims that your webhost has a backup. With a VPS I find being able to “roll back” to a previous version useful, but backup with shared hosting plans supposedly made by admins isn’t a solution when you need to fix things in minutes.

Keep WordPress up to date, plugins up to date etc

Part of security is controlling what bots can crawl and index on your site, so some pamphlets would be useful as well

Getting URLs outta Google – the good, the popular, and the definitive way
Handling Google’s neat X-Robots-Tag – Sending REP header tags with PHP

Nasty Bots & Users

A lot of security relies on identifying nasty bots, detecting rogue activity such as failed logins or preventing access to all but approved users using an additional layer of password protection, or only allowing access to a server from a specific IP or range of IP addresses.

Also it is important to realise that different WordPress implementations require different levels of access control. With WordPress frequently being used for membership sites, you need to allow access to members. This reduces the number of security options available.

SEO Benefits

Lots of the pages you want to block from being crawled for security purposes also need to somehow be blocked or removed from indexation for SEO purposes, so tightening up security using the right methods will have natural SEO benefits.

Robots.txt isn’t the best option because you end up with lots of blocked pages appearing in search results and potentially indexed instead of pages you want in the index. As Sebastian explained, you have to let the bots in to crawl a URL before you can redirect them.
Not all bots can be identified, and not all bots obey robots.txt, though you can trap the naughty ones. If you are serious about your bot control you might also consider Fantomasters Searchbot Database.

User Agent Access Control For Total Lockdown

Lots of security and SEO methods rely on identifying various bots and kicking them somewhere else with 301 redirects, or denying them access to areas they are not wanted.

Far better would be to only allow access to one specific user agent, and globally kick out anything that doesn’t match – this is the user agent equivalent to restricting access to only a single IP address.

But how could this be achieved?

Many SEOs would already be familiar with User Agent Switcher for Firefox. This allows you to wander around the web pretending to be someone else, or something else such as Googlebot.

Unfortunately User Agent Switcher has a nasty problem – you often forget you have it switched to something different and then suddenly realise when a website starts misbehaving, refusing you entry, redirecting you to funny places etc.

If you created a custom user agent for security purposes, it wouldn’t be very secure if there was a chance you could broadcast it to lots of other webmasters by mistake. It is bad enough that user agent is broadcast “in the clear” unless you use SSL connections.

Then I came across an article discussing how to fake your user agent specifically for itunes but not other sites.

The Header Control Firefox plugin allows you to set your User Agent specific to a domain.

This would allow you to set a specific unique or relatively obscure user agent, and for it to only be used when accessing your own websites.

Even more useful this can be set up in multiple locations, work with variable IPs etc.

Experimental

This is something I am still experimenting with – I haven’t decided whether it is best to use .htaccess, php or a combination of both, and I am convinced the best option is to 301 redirect everything rather than deny access.
The best option might be to use a combination htaccess > php so you can do some enhanced logging.

The user agent doesn’t have to be unique, it could just be an obscure out of date version of Firefox or Chrome.

Example .htaccess to deny access

RewriteEngine on 
#
RewriteCond %{HTTP_user_agent} !^RareUserAgent
RewriteRule .* - [F,L]
# 

Example .htaccess to 301 redirect

RewriteEngine on 
#
RewriteCond %{HTTP_user_agent} !^RareUserAgent
RewriteRule ^ http://WhereIWantPagerank.com/MyMoneyPage/ [R=301,L]
#

What I haven’t included are rewrite conditions based on specific paths as I haven’t worked out exactly what paths I need to block whilst using specific WordPress Membership Plugins.

Warning 1 – always have backups
Warning 2 – you can majorly mess up access to your website with htaccess it you get it wrong and can’t restore a working version

Disclaimer/License: GNU FDL – run with it, make it useful

 

Liked this post? Follow this blog to get more. Follow

Comments

  1. says

    While its very common to hear

    Keep WordPress up to date, plugins up to date etc

    It’s important to remember that WordPress publishing cycle of new releases means security bugs are bundled in with feature updates which potentially can do more harm then the security bug you are patching. It is quite usual for wordpress to have a release and then several smaller patch releases within a few days to address the bugs they didn’t find.

    It is therefore worth holding back from instantly updating your live version of your site but testing the update on your development copy (which is populated from your regular backup ;) ) and holding off a couple of days to allow bugs to surface from early adopters and your own testing. (Its important for Open Source software in particular if you find bugs to report them otherwise they won’t get fixed)

    While security is important, the chances of you being hacked on a given day are small, it is better to test and wait a few days with a potential security problem, then jump on to an untested ship which has the potential to be catastrophic.

    I wrote an article on suitability of WordPress for Enterprise clients, and why SEO agencies working with such clients should be careful back in April.
    http://www.timnash.co.uk/04/2008/wordpress-security/

      • says

        Its a huge drain, as a plugin author, when ever we make a code change we have to test in multiple test environments including the previous version of wordpress, the current version and the current beta.

        To be honest we should test in more environments but even so maintaining that testing cycle is time consuming.

        As a consumer it is common sense to maintain your own development server, set up as your production server to test plugin and core updates on. The cost in time and server costs are still negligible against major downtime in most enterprises.

  2. says

    This happened to me once this year, index.php was deleted by some nasty hackers who were able to signup as admin back in those days when I’m very much a newbie because I retained the default “admin” username “as is” instead of changing it. I’ve learned my lesson and hopefully it won’t happen again.

  3. says

    For those who run their own servers (dedicated, virtual private servers, or cloud instance), it’s also good to secure the underlying operating system (e.g. filesystem, host firewall, etc.) and related applications like the web server and database server. Any security you put on WordPress will be useless if the system it’s running on is compromised.