Archives for 

seo

An Evergreen Content Case Study

Posted by ChristopherFielden

This post was originally in YouMoz, and was promoted to the main blog because it provides great value and interest to our community. The author’s views are entirely his or her own and may not reflect the views of Moz, Inc.

Creating timeless content is something all SEOs should aspire to do. Why? When placed in front of the right audience, amazing content is highly likely to generate ongoing interest, engagement, links, and traffic, leading to increased sales/conversions and brand awareness. These results tend to make all but the most difficult client quite happy.

Image by Dominic Alves

In 2012, I decided to undertake an evergreen content experiment. I created a piece of content that I planned to update regularly over an extended period of time. I was in this for the long haul — I wanted to keep working on this content for at least a year. The aim was to see if putting ongoing effort into one page on a website would prove more efficient than spending time planning and creating multiple pieces of fresh content.

Common content performance patterns

Many creative content campaigns produce spikes of interest when they’re released and then dwindle in popularity. If you’re nodding your head in agreement, this might look familiar:

Creative campaign referral traffic spike, taken from Google Analytics

When shared, you see a brief spike in traffic, and then visits decline. This example is taken from the release of a well-received infographic that saw a lot of visits when it was shared on Reddit.

A spike isn’t always a bad thing. At the last count, this example generated over 35 decent quality links (ranging from DA 30 to DA 82) and thousands of social shares. This is a good result, but I wanted to try and create something that saw continued growth in traffic, engagement, and links over time rather than a spike.

Evergreen experiment

So I could share the results openly without contravening any client confidentiality agreements, I decided to conduct the test on my personal website. I write fiction, and I originally created my website to showcase my short stories. I launched the site in October 2011.

Image by Rose Craft

I’m not famous. No one knows who I am. No one found my writing, because no one was searching for it. Aside from friends and family, few people read my stories. Boo hoo.

In an attempt to gain an audience, I decided to try and make my website useful to the short story-writing community (people who write short stories also like to read them). I’d spent a LOT of time researching short story competitions to enter. I’d found a few decent resources, like Booktrust, that listed some writing competitions, but none of the lists or calendars were exhaustive or kept very up to date, and many of them didn’t list the full range of details I was interested in (closing dates, prize money, word count limits, genres, publishing opportunities etc).

So I decided to create an extensive short story competition list on my website.

Research

I was fairly certain, given the amount of competition lists in print magazines and the amount of writing websites I’d found, that there would be an audience for this type of content. To be certain, I did some keyword research.

There was an audience. Further research showed there was a large amount of long-tail keyword opportunities.

So I created the page, initially listing details of approximately 50 writing contents. The list went live during April 2012.

Page content

The page format is fairly simple. I started out with two tables, one listing regular writing competitions (monthly, quarterly, triannual and biannual) and another listing annual contests. Over time, I’ve added more tables so the resource is as easy to use as possible.

At the top of the page I openly invite users to contact me to have writing competitions listed. I also invite users to let me know if any of my details are incorrect, out of date, or if they find any broken links.

Use of outbound links

Again, to make the resource easy for writers to use, I’ve linked to all the competitions I’ve listed. I’ve read all sorts of discussions regarding outbound links and whether it’s best for them to be follow or no-follow, as well as discussions about how many links you should have on a page alongside concerns about the quality of the sites you link to and whether that has any impact on SEO.

As there doesn’t seem to be a definitive right or wrong way to do this, I decided to ignore all these concerns and just link to the most useful page on the different competition websites for the user. The only exception is when I link to a competition website that updates its URLs each time it updates the competition details. In this instance I link to the homepage to avoid excessive administration and maintenance of the page.

All links are followed.

Page maintenance

Image by Abhisek Sarda

From the day the page went live, I decided that I was going to display the date the content was last updated prominently at the top of the page. I wanted users and search engines to be able to see that the page was cared for and updated regularly.

I’ve read many arguments against using dates. This is usually because time constraints mean webmasters can’t update content regularly and the date often has the opposite effect, showing how out of date the content has become. But as I knew I’d be updating the page regularly, this wasn’t a concern.

I update the page at least twice a month, sometimes as frequently as twice a week, depending on how much time I have available.

On average, one competition contacts me a week, asking to be added to the list.

I respond to the vast majority of comments, either privately via email or as a comment, depending on what seems most appropriate given the subject matter.

Technical notes

My website is pretty basic. From a technical standpoint, I have ensured that the menu structures and URLs made sense and that my authorship has been setup correctly. Aside from that, all I’ve done is generate content. I’ve purposely kept the amount of pages on the site low, only adding new pages when I have to. At the time of writing, the site has 36 pages.

No linkbuilding

While undertaking this experiment I haven’t done any active link building at all. Any links the website has gained have been natural. Likewise, I haven’t undertaken any outreach. I have only engaged with writers and competition administrators that have approached me directly.

I did this to see how well the page could perform naturally, with internet users initially finding the content via organic search. Over time, this has led to natural interaction through comments, social sharing and links (and the unavoidable plethora of spam comments in my inbox). But I haven’t actively pushed the content. The results have come from natural content discovery and users outreaching to me.

Results

Traffic

This first graph shows the growth in traffic to the entire site from all mediums since launch in October 2011:

Traffic from all mediums to entire site from October 2011 to May 2013

Below is a breakdown of the figures from the different mediums:

The second graph shows the visits from all mediums to the short story competition page from its launch in 2012:

Traffic from all mediums to short story competition page from April 2012 to May 2013

Since its launch, the short story competition list has accounted for 67% of all the visits landing on my website (total entries to all pages are 77,374 — page entries to the competition page are 51,861). Full details of growth in visits to the page from all mediums can be seen below:

Visits have increased substantially since the competition list was launched. The dip we can see in April and May seems to be due to seasonality. The page still ranks well for a wide variety of long-tail phrases, and the New Year and autumn are seasonal peaks in writing-relating searches — admittedly, this is a generalisation, but as the site only launched in 2011 I don’t have a great deal of data to work with.

If patterns follow those of last year, I’d expect to see a rise in traffic in September.

Amount of search terms

10,728 search phrases have been used to find the page through organic search.

Most popular search terms used to find the short story competition page

Given that ‘(not provided)’ accounts for 30% of these searches, it’s safe to assume that the figure is actually substantially higher, so there is a lot of long-tail search involved here.

The large word count of the page copy contributes to this. At the time of writing this post, there were 11,632 words of copy on the page, of which user comments account for 3,463. At the time of writing this post, there are 66 comments on the page, some of them replies from me.

Social shares

The total amount of social shares to date is 127:

Details generated using Shared Count

I find that writers will often share the page on Facebook and Twitter, as will administrators of the competitions I list, if they run social profiles. Since the beginning of 2013, I have seen the share counts rising more rapidly, which I would expect given the large increases in traffic the page has seen when compared to last year.

Links

You can see details of the links that have been attracted below:

Data taken from Majestic SEO

Results from Moz’s Open Site Explorer

The volume of links isn’t huge. But this project is aimed at slow growth, and I haven’t actively asked anyone for a link. I want links to be entirely natural, only coming from those who think the content is worth linking to of their own volition. The only exception I can think of is me writing about the experiment.

As the resource becomes more widely recognised, I would expect the amount of links to increase accordingly. Recently, I have received my first university (.ac.uk) link, and started to receive correspondence from university lecturers who are involved with creative writing courses, asking about writing opportunities for their students (which led to me adding the ‘Writing Competitions for Young Writers & Children’ table to the page). This bodes really well for the future, as relevant university website links are likely to help the site’s performance greatly. And this kind of natural link building should make my backlink profile Penguin-proof long into the future.

I guess the key point here is that it’s taken almost a year of developing this content to start gaining links of this quality. Now that a handful of lecturers have found the site and started using it, it’ll be very interesting to see how the link results fare over the next twelve months.

Hmm, I feel another blog post coming on in the not so distant future…

There are a couple of other points to bear in mind:

  1. I’ve done this work in my spare time, around work and other commitments. If you had the time to focus fully on projects of this nature you could probably generate these types of results far more quickly.
  2. The links generated have been entirely natural as I haven’t actively asked anyone for a link.
Point 2 proves that detailed, focused content can work in its own right. You don’t have to outreach and link build to see some level of success.

Does this type of content help conversions?

Due to the growth in traffic to my website, I have increased my audience and engagement with my site. I’m beginning to be recognised as a thought leader (and a brand, I guess) in my niche area. Users have started to approach me with all manner of queries. I also receive frequent requests to proofread other writers’ work. If I had more time, this is a paid service I could consider offering in the future. So producing the content has revealed business areas I could expand into.

Ultimately, all the extra traffic has led to a rise in the number of people buying the book I sell through Amazon and Lulu. I now sell a few a week, compared to one every couple of months.

So, in answer to the title above, ‘Yes.’ I am getting what I wanted — a wider audience for my writing.

Amount of referrals other sites receive

Below you can see the amount of referral visits my page generates to other websites:

Referral traffic received from my competition page between January 2013 and May 2013

One of the writing competitions I list was kind enough to share this data with me. They were first listed on my site in January 2013.

A breakdown of figures can be seen below:

The highly relevant traffic I can offer writing websites makes being listed appealing to most competitions. From speaking with the administrators of the competition in the example above, I know that the traffic also converts well into competition entries, so they are very happy with the results related to me listing them.

This means that when I receive enquiries I can be confident in the value my list offers.

Summary

So far, this experiment has proved that investing time in creating content that is updated regularly can bring excellent results. In 2013, the page attracts between 6,000 and 9,000 visits a month, 22% of which return to the page time and again.

All you need to emulate this is some vision and common sense:

  • Find something your target audience wants
  • Give it to them
  • Keep the content fresh with regular updates and improvements
  • Listen to user suggestions and make changes accordingly
  • Listen to user suggestions about other resources they might find useful, and create them

That’s a content strategy that is likely to keep me busy for the next few months and generate excellent results.

Keep it simple

One of the more common mistakes I’ve seen SEOs make is developing content no one is interested in. You might end up creating something sexy based on an amazing concept, but will it actually gain you the result you or your client wants to see? Sometimes the more mundane ideas, like generating a useful list, can work far more effectively. It might not be sexy, it might not look awesome, but it is useful and can appeal to a community.

Keep it simple.

I believe you can learn more from those three words than you’d like to believe. Looking forward to hearing your thoughts in the comments.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

Domain Migrations: Surviving the "Perfect Storm" of Site Changes

Posted by Ruth_Burr

Last week, I held a Mozinar talking about the SEO steps involved in transitioning from SEOmoz.org to Moz.com, and sharing some of the results we got. We got some great questions on the Mozinar, and I wanted a chance to answer some more of them as well as expand on some points that didn’t fit into the Mozinar.


Throwing Best Practices to the Wind

As we spent more than a year planning the transition from SEOmoz to Moz, one thing I wanted to make sure everyone knew internally was that we were engaging in — well, maybe not worst practices, but we were pretty far away from best practices when it came to domain migration.

One thing most SEOs will tell you about domain migration is that you shouldn’t make a lot of big changes at once. For example, if you’re switching to a new domain, just switch domains; don’t try to change anything else at the same time. If you’re refreshing your design, just do that; don’t try to change your content or URL structure at the same time. And definitely, definitely don’t change anything else if you’re changing your top-level domain (TLD).

Screenshot from “Achieving an SEO-Friendly Domain Migration – The Infographic” by Aleyda Solis

Avoiding making this many changes to your website at once will mean that search engines have a much easier time finding, crawling, and ranking your new site, and that you’re much better positioned to diagnose problems as they arise.

Nevertheless, there we were: plotting a massive re-brand, site redesign, content overhaul, and domain change — complete with TLD switch — all at the same time. A perfect storm. It’s enough to make a person lose sleep (I know I did). At the same time, I’m glad we went through this, because it’s exactly the kind of thing some of you are going to end up dealing with as well. We needed to make all of these changes simultaneously in order to do what we wanted to do with the new product and re-brand, and that took precedence over SEO best practices. Instead of throwing up my hands and saying “well, we’re doomed,� I had to learn to do as much as I could with the situation at hand.

Doing the Long, Boring, Hard Work

The major portion of my work preparing for the domain migration was my big giant list of URLs:

Casey helped me pull a list of every URL on the site from our database, and I found each URL a redirect target on Moz.com. I would recommend pulling your URL list from your own database or server logs if it’s at all possible; it will give you a much more complete list of URLs than simply running a crawl using a program like Xenu or Screaming Frog.

When I talk to people about the migration, they typically blanch at the big giant list of URLs. Is it really necessary to look at every URL on the site?

Well, no, not totally. In our case, there were large sections of the site (like the blog and Q&A) that were staying largely the same — we could just redirect everything at seomoz.org/blog/* to moz.com/blog/* without needing further detail. For sites that are simply changing from one domain to another without a major redesign/restructure (which, again, you should really do if you can), it becomes even easier: If your site’s staying exactly the same, you can just redirect everything to the same folder location on your new domain.

I’m so glad that I did go through every page on the site, though, since I was able to get rid of a lot of old orphan pages, and help make sure the new site taxonomy was more inclusive so we didn’t have new orphan pages going forward. A site migration is a great time to 301 old pages that have outlived their usefulness to newer, more useful resources.

Traffic and Ranking Loss

I can’t stress enough how important it is to manage expectations around traffic and ranking loss during a domain migration. In the Mozinar, I mentioned that some PageRank is lost through 301 redirects (thanks Ethan for sending along this video from Matt Cutts explaining that the amount of PageRank that dissipates through a 301 is currently identical to the amount that dissipates through a link). This is usually not a huge deal for your most popular, best-linked pages, but can be an issue for deep pages that rank for long-tail terms, especially if the external links pointing to those pages are old or there aren’t very many of them.

With the Moz migration, the site restructure meant that we changed the internal link juice flowing from page to page as well. In some cases that was beneficial, such as with our Learn section which gained importance as it moved from our footer to our (now-reduced) header. In other cases, however, it meant some pages losing internal link equity. Again, not a huge issue for the most important pages but definitely impactful on long-tail terms. Between those two factors, the chance that our traffic and rankings wouldn’t be affected was pretty slim — and they were.

Better User Engagement

The flip side to the traffic loss was that we saw a boost in engagement metrics. Cyrus ran a quick study on a subgroup of users who a) had arrived through non-branded organic search and b) were new visitors to the site, to mitigate as much as possible the influences of preconceived expectations and industry “buzz� surrounding the re-brand. Here’s what he found:

As you can see, nearly every section on the site saw a boost in pageviews and pages per visit, as well as a huge decrease in bounce rate. The only downside is that we did see a decrease in time on page, pretty much across the board. We have a few theories on that: It could be that the more people click around the site, the less qualified each page view becomes; or it could be that the redesign has, in many cases, made pages shorter and easier to read quickly. The fact that time on page has decreased while average visit duration and bounce rate have improved points to the lowered time on page not being an indicator of lower quality, so that’s good.

What About Changing Platforms?

I didn’t get much of a chance to discuss changing CMS/Platforms in the Mozinar, because we run the site on a custom back end and CMS. It’s a question we get a lot in Q&A, so I wanted to address it.

Like most domain migrations, it’s important to keep things as much “the same� as possible when migrating to a new platform or CMS. Ideally, your site would look pretty much the same to users before and after the change – you could start making improvements using your brand new shiny CMS after the migration takes place. One thing that’s especially important to avoid when changing platforms or CMS is to make sure the new back end isn’t appending extra things to your URLs. For example, you want to make sure your home page is still www.example.com and hasn’t switched to www.example.com/index or the like. Also be on the lookout for extensions such as .html or .aspx being appended to your old URLs by the new platform. That’s a really common cause of duplicate content on a new platform.

Sitemaps

In the Mozinar, I mentioned that we had multiple sitemaps in Google Webmaster Tools, and got a question about why we do it that way. Since that’s a decision that was implemented before I came on, I wanted to make sure I had the whole answer before I responded, but it was as I suspected. We have separate sitemaps for our blog, Community profiles and YouMoz because those are three of the largest areas of our site. Since each sitemap can only contain 50,000 URLs, this multiple-map experience ensures we have plenty of room in each one for these prolific sections to keep growing. Kate Morris wrote a great post on using multiple sitemaps a couple years ago; you can read it here.

Noise in the Signal

“This is great info, Ruth,� I can hear you saying, “but why did it take you a month to share it with us?� A lot of the reason has to do with noise in the signal.

In the days surrounding the launch, we had increased buzz from our PR efforts and excitement from our customers about the new site. We knew this would happen – and were happy about it! – and that this uptick wasn’t a good indicator of how the new site would perform in the long term.

I also wanted to wait until SEOmoz pages were no longer ranking (as I mentioned in the Mozinar, they’re still indexed but aren’t ranking for any of our target terms) and had been replaced with Moz.com URLs, to get a better sense of how our rankings were impacted before I shared the info. This kind of longer-term analysis is important in the wake of a migration; make sure you’re getting as accurate a picture as possible of your new metrics.

Thanks again to everyone who listened in on the Mozinar, and who sent your kind wishes and congratulations to the Moz team during this process. It was a huge effort by the whole company and we’re so happy to share it with you!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

The Definitive Guide to WordPress Security

Posted by SamAntics

This post was originally in YouMoz, and was promoted to the main blog because it provides great value and interest to our community. The author’s views are entirely his or her own and may not reflect the views of Moz, Inc.

If you work in online marketing, the chances are good that you’ve worked on, are working on, or will at some point work on a WordPress site. If you work with wordpress.org in any capacity, this post is for you (much of this post doesn’t apply to *.wordpress.com hosted sites).

Script kiddies suck

In hacker lingo, a script kiddie is the lowliest form of hacker (using the term hacker loosely), and relies on common tools and scripts to find and take advantage of the weakest and most common security vulnerabilities: crappy passwords, use of public WiFi without a VPN, outdated plugins, low-security hosting, phishing attacks, and other things of this nature. Sadly, these issues alone grant access to a shocking number of sites.

Unless you’re in charge of a WordPress site for a major brand, the majority of the security issues you’re likely to face will be the result of script kiddies.

The good news is this: If you follow this guide, your site should be as close to invulnerable as you can reasonably get. (To be fair, nothing is truly invulnerable, but this will get you pretty close.) Abracadabra, vault-like security is yours.

NORAD Vault Door

Without further ado, let’s dive in. I personally take a four-tiered approach to WordPress security:

Hosting and server level security

When it comes to securing WordPress, it’s best to start from the ground up. When you host your website with a hosting company that isn’t sufficiently security-conscious, if any site on a server is hacked, there’s a chance that any other site on that same server could be vulnerable.

One Does Not Simply Host Wordpress

After a ton of research, I’ve determined that the most secure option for hosting Wordpress is WPEngine.com (and, conveniently, Moz has a PRO perk for them, 4 months of free hosting).

The effort they put into security is re-freaking-diculous (seriously). I’m in the process of moving all of my WordPress sites over to them as we speak. They aren’t cheap, but you get quite a lot for what you pay. They even have a partnership with Sucuri Security, so if your site ever gets hacked, they’ll fix it for free.

That said, they might not be a perfect fit for everyone. For example, there are quite a few plugins they don’t allow (many for performance issues, not security issues). There are alternates to most plugins though, so hopefully that isn’t a deal breaker.

If you HAVE to use another host for whatever reason, or need to host on your own servers, there are a few things to keep in mind (WP Engine does most, if not all, of this):

  • Run secure, stable versions of your web server and any software on that server.
  • Have a server-level firewall.
  • Keep your server under lock and key. Only your IT team should have access.
  • Never, ever access your server from an unsecure network.
  • If you need to FTP in, use SFTP via a reputable program (I like FileZilla).
  • Make sure your MySQL installation is as secure as possible.
  • Always create a unique database for each blog installation, and make sure your database table DOES NOT begin with wp_.
  • Backup your database and other files as often as possible, especially right before you make a change (there are plenty of options for this, such as CodeGuard and VaultPress).
  • And, of course, make sure your passwords are both complex and not used elsewhere.

There’s more to this, but those are the biggies. If you want a lot more detail, go here and here.

The next step in this process involves configuring some server rules. If you have access to the main server configuration file, it’s best to do these things at that level, but not everyone is going to have that access. For that reason, I’m going to cover how to do this via the .htaccess file by walking you through a real .htaccess file (Note: edit your .htaccess file AFTER you install WP. It’s server-centric though, so I’m covering it here).

BIG FAT WARNING: Be very, very careful when making changes to your .htaccess file. If you aren’t extremely comfortable with code, it’s best to let your developer do this. I’ve personally used all of this code, exactly as is, but I’ve seen bits work on some sites and break things on others (it totally depends on your server configuration, plugins installed, etc.). To be safe, get your developer to do this for you.

WordPress auto-creates a section in the .htaccess file. Don’t put anything inside of the WordPress section of the .htaccess, as it will be overwritten. Some things will need to go before the WordPress .htaccess section, and some things after, to avoid breaking things. If you don’t know what should go where, you probably shouldn’t be editing your .htaccess file.

Homer Simpson Consider Yourself Warned

OK, here goes…

This first bit of code helps to prevent errors on some Apache servers, and activates the rewrite engine (which many of these commands require to function):

## Include this at the start of your .htaccess file ##
Options +FollowSymlinks
RewriteEngine On

This next bit turns off the server signature. This is a “security by obscurity” trick, as the less info a hacker has about your system, the harder it is to get in. The more they know, the easier it is to go out and hunt for known exploits:

## Disable the Server Signature ##
ServerSignature Off

Sometimes spammers will append their own crappy query strings to the end of a URL, attempting to do all kinds of nasty things, and this next bit of code can negate it by 301 redirecting certain query strings back to the canonical URL.

Just edit the enter|query|strings|here bit to include the query strings you’re having issues with, separated by pipes (a pipe is a separator in RegEx). This next bit of code also has uses beyond blocking spammers, and can sort out issues with ?replytocom and other common junk query strings:

## Remove Spammy Query Strings ##
<ifModule mod_rewrite.c>
RewriteCond %{QUERY_STRING} enter|separated|query|strings|here [NC]
RewriteRule .* http://www.%{HTTP_HOST}/$1? [R=301,L]
</ifModule>

While not hacker-specific (though it certainly could be), this next bit of code will prevent bots with no user agent from hitting your site. Just change out yourwebsite.com with your actual URL before placing this in your .htaccess:

## Protect from spam bots ##
<IfModule mod_rewrite.c>
RewriteCond %{REQUEST_METHOD} POST
RewriteCond %{REQUEST_URI} .wp-comments-post\.php*
RewriteCond %{HTTP_REFERER} !.yourwebsite.com.* [OR]
RewriteCond %{HTTP_USER_AGENT} ^$
RewriteRule (.*) ^http://%{REMOTE_ADDR}/$ [R=301,L]
</IfModule>

A common hacking tactic is a SQL injection, and this bit of code can block the vast majority of attempts:

## SQL Injection Block ##
<IfModule mod_rewrite.c>
RewriteBase /
RewriteCond %{REQUEST_METHOD} ^(HEAD|TRACE|DELETE|TRACK) [NC]
RewriteRule ^(.*)$ - [F,L]
RewriteCond %{QUERY_STRING} \.\.\/ [NC,OR]
RewriteCond %{QUERY_STRING} boot\.ini [NC,OR]
RewriteCond %{QUERY_STRING} tag\= [NC,OR]
RewriteCond %{QUERY_STRING} ftp\:  [NC,OR]
RewriteCond %{QUERY_STRING} http\:  [NC,OR]
RewriteCond %{QUERY_STRING} https\:  [NC,OR]
RewriteCond %{QUERY_STRING} (\|%3E) [NC,OR]
RewriteCond %{QUERY_STRING} mosConfig_[a-zA-Z_]{1,21}(=|%3D) [NC,OR]
RewriteCond %{QUERY_STRING} base64_encode.*\(.*\) [NC,OR]
RewriteCond %{QUERY_STRING} ^.*(\[|\]|\(|\)||ê|"|;|\?|\*|=$).* [NC,OR]
RewriteCond %{QUERY_STRING} ^.*("|'|<|>|\|{||).* [NC,OR]
RewriteCond %{QUERY_STRING} ^.*(%24&x).* [NC,OR]
RewriteCond %{QUERY_STRING} ^.*(%0|%A|%B|%C|%D|%E|%F|127\.0).* [NC,OR]
RewriteCond %{QUERY_STRING} ^.*(globals|encode|localhost|loopback).* [NC,OR]
RewriteCond %{QUERY_STRING} ^.*(request|select|insert|union|declare).* [NC]
RewriteCond %{HTTP_COOKIE} !^.*wordpress_logged_in_.*$
RewriteRule ^(.*)$ - [F,L]
</IfModule>

Now, there are plugins that can limit the number of login attempts from any one IP address, but that doesn’t prevent hackers from using large blocks of IPs to brute-force your site (a la public proxy lists). I’ve experienced this first hand numerous times, so the following bit of code has been a lifesaver as it only allows my login pages to be reached from IP addresses I specify, and blocks access to those pages from all other IPs.

Just adjust the allow from lines to reflect your actual IP addresses (you can get your IP addresses by going to Google from each place you connect to the internet and searching “What is my IP”). If needed, change the login filenames as well (wp-login.php is default, and login is not, but my site uses both because of a plugin I use).

Or, to make it easier on yourself, go to ProxyBonanza and pay $10/mo for one exclusive proxy IP of your own, and then allow that IP and use that IP whenever you want to access your sites. (ProxyBonanza has plugins for Firefox and Chrome, which make this step really easy.) Just swap out the fake IPs below with your actual IPs. If your IP changes, you can always go in and fix this via FTP later.

## Restrict WordPress Login Pages to Your Own IPs ##
<Files wp-login.php>
order deny,allow
deny from all
allow from 192.168.1.1
allow from 192.168.1.2
</Files>
<Files login>
order deny,allow
deny from all
allow from 192.168.1.1
allow from 192.168.1.1
</Files>

There are a number of files that nobody but you should ever be accessing, and this bit of code will block them from being accessed via a browser:

## Block Sensitive Files ##
Options All -Indexes
<files .htaccess>
Order allow,deny
Deny from all
</files>
<files readme.html>
Order allow,deny
Deny from all
</files>
<files license.txt>
Order allow,deny
Deny from all
</files>
<files install.php>
Order allow,deny
Deny from all
</files>
<files wp-config.php>
Order allow,deny
Deny from all
</files>
<files error_log>
Order allow,deny
Deny from all
</files>
<files fantastico_fileslist.txt>
Order allow,deny
Deny from all
</files>
<files fantversion.php>
Order allow,deny
Deny from all
</files>

If you find your site being hit repeatedly with attack attempts from certain IP addresses, you can manually block certain IPs with the following bit of code. Just edit the deny from bit to include the offending IP, with one IP per line as follows:

## Malicious IP Blocking ##
order allow,deny
deny from 1.1.1.1
deny from 2.2.2.2
allow from all

If you have people hitting you really often from the same IP or IP block, you can redirect that IP/IP block to a nice rickroll video (just change the IP below to reflect the one that’s hitting you). 🙂 I’ve done this on my sites for a few repeat offenders:

## Redirect Recurring Spammer IPs to a Rickroll Video ##
RewriteCond %{REMOTE_ADDR} ^192\.168\.1\.1$
RewriteRule .* http://www.youtube.com/watch?v=oHg5SJYRHA0 [R=302,L]

If you have certain websites that are hitting you with referral traffic you don’t want (it can happen for various reasons), you can block those referring domains with this code:

## Block Certain Referring Domains ##
RewriteCond %{HTTP_REFERER} digg\.com [NC]
RewriteRule .* – [F]

You can also use your .htaccess file to secure wp-includes (this can cause real issues, especially with Multisite, so I’ll have you go here for the specifics). You can also do some other pretty advanced things, like blocking certain countries and browser languages, if you so choose.

With all of that in place, your .htaccess file is just about as hardened as it can get. An .htaccess file can exist for each directory on a site, and is applied to everything in and under that directory. I’ve compiled this list from a number of different articles, with a few bits of my own sprinkled in. For further reading on these and other similar points, check out these five links.

The last step is to lock down your file permissions so that only those who should have access to certain files have that access. You can read how to change file permissions here (be careful with this one too, as it can break things, particularly plugins.) This is something you should test very carefully as you implement it, ideally in a sandbox or dev environment.

And that’s it for WordPress server-level security (not really — you could fill a book with this stuff — but this should be sufficient for your needs). Next up, WordPress itself!

Your WordPress installation

Once you have your hosting and server security sorted out, it’s time to get WordPress installed, along with the necessary security plugins. Even if you already have an existing WordPress site, don’t skip this section!

You’ll want to download the WordPress install files directly from wordpress.org, and go through the install process via secure FTP (SFTP). Many hosts offer a one-touch WP install, which is also fine. As you do this, make sure you pick secure passwords (outlined in the next section), and don’t use the same password for more than one site/thing (separate passwords for your database, FTP, WordPress admin, etc.)

With WordPress installed, the next step will be to pick a theme — and not just any theme will do. As any black-hat SEO knows, themes and plugins have long been a great way to get links, albeit in a shady and unethical way (remember MozCon 2011, when Richard Baxter gave a live demonstration of pointing millions of links with anchor text of his choice from a set of WordPress sites running a theme/plugin he’d created? Yeah.)

Because a lot of potentially dangerous things can be hidden inside of themes, it’s a good idea to use or buy a secure, clean theme. The themes that come with wordpress.org by default are pretty safe, but here are a few other options for clean themes: Option 1 and Option 2. To get a better feel for why this is so important, there’s a great video here.

If you already have a theme installed, you might want to run a security scan, or have a security-minded developer look through the theme code. Ditto for any plugins you might have.

After you’ve selected your theme, the next step is to start picking plugins. When it comes to plugins, you need to be just as careful as you were with picking a theme. Even popular plugins can contain vulnerabilities, and developers can sometimes be slow to fix them (or perhaps put them there themselves). For that reason, I recommend using as few plugins as possible to get the job done. That said, from a security perspective, here are the plugins I highly recommend:

  • Better WP Security – This is sort of an all-in-one security option. It handles a variety of tactics covered in this post. Can overlap with other plugins, so be careful. Free.
  • Limit Login Attempts – Exactly what it says, and a phenomenal way to deter brute-force hacking attempts on a site. Free.
  • Akismet – Great way to filter out a lot of crap before it ever touches your site. If your site is easy to spam, it might also be easy to hack, so make it a hardened target on all fronts. Paid.
  • Sucuri Security – When you pay for this service, you get a plugin to install on your site that helps with the monitoring and hardening process. It has overlap with other plugins though, such as Limit Login Attempts and Better WP Security, so you don’t want to use all of them at once. Paid.
  • CodeGuard – Great backup service that lets you easily roll back if you ever do get hacked. Also, people don’t back things up nearly as often as they should, so doing it automatically is handy. Paid.
  • CloudFlare – CloudFlare is a CDN, but also so much more. It has some great security features built in, and comes in both free and paid versions.
  • Google Authenticator – Enables two-factor authentication on WordPress, which is awesome. I use two-factor wherever it’s offered, because it rocks. Free.
  • Stealth Login Page – You can’t crack what you can’t find. This plugin hides your login page without needing to edit .htaccess files. Free.
  • WordPress SEO by Yoast – Not only does this have great SEO benefits, but it allows you to easily edit your .htaccess file from within the WordPress admin, which is very handy. Free.

If you opt to use WP-Engine for your hosting, be aware that they are very strict on what plugins they do and don’t permit. I find this pretty annoying, and while I understand their reasons, I really like some of the plugins they don’t permit.

If you have unused themes or plugins installed, I’d recommend deleting them. Just having them installed on your site, even if they aren’t active, can potentially pose problems. You should also make sure that you keep WordPress, your plugins and your themes up-to-date. Updates often fix known security issues, and one of the first things a smart hacker looks for is out-of-date plugins and themes they can exploit.

As you build out your site, you should also pay very close attention to what is and isn’t reachable by crawlers, and how your site handles things like login info, passwords, lost passwords/password resets, security questions, etc. There’s an entire sub-set of hacking called Google hacking, dedicated to surfacing information Google has found and indexed that it probably shouldn’t have (great article here). Making effective use of your robots.txt file to block things that should be blocked is highly recommended.

While site security is never finished, this will sort out the vast majority of problems you’re likely to encounter. Remember, nothing is unhackable, so the goal is simply to make your site more way trouble than it’s worth to the majority of hackers.

Personal security

As any half-decent hacker knows, the human element of security is usually the weakest link in the chain. The most security-conscious web admin or host can be foiled by a common password (Love, Sex, Secret, God, Hack the Planet!).

You Are The Weakest Link, Goodbye!

The human brain likes routines, patterns, and comfort zones; and hackers exploit that with glee! If you want a fascinating yet frightening read on this topic, check out Kevin Mitnick’s book The Art of Deception.

Here are my seven personal best practices for locking down the human element:

  1. Never access a WiFi hotspot through anything other than a secure VPN. I personally use Cloak as my VPN (iOS and Mac only at this point), but there are lots of options. You’d be shocked at what can be found with simple packet sniffing (Firesheep is a great example, and will probably make you quite uncomfortable). When you use a WiFi network, secured or unsecured, anyone else on that network can get access to your traffic (if all your traffic is encrypted, you’re MUCH safer, which is why you should use a secure VPN on any shared network, even if it’s a “secure” shared network). If you have WiFi at home or work, make the password a strong one, use WPA2, and set your router to NOT display the SSID (this is a “security by obscurity” tactic).

  2. Get a firewall. A good firewall is an excellent defensive tool. In a perfect world, I’d recommend having both a software and a hardware firewall, but that may not be feasible for everyone. At the very least, you need a software firewall (Comodo, ZoneAlarm, etc.). It can be a bit intrusive, depending on your settings, but it’s easy to customize and does a very good job. You should have a firewall on every desktop/laptop/server.

    Firewall
    See, there you go, a good firewall. Not much is getting past that…

  3. Get an antivirus program. Viruses and malware are a dime a dozen, and the chances are REALLY good that you’ve got at least one on your machine already. If a hacker has access to your computer, no amount of security anywhere else can protect your WordPress installation (not to mention your email, bank account, etc.) I’ve tried quite a few over the years, and I’m partial to Avast. It’s one of the least resource-intensive AV programs on the market (won’t bog down your machine), but it’s also extremely thorough (there’s a free version, but I pay for the full suite for a variety of reasons).

  4. Keep your hardware physically secure. If someone can get to your machine, it’s a cinch to hook up a keylogger. If you don’t password protect your machine, there are all kinds of other quick and dirty things they could do as well. If you use a desktop in particular, and it’s in a common area at work, periodically check your USB ports and all cords running into the machine for anything unusual. It’s uncommon, but it happens. Seriously, you should see the type of security Google has at its server farms!

  5. Use really good passwords, and don’t ever reuse passwords on multiple sites. Here’s where the lazy human element really comes into play. We’re not really good at remembering obscure passwords, so we tend to stick with things we’ll remember (asdf, 12345678, qwerty12345, etc.). This is bad, because common passwords make things REALLY easy for hackers, especially if you use the same password for multiple sites (don’t do that, ever).

    Operating system passwords are notoriously easy to crack with rainbow tables, so make sure your OS password is long (at least 15 characters) and complex (uppercase and lowercase letters, numbers and symbols, avoiding common substitutions like @ for A or 8 for B, etc.). Here’s a cool article that explains why complex passwords make things SO much harder for hackers.

    Thanks to some pretty serious security blunders over the years, it’s easy to find massive lists of passwords used on pretty major sites (RockYou is a great example, with 32 million passwords leaked). With a list like that, you can just pick a WordPress site and try random passwords at will until you get a hit. While far from efficient, script kiddies in particular love this brute-force approach.

    Homer Simpson D'oh!

    I’ve found the easiest way to have virtually unbreakable passwords is to use a tool like LastPass, 1Password or Roboform. They allow you to generate a random, long, extremely complex password for each site, and then encrypt and store them all with one master password. There are desktop and mobile apps available (some of which even contain a secure browsing environment), so you can easily login from your various devices, and all you have to remember is one password to access them all (for the love of all that is holy, at least make that one password complex).

    Don’t write down, print, or store your passwords in plain text on your computer. Just don’t.


  6. Protect your email accounts with two-factor authentication (and then protect your phone too). If a hacker can’t get into your site via the password, their next trick is usually trying to crack your email account so they can just do a reset. If your email provider offers two-factor authentication, USE IT.

    If you do this, make sure you lock your phone (use a real password, not the 4 digit variety) and try really hard not to lose it, since that is now the key to your accounts (and, in a perfect world, don’t put that phone number up online, just to be safe. If a website ever needs a phone number, get a Google Voice number that you use just for that.) You should probably also set your phone to wipe after a certain number of failed tries, and configure a remote wipe option as well, if possible, as your phone is now the key to your accounts.

    If your account provider asks you for security questions, use a mnemonic to come up with a totally separate answer (for example: for the question “What was your high school mascot?”, I might think, I really hated my CS teacher in high school, and then use that teacher’s name as the answer.) This will effectively neutralize attempts to mine your social profiles for data hackers can use to guess your security questions.

  7. Learn to recognize and avoid phishing attacks. Whether by email or website, phishing attacks are one of the most common causes of security breaches (you might have heard about the hacked AP Twitter account fiasco that caused a massive stock drop — yeah, that was due to a phishing attack).

    When it comes to avoiding these sorts of attacks, I live by three rules:

    If I have to log in to a site, I only navigate to that site through my password manager (this prevents me from accidentally falling for a misspelling URL phishing attack, like if I were to type Facebool.com instead of Facebook.com).

    Never, ever click on a link in an email and then login to whatever page pops up (see last rule). In fact, I don’t click on links in email anymore. I right-click, copy link location, and then paste it into Google, just to be safe. If it doesn’t look right, or the results include spammy stuff, I stop there.

    Never, ever open an attachment from someone you don’t know and trust (and even if you know and trust them, drop it in a folder and run a virus check on it before opening it, or open it in a sandbox program first just to be safe). If someone who has you in their contact list gets their email hacked, the hackers start by blasting out emails to that person’s contact list to expand their phishing pond.

Last but not least, exercise constant diligence

When it comes to WordPress security, you can’t just set it and forget it.

Showtime Rotisserie Just Set It And Forget It!

If you put all of this in place, and then fail to monitor and update and change things as time goes by, you’ll be in just as bad of shape as if you’d never done any of this to begin with.

To make sure that all of your hard work doesn’t go to waste, I recommend a seven-step checklist to maintain constant vigilance for your WordPress sites:

  1. Keep WordPress updated. I’m in my sites daily, so I keep an eye on this daily. WordPress doesn’t update the core too terribly often, so I’d recommend checking this at least monthly to be safe. You might want to have your dev team do this, as updates sometimes break things.

  2. Keep your plugins updated. Plugins are one of the most vulnerable parts of WordPress, not only to external hackers, but to malicious or greedy programmers. While we already covered only using reputable plugins, also make sure you keep these plugins updated, just in case a vulnerability is being addressed in the update. Again, you might want to have your dev team do this, as updates can sometimes break things.

  3. Monitor your server log files. This might be overkill for most folks, unless you’ve spotted something suspicious. Your server logs will give you the details of everything that has hit your site, human or bot, and when and from what IP address. You can find some awesome stuff in here, so keep an eye on it from time to time. (AWStats is a good free tool for this.)

  4. Monitor WP access. You can use a plugin like Simple Login Log to monitor the details of logins to your site. DO THIS.

  5. Monitor for file changes. A plugin like CodeGuard will send you emails whenever your WordPress files are changed. This can be an early-warning system for a hack, and is worth the investment. It also allows you to roll back changes if needed.

  6. Change your password periodically. I’d recommend every 3-6 months, but once per year is probably sufficient if you’re using a sufficiently complex and unique password.

  7. Keep Your Firewall and Antivirus Software Updated – New threats are discovered constantly, so it’s important to keep everything updated. Out-of-date security software is a vulnerability.
There are literally books worth of information on this (you’ve kind of just read one), so I’ll leave it at that. If you follow these guidelines, and make the recommended changes, your WordPress site (and all your other accounts for that matter) will be as secure as reasonably possible.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

#MozCon Speaker Interviews: Avinash &amp; Annie

Posted by Lindsay

Today I’m excited to bring you a short interview with two of the top web analytics professionals in the industry, Avinash Kaushik and Annie Cushing. Not only are they experts at leveraging data effectively, they’re incredible conference speakers who are returning to MozCon this year!

Avinash Kaushik is the digital marketing evangelist at Google and co-founder of Market Motive. He is also the author of two best-selling books, Web Analytics 2.0 and Web Analytics: An Hour A Day, and he writes a popular web analytics blog, Occam’s Razor.

He’s an energetic speaker who delivers eye-opening insights about the power of data that you can put into action immediately. He gave an awesome presentation at MozCon 2011 which inspired a flood of tweets and ended with a standing ovation.

Annie Cushing is an SEO and analytics consultant. Her areas of expertise are analytics, technical SEO, and everything to do with data — collection, analysis, and beautification. She’s on a mission to rid the world of ugly data, one spreadsheet at a time.

If you don’t think analytics can be sexy, chances are Annie will change your mind. She shares practical, actionable information that revolves around one of her passions — making data sexy. At MozCon 2012, the audience at her amazing presentation left with tons of useful tips and tricks for creating Excel spreadsheets that are comprehensive, easy to understand, and compelling to decision-makers.

We are honored that Avinash and Annie are joining us again at MozCon, and we hope you will join us, too! Their talks will help you demystify analytics data the moment you get back to the office.

In his keynote, “Simplifying Complexity: Three Ideas For Higher ROI,� Avinash will apply Occam’s Razor to three user cases and share practical tips for dealing with complexity. Annie will show you how to separate the junk from the sound data when analyzing organic keyword data in her talk, “Breaking Up With Your Keyword-Based KPIs.�

Recently, Avinash and Annie were kind enough to answer a few questions about their upcoming MozCon presentations, must-know analytics information, and which technology would improve their lives.

Tell us about the presentation you have planned for MozCon.

Avinash: My plan is to share three stories that serve as an example of amplifying the awesomeness of any business not by focusing on doing one thing well. That seems like such an odd thing to say, but I’ve convinced that if we are to make incredible progress we need to solve for multiplicity.

Three simple examples, from our everyday lives, that the audience will be able to go back and implement in their day to day efforts.

Annie: I’m going to talk about breaking up with your organic keyword data. Many marketers (if not most) who focus on organic search are using junk data that does not stand up to scientific criteria. I’m going to talk about what data is junk, how to differentiate junk from sound data, and some alternatives to junk data that withstand statistical scrutiny.

What is something that all marketers should know about web analytics, but many don’t seem to know?

Avinash: My 10/90 rule. For every $100 you need to invest in making smarter decisions on the web, you need to invest $10 in tools and consultants to implement the tools, and you need to invest $90 in big brains to analyze the data and recommend actions.

People have this insane belief that data talks. No. Data does not talk; people make data talk.

The question to ask, hence, is not how much data you have. The question is how many big brains you have.

Annie: How to report on conversions in a way that gives all of the marketing channels credit for their contribution to the end goal(s). If coaches ran football teams the way marketers report on conversions, only the players who score the touchdowns would get paid.

What uninvented technology would improve your life the most?

Avinash: I know this seems silly, but I think I have all the technology I need in my life. Sure the batteries could last longer and my computer could just type in what I’m thinking – why do I need to physically type in a cramped plane seat?

There is an impressive amount of technology we need to deploy to ease human suffering. Those solutions, big and small, from smarter malaria nets to more precise lasers to target cancer, are the ones I’m rooting for.

Annie: A centralized financial system in the cloud that would enable me to manage all of my financial needs from one place — from investing to paying bills to paying my business quarterly taxes — with robust projection and budgeting data visualizations at my fingertips.

Thank you for speaking with us, Avinash and Annie!

If you would like to read more about Avinash, check out this great interview he did a couple years ago, which covers some of his views on social media, SEO, and why he is always looking for exceptional things. You can also follow Avinash on Twitter @avinash.

Learn more about Annie and web analytics by checking out her info-packed blog, Annielytics, her great posts on Search Engine Land, and by following her on Twitter @AnnieCushing.

Even better, get actionable analytics advice by joining us at MozCon and experiencing their awesome presentations!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →