Archives for 

seo

Is that Mind-Blowing Title Blowing Your Credibility? You Decide

Posted by Isla_McKetta

tantalus

Image of Tantalus courtesy of Clayton Cusak

What if I told you I could teach you to write the perfect headline? One that is so irresistible every person who sees it will click on it. You’d sign up immediately and maybe even promise me your firstborn.

But what if I then told you not one single person out of all the millions who will click on that headline will convert? And that you might lose all your credibility in the process. Would all the traffic generated by that “perfect” headline be worth it?

Help us solve a dispute

It isn’t really that bad, but with all the emphasis lately on headline science and the curiosity gap, Trevor (your faithful editor) and I (a recovering copywriter) started talking about the importance of headlines and what their role should be in regards to content. I’m for clickability (as long as there is strong content to back the headline) and, if he has to choose, Trevor is for credibility (with an equal emphasis on quality of the eventual content).

credible vs clickable headlines

What’s the purpose of a headline?

Back in the good ol’ days, headlines were created to sell newspapers. Newsboys stood on street corners shouting the headlines in an attempt to hawk those newspapers. Headlines had to be enough of a tease to get readers interested but they had to be trustworthy enough to get a reader to buy again tomorrow. Competition for eyeballs was less fierce because a town only had so many newspapers, but paper cost money and editors were always happy to get a repeat customer.

Nowadays the competition for eyeballs feels even stiffer because it’s hard to get noticed in the vast sea of the internet. It’s easy to feel a little desperate. And it seems like the opportunity cost of turning away a customer is much lower than it was before. But aren’t we doing content as a product? Does the quality of that product matter?

The forbidden secrets of clickable headlines

There’s no arguing that headlines are important. In fact, at MozCon this year, Nathalie Nahai reminded us that many copywriters recommend an 80:20 ratio of energy spent on headline to copy. That might be taking things a bit far, but a bad (or even just boring) headline will tank your traffic. Here is some expert advice on writing headlines that convert: 

  • Nahai advises that you take advantage of psychological trigger words like, “weird,” “free,” “incredible,” and “secret” to create a sense of urgency in the reader. Can you possibly wait to read “Secret Ways Butter can Save Your Life”?
  • Use question headlines like “Can You Increase Your Sales by 45% in Only 5 Minutes a Day?” that get a reader asking themselves, “I dunno, can I?” and clicking to read more.
  • Key into the curiosity gap with a headline like “What Mother Should Have Told You about Banking. (And How Not Knowing is Costing You Friends.)” Ridiculous claim? Maybe, but this kind of headline gets a reader hooked on narrative and they have to click through to see how the story comes together.
  • And if you’re looking for a formula for the best headlines ever, Nahai proposes the following:
    Number/Trigger word + Adjective + Keyword + Promise = Killer Headline.

Many readers still (consciously or not) consider headlines a promise. So remember, as you fill the headline with hyperbole and only write eleven of the twelve tips you set out to write, there is a reader on the other end hoping butter really is good for them.

The headline danger zone

This is where headline science can get ugly. Because a lot of “perfect” titles simply do not have the quality or depth of content to back them.

Those types of headlines remind me of the Greek myth of Tantalus. For sharing the secrets of the gods with the common folk, Tantalus was condemned to spend eternity surrounded by food and drink that were forever out of his reach. Now, content is hardly the secrets of the gods, but are we tantalizing our customers with teasing headlines that will never satisfy?

buzzfeed headlines

For me, reading headlines on BuzzFeed and Upworthy and their ilk is like talking to the guy at the party with all those super wild anecdotes. He’s entertaining, but I don’t believe a word he says, soon wish he would shut up, and can’t remember his name five seconds later. Maybe I don’t believe in clickability as much as I thought…

So I turn to credible news sources for credible headlines.

washington post headlines

I’m having trouble deciding at this point if I’m more bothered by the headline at The Washington Post, the fact that they’re covering that topic at all, or that they didn’t really go for true clickbait with something like “You Won’t Believe the Bizarre Reasons Girls Scream at Boy Band Concerts.” But one (or all) of those things makes me very sad. 

Are we developing an immunity to clickbait headlines?

Even Upworthy is shifting their headline creation tactics a little. But that doesn’t mean they are switching from clickbait, it just means they’ve seen their audience get tired of the same old tactics. So they’re looking for new and better tactics to keep you engaged and clicking.

The importance of traffic

I think many of us would sell a little of our soul if it would increase our traffic, and of course those clickbaity curiosity gap headlines are designed to do that (and are mostly working, for now).

But we also want good traffic. The kind of people who are going to engage with our brand and build relationships with us over the long haul, right? Back to what we were discussing in the intro, we want the kind of traffic that’s likely to convert. Don’t we?

As much as I advocate for clickable headlines, the riskier the headline I write, the more closely I compare overall traffic (especially returning visitors) to click-throughs, time on page, and bounce rate to see if I’ve pushed it too far and am alienating our most loyal fans. Because new visitors are awesome, but loyal customers are priceless.

Headline science at Moz

At Moz, we’re trying to find the delicate balance between attracting all the customers and attracting the right customers. In my first week here when Trevor and Cyrus were polling readers on what headline they’d prefer to read, I advocated for a more clickable version. See if you can pick out which is mine…

headline poll

Yep, you guessed it. I suggested “Your Google Algorithm Cheat Sheet: Panda, Penguin, and Hummingbird” because it contained a trigger word and a keyword, plus it was punchy. I actually liked “A Layman’s Explanation of the Panda Algorithm, the Penguin Algorithm, and Hummingbird,” but I was pretty sure no one would click on it.

Last time I checked, that has more traffic than any other post for the month of June. I won’t say that’s all because of the headline—it’s a really strong and useful post—but I think the headline helped a lot.

But that’s just one data point. I’ve also been spicing up the subject lines on the Moz Top 10 newsletter to see what gets the most traffic.

most-read subject lines

And the results here are more mixed. Titles I felt like were much more clickbaity like “Did Google Kill Spam?…” and “Are You Using Robots.txt the Right Way?…” underperformed compared to the straight up “Moz Top 10.”

While the most clickbaity “Groupon Did What?…” and the two about Google selling domains (which was accurate but suggested that Google was selling it’s own domains, which worried me a bit) have the most opens overall.

Help us resolve the dispute

As you can tell, I have some unresolved feelings about this whole clickbait versus credibility thing. While Trevor and I have strong opinions, we also have a lot of questions that we hope you can help us with. Blow my mind with your headline logic in the comments by sharing your opinion on any of the following:

  • Do clickbait titles erode trust? If yes, do you ever worry about that affecting your bottom line?
  • Would you sacrifice credibility for clickability? Does it have to be a choice?
  • Is there such thing as a formula for a perfect headline? What standards do you use when writing headlines?
  • Does a clickbait title affect how likely you are to read an article? What about sharing one? Do you ever feel duped by the content? Does that affect your behavior the next time?  
  • How much of your soul would you sell for more traffic?

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

How To Do a Content Audit – Step-by-Step

Posted by Everett

This is Inflow’s process for doing content audits. It may not be the “best” way to do them every time, but we’ve managed to keep it fairly agile in terms of how you choose to analyze, interpret and make recommendations on the data. The fundamental parts of the process remain about the same across numerous types of websites no matter what their business goals are: Collect all of the content URLs on the site and fetch the data you need about each URL. Then analyze the data and provide recommendations for each content URL. Theoretically it’s simple. In practice, however, it can be a daunting exercise if you don’t have a plan or process in place. By the end of this post we hope you’ll have a good start on both.

Table of Contents

The many purposes of a content audit

A content audit can help in a variety of different ways, and the approach can be customized for any given scenario. I’ll write more about potential “scenarios” and how to approach them below. For now, here are some things a content audit can help you accomplish…

  1. Determine the most effective way to escape a Panda penalty
  2. Determine which pages need copywriting / editing
  3. Determine which pages need to be updated and made more current, and prioritize them
  4. Determine which pages should be consolidated due to overlapping topics
  5. Determine which pages should be pruned off the site, and what the approach to pruning should be
  6. Prioritize content based on a variety of metrics (e.g. visits, conversions, PA, copyscape risk score…)
  7. Find content gap opportunities to drive content ideation and editorial calendars
  8. Determine which pages are ranking for which keywords
  9. Determine which pages “should” be ranking for which keywords
  10. Find the strongest pages on a domain and develop a strategy to leverage them
  11. Uncover content marketing opportunities
  12. Auditing and creating an inventory of content assets when buying/selling a website
  13. Understanding the content assets of a new client (i.e. what you have to work with)
  14. And many more…

A content audit case study

8 Times the Leads

Inflow’s technical SEO specialist Rick Ramos performed an earlier version of our content audit last year for Phases Design Studio, who graciously permitted us to share their case study. After taking an inventory of all content URLs on the domain, Rick outlined a plan to noindex/follow and remove from their sitemap many of the older blog posts that were no longer relevant, and weren’t candidates for a content refresh. The site also had a series of campaign-based landing pages dating back from 2006. These pages typically had a life cycle of a few months, but were never removed from the site or Google’s index. Rick recommended that these pages be 301 redirected to a few evergreen landing pages that would be updated whenever a new campaign was launched—a tactic that works particularly well on seasonal pages for eCommerce sites (e.g. 2014 New Years Resolution Deals). Still more pages were candidates to be updated / refreshed, or improved in other ways.

The results

Shortly after the recommendations were implemented the client called to ask if we knew why they were suddenly seeing eight times the amount of leads they were used to seeing month over month.

Analytics traffic graph after a content audit


Why we think it worked

There are several probable reasons why this approach worked for our client. Here are a few of them…

  1. The ratio of useful, relevant, unique content to thin, irrelevant, duplicate content was greatly improved.
  2. The PageRank from dozens of expired campaign landing pages was consolidated into a relatively few evergreen pages (via 301 redirects and consolidation of internal linking signals).
  3. Crawl budget is now being used more efficiently.

This improved the overall customer experience on the site, as well as organic search rankings for important topic areas that were consolidated.

Since then we have refined and improved the process and have been performing them on a variety of sites with great success. It works particularly well for panda recoveries on large-scale content websites, and for prioritizing which eCommerce product copy needs to be rewritten first.

A 50,000-foot overview of our process

Inflow’s content auditing process changes depending on the client’s goals, needs and budget. Generally speaking, however, here is how we approach it…

  1. Gather all available URLs on the site
    • Use Screaming Frog (or another crawl tool), CMS Exports, Google Analytics and Webmaster Tools
  2. Import URLs into a tool that gathers KPIs and other data for each URL
    • Use URL Profiler, a custom in-house tool, or other data-gathering resources
      • Things to gather: Moz Metrics, Google Analytics KPIs, GWT Data, Magestic SEO metrics, Titles, Descriptions, Wordcounts, canonical tags…
  3. Analyze the content
    • Choose to keep as-is, improve, remove or consolidate.
      • Write detail strategies for each.
  4. Perform keyword research
    • Optional: Provide relevancy scores, topic buckets and buying stage/s for each keyword
    • Match keywords to pages that already rank within a keyword matrix
    • Match non-ranking keywords to the best page for guiding on-page changes
  5. Do content gap ideation
    • Use keywords that did not have an appropriate page match to fill in the Content Gap tab.
      • Optional: Incorporate buying cycles into content gap ideation
  6. Write the content strategy
    • Summarize the findings and present a strategy for optimizing existing pages, creating new pages to fill gaps, explain how many pages are being removed, redirected, etc…

Each piece of the process can be customized for the needs of a particular website. 

For example, when auditing a very large content site with lots of duplicate/thin/overlapping content issues we may skip the entire keyword research and content gap analysis part of the process and focus on pruning the site of these types of pages and improving the rest. Alternatively, a site without much content may need to focus on keyword research and content gaps. Other sites may be looking specifically for content assets that they can improve, repeat in new ways or leverage for newer content. One example of a very specific goal would be to identify interlinking opportunities from strong, older pages to promising, newer pages. For now it is sufficient to know that the framework can be changed as needed in a way that could dramatically affect where you spend your time in the process, or even which steps you may want to skip altogether.

Our documents

There are several major steps in the content auditing process that require various documents. While I’m not providing links to our internal SOP documentation (mainly because it’s still evolving), I will describe each document and provide screenshots and links to examples / templates so you can have a foundation around which to customize one for your own needs.

Content audit scenarios

We keep a list of recommendations for common scenarios to guide our approach to content audits. While every situation is unique in its own ways, we find this helps us get 90% of the way to the appropriate strategy for each client much faster. I discuss this in more detail later, but if you’d like to take a peek click here.

Content audit dashboard spreadsheet

We were originally working within Google Docs, but as we started pulling in from more sources and performing more vLookups the spreadsheet would load so slowly on big sites as to make it nearly impossible to complete an audit. For this reason we have recently moved the entire process over to Excel, though this template we’re providing is in Google Docs format. Below are some of the tabs you may want in this spreadsheet…

The “Content Audit” tab

This tab within the dashboard is where most of the work is done. Other tabs pull data from this one by VLookup. Whether the data is fetched by API and compiled by one tool (e.g. URL Profiler) or exported manually from many tools and compiled manually (by VLookup), the end result should be that you have all of the metrics needed for each URL in one place so you can begin sorting by various metrics to discern patterns, spot opportunities and make educated decisions on how to handle each piece of content, and the content strategy of the site as a whole.

Content Audit Metrics Screenshot

You can customize the process to include whatever metrics you’d like to use. Here are the ones we’ve ended up with after some experimentation, as well as the source of the data:

  • Action (internal)
    • Leave As-Is
    • Improve
    • Consolidate
    • Remove
  • Strategy (internal)
    • A more detailed version of “action”. Example: Remove and 301 redirect to /another-page/.
  • Page Type (internal via URL patterns or CMS export)
    • This is and optional step for certain situations. Example: Article, Product, Category…
  • Source (original source of the URL, e.g. Google Analytics, Screaming Frog)
  • CopyScape Risk Score (copyscape API)
  • Title Tag (Screaming Frog)
  • Title Length (Screaming Frog)
  • Meta Description (Screaming Frog)
  • Word Count (Screaming Frog)
  • GA Entrances (Google Analytics API)
  • GA Organic Entrances (Google Analytics API)
  • Moz Links (Moz API)
  • Moz Page Authority (Moz API)
  • MozRank (Moz API)
  • Moz External Equity Links (Moz API)
  • Stumbleupon (Social Count API)
  • Facebook Likes (Social Count API)
  • Facebook Shares (Social Count API)
  • Google Plus One (Social Count API)
  • Tweets (Social Count API)
  • Pinterest (Social Count API)

Screenshot of Content Audit Dashboard

Our recommendations typically fall into one of four “Action” categories: “Keep As-Is”, “Remove”, “Improve”, or “Consolidate”. Further details (e.g. remove and 404, or remove and 301? If 301, to where?) are provided in a column called “Strategy”. Some URLs (the important ones) will have highly customized strategies, while others may have been bulk processed, meaning thousands could share the same strategy (e.g. rewriting duplicate product description copy). The “Action” column is limited in choices so we can sort the data effectively (e.g. see all pages marked as “removed”) while the “Strategy” column can be more free-form and customized to the URL (e.g. consolidate /buy-blue-widgets/ content into /buying-blue-widgets/ and 301 redirect the former to the latter to avoid duplicating the same topic).

The “Keyword Research” tab

This tab includes keywords gathered from a variety of sources, including brainstorming for seed keywords, mining Google Webmaster Tools, PPC campaigns, the AdWords Keyword Planner and several other tools. Search Volume and Ad Competition (not shown in this screenshot) are pulled from Google’s Keyword Planner. The average ranking position comes from GWT, as does the top ranking page. The relevancy score is something we typically ask the client to do once we’ve cleaned out most of the obvious junk keywords.

Keyword Research Screenshot

The “Keyword Matrix” tab

This tab includes URLs for important pages, and those that are ranking for – or are most qualified to rank for – important topics. It essentially matches up keywords with the best possible page to guide our copywriting and on-page optimization efforts.

Sometimes the KWM tab plays an important role in the process, like when the site is relatively new or unoptimized. Most of the time it takes a back-seat to other tabs in terms of strategic importance.

The “Content Gaps” tab

This is where we put content ideas for high-volume, highly relevant keywords for which we could not find an appropriate page. Often it involves keywords that represent stages in the buying cycle or awareness ladder that have been overlooked by the company. Sometimes it plays an important role, such as with new and/or small sites. Most of the time this also takes a back-seat to more important issues, like pruning.

The “Prune” tab

If it was marked for “Remove” or “Consolodate” it should be on this tab. Whether it is supposed to be removed and 301 redirected, canonicalized elsewhere, consolidated into another page, allowed to stay up but with a robots “noindex” meta tag, removed and allowed to 404/410… or any number of “strategies” you might come up with, these are the pages that will no longer exist once your recommendations have been implemented. I find this to be a very useful tab. For example, one could export this tab, send it to a developer (or a company like WP Curve), and have someone get started on most or all of the implementation. Our mantra for low-quality, under-performing content on sites that may have a Panda-related traffic drop is to improve it or remove it.

“Imported Data” tabs

In addition to the tabs above, we also have data tabs that are in the spreadsheet to house exported data from the various sources so we can perform Vlookups based on the URL to populate data in other tabs. These data tabs include:

  • GWT Top Queries
  • GWT Top Pages
  • CopyScape Scores (typically for up to 1,000 URLs)
  • Keyword Data

The more data that can be compiled by a tool like URL Profiler, the fewer data tabs you’ll need and the faster this entire process will go. Before we built the internal tool to automate parts of the process, we also had tabs for GA data, Moz data, and the initial Screaming Frog export.

Vlookup Master!

If you don’t know how to do a Vlookup there are plenty of online tutorials for Excel and GoogleDocs Spreadsheets. Here’s one I found useful for Excel. Alternatively, you could import all of the data into the tabs and ask someone more spreadsheet-savvy on your team to do the lookups. Our resident spreadsheet guru is Caesar Barba, and he has great hair. Below is an example of a simple Vlookup used to bring the “Action” over from the Content Audit tab for a URL in the Keyword Matrix tab…

=VLOOKUP(A2,’Content Audit’!A:C,3,FALSE)

Content Strategy

The Content Audit Dashboard is just what we need internally: A spreadsheet crammed with data that can be sliced and diced in so many useful ways that we can always go back to it for more insight and ideas. Some clients appreciate it as well, but most are going to find the greater benefit in our final content strategy, which includes a high-level overview of our recommendations from the audit.

Content Strategy Screenshot from Inflow

Recommended exports and data sources

There are many options for getting the data you need into one place so you can simultaneously see a broad view of the entire content situation, as well as detailed metrics for each URL. For URL gathering we use Screaming Frog and Google Analytics. For data we use Google Webmaster Tools (GWT), Google Analytics (GA), Social Count (SC), Copyscape (CS), Moz, CMS exports, and a few other data sources as needed.

However we’ve been experimenting with using URL Profiler instead of our internal tool to pull all of these data-sources together much faster. URL Profiler is a few hundred bucks and is very powerful. It’s also somewhat of a pain to set up the first time, so be prepared for several hours of wrangling down API keys before getting all of the data you need.

No matter how you end up pulling it all together in the end, doing it yourself in Excel is always an option for the first few times.

A step-by-step example of our process

Below is the step-by-step process for an “average” client – whatever that means. Let’s say it is a medium-sized eCommerce client with about 800-900 pages indexed by Google, including category, product, blog posts and other pages. They don’t have an existing penalty that we know of, but could certainly be at risk of being affected by Panda due to some thin, overlapping, duplicate, outdated and irrelevant content on the site.

Step 1: Assess the situation and choose a scenario

Every situation is different, but we have found common similarities based on two primary factors – The size of the site and its content-based penalty risk. Below is a screenshot from our list of recommended strategies for common content auditing scenarios, which can be found here on GoInflow.com.

Inflow Content Audit Scenarios

Each of the colored boxes drops down to reveal the strategy for that scenario in more detail.
Hat tip to Ian Lurie’s Marketing Stack for design inspiration.

The site described above would fall into the second box within purple column ( Focus: Content Audit with an eye to Improve and/or Prune, followed by KWM for key pages). Here is the reasoning behind that…

The site is in danger of a penalty (though it does not appear to have one “yet”) so we follow the Panda matra: Improve it or Remove it. The size of the site determines which of those two (improve or remove) gets the most attention. Smaller sites need less pruning (scalpel), while larger sites need much more (hatchet). Smaller sites often need some keyword research to determine if they are covering all of the topic areas for various stages in the customer’s buying cycle, while larger sites typically have the opposite problem —> too many pages covering overlapping topic areas with low-quality (thin, duplicate, irrelevant, outdated, poorly written, automated…) content. Such a site would not require the keyword research, and would therefore not be getting a keyword matrix or content gap analysis, as the focus would be primarily about pruning the site.

Our focus in this example will be to audit the content with an eye to improve and/or Remove low performing pages, followed by keyword research and a keyword matrix for the primary pages, including the home page, categories, blog home and key product pages, as well as certain other topical landing pages.

As it turns out, this hypothetical website has lots of manufacturer-supplied product descriptions. We’re going to need to prioritize which ones get rewritten first because the client does not have the cash-flow to do them all at once. When budget and time is a concern, we typically shoot for the 80/20 rule: Write great content for the top 20% of pages right away, and do the other 80% over the course of 6-12 months as time/budget permit.

Because this site doesn’t have an existing penalty, we will recommend that all pages stay indexed. If they had a penalty already, we would recommend they noindex,follow the bottom 80% of pages, gradually releasing them back into the index as they are rewritten. This may not be the way you choose to handle the same situation, which is fine, but the point is you can easily sort the pages by any number of metrics to determine a relative “priority”. The bigger the site and tighter the budget, the more important it is to prioritize what gets worked on first.

Causes of Content-Related Penalties

For the purpose of a content audit we are only concerned with content-related penalties (as opposed to links and other off-page issues), which typically fall under three major categories: Quality, Duplication, and Relevancy. These can be further broken down into other issues, which include – but are not limited to:

  • Typical low quality content
    • Poor grammar, written primarily for search engines (includes keyword stuffing), unhelpful, inaccurate…
  • Completely irrelevant content
    • OK in small amounts, but often entire blogs are full of it.
    • A typical example would be a “linkbait” piece circa 2010.
  • Thin / Short content
    • Glossed over the topic, too few words, all image-based content…
  • Curated content with no added value
    • Comprised almost entirely of bits and pieces of content that exists elsewhere.
  • Misleading Optimization
    • Titles or keywords targeting queries for which content doesn’t answer or deserve to rank
    • Generally not providing the information the visitor was expecting to find
  • Duplicate Content
    • Internally duplicated on other pages (e.g. categories, product variants, archives, technical issues…)
    • Externally duplicated (e.g. manufacturer product descriptions, product descriptions duplicated in feeds used for other channels like Amazon, shopping comparison sites and eBay, plagiarized content…)
  • Stub Pages (e.g. “No content is here yet, but if you sign in and leave some user-generated-content then we’ll have content here for the next guy”. By the way, want our newsletter? Click an AD!)
  • Indexable internal search results
  • Too many indexable blog tag or blog category pages
  • And so forth and so-on…

If you are unsure about the scale of the site’s content problems, feel free to do step 2 before deciding on a scenario…

Step 2: Scan the site

We use Screaming Frog for this step, but you can adapt this process to whatever crawler you want. This is how we configure the spider’s “Basic” and “Advanced” tabs…

 And the advanced tab…

Notice that “crawl all subdomains” is checked. This is optional, depending on what you’re auditing. We are respecting “meta robots noindex”, “rel = canonical” and robots.txt. Also notice that we are not crawling images, CSS, JS, flash, external links…. This type of stuff is what we look at in a Technical SEO Audit, but would needlessly complicate a “Content” Audit. What we’re looking for here are all of the indexable HTML pages that might lead a visitor to the site from the SERPs, though it may certainly lead to the discovery of technical issues.

Export the complete list of URLs and related data from Screaming Frog into a CSV file.

Step 3: Import the URLs and start the tool

We have our own internal “Content Auditing Tool”, which takes URLs and data from Screaming Frog and Google Analytics, de-dupes them, and pulls in data from Google Webmaster Tools, Moz, Social Count and Copyscape for each URL. The tool is a bit buggy at times, however, so I’ve been experimenting with URL Profiler, which can essentially accomplish the same goal with fewer steps and less upkeep. We need the “Agency” version, which is about $400 per year, plus tax. That’s not too bad, considering we’d already spent several thousand on our internal tool by the time Gareth Brown released URL Profiler publicly. :-/

Below is a screenshot of what you’ll see after downloading the tool. I’ve highlighted the boxes we currently check, though it depends on the tools/APIs to which you already subscribe and will differ by user. We’ve only just started playing with uClassify for the purpose of semi-automating our topic bucketing of pages, but I don’t have a process to share yet (feel free to comment with advice)…

Right-click on the URL List box and choose “Import From File”, then choose the ScreamingFrog export or any other list of URLs. There are also options to import from the clipboard or XML sitemap. Full documentation for URL Profiler can be found here. Below are two output screenshots to give you an idea of what you’re going to end up with…

The output changes depending on which boxes you check and what API access you have.

Step 4: Import the tool output into the dashboard

As described in the 50,000 foot overview above, we have a spreadsheet template with multiple tabs, one of which is the “Content Audit” tab. The tool output gets brought into the Content Audit tab of the dashboard. Our internal tool automatically ads columns for Action, Strategy, Page Type and Source (of the URL). You can also add these to the tab after importing the URL Profiler output. Page Type and URL Source are optional, but Action and Strategy are key elements of the process.

Our hypothetical client requires a Keyword Matrix. However, if your “scenario” does not involve keyword research (i.e. if it is a big site with content penalty risks) you can skip steps 5-7 and move straight to “Step 8 – Time to Analyze and Make Some Decisions”.

Step 5: Import GWT data

Match existing URLs from the content audit to keywords for which they already rank in Google Webmaster Tools

There may be a way to do this with URL Profiler. If so, I haven’t found it yet. Here is what we do to grab the landing page and associated keyword/query data from Google Webmaster Tools, which we then import into two tabs (GWT Top Queries and GWT Top Pages). These tabs are helpful when filling out the Keyword Matrix because they tell you which pages Google is already associating with each ranking keyword. This step can actually be skipped altogether for huge sites with major content problems because the “Focus” is going to be on pruning the site of low quality content, rather than doing any keyword research or content gap analysis.

Instructions for Importing Top Pages from GWT

    • Log into GWT from a Chrome browser
    • Go to Search Traffic —> Search Queries
    • Switch the view to “Top pages” (default is “Top queries”)
    • Change the date range to start as far back as possible (i.e. 3 months)
    • Expand the amount of rows to show to the maximum of 500 rows
      • This will put the s=500 parameter in the URL. Change s=500 to s=10000 or however many rows of data are available
      • See bottom of GWT page (e.g. 1-500 of ####).
    • In the Chrome menu go to View —> Developer —> Javascript Console
    • Copy and Paste the following script into the console window and press Enter.
    • This action should expand all of the drop-downs to show the keywords under each “page” URL and then open up a dialog window that will ask you to save a CSV file: (more info here and here).
    • The script is also available in a javascript bookmarklet on Lunametrics.com
  1. (function(){eval(function(p,a,c,k,e,r){e=function(c){return(c<a?'':e(parseInt(c/a)))+((c=c%a)>35?String.fromCharCode(c+29):c.toString(36))};if(!''.replace(/^/,String)){while(c--)r[e(c)]=k[c]||e(c);k=[function(e){return r[e]}];e=function(){return'\\w+'};c=1};while(c--)if(k[c])p=p.replace(new RegExp('\\b'+e(c)+'\\b','g'),k[c]);return p}('C=M;k=0;v=e.q(\'1g-1a-18 o-y\');z=16(m(){H(v[k]);k++;f(k>=v.c){15(z);A()}},C);m H(a){a.h(\'D\',\'#\');a.h(\'11\',\'\');a.F()}m A(){d=e.10(\'Z\').4[1].4;2=X B();u=B.W.R.Q(d);7=e.q(\'o-G-O\');p(i=0;i<7.c;i++){d=u.J(7[i]);2.K([d,7[i].4[0].4[0].j])}7=e.q(\'o-G-14\');p(i=0;i<7.c;i++){d=u.J(7[i]);2.K([d,7[i].4[0].4[0].j])}2.N(m(a,b){P a[0]-b[0]});p(i=2.c-1;i>0;i--){r=2[i][0]-2[i-1][0];f(r===1){2[i-1][1]=2[i][1];2[i][0]++}}5="S\\T\\U\\V\\n";9=e.q("o-y-Y");6=0;I:p(i=0;i<9.c;i++){f(2[6][0]===i){E=2[6][1];12{6++;f(6>=2.c){13 I}r=2[6][0]-2[6-1][0]}L(r===1);2[6][0]-=(6)}5+=E+"\\t";l=9[i].4[0].4.c;f(l>0)5+=9[i].4[0].4[0].j+"\\t";17 5+=9[i].4[0].w+"\\t";5+=9[i].4[1].4[0].w+"\\t";5+=9[i].4[3].4[0].w+"\\n";5=5.19(/"|\'/g,\'\')}x="1b:j/1c;1d=1e-8,"+1f(5);s=e.1h("a");s.h("D",x);s.h("1i","1j.1k");s.F()}',62,83,'||indices||children|thisCSV|count|pageTds||queries|||length|temp|document|if||setAttribute||text|||function||url|for|getElementsByClassName|test|link||tableEntries|pages|innerHTML|encodedUri|detail|currInterval|downloadReport|Array|timeout1|href|thisPage|click|expand|expandPageListing|buildCSV|indexOf|push|while|25|sort|open|return|call|slice|page|tkeyword|timpressions|tclicks|prototype|new|row|grid|getElementById|target|do|break|closed|clearInterval|setInterval|else|block|replace|inline|data|csv|charset|utf|encodeURI|goog|createElement|download|GWT_data|tsv'.split('|'),0,{}))})();
    	

    Ignore any dialog windows that pop up.
    You can check “Prevent this page from creating additional dialogs” to disable them.

      • Import the resulting download.csv file from GWT into the “GWT Top Pages” tab in the Content Auditing Dashboard.

      Instructions for Importing Top Queries from GWT

      1. Within GWT switch back to Top Queries.
      2. Adjust the date to go back as far as you can.
      3. Expand the amount of rows to show to the maximum of 500 rows
        1. This will put the s=500 parameter in the URL. Change s=500 to s=10000 or however many rows of data are available
          1. See bottom of GWT page (e.g. 1-500 of ####).
      4. Select “Download this table” as a CSV file
      5. Import the resulting TopSearchQueries.csv file from GWT into the “GWT Top Queries” tab in the Content Auditing Dashboard.

    1. Step 6: Perform keyword research

      This is another optional step, depending on the focus/objective of the audit. It is also highly customizable to your own KWR process. Use whatever methods you like for gathering the list of keywords (e.g. brainstorming, SEMRush, Google Trends, Uber Suggest, GWT, GA…). Ensure all “junk” and irrelevant keywords are removed from the list, and run the rest through a single tool that collects search volume and competition metrics. We use the Google Adwords Keyword Planner, which is outlined below.

      1. Go to www.google.com/sktool/ while logged into our Google email account associated AdWords.
      2. Select “Get search volume for a list of keywords or group them into ad groups”, paste in your list of keywords and click “Get search volume”.
        1. Note: At this point you should have already expanded the list as much as you need/want to so you’re just gathering data and organizing them now.
        2. Note: The copy/paste method is limited to 1,000 keywords. You can get up to 3,000 by uploading your simple .txt file.
      3. Go to the “Keyword Ideas” tab on the next screen and Add All keywords to the plan.
      4. Go to the “Ad Group Ideas” tab and choose to Add All of the ad groups to the plan.
      5. Download the plan, as seen in the screenshot below.
      6. Import the data into the AdWords Data tab of the Content Auditing Dashboard

      Use the settings below when downloading the plan:

      Step 7: Tying the keyword data together

      Again, you don’t need to do this step if you’re working on a large site and the focus is on pruning out low quality content. The GWT Queries and KWR steps provide data needed to develop a “Keyword Matrix” (KWM), which isn’t necessary unless part of your focus is on-page optimization and copywriting of key pages. Sometimes you just need to get a client out of a penalty, or remove the danger of one. The KWM comes in handy for the important pages marked as “Improve” within the Content Audit tab just so the person writing the copy understands which keywords are important for that page. It’s SEO 101 and you can do it anyway you like using whatever tools you like.

      Google Adwords has given you the keyword, search volume and competition. Google Webmaster Tools has given you the ranking page, average position, impressions, clicks and CTR for each keyword. Pull these together into a tab called “Keyword Research” using Vlookups. You should end up with something like this:

      The purpose of these last few steps was to help with the KWM, an example of which is shown below:

      Step 8: Time to analyze and make some decisions!

      All of the data is right in front of you, and your path has been laid out using the Content Audit Scenarios tool. From here on the actual step-by-step process becomes much more open to interpretation and your own experience / intuition. Therefore, do not consider this a linear set of instructions meant to be carried out one after another. You may do some of them and not others. You may do them a little differently. That is all fine as long as you are working toward the goal of determining what to do, if anything, for each piece of content on the website.

      • Sort by Copyscape Risk Score
        • Which of these pages should be rewritten?
          • Rewrite key/important pages, such as categories, home page, top products
          • Rewrite pages with good Link and Social metrics
          • Rewrite pages with good traffic
          • After selecting “Improve” in the Action column, elaborate in the Strategy column:
            • “Improve these pages by writing unique, useful content to improve the Copyscape risk score.”
        • Which of these pages should be removed / pruned?
          • Remove guest posts that were published elsewhere
          • Remove anything the client plagiarized
          • Remove content that isn’t worth rewriting, such as:
            • No external links, no social shares, and very few or no entrances / visits
          • After selecting “Remove” from the Action column, elaborate in the Strategy column:
            • “Prune from site to remove duplicate content. This URL has no links or shares and very little traffic. We recommend allowing the URL to return 404 or 410 response code. Remove all internal links, including from the sitemap.
        • Which of these pages should be consolidated into others?
          • Presumably none, since the content is already externally duplicated
        • Which of these pages should be marked “Leave As-Is”
          • Important pages which have had their content stolen
            • In the Strategy column provide a link to the CopyScape report and instructions for filing a DMCA / Copyright complaint with Google.
      • Sort by Entrances or Visits (filtering out any that were already finished)
        • Which of these pages should be marked as “Improve”?
          • Pages with high visits / entrances but low conversion, time-on-site, pageviews per session…
          • Key pages that require improvement determined after a manual review of the page
        • Which of these pages should be marked as “Consolidate”?
          • When you have overlapping topics that don’t provide much unique value of their own, but could make a great resource when combined.
            • Mark the page in the set with the best metrics as “Improve” and in the Strategy column outline which pages are going to be consolidated into it. This is the canonical page.
            • Mark the pages that are to be consolidated into the canonical page as “Consolidate” and provide further instructions in the Strategy column, such as:
              • Use portions of this content to round out /canonicalpage/ and then 301 redirect this page into /canonicalpage/ Update all internal links.
          • Campaign-based or seasonal pages that could be consolidated into a single “Evergreen” landing page (e.g. Best Sellers of 2012 and Best Sellers of 2013 —> Best Sellers).
        • Which of these pages should be marked as “Remove”?
          • Pages with poor link, traffic and social metrics related to low-quality content that isn’t worth updating
            • Typically these will be allowed to 404/410.
          • Irrelevant content
            • The strategy will depend on link equity and traffic as to whether it gets redirected or simply removed.
          • Out-of-Date content that isn’t worth updating or consolidating
            • The strategy will depend on link equity and traffic as to whether it gets redirected or simply removed.
        • Which of these pages should be marked as “Leave As-Is”?
          • Pages with good traffic, conversions, time on site, etc… that also have good content.
            • These may or may not have any decent external links

      Another Way of Thinking About It…

      For big sites It is best to use a hatchet-approach as much as possible, and finish up with a scalpel in the end. Otherwise you’ll spend way too much time on the project, which eats into the ROI.

      This is not a process that can be documented step-by-step. For the purpose of illustration, however, here are a few different examples of hatchet approaches and when to consider using them.

      • Parameter-based URLs that shouldn’t be indexed
        • Defer to the Technical Audit, if applicable. Otherwise, use your best judgement:
          • e.g. /?sort=color, &size=small
            • Assuming the Tech Audit didn’t suggest otherwise these pages could all be handled in one fell swoop. Below is an “example” action and an “example” strategy for such a page:
              • Action = Consolodate
              • Strategy = Rel canonical to the base page without the parameter
      • Internal search results
        • Defer to the Technical Audit if applicable. Otherwise, use your best judgement:
          • e.g. /search/keyword-phrase/
            • Assuming the Tech Audit didn’t suggest otherwise:
              • Action = Remove
              • Strategy = Apply a noindex meta tag. Once they are removed from the index, disallow /search/ in the robots.txt file.
      • Blog tag pages
        • Defer to the Technical Audit if applicable. Otherwise…:
          • e.g. /blog/tag/green-widgets/ , blog/tag/blue-widgets/ …
            • Assuming the Tech Audit didn’t suggest otherwise:
              • Action = Remove
              • Strategy = Apply a noindex meta tag. Once they are removed from the index, disallow /search/ in the robots.txt file.
      • eCommerce Product Pages with Manufacturer Descriptions
        • In cases where the “Page Type” is known (i.e. it’s in the URL or was provided in a CMS export) and Risk Score indicates duplication…
          • e.g. /product/product-name/
            • Assuming the Tech Audit didn’t suggest otherwise:
              • Action = Improve
              • Strategy = Rewrite to improve product description and avoid duplicate content
      • eCommerce Category Pages with No Static Content
        • In cases where the “Page Type” is known…
          • e.g. /category/category-name/ or category/cat1/cat2/
            • Assuming NONE of the category pages have content…
              • Action = Improve
              • Strategy = Write 2-3 sentences of unique, useful content that explains choices, next steps or benefits to the visitor looking to choose a product from the category.
      • Out-of-Date Blog Posts, Articles and Other Landing Pages
        • In cases where the Title tag includes a date or…
        • In cases where the URL indicates the publishing date….
          • Action = Improve
          • Strategy = Update the post to make it more current if applicable. Otherwise, change Action to “Remove” and customize the Strategy based on links and traffic (i.e. 301 or 404)

      Step 9: Content gap analysis and other value-adds

      Although most of these could be put as optional items during the keyword research process, I prefer to save them until last because I never knows how much time I’ll have after taking care of more pressing issues.

      Content gaps
      If you’ve gone through the trouble of identifying keywords and the pages already ranking for them, it isn’t much of a step further to figure out which keywords could lead to ideas about how to fill content gaps.

      At Inflow we like to use the “Awareness Ladder” developed by Ben Hunt, as featured in his book Convert!. You can learn more about it here.

      Content levels
      If time permits, or the situation dictates, we may also add a column to the Keyword Matrix or Content Audit which identifies which level of content the page would need to compete in its keyword space. We typically choose from Basic, Standard and Premium. This goes a long way in helping the client allocate copywriting resources to work where they’re needed the most (i.e. best writers do the Premium content).

      Landing page or keyword topic buckets
      If time permits, or the situation dictates, we may provide topic bucketing for landing pages and/or keywords. More than once this has resulted in recommendations for adding to or changing existing taxonomy with great results. The most frequent example is in the “How To” or “Resources” space for any given niche.

      Keyword relevancy scores
      This is a good place to enlist the help of a client, especially in complicated niches with a lot of jargon. Sometimes the client can be working on this while the strategist is doing the content audit.

      Step 10: Writing up the content audit strategy document

      The Content Strategy, or whatever you decide to call it, should be delivered at the same time as the audit, and summarizes the findings, recommendations and next steps from the audit. It should start with an Executive Summary and then drill deeper into each section outlined therein.

      Here is a real example of an executive summary from one of Inflow’s Content Audit Strategies:

      As a result of our comprehensive content audit, we are recommending the following, which will be covered in more detail below:

      • Removal of about 624 pages from Google index by deletion or consolidation:
        • 203 Pages were marked for Removal with a 404 error (no redirect needed)
        • 110 Pages were marked for Removal with a 301 redirect to another page
        • 311 Pages were marked for Consolidation of content into other pages
          • Followed by a redirect to the page into which they were consolidated
      • Rewriting or improving of 668 pages
        • 605 Product Pages are to be rewritten due to use of manufacturer product descriptions (duplicate content), these being prioritized from first to last within the Content Audit.
        • 63 “Other” pages to be rewritten due to low-quality or duplicate content.
      • Keeping 26 pages as-is with no rewriting or improvements needed unless the page exists in the Keyword Matrix, in which case it requires on-page optimization best practices be reviewed/applied.
      • On-Page optimization focus for 25 pages with keywords outlined in the Keyword Matrix tab.

      These changes reflect an immediate need to “improve or remove” content in order to avoid an obvious content-based penalty from Google (e.g. Panda) due to thin, low-quality and duplicate content, especially concerning Representative and Dealers pages with some added risk from Style pages.


      The Content Strategy should end with recommended next steps, including action items for the consultant and the client. Here is a real example from one of our documents:

      We recommend the following actions in order of their urgency and/or potential ROI for the site:
      1. Remove or consolidate all pages in the “Prune” tab of the Content Audit Dashboard
        1. Detailed instructions for each page can be found in the “Strategy” column of the Prune tab
      2. Begin a copywriting project to improve/rewrite content on Style pages to ensure unique, robust content and proper keyword targeting.
        1. Inflow can provide support for your own copywriters, or we can use our in-house copywriters, depending on budget and other considerations. As part of this process, these items can also be addressed:
          1. Improve/rewrite all pages in the Keyword Matrix to match assigned keywords.
            1. Include on-page optimization (e.g. Title, description, alt attributes, keyword use, etc.)
              1. See the “Strategy” column for more complete instructions for each page.
          2. Improve/rewrite all remaining pages from the “Content Audit” tab listed as “Improve”.

      Resources, links, and post-scripts…

      Example Content Auditing Dashboard
      Make a copy of this Google Docs spreadsheet, which is a basic version of how we format ours at Inflow.

      Content Audit Strategies for Common Scenarios
      This page/tool will help you determine where to start and what to focus on for the majority of situations you’ll encounter while doing comprehensive content audits.

      How to Conduct a Content Audit on Your Site by Neil Patel of QuickSprout
      Oh wait, I can’t in send everyone to a page that makes them navigate a gauntlet of pop-ups to see the content, and another one to leave. So nevermind…

      How to Perform a Content Audit by Kristina Kledzik of Distilled
      This one focuses mostly on categorizing pages by buying cycle stage.

      Expanding the Horizons of eCommerce Content Strategy by Dan Kern of Inflow
      Dan wrote an epic post recently about content strategies for eCommerce businesses, which includes several good examples of content on different types of pages targeted toward various stages in the buying cycle.

      Distilled’s Epic Content Guide
      See the section on Content Inventory and Audit.

      The Content Inventory is Your Friend by Kristina Halvorson on BrainTraffic
      Praise for the life-changing powers of a good content audit inventory.

      How to Perform a Content Marketing Audit by Temple Stark on Vertical Measures
      Temple did a good job of spelling out the “how to” in terms of a high-level overview of his process to inventory content, assess its performance and make decisions on what to do next.

      Why Traditional Content Audits Aren’t Enough by Ahava Leibtag on Content Marketing Institute’s blog
      While not a step-by-step “How To” like this post, Ahava’s call for marketing analysts to approach these proejcts from both a quantitative (content inventory) and qualitative (content quality audit) resonated with me the first time I read it, and is partly responsible for how I’ve approached the process outlined above.


      Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

      Continue reading →

      We Removed a Major Website from Google Search, for Science!

      Posted by Cyrus-Shepard

      The folks at Groupon surprised us earlier this summer when they reported the results of an experiment that showed that up to 60% of direct traffic is organic.

      In order to accomplish this, Groupon de-indexed their site, effectively removing themselves from Google search results. That’s crazy talk!

      Of course, we knew we had to try this ourselves.

      We rolled up our sleeves and chose to de-index Followerwonk, both for its consistent Google traffic and its good analytics setup—that way we could properly measure everything. We were also confident we could quickly bring the site back into Google’s results, which minimized the business risks.

      (We discussed de-indexing our main site moz.com, but… no soup for you!)

      We wanted to measure and test several things:

      1. How quickly will Google remove a site from its index?
      2. How much of our organic traffic is actually attributed as direct traffic?
      3. How quickly can you bring a site back into search results using the URL removal tool?

      Here’s what happened.

      How to completely remove a site from Google

      The fastest, simplest, and most direct method to completely remove an entire site from Google search results is by using the URL removal tool

      We also understood, via statements by Google engineers, that using this method gave us the biggest chance of bringing the site back, with little risk. Other methods of de-indexing, such as using meta robots NOINDEX, might have taken weeks and caused recovery to take months.

      CAUTION: Removing any URLs from a search index is potentially very dangerous, and should be taken very seriously. Do not try this at home; you will not pass go, and will not collect $200!

      After submitting the request, Followerwonk URLs started disappearing from Google search results in 2-3 hours

      The information needs to propagate across different data centers across the globe, so the effect can be delayed in areas. In fact, for the entire duration of the test, organic Google traffic continued to trickle in and never dropped to zero.

      The effect on direct vs. organic traffic

      In the Groupon experiment, they found that when they lost organic traffic, they actually lost a bunch of direct traffic as well. The Groupon conclusion was that a large amount of their direct traffic was actually organic—up to 60% on “long URLs”.

      At first glance, the overall amount of direct traffic to Followerwonk didn’t change significantly, even when organic traffic dropped.

      In fact, we could find no discrepancy in direct traffic outside the expected range.

      I ran this by our contacts at Groupon, who said this wasn’t totally unexpected. You see, in their experiment they saw the biggest drop in direct traffic on long URLs, defined as a URL that is at least as long enough to be in a subfolder, like https://followerwonk.com/bio/?q=content+marketer.

      For Followerwonk, the vast majority of traffic goes to the homepage and a handful of other URLs. This means we didn’t have a statistically significant sample size of long URLs to judge the effect. For the long URLs we were able to measure, the results were nebulous. 

      Conclusion: While we can’t confirm the Groupon results with our outcome, we can’t discount them either.

      It’s quite likely that a portion of your organic traffic is attributed as direct. This is because of different browsers, operating systems and user privacy settings can potentially block referral information from reaching your website.

      Bringing your site back from death

      After waiting 2 hours, we deleted the request. Within a few hours all traffic returned to normal. Whew!

      Does Google need to recrawl the pages?

      If the time period is short enough, and you used the URL removal tool, apparently not.

      In the case of Followerwonk, Google removed over 300,000 URLs from its search results, and made them all reappear in mere hours. This suggests that the domain wasn’t completely removed from Google’s index, but only “masked” from appearing for a short period of time.

      What about longer periods of de-indexation?

      In both the Groupon and Followerwonk experiments, the sites were only de-indexed for a short period of time, and bounced back quickly.

      We wanted to find out what would happen if you de-indexed a site for a longer period, like two and a half days?

      I couldn’t convince the team to remove any of our sites from Google search results for a few days, so I choose a smaller personal site that I often subject to merciless SEO experiments.

      In this case, I de-indexed the site and didn’t remove the request until three days later. Even with this longer period, all URLs returned within just a few hours of cancelling the URL removal request.

      In the chart below, we revoked the URL removal request on Friday the 25th. The next two days were Saturday and Sunday, both lower traffic days.

      Test #2: De-index a personal site for 3 days

      Likely, the URLs were still in Google’s index, so we didn’t have to wait for them to be recrawled. 

      Here’s another shot of organic traffic before and after the second experiment.

      For longer removal periods, a few weeks for example, I speculate Google might drop these semi-permanently from the index and re-inclusion would comprise a much longer time period.

      What we learned

      1. While a portion of your organic traffic may be attributed as direct (due to browsers, privacy settings, etc) in our case the effect on direct traffic was negligible.
      2. If you accidentally de-index your site using Google Webmaster Tools, in most cases you can quickly bring it back to life by deleting the request.
      3. Reinclusion happens quickly even after we removed a site for over 2 days. Longer than this, the result is unknown, and you could have problems getting all the pages of your site indexed again.

      Further reading

      Moz community member Adina Toma wrote an excellent YouMoz post on the re-inclusion process using the same technique, with some excellent tips for other, more extreme situations.

      Big thanks to Peter Bray for volunteering Followerwonk for testing. You are a brave man!


      Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

      Continue reading →

      Beyond Search: Unifying PPC and SEO on the Display Network

      Posted by anthonycoraggio

      PPC and SEO go better together. By playing both sides of the coin, it’s possible to make more connections and achieve greater success in your online marketing than with either alone.

      That the data found in search query reporting within AdWords can be a valuable source of information in keyword research is well known. Managing the interaction effects of sharing the SERPs and capturing reinforcing real estate on the page is of course important. Smart marketers will use paid search to test landing pages and drive traffic to support experiments on the site itself. Harmony between paid and organic search is a defining feature of well executed search engine marketing.

      Unfortunately, that’s where the game all too often stops, leaving a world of possibilities for research and synergy waiting beyond the SERPs on the Google Display Network. Today I want to give you a couple techniques to kick your paid/organic collaboration back into gear and get more mileage from combining efforts across the disciplines.

      Using the display network

      If you’re not familiar with it already, the GDN is essentially the other side of AdSense, offering the ability to run banner, rich media, and even video ads across the network from AdWords or Doubleclick. There are two overarching methods of targeting these ads: by context/content, and by using remarketing lists. Regardless of your chosen method, ads here are about as cheap as you can find (often under a $1 CPC), making them a prime tool for exploratory research and supporting actions.

      Contextual and content-based targeting offers some simple and intuitive ways to extend existing methods of PPC and SEO interaction. By selecting relevant topics, key phrases, or even particular sites, you can place ads in the wild to test the real world resonance of taglines and imagery with people consuming content relevant to yours.

      You can also take a more coordinated approach during a content marketing campaign using the same type of targeting. Enter a unique phrase from any placements you earn on pages using AdSense as a keyword target, and you can back up any article or blog post with a powerful piece of screen real estate and a call to action that is fully under your control. This approach mirrors the tactic of using paid search ads to better control organic results, and offers a direct route to conversion that usually would not otherwise exist in this environment.

      Research with remarketing

      Remarketing on AdWords is a powerful tool to drive conversions, but it also produces some very interesting and frequently neglected data in the proces: Your reports will tell you which other sites and pages your targeted audience visits once your ads display there. You will, of course, be restricted here to sites running AdSense or DoubleClick inventory, but this still adds up to over 2 million potential pages!

      If your firm is already running remarketing, you’ll be able to draw some insights from your existing data, but if you have a specific audience in mind, you may want to create a new list anyway. While it is possible to create basic remarketing lists natively in AdWords, I recommend using Google Analytics to take advantage of the advanced segmentation capabilities of the platform. Before beginning, you’ll need to ensure that your AdWords account is linked and your tracking code is updated.

      Creating your remarketing list

      First, define who exactly the users you’re interested in are. You’re going to have to operationalize this definition based on the information available in GA/UA, so be concrete about it. We might, for example, want to look after users who have made multiple visits within the past two weeks to peruse our resources without completing any transactions. Where else are they bouncing off to instead of closing the deal with us?

      If you’ve never built a remarketing list before, pop into the creation interface in GA through Admin > Remarketing > Audiences. Hit the big red ‘+ Audience’ button to get started. You’re first presented with a selection of list types:

      ga-remarketing-list-types.PNG

      The first three options are the simplest and least customizable, so they won’t be able to parse out our theoretical non-transactors, but can be handy for this application nonetheless. The Smart List option is a relatively new and interesting option. Essentially, this will create a list based on Google’s best algorithmic guess at which of your users are most likely to convert upon return to your site. The ‘black box’ element to Smart Lists makes it less precise as a tool here, but it’s simple to test and see what it turns up.

      The next three are relatively self explanatory; you can gather all users, all users to a given page, or all that have completed a conversion goal. Where it gets truly interesting is when you create your own list using segments. All the might of GA opens up here for you to apply criteria for demographics, technology/source, behavior, and even advanced conditions and sequences. Very handily, you can also import any existing segments you’ve created for other purposes.

      In this figure, we’re simply translating the example from above into some criteria that should fairly accurately pick out the individuals in which we are interested.

      Setting up and going live

      When you’ve put your list together, simply save it and hop back over to AdWords. Once it counts at least 100 users in its target audience, Google will let you show ads using it as targeting criteria. To set up the ad group, there are a few key considerations to bear in mind:

      1. You can further narrow your sample using AdWords’ other targeting options, which can be very handy. For example, want to know only what sites your users visit within a certain subject category? Plug in topic targeting. I won’t jump down the rabbit hole of possibilities here, but I encourage you to think creatively in using this capability.
      2. You’ll of course need fill the group with some actual ads for it to work. If you can’t get some applicable banner ads, you can create some simple text ads. We might be focusing on the research data to be had in this particular group, but remember that users are still going to see and potentially click these ads, so make sure you use relevant copy and direct them to an appropriate landing page.
      3. To hone down on unique and useful discoveries, consider setting some of the big generic inventory sources like YouTube as negative targets.
      4. Finally, set a reasonable CPC bid to ensure your ads show. $0.75 to $1.00 should be sufficient; if your ads aren’t turning up many impressions with a decent sized list, push the number up a bit.

      To check on the list size and status, you can find it in Shared Library > Audiences or back in GA. Once everything is in place, set your ads live and start pulling in some data!

      Getting the data

      You won’t get your numbers back overnight, but over time you will collect a list of the websites your remarketed ads show on: all the pages across the vast Google Display Network that your users visit. To find it, enter AdWords and select the ad group you set up. Click the “Display Network” and “Placements” tabs:

      placement-data-tabs.PNG

      You’ll see a grid showing the domain level placements your remarketing lists have shown on, with the opportunity to customize the columns of data included. You can sift through the data on a more granular level by clicking “see details;” this will provide you with page level data for the listed domains. You’re likely to see a chunk of anonymized visits; there is a workaround to track down the pages in here, but be advised it will take a fair amount of extra effort.

      plaecments-see-details.png

      Tada! There you are—a lovely cross section of your target segment’s online activities. Bear in mind you can use this approach with contextual, topic, or interest targeting that produces automatic placements as well.

      Depending on your needs, there are of course myriad ways to make use of display advertising tools in sync with organic marketing. Have you come up with any creative methods or intriguing results? Let us know in the comments! 


      Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

      Continue reading →