About frans

Website:
frans has written 4625 articles so far, you can find them below.

Detecting Link Manipulation and Spam with Domain Authority

Posted by rjonesx.

Over 7 years ago, while still an employee at Virante, Inc. (now Hive Digital), I wrote a post on Moz outlining some simple methods for detecting backlink manipulation by comparing one’s backlink profile to an ideal model based on Wikipedia. At the time, I was limited in the research I could perform because I was a consumer of the API, lacking access to deeper metrics, measurements, and methodologies to identify anomalies in backlink profiles. We used these techniques in spotting backlink manipulation with tools like Remove’em and Penguin Risk, but they were always handicapped by the limitations of consumer facing APIs. Moreover, they didn’t scale. It is one thing to collect all the backlinks for a site, even a large site, and judge every individual link for source type, quality, anchor text, etc. Reports like these can be accessed from dozens of vendors if you are willing to wait a few hours for the report to complete. But how do you do this for 30 trillion links every single day?

Since the launch of Link Explorer and my residency here at Moz, I have had the luxury of far less filtered data, giving me a far deeper, clearer picture of the tools available to backlink index maintainers to identify and counter manipulation. While I in no way intend to say that all manipulation can be detected, I want to outline just some of the myriad surprising methodologies to detect spam.

The general methodology

You don’t need to be a data scientist or a math nerd to understand this simple practice for identifying link spam. While there certainly is a great deal of math used in the execution of measuring, testing, and building practical models, the general gist is plainly understandable.

The first step is to get a good random sample of links from the web, which you can read about here. But let’s assume you have already finished that step. Then, for any property of those random links (DA, anchor text, etc.), you figure out what is normal or expected. Finally, you look for outliers and see if those correspond with something important – like sites that are manipulating the link graph, or sites that are exceptionally good. Let’s start with an easy example, link decay.

Link decay and link spam

Link decay is the natural occurrence of links either dropping off the web or changing URLs. For example, if you get links after you send out a press release, you would expect some of those links to eventually disappear as the pages are archived or removed for being old. And, if you were to get a link from a blog post, you might expect to have a homepage link on the blog until that post is pushed to the second or third page by new posts.

But what if you bought your links? What if you own a large number of domains and all the sites link to each other? What if you use a PBN? These links tend not to decay. Exercising control over your inbound links often means that you keep them from ever decaying. Thus, we can create a simple hypothesis:

Hypothesis: The link decay rate of sites manipulating the link graph will differ from sites with natural link profiles.

The methodology for testing this hypothesis is just as we discussed before. We first figure out what is natural. What does a random site’s link decay rate look like? Well, we simply get a bunch of sites and record how fast links are deleted (we visit a page and see a link is gone) vs. their total number of links. We then can look for anomalies.

In this case of anomaly hunting, I’m going to make it really easy. No statistics, no math, just a quick look at what pops up when we first sort by Lowest Decay Rate and then sort by Highest Domain Authority to see who is at the tail-end of the spectrum.

spreadsheet of sites with high deleted link ratios

Success! Every example we see of a good DA score but 0 link decay appears to be powered by a link network of some sort. This is the Aha! moment of data science that is so fun. What is particularly interesting is we find spam on both ends of the distribution — that is to say, sites that have 0 decay or near 100% decay rates both tend to be spammy. The first type tends to be part of a link network, the second part tends to spam their backlinks to sites others are spamming, so their links quickly shuffle off to other pages.

Of course, now we do the hard work of building a model that actually takes this into account and accurately reduces Domain Authority relative to the severity of the link spam. But you might be asking…

These sites don’t rank in Google — why do they have decent DAs in the first place?

Well, this is a common problem with training sets. DA is trained on sites that rank in Google so that we can figure out who will rank above who. However, historically, we haven’t (and no one to my knowledge in our industry has) taken into account random URLs that don’t rank at all. This is something we’re solving for in the new DA model set to launch in early March, so stay tuned, as this represents a major improvement on the way we calculate DA!

Spam Score distribution and link spam

One of the most exciting new additions to the upcoming Domain Authority 2.0 is the use of our Spam Score. Moz’s Spam Score is a link-blind (we don’t use links at all) metric that predicts the likelihood a domain will be indexed in Google. The higher the score, the worse the site.

Now, we could just ignore any links from sites with Spam Scores over 70 and call it a day, but it turns out there are fascinating patterns left behind by common link manipulation schemes waiting to be discovered by using this simple methodology of using a random sample of URLs to find out what a normal backlink profile looks like, and then see if there are anomalies in the way Spam Score is distributed among the backlinks to a site. Let me show you just one.

It turns out that acting natural is really hard to do. Even the best attempts often fall short, as did this particularly pernicious link spam network. This network had haunted me for 2 years because it included a directory of the top million sites, so if you were one of those sites, you could see anywhere from 200 to 600 followed links show up in your backlink profile. I called it “The Globe” network. It was easy to look at the network and see what they were doing, but could we spot it automatically so that we could devalue other networks like it in the future? When we looked at the link profile of sites included in the network, the Spam Score distribution lit up like a Christmas tree.

spreadsheet with distribution of spam scores

Most sites get the majority of their backlinks from low Spam Score domains and get fewer and fewer as the Spam Score of the domains go up. But this link network couldn’t hide because we were able to detect the sites in their network as having quality issues using Spam Score. If we relied only on ignoring the bad Spam Score links, we would have never discovered this issue. Instead, we found a great classifier for finding sites that are likely to be penalized by Google for bad link building practices.

DA distribution and link spam

We can find similar patterns among sites with the distribution of inbound Domain Authority. It’s common for businesses seeking to increase their rankings to set minimum quality standards on their outreach campaigns, often DA30 and above. An unfortunate outcome of this is that what remains are glaring examples of sites with manipulated link profiles.

Let me take a moment and be clear here. A manipulated link profile is not necessarily against Google’s guidelines. If you do targeted PR outreach, it is reasonable to expect that such a distribution might occur without any attempt to manipulate the graph. However, the real question is whether Google wants sites that perform such outreach to perform better. If not, this glaring example of link manipulation is pretty easy for Google to dampen, if not ignore altogether.

spreadsheet with distribution of domain authorityA normal link graph for a site that is not targeting high link equity domains will have the majority of their links coming from DA0–10 sites, slightly fewer for DA10–20, and so on and so forth until there are almost no links from DA90+. This makes sense, as the web has far more low DA sites than high. But all the sites above have abnormal link distributions, which make it easy to detect and correct — at scale — link value.

Now, I want to be clear: these are not necessarily examples of violating Google’s guidelines. However, they are manipulations of the link graph. It’s up to you to determine whether you believe Google takes the time to differentiate between how the outreach was conducted that resulted in the abnormal link distribution.

What doesn’t work

For every type of link manipulation detection method we discover, we scrap dozens more. Some of these are actually quite surprising. Let me write about just one of the many.

The first surprising example was the ratio of nofollow to follow links. It seems pretty straightforward that comment, forum, and other types of spammers would end up accumulating lots of nofollowed links, thereby leaving a pattern that is easy to discern. Well, it turns out this is not true at all.

The ratio of nofollow to follow links turns out to be a poor indicator, as popular sites like facebook.com often have a higher ratio than even pure comment spammers. This is likely due to the use of widgets and beacons and the legitimate usage of popular sites like facebook.com in comments across the web. Of course, this isn’t always the case. There are some sites with 100% nofollow links and a high number of root linking domains. These anomalies, like “Comment Spammer 1,” can be detected quite easily, but as a general measurement the ratio does not serve as a good classifier for spam or ham.

So what’s next?

Moz is continually traversing the the link graph looking for ways to improve Domain Authority using everything from basic linear algebra to complex neural networks. The goal in mind is simple: We want to make the best Domain Authority metric ever. We want a metric which users can trust in the long run to root out spam just like Google (and help you determine when you or your competitors are pushing the limits) while at the same time maintaining or improving correlations with rankings. Of course, we have no expectation of rooting out all spam — no one can do that. But we can do a better job. Led by the incomparable Neil Martinsen-Burrell, our metric will stand alone in the industry as the canonical method for measuring the likelihood a site will rank in Google.


We’re launching Domain Authority 2.0 on March 5th! Check out our helpful resources here, or sign up for our webinar this Thursday, February 21st for more info on how to communicate changes like this to clients and stakeholders:

Save my spot!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

4 Ways to Improve Your Data Hygiene – Whiteboard Friday

Posted by DiTomaso

We base so much of our livelihood on good data, but managing that data properly is a task in and of itself. In this week’s Whiteboard Friday, Dana DiTomaso shares why you need to keep your data clean and some of the top things to watch out for.

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Hi. My name is Dana DiTomaso. I am President and partner at Kick Point. We’re a digital marketing agency, based in the frozen north of Edmonton, Alberta. So today I’m going to be talking to you about data hygiene.

What I mean by that is the stuff that we see every single time we start working with a new client this stuff is always messed up. Sometimes it’s one of these four things. Sometimes it’s all four, or sometimes there are extra things. So I’m going to cover this stuff today in the hopes that perhaps the next time we get a profile from someone it is not quite as bad, or if you look at these things and see how bad it is, definitely start sitting down and cleaning this stuff up.

1. Filters

So what we’re going to start with first are filters. By filters, I’m talking about analytics here, specifically Google Analytics. When go you into the admin of Google Analytics, there’s a section called Filters. There’s a section on the left, which is all the filters for everything in that account, and then there’s a section for each view for filters. Filters help you exclude or include specific traffic based on a set of parameters.

Filter out office, home office, and agency traffic

So usually what we’ll find is one Analytics property for your website, and it has one view, which is all website data which is the default that Analytics gives you, but then there are no filters, which means that you’re not excluding things like office traffic, your internal people visiting the website, or home office. If you have a bunch of people who work from home, get their IP addresses, exclude them from this because you don’t necessarily want your internal traffic mucking up things like conversions, especially if you’re doing stuff like checking your own forms.

You haven’t had a lead in a while and maybe you fill out the form to make sure it’s working. You don’t want that coming in as a conversion and then screwing up your data, especially if you’re a low-volume website. If you have a million hits a day, then maybe this isn’t a problem for you. But if you’re like the rest of us and don’t necessarily have that much traffic, something like this can be a big problem in terms of the volume of traffic you see. Then agency traffic as well.

So agencies, please make sure that you’re filtering out your own traffic. Again things like your web developer, some contractor you worked with briefly, really make sure you’re filtering out all that stuff because you don’t want that polluting your main profile.

Create a test and staging view

The other thing that I recommend is creating what we call a test and staging view. Usually in our Analytics profiles, we’ll have three different views. One we call master, and that’s the view that has all these filters applied to it.

So you’re only seeing the traffic that isn’t you. It’s the customers, people visiting your website, the real people, not your office people. Then the second view we call test and staging. So this is just your staging server, which is really nice. For example, if you have a different URL for your staging server, which you should, then you can just include that traffic. Then if you’re making enhancements to the site or you upgraded your WordPress instance and you want to make sure that your goals are still firing correctly, you can do all that and see that it’s working in the test and staging view without polluting your main view.

Test on a second property

That’s really helpful. Then the third thing is make sure to test on a second property. This is easy to do with Google Tag Manager. What we’ll have set up in most of our Google Tag Manager accounts is we’ll have our usual analytics and most of the stuff goes to there. But then if we’re testing something new, like say the content consumption metric we started putting out this summer, then we want to make sure we set up a second Analytics view and we put the test, the new stuff that we’re trying over to the second Analytics property, not view.

So you have two different Analytics properties. One is your main property. This is where all the regular stuff goes. Then you have a second property, which is where you test things out, and this is really helpful to make sure that you’re not going to screw something up accidentally when you’re trying out some crazy new thing like content consumption, which can totally happen and has definitely happened as we were testing the product. You don’t want to pollute your main data with something different that you’re trying out.

So send something to a second property. You do this for websites. You always have a staging and a live. So why wouldn’t you do this for your analytics, where you have a staging and a live? So definitely consider setting up a second property.

2. Time zones

The next thing that we have a lot of problems with are time zones. Here’s what happens.

Let’s say your website, basic install of WordPress and you didn’t change the time zone in WordPress, so it’s set to UTM. That’s the default in WordPress unless you change it. So now you’ve got your data for your website saying it’s UTM. Then let’s say your marketing team is on the East Coast, so they’ve got all of their tools set to Eastern time. Then your sales team is on the West Coast, so all of their tools are set to Pacific time.

So you can end up with a situation where let’s say, for example, you’ve got a website where you’re using a form plugin for WordPress. Then when someone submits a form, it’s recorded on your website, but then that data also gets pushed over to your sales CRM. So now your website is saying that this number of leads came in on this day, because it’s in UTM mode. Well, the day ended, or it hasn’t started yet, and now you’ve got Eastern, which is when your analytics tools are recording the number of leads.

But then the third wrinkle is then you have Salesforce or HubSpot or whatever your CRM is now recording Pacific time. So that means that you’ve got this huge gap of who knows when this stuff happened, and your data will never line up. This is incredibly frustrating, especially if you’re trying to diagnose why, for example, I’m submitting a form, but I’m not seeing the lead, or if you’ve got other data hygiene issues, you can’t match up the data and that’s because you have different time zones.

So definitely check the time zones of every product you use –website, CRM, analytics, ads, all of it. If it has a time zone, pick one, stick with it. That’s your canonical time zone. It will save you so many headaches down the road, trust me.

3. Attribution

The next thing is attribution. Attribution is a whole other lecture in and of itself, beyond what I’m talking about here today.

Different tools have different ways of showing attribution

But what I find frustrating about attribution is that every tool has its own little special way of doing it. Analytics is like the last non-direct click. That’s great. Ads says, well, maybe we’ll attribute it, maybe we won’t. If you went to the site a week ago, maybe we’ll call it a view-through conversion. Who knows what they’re going to call it? Then Facebook has a completely different attribution window.

You can use a tool, such as Supermetrics, to change the attribution window. But if you don’t understand what the default attribution window is in the first place, you’re just going to make things harder for yourself. Then there’s HubSpot, which says the very first touch is what matters, and so, of course, HubSpot will never agree with Analytics and so on. Every tool has its own little special sauce and how they do attribution. So pick a source of truth.

Pick your source of truth

This is the best thing to do is just say, “You know what? I trust this tool the most.” Then that is your source of truth. Do not try to get this source of truth to match up with that source of truth. You will go insane. You do have to make sure that you are at least knowing that things like your time zones are clear so that’s all set.

Be honest about limitations

But then after that, really it’s just making sure that you’re being honest about your limitations.

Know where things are necessarily going to fall down, and that’s okay, but at least you’ve got this source of truth that you at least can trust. That’s the most important thing with attribution. Make sure to spend the time and read how each tool handles attribution so when someone comes to you and says, “Well, I see that we got 300 visits from this ad campaign, but in Facebook it says we got 6,000.

Why is that? You have an answer. That might be a little bit of an extreme example, but I mean I’ve seen weirder things with Facebook attribution versus Analytics attribution. I’ve even talked about stuff like Mixpanel and Kissmetrics. Every tool has its own little special way of recording attributions. It’s never the same as anyone else’s. We don’t have a standard in the industry of how this stuff works, so make sure you understand these pieces.

4. Interactions

Then the last thing are what I call interactions. The biggest thing that I find that people do wrong here is in Google Tag Manager it gives you a lot of rope, which you can hang yourself with if you’re not careful.

GTM interactive hits

One of the biggest things is what we call an interactive hit versus a non-interactive hit. So let’s say in Google Tag Manager you have a scroll depth.

You want to see how far down the page people scroll. At 25%, 50%, 75%, and 100%, it will send off an alert and say this is how far down they scrolled on the page. Well, the thing is that you can also make that interactive. So if somebody scrolls down the page 25%, you can say, well, that’s an interactive hit, which means that person is no longer bounced, because it’s counting an interaction, which for your setup might be great.

Gaming bounce rate

But what I’ve seen are unscrupulous agencies who come in and say if the person scrolls 2% of the way down the page, now that’s an interactive hit. Suddenly the client’s bounce rate goes down from say 80% to 3%, and they think, “Wow, this agency is amazing.” They’re not amazing. They’re lying. This is where Google Tag Manager can really manipulate your bounce rate. So be careful when you’re using interactive hits.

Absolutely, maybe it’s totally fair that if someone is reading your content, they might just read that one page and then hit the back button and go back out. It’s totally fair to use something like scroll depth or a certain piece of the content entering the user’s view port, that that would be interactive. But that doesn’t mean that everything should be interactive. So just dial it back on the interactions that you’re using, or at least make smart decisions about the interactions that you choose to use. So you can game your bounce rate for that.

Goal setup

Then goal setup as well, that’s a big problem. A lot of people by default maybe they have destination goals set up in Analytics because they don’t know how to set up event-based goals. But what we find happens is by destination goal, I mean you filled out the form, you got to a thank you page, and you’re recording views of that thank you page as goals, which yes, that’s one way to do it.

But the problem is that a lot of people, who aren’t super great at interneting, will bookmark that page or they’ll keep coming back to it again and again because maybe you put some really useful information on your thank you page, which is what you should do, except that means that people keep visiting it again and again without actually filling out the form. So now your conversion rate is all messed up because you’re basing it on destination, not on the actual action of the form being submitted.

So be careful on how you set up goals, because that can also really game the way you’re looking at your data.

Ad blockers

Ad blockers could be anywhere from 2% to 10% of your audience depending upon how technically sophisticated your visitors are. So you’ll end up in situations where you have a form fill, you have no corresponding visit to match with that form fill.

It just goes into an attribution black hole. But they did fill out the form, so at least you got their data, but you have no idea where they came from. Again, that’s going to be okay. So definitely think about the percentage of your visitors, based on you and your audience, who probably have an ad blocker installed and make sure you’re comfortable with that level of error in your data. That’s just the internet, and ad blockers are getting more and more popular.

Stuff like Apple is changing the way that they do tracking. So definitely make sure that you understand these pieces and you’re really thinking about that when you’re looking at your data. Again, these numbers may never 100% match up. That’s okay. You can’t measure everything. Sorry.

Bonus: Audit!

Then the last thing I really want you to think about — this is the bonus tip — audit regularly.

So at least once a year, go through all the different stuff that I’ve covered in this video and make sure that nothing has changed or updated, you don’t have some secret, exciting new tracking code that somebody added in and then forgot because you were trying out a trial of this product and you tossed it on, and it’s been running for a year even though the trial expired nine months ago. So definitely make sure that you’re running the stuff that you should be running and doing an audit at least on an yearly basis.

If you’re busy and you have a lot of different visitors to your website, it’s a pretty high-volume property, maybe monthly or quarterly would be a better interval, but at least once a year go through and make sure that everything that’s there is supposed to be there, because that will save you headaches when you look at trying to compare year-over-year and realize that something horrible has been going on for the last nine months and all of your data is trash. We really don’t want to have that happen.

So I hope these tips are helpful. Get to know your data a little bit better. It will like you for it. Thanks.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

A guide to setting up your very own search intent projects

Posted by TheMozTeam

This post was originally published on the STAT blog.


Whether you’re tracking thousands or millions of keywords, if you expect to extract deep insights and trends just by looking at your keywords from a high-level, you’re not getting the full story.

Smart segmentation is key to making sense of your data. And you’re probably already applying this outside of STAT. So now, we’re going to show you how to do it in STAT to uncover boatloads of insights that will help you make super data-driven decisions.

To show you what we mean, let’s take a look at a few ways we can set up a search intent project to uncover the kinds of insights we shared in our whitepaper, Using search intent to connect with consumers.

Before we jump in, there are a few things you should have down pat:

1. Picking a search intent that works for you

Search intent is the motivating force behind search and it can be:

  • Informational: The searcher has identified a need and is looking for information on the best solution, ie. [blender], [food processor]
  • Commercial: The searcher has zeroed in on a solution and wants to compare options, ie. [blender reviews], [best blenders]
  • Transactional: The searcher has narrowed their hunt down to a few best options, and is on the precipice of purchase, ie. [affordable blenders], [blender cost]
    • Local (sub-category of transactional): The searcher plans to do or buy something locally, ie. [blenders in dallas]
    • Navigational (sub-category of transactional): The searcher wants to locate a specific website, ie. [Blendtec]

We left navigational intent out of our study because it’s brand specific and didn’t want to bias our data.

Our keyword set was a big list of retail products — from kitty pooper-scoopers to pricey speakers. We needed a straightforward way to imply search intent, so we added keyword modifiers to characterize each type of intent.

As always, different strokes for different folks: The modifiers you choose and the intent categories you look at may differ, but it’s important to map that all out before you get started.

2. Identifying the SERP features you really want

For our whitepaper research, we pretty much tracked every feature under the sun, but you certainly don’t have to.

You might already know which features you want to target, the ones you want to keep an eye on, or questions you want to answer. For example, are shopping boxes taking up enough space to warrant a PPC strategy?

In this blog post, we’re going to really focus-in on our most beloved SERP feature: featured snippets (called “answers” in STAT). And we’ll be using a sample project where we’re tracking 25,692 keywords against Amazon.com.

3. Using STAT’s segmentation tools

Setting up projects in STAT means making use of the segmentation tools. Here’s a quick rundown of what we used:

  • Standard tag: Best used to group your keywords into static themes — search intent, brand, product type, or modifier.
  • Dynamic tag: Like a smart playlist, automatically returns keywords that match certain criteria, like a given search volume, rank, or SERP feature appearance.
  • Data view: House any number of tags and show how those tags perform as a group.

Learn more about tags and data views in the STAT Knowledge Base.

Now, on to the main event…

1. Use top-level search intent to find SERP feature opportunities

To kick things off, we’ll identify the SERP features that appear at each level of search intent by creating tags.

Our first step is to filter our keywords and create standard tags for our search intent keywords (read more abou tfiltering keywords). Second, we create dynamic tags to track the appearance of specific SERP features within each search intent group. And our final step, to keep everything organized, is to place our tags in tidy little data views, according to search intent.

Here’s a peek at what that looks like in STAT:

What can we uncover?

Our standard tags (the blue tags) show how many keywords are in each search intent bucket: 2,940 commercial keywords. And our dynamic tags (the sunny yellow stars) show how many of those keywords return a SERP feature: 547 commercial keywords with a snippet.

This means we can quickly spot how much opportunity exists for each SERP feature by simply glancing at the tags. Boom!

By quickly crunching some numbers, we can see that snippets appear on 5 percent of our informational SERPs (27 out of 521), 19 percent of our commercial SERPs (547 out of 2,940), and 12 percent of our transactional SERPs (253 out of 2,058).

From this, we might conclude that optimizing our commercial intent keywords for featured snippets is the way to go since they appear to present the biggest opportunity. To confirm, let’s click on the commercial intent featured snippet tag to view the tag dashboard…

Voilà! There are loads of opportunities to gain a featured snippet.

Though, we should note that most of our keywords rank below where Google typically pulls the answer from. So, what we can see right away is that we need to make some serious ranking gains in order to stand a chance at grabbing those snippets.


2. Find SERP feature opportunities with intent modifiers

Now, let’s take a look at which SERP features appear most often for our different keyword modifiers.

To do this, we group our keywords by modifier and create a standard tag for each group. Then, we set up dynamic tags for our desired SERP features. Again, to keep track of all the things, we contained the tags in handy data views, grouped by search intent.

What can we uncover?

Because we saw that featured snippets appear most often for our commercial intent keywords, it’s time to drill on down and figure out precisely which modifiers within our commercial bucket are driving this trend.

Glancing quickly at the numbers in the tag titles in the image above, we can see that “best,” “reviews,” and “top” are responsible for the majority of the keywords that return a featured snippet:

  • 212 out of 294 of our “best” keywords (72%)
  • 109 out of 294 of our “reviews” keywords (37%)
  • 170 out of 294 of our “top” keywords (59%)

This shows us where our efforts are best spent optimizing.

By clicking on the “best — featured snippets” tag, we’re magically transported into the dashboard. Here, we see that our average ranking could use some TLC.


There is a lot of opportunity to snag a snippet here, but we (actually, Amazon, who we’re tracking these keywords against) don’t seem to be capitalizing on that potential as much as we could. Let’s drill down further to see which snippets we already own.

We know we’ve got content that has won snippets, so we can use that as a guideline for the other keywords that we want to target.


3. See which pages are ranking best by search intent

In our blog post How Google dishes out content by search intent, we looked at what type of pages — category pages, product pages, reviews — appear most frequently at each stage of a searcher’s intent.

What we found was that Google loves category pages, which are the engine’s top choice for retail keywords across all levels of search intent. Product pages weren’t far behind.

By creating dynamic tags for URL markers, or portions of your URL that identify product pages versus category pages, and segmenting those by intent, you too can get all this glorious data. That’s exactly what we did for our retail keywords

What can we uncover?

Looking at the tags in the transactional page types data view, we can see that product pages are appearing far more frequently (526) than category pages (151).

When we glanced at the dashboard, we found that slightly more than half of the product pages were ranking on the first page (sah-weet!). That said, more than thirty percent appeared on page three and beyond. So despite the initial visual of “doing well”, there’s a lot of opportunity that Amazon could be capitalizing on.

We can also see this in the Daily Snapshot. In the image above, we compare category pages (left) to product pages (right), and we see that while there are less category pages ranking, the rank is significantly better. Amazon could take some of the lessons they’ve applied to their category pages to help their product pages out.

Wrapping it up

So what did we learn today?

  1. Smart segmentation starts with a well-crafted list of keywords, grouped into tags, and housed in data views.
  2. The more you segment, the more insights you’re gonna uncover.
  3. Rely on the dashboards in STAT to flag opportunities and tell you what’s good, yo!

Want to see it all in action? Get a tailored walkthrough of STAT, here.

Or get your mitts on even more intent-based insights in our full whitepaper: Using search intent to connect with consumers.

Read on, readers!

More in our search intent series:


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

The Basics of Building an Intent-based Keyword List

Posted by TheMozTeam

This week, we’re taking a deep into search intent.

The STAT whitepaper looked at how SERP features respond to intent, and the bonus blog posts broke things down even further and examined how individual intent modifiers impact SERP features, the kind of content that Google serves at each stage of intent, and how you can set up your very own search intent projects. And look out for Seer’s very own Scott Taft’s upcoming post on how to use STAT and Power BI to create your very own search intent dashboard.

Search intent is the new demographics, so it only made sense to get up close and personal with it. Of course, in order to bag all those juicy search intent tidbits, we needed a great intent-based keyword list. Here’s how you can get your hands on one of those.

Gather your core keywords

First, before you can even think about intent, you need to have a solid foundation of core keywords in place. These are the products, features, and/or services that you’ll build your search intent funnel around.

But goodness knows that keyword list-building is more of an art than a science, and even the greatest writers (hi, Homer) needed to invoke the muses (hey, Calliope) for inspiration, so if staring at your website isn’t getting the creative juices flowing, you can look to a few different places for help.

Snag some good suggestions from keyword research tools

Lots of folks like to use the Google Keyword Planner to help them get started. Ubersuggest and Yoast’s Google Suggest Expander will also help add keywords to your arsenal. And Answer The Public gives you all of that, and beautifully visualized to boot.

Simply plunk in a keyword and watch the suggestions pour in. Just remember to be critical of these auto-generated lists, as odd choices sometimes slip into the mix. For example, apparently we should add [free phones] to our list of [rank tracking] keywords. Huh.

Spot inspiration on the SERPs

Two straight-from-the-SERP resources that we love for keyword research are the “People also ask” box and related searches. These queries are Google-vetted and plentiful, and also give you some insight into how the search engine giant links topics.

If you’re a STAT client, you can generate reports that will give you every question in a PAA box (before it gets infinite), as well as each of the eight related searches at the bottom of a SERP. Run the reports for a couple of days and you’ll get a quick sense of which questions and queries Google favours for your existing keyword set.

A quick note about language & location

When you’re in the UK, you push a pram, not a stroller; you don’t wear a sweater, you wear a jumper. This is all to say that if you’re in the business of global tracking, it’s important to keep different countries’ word choices in mind. Even if you’re not creating content with them, it’s good to see if you’re appearing for the terms your global searchers are using.

Add your intent modifiers

Now it’s time to tackle the intent bit of your keyword list. And this bit is going to require drawing some lines in the sand because the modifiers that occupy each intent category can be highly subjective — does “best” apply transactional intent instead of commercial?

We’ve put together a loose guideline below, but the bottom line is that intent should be structured and classified in a way that makes sense to your business. And if you’re stuck for modifiers to marry to your core keywords, here’s a list of 50+ to help with the coupling.

Informational intent

The searcher has identified a need and is looking for the best solution. These keywords are the core keywords from your earlier hard work, plus every question you think your searchers might have if they’re unfamiliar with your product or services.

Your informational queries might look something like:

  • [product name]
  • what is [product name]
  • how does [product name] work
  • how do I use [product name]
Commercial intent

At this stage, the searcher has zeroed in on a solution and is looking into all the different options available to them. They’re doing comparative research and are interested in specific requirements and features.

For our research, we used best, compare, deals, new, online, refurbished, reviews, shop, top, and used.

Your commercial queries might look something like:

  • best [product name]
  • [product name] reviews
  • compare [product name]
  • what is the top [product name]
  • [colour/style/size] [product name]
Transactional intent (including local and navigational intent)

Transactional queries are the most likely to convert and generally include terms that revolve around price, brand, and location, which is why navigational and local intent are nestled within this stage of the intent funnel.

For our research, we used affordable, buy, cheap, cost, coupon, free shipping, and price.

Your transactional queries might look something like:

  • how much does [product name] cost
  • [product name] in [location]
  • order [product name] online
  • [product name] near me
  • affordable [brand name] [product name]
A tip if you want to speed things up

A super quick way to add modifiers to your keywords and save your typing fingers is by using a keyword mixer like this one. Just don’t forget that using computer programs for human-speak means you’ll have to give them the ol’ once-over to make sure they still make sense.

Audit your list

Now that you’ve reached for the stars and got yourself a huge list of keywords, it’s time to bring things back down to reality and see which ones you’ll actually want to keep around.

No two audits are going to look the same, but here are a few considerations you’ll want to keep in mind when whittling your keywords down to the best of the bunch.

  1. Relevance. Are your keywords represented on your site? Do they point to optimized pages
  2. Search volume. Are you after highly searched terms or looking to build an audience? You can get the SV goods from the Google Keyword Planner.
  3. Opportunity. How many clicks and impressions are your keywords raking in? While not comprehensive (thanks, Not Provided), you can gather some of this info by digging into Google Search Console.
  4. Competition. What other websites are ranking for your keywords? Are you up against SERP monsters like Amazon? What about paid advertising like shopping boxes? How much SERP space are they taking up? Your friendly SERP analytics platform withshare of voice capabilities (hi!) can help you understand your search landscape.
  5. Difficulty. How easy is your keyword going to be to win? Search volume can give you a rough idea — the higher the search volume, the stiffer the competition is likely to be — but for a different approach, Moz’s Keyword Explorer has a Difficulty score that takes Page Authority, Domain Authority, and projected click-through-rate into account.

By now, you should have a pretty solid plan of attack to create an intent-based keyword list of your very own to love, nurture, and cherish.

If, before you jump headlong into it, you’re curious what a good chunk of this is going to looks like in practice, give this excellent article by Russ Jones a read, or drop us a line. We’re always keen to show folks why tracking keywords at scale is the best way to uncover intent-based insights.

Read on, readers!

More in our search intent series:

This post was originally published on the STAT blog.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

Do Businesses Really Use Google My Business Posts? A Case Study

Posted by Ben_Fisher

Google My Business (GMB) is one of the most powerful ways to improve a business’ local search engine optimization and online visibility. If you’re a local business, claiming your Google My Business profile is one of the first steps you should take to increase your company’s online presence.

As long as your local business meets Google’s guidelines, your Google My Business profile can help give your company FREE exposure on Google’s search engine. Not only can potential customers quickly see your business’ name, address and phone number, but they can also see photos of your business, read online reviews, find a description about your company, complete a transaction (like book an appointment) and see other information that grabs a searcher’s attention — all without them even visiting your website. That’s pretty powerful stuff!

Google My Business helps with local rankings

Not only is your GMB Profile easily visible to potential customers when they search on Google, but Google My Business is also a key Google local ranking factor. In fact, according to local ranking factor industry research, Google My Business “signals” is the most important ranking factor for local pack rankings. Google My Business signals had a significant increase in ranking importance between 2017 and 2018 — rising from 19% to 25%.

Claiming your Google My Business profile is your first step to local optimization — but many people mistakenly think that just claiming your Google My Business profile is enough. However, optimizing your Google My Business profile and frequently logging into your Google My Business dashboard to make sure that no unwanted updates have been made to your profile is vital to improving your rankings and ensuring the integrity of your business profile’s accuracy.

Google My Business features that make your profile ROCK!

Google offers a variety of ways to optimize and enhance your Google My Business profile. You can add photos, videos, business hours, a description of your company, frequently asked questions and answers, communicate with customers via messages, allow customers to book appointments, respond to online reviews and more.

One of the most powerful ways to grab a searcher’s attention is by creating Google My Business Posts. GMB Posts are almost like mini-ads for your company, products, or services.

Google offers a variety of posts you can create to promote your business:

  • What’s New
  • Event
  • Offer
  • Product

Posts also allow you to include a call to action (CTA) so you can better control what the visitor does after they view your post — creating the ultimate marketing experience. Current CTAs are:

  • Book
  • Order Online
  • Buy
  • Learn More
  • Sign Up
  • Get Offer
  • Call Now

Posts use a combination of images, text and a CTA to creatively show your message to potential customers. A Post shows in your GMB profile when someone searches for your business’ name on Google or views your business’ Google My Business profile on Google Maps.

Once you create a Post, you can even share it on your social media channels to get extra exposure.

Despite the name, Google My Business Posts are not actual social media posts. Typically the first 100 characters of the post are what shows up on screen (the rest is cut off and must be clicked on to be seen), so make sure the most important words are at the beginning of your post. Don’t use hashtags — they’re meaningless. It’s best if you can create new posts every seven days or so.

Google My Business Posts are a great way to show off your business in a unique way at the exact time when a searcher is looking at your business online.

But there’s a long-standing question: Are businesses actually creating GMB Posts to get their message across to potential customers? Let’s find out…

The big question: Are businesses actively using Google My Business Posts?

There has been a lot of discussion in the SEO industry about Google My Business Posts and their value: Do they help with SEO rankings? How effective are they? Do posts garner engagement? Does where the Posts appear on your GMB profile matter? How often should you post? Should you even create Google My Business Posts at all? Lots of questions, right?

As industry experts look at all of these angles, what do average, everyday business owners actually do when it comes to GMB Posts? Are real businesses creating posts? I set out to find the answer to this question using real data. Here are the details.

Google My Business Post case study: Just the facts

When I set out to discover if businesses were actively using GMB Posts for their companies’ Google My Business profiles, I first wanted to make sure I looked at data in competitive industries and markets. So I looked at a total of 2,000 Google My Business profiles that comprised the top 20 results in the Local Finder. I searched for highly competitive keyword phrases in the top ten cities (based on population density, according to Wikipedia.)

For this case study, I also chose to look at service type businesses.

Here are the results.

Cities:

New York, Los Angeles, Chicago, Philadelphia, Dallas, San Jose, San Francisco, Washington DC, Houston, and Boston.

Keywords:

real estate agent, mortgage, travel agency, insurance or insurance agents, dentist, plastic surgeon, personal injury lawyer, plumber, veterinarian or vet, and locksmith

Surprise! Out of the industries researched, Personal Injury Lawyers and Locksmiths posted the most often.

For the case study, I looked at the following:

  • How many businesses had an active Google My Business Post (i.e. have posted in the last seven days)
  • How many had previously made at least one post
  • How many have never created a post

Do businesses create Google My Business Posts?

Based on the businesses, cities, and keywords researched, I discovered that more than half of the businesses are actively creating Posts or have created Google My Business Posts in the past.

  • 17.5% of businesses had an active post in the last 7 days
  • 42.1% of businesses had previously made at least one post
  • 40.4% have never created a post

Highlight: A total of 59.60% of businesses have posted a Google My Business Post on their Google My Business profile.

NOTE: If you want to look at the raw numbers, you can check out the research document that outlines all the raw data. (NOTE: Credit for the research spreadsheet template I used and inspiration to do this case study goes to SEO expert Phil Rozek.)

Do searchers engage with Google My Business Posts?

If a business takes the time to create Google My Business Posts, do searchers and potential customers actually take the time to look at your posts? And most importantly, do they take action and engage with your posts?

This chart represents nine random clients, their total post views over a 28-day period, and the corresponding total direct/branded impressions on their Google My Business profiles. When we look at the total number of direct/branded views alongside the number of views posts received, the number of views for posts appears to be higher. This means that a single user is more than likely viewing multiple posts.

This means that if you take the time to create a GMB Post and your marketing message is meaningful, you have a high chance of converting a potential searcher into a customer — or at least someone who is going to take the time to look at your marketing message. (How awesome is that?)

Do searchers click on Google My Business Posts?

So your GMB Posts show up in your Knowledge Panel when someone searches for your business on Google and Google Maps, but do searchers actually click on your post to read more?

When we evaluated the various industry post views to their total direct/branded search views, on average the post is clicked on almost 100% of the time!

Google My Business insights

When you log in to your Google My Business dashboard you can see firsthand how well your Posts are doing. Below is a side-by-side image of a business’ post views and their direct search impressions. By checking your GMB insights, you can find out how well your Google My Business posts are performing for your business!

GMB Posts are worth it

After looking at 2,000 GMB profiles, I discovered a lot of things. One thing is for sure. It’s hard to tell on a week-by-week basis how many companies are using GMB Posts because posts “go dark” every seven business days (unless the Post is an event post with a start and end date.)

Also, Google recently moved Posts from the top of the Google My Business profile towards the bottom, so they don’t stand out as much as they did just a few months ago. This may mean that there’s less incentive for businesses to create posts.

However, what this case study does show us is that businesses that are in a competitive location and industry should use Google My Business optimizing strategies and features like posts if they want to get an edge on their competition.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →