About frans

Website:
frans has written 4625 articles so far, you can find them below.

How to Set Up GTM Cookie Tracking (and Better Understand Content Engagement)

Posted by Joel.Mesherghi


The more you understand the behaviour of your users, the better you can market your product or service — which is why Google Tag Manager (GTM) is a marketer’s best friend. With built-in tag templates, such as scroll depth and click tracking, GTM is a powerful tool to measure the engagement and success of your content. 

If you’re only relying on tag templates in GTM, or the occasionally limiting out-of-box Google Analytics, then you could be missing out on insights that go beyond normal engagement metrics. Which means you may be getting an incomplete story from your data.

This post will teach you how to get even more insight by setting up cookies in GTM. You’ll learn how to tag and track multiple page views in a single session, track a specific set number of pages, based on specific on-page content elements, and understand how users are engaging with your content so you can make data-based decisions to better drive conversions.

Example use case

I recently worked with a client that wanted to better understand the behavior of users that landed on their blog content. The main barrier they faced was their URL structure. Their content didn’t live on logical URL structures — they placed their target keyword straight after the root. So, instead of example.com/blog/some-content, their URL structure looked like example.com/some-content.

You can use advanced segments in Google Analytics (GA) to track any number of metrics, but if you don’t have a logically defined URL, then tracking and measuring those metrics becomes a manual and time-consuming practice — especially when there’s a large number of pages to track.

Fortunately, leveraging a custom cookie code, which I provide below, helps you to cut through that time, requires little implementation effort, and can surface powerful insights:

  1. It can indicate that users are engaged with your content and your brand.
  2. The stored data could be used for content scoring — if a page is included in the three pages of an event it may be more valuable than others. You may want to target these pages with more upsell or cross-sell opportunities, if so.
  3. The same scoring logic could apply to authors. If blogs written by certain authors have more page views in a session, then their writing style/topics could be more engaging and you may want to further leverage their content writing skills.
  4. You can build remarketing audience lists to target these seemingly engaged users to align with your business goals — people who are more engaged with your content could be more likely to convert.

So, let’s briefly discuss the anatomy of the custom code that you will need to add to set cookies before we walk through a step by step implementation guide.

Custom cookie code

Cookies, as we all know, are a small text file that is stored in your browser — it helps servers remember who you are and its code is comprised of three elements:

  • a name-value pair containing data
  • an expiry date after which it is no longer valid
  • the domain and path of the server it should be sent to.

You can create a custom code to add to cookies to help you track and store numerous page views in a session across a set of pages.

The code below forms the foundation in setting up your cookies. It defines specific rules, such as the events required to trigger the cookie and the expiration of the cookie. I’ll provide the code, then break it up into two parts to explain each segment.

The code

<script>
function createCookie(name,value,hours) {
    if (hours) {
        var date = new Date();
        date.setTime(date.getTime()+(hours*60*60*1000));
        var expires = "; expires="+date.toGMTString();
    }
    else var expires = "";
    document.cookie = name+"="+value+expires+"; path=/";
}
if (document.querySelectorAll("CSS SELECTOR GOES HERE"").length > 0) {
var y = {{NumberOfBlogPagesVisited}}
if (y == null) {
    createCookie('BlogPagesVisited',1,1);
}
  else if (y == 1) {
    createCookie('BlogPagesVisited',2,1);
  } 
  else if (y == 2) {
    var newCount = Number(y) + 1;
    createCookie('BlogPagesVisited',newCount,12);
  }
  
 if (newCount == 3) {
 dataLayer.push({
 'event': '3 Blog Pages'
 });
 }
}
</script>


Part 1

<script>
function createCookie(name,value,hours) {
    if (hours) {
        var date = new Date();
        date.setTime(date.getTime()+(hours*60*60*1000));
        var expires = "; expires="+date.toGMTString();
    }
    else var expires = "";
    document.cookie = name+"="+value+expires+"; path=/";
}

Explanation:

This function, as the name implies, will create a cookie if you specify a name, a value, and the time a cookie should be valid for. I’ve specified “hours,” but if you want to specify “days,” you’ll need to iterate variables of the code. Take a peek at this great resource on setting up cookies.

    Part 2

    if (document.querySelectorAll("CSS SELECTOR GOES HERE").length > 0) {
    var y = {{NumberOfBlogPagesVisited}}
    if (y == null) {
    createCookie('BlogPagesVisited',1,1);
    }
    else if (y == 1) {
    createCookie('BlogPagesVisited',2,1);
    }
    else if (y == 2) {
    var newCount = Number(y) + 1;
    createCookie('BlogPagesVisited',newCount,12);
    }
      
    if (newCount == 3) {
    dataLayer.push({
    'event': '3 Blog Pages'
    });
    }
      
    </script>


    Explanation:

    The second part of this script will count the number of page views:

    • The “CSS SELECTOR GOES HERE”, which I’ve left blank for now, will be where you add your CSS selector. This will instruct the cookie to fire if the CSS selector matches an element on a page. You can use DevTools to hover over an on-page element, like an author name, and copy the CSS selector.
    • “y” represents the cookie and “NumberOfBlogPagesVisited” is the name I’ve given to the variable. You’ll want to iterate the variable name as you see fit, but the variable name you set up in GTM should be consistent with the variable name in the code (we’ll go through this during the step-by-step guide).
    • “createCookie” is the actual name of your cookie. I’ve called my cookie “BlogPagesVisited.” You can call your cookie whatever you want, but again, it’s imperative that the name you give your cookie in the code is consistent with the cookie name field when you go on to create your variable in GTM. Without consistency, the tag won’t fire correctly.
    • You can also change the hours at which the cookie expires. If a user accumulates three page views in a single session, the code specifies a 12 hour expiration. The reasoning behind this is that if someone comes back after a day or two and views another blog, we won’t consider that to be part of the same “session,” giving us a clearer insight of the user behaviour of people that trigger three page views in a session.
    • This is rather arbitrary, so you can iterate the cookie expiration length to suit your business goals and customers.

    Note: if you want the event to fire after more than three page views (for example, four-page views) then the code would look like the following:

    var y = {{NumberOfBlogPagesVisited}}
    if (y == null) {
    createCookie('BlogPagesVisited',1,1);
    }
    else if (y == 1) {
    createCookie('BlogPagesVisited',2,1);
    }
    }
    else if (y == 2) {
    createCookie('BlogPagesVisited',3,1);
    }
    else if (y == 3) {
    var newCount = Number(y) + 1;
    createCookie('BlogPagesVisited',newCount,12);
    }
      
    if (newCount == 4) {
    dataLayer.push({
    'event': '4 Blog Pages'
    });


    Now that we have a basic understanding of the script, we can use GTM to implement everything.

    First, you’ll need the set up the following “Tags,” “Triggers”, and “Variables”:

    Tags

    Custom HTML tag: contains the cookie script

    Event tag: fires the event and sends the data to GA after a third pageview is a session.

    Triggers

    Page View trigger: defines the conditions that will fire your Custom HTML Tag.

    Custom Event trigger: defines the conditions that will fire your event.

    Variable

    First Party Cookie variable: This will define a value that a trigger needs to evaluate whether or not your Custom HTML tag should fire.

    Now, let’s walk through the steps of setting this up in GTM.

    Step 1: Create a custom HTML tag

    First, we’ll need to create a Custom HTML Tag that will contain the cookie script. This time, I’ve added the CSS selector, below:

     #content > div.post.type-post.status-publish.format-standard.hentry > div.entry-meta > span > span.author.vcard > a

    This matches authors on Distilled’s blog pages, so you’ll want to add your own unique selector.

    Navigate to Tags > New > Custom HTML Tag > and paste the script into the custom HTML tag box.

    You’ll want to ensure your tag name is descriptive and intuitive. Google recommends the following tag naming convention: Tag Type – Detail – Location. This will allow you to easily identify and sort related tags from the overview tag interface. You can also create separate folders for different projects to keep things more organized.

    Following Google’s example, I’ve called my tag Custom HTML – 3 Page Views Cookie – Blog.

    Once you’ve created your tag, remember to click save.

    Step 2: Create a trigger

    Creating a trigger will define the conditions that will fire your custom HTML tag. If you want to learn more about triggers, you can read up on Simo Ahava’s trigger guide.

    Navigate to Triggers > New > PageView.

    Once you’ve clicked the trigger configuration box, you’ll want to select “Page View” as a trigger type. I’ve also named my trigger Page View – Cookie Trigger – Blog, as I’m going to set up the tag to fire when users land on blog content.

    Next, you’ll want to define the properties of your trigger.

    Since we’re relying on the CSS selector to trigger the cookie across the site, select “All Page Views”.

    Once you’ve defined your trigger, click save.

    Step 3: Create your variable

    Just like how a Custom HTML tag relies on a trigger to fire, a trigger relies on a variable. A variable defines a value that a trigger needs to evaluate whether or not a tag should fire. If you want to learn more about variables, I recommend reading up on Simo Ahava’s variable guide.

    Head over to Variables > User-Defined Variables > Select 1st Party Cookie. You’ll also notice that I’ve named this variable “NumberOfBlogPagesVisited” — you’ll want this variable name to match what is in your cookie code.

    Having selected “1st Party Cookie,” you’ll now need to input your cookie name. Remember: the cookie name needs to replicate the name you’ve given your cookie in the code. I named my cookie BlogPagesVisited, so I’ve replicated that in the Cookie Name field, as seen below.

    Step 4: Create your event tag

    When a user triggers a third-page view, we’ll want to have it recorded and sent to GA. To do this, we need to set up an “Event” tag.

    First, navigate to Tags > New > Select Google Analytics – Universal Analytics:

    Once you’ve made your tag type “Google Analytics – Universal Analytics”, make sure track type is an “Event” and you name your “Category” and “Action” accordingly. You can also fill in a label and value if you wish. I’ve also selected “True” in the “Non-interaction Hit” field, as I still want to track bounce rate metrics.

    Finally, you’ll want to select a GA Setting variable that will pass on stored cookie information to a GA property.

    Step 5: Create your trigger

    This trigger will reference your event.

    Navigate to Trigger > New > Custom Event

    Once you’ve selected Custom Event, you’ll want to ensure the “Event name” field matches the name you have given your event in the code. In my case, I called the event “3 Blog Pages”.

    Step 6: Audit your cookie in preview mode

    After you’ve selected the preview mode, you should conduct an audit of your cookie to ensure everything is firing properly. To do this, navigate to the site you where you’ve set up cookies.

    Within the debugging interface, head on over to Page View > Variables.

    Next, look to a URL that contains the CSS selector. In the case of the client, we used the CSS selector that referenced an on-page author. All their content pages used the same CSS selector for authors. Using the GTM preview tool you’ll see that “NumberOfBlogPagesVisited” variable has been executed.

    And the actual “BlogPagesVisited” cookie has fired at a value of “1” in Chrome DevTools. To see this, click Inspect > Application > Cookies.

    If we skip the second-page view and execute our third-page view on another blog page, you’ll see that both our GA event and our Custom HTML tag fired, as it’s our third-page view.

    You’ll also see the third-page view triggered our cookie value of “3” in Chrome DevTools.

    Step 7: Set up your advanced segment

    Now that you’ve set up your cookie, you’ll want to pull the stored cookie data into GA, which will allow you to manipulate the data as you see fit.

    In GA, go to Behaviour > Events > Overview > Add Segment > New Segment > Sequences > Event Action > and then add the event name you specified in your event tag. I specified “3 Blog Page Views.”

    And there you have it! 

    Conclusion

    Now that you know how to set up a cookie in GTM, you can get heaps of additional insight into the engagement of your content.

    You also know how also to play around with the code snippet and iterate the number of page views required to fire the cookie event as well as the expiration of the cookies at each stage to suit your needs.

    I’d be interested to hear what other use cases you can think of for this cookie, or what other types of cookies you set up in GTM and what data you get from them.


    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

    Continue reading →

    How to Use Domain Authority for SEO – Whiteboard Friday

    Posted by Cyrus-Shepard

    Domain Authority is an incredibly well-known metric throughout the SEO industry, but what exactly is the right way to use it? In this week’s edition of Whiteboard Friday, we’re delighted to welcome Cyrus Shepard as he explains both what’s new with the new Domain Authority 2.0 update, and how to best harness its power for your own SEO success. 

    Click on the whiteboard image above to open a high resolution version in a new tab!

    Video Transcription

    Howdy, SEO fans. Welcome to a very special edition of Whiteboard Friday. I’m Cyrus Shepard. I’m honored to be here today with Moz to talk about the new Domain Authority. I want to talk about how to use Domain Authority to do actual SEO.

    What is Domain Authority?

    Let’s start with a definition of what Domain Authority actually is because there’s a lot of confusion out there. A Domain Authority is a metric, from 1 to 100, which predicts how well a domain will rank in Google. Now let’s break that down a little bit and talk about some of the myths of Domain Authority. 

    Is Domain Authority a ranking factor? No, Domain Authority is not a ranking factor. Does Google use Domain Authority in its algorithm? No, Google does not use Domain Authority in its algorithm. Now Google may use some domain-like metrics based on links similar to Domain Authority, but they do not use Domain Authority itself. In fact, it’s best if you don’t bring it up with them. They don’t tend to like that very much.

    So if it’s not a ranking factor, if Google doesn’t use it, what does Domain Authority actually do? It does one thing very, very well. It predicts rankings. That’s what it was built to do. That’s what it was designed to do, and it does that job very, very well. And because of that, we can use it for SEO in a lot of different ways. So Domain Authority has been around since 2010, about 8 years now, and since then it’s become a very popular metric, used and abused in different ways.

    What’s New With Domain Authority 2.0?

    So what’s new about the new Domain Authority that makes it so great and less likely to be abused and gives it so many more uses? Before I go into this, a big shout-out to two of the guys who helped develop this — Russ Jones and Neil Martinsen-Burrell — and many other smart people at Moz. Some of our search scientists did a tremendous job of updating this metric for 2019.

    1. Bigger Link Index

    So the first thing is the new Domain Authority is based on a new, bigger link index, and that is Link Explorer, which was released last year. It contains 35 trillion links. There are different ways of judging index sizes, but that is one of the biggest or if not the biggest link indexes publicly available that we know of.

    Thirty-five trillion links, to give you an idea of how big that is, if you were to count one link per second, you would be counting for 1.1 million years. That’s a lot of links, and that’s how many links are in the index that the new Domain Authority is based upon. Second of all, it uses a new machine learning model. Now part of Domain Authority looks at Google rankings and uses machine learning to try to fit the model in to predict how those rankings are stacked.

    2. New Machine Learning Model

    Now the new Domain Authority not only looks at what’s winning in Google search, but it’s also looking at what’s not ranking in Google search. The old model used to just look at the winners. This makes it much more accurate at determining where you might fall or where any domain or URL might fall within that prediction. 

    3. Spam Score Incorporation

    Next the new Domain Authority incorporates spam detection.

    Spam Score is a proprietary Moz metric that looks at a bunch of on-page factors, and those have been incorporated into the new metric, which makes it much more reliable. 

    4. Detects Link Manipulation

    It also, and this is very important, the new Domain Authority detects link manipulation. This is people that are buying and selling links, PBNs, things like that.

    It’s much better. In fact, Russ Jones, in a recent webinar, said that link buyers with the new Domain Authority will drop an average of 11 points. So the new Domain Authority is much better at rooting out this link manipulation, just like Google is attempting to do. So it much more closely resembles what Google is attempting.

    5. Daily Updates

    Lastly, the new Domain Authority is updated daily. This is a huge improvement. The old Domain Authority used to update about approximately every month or so.* The new Domain Authority is constantly being updated, and our search scientists are constantly adding improvements as they come along.

    So it’s being updated much more frequently and improved much more frequently. So what does this mean? The new Domain Authority is the most accurate domain-level metric to predict Google search results that we know of. When you look at ranking factors that we know of, like title tags or even generally backlinks, they predict a certain amount of rankings. But Domain Authority blows those out of the water in its ranking potential.

    *Note: Our former link research tool, Open Site Explorer, updated on a monthly cadence, resulting in monthly updates to DA scores. With the launch of Link Explorer in April 2018, Domain Authority scores moved to a daily update cadence. This remains true with the new underlying algorithm, Domain Authority 2.0.

    How to Use Domain Authority for SEO

    So the question is how do we actually use this? We have this tremendous power with Domain Authority that can predict rankings to a certain degree. How do we use this for SEO? So I want to go over some general tips for success. 

    The first tip, never use Domain Authority in isolation. You always want to use it with other metrics and in context, because it can only tell you so much.

    It’s a powerful tool, but it’s limited. For example, when you’re looking at rankings on-page, you’re going to want to look at the keyword targeting. You’re going to want to look at the on-page content, the domain history, other things like that. So never use Domain Authority by itself. That’s a key tip. 

    Second, you want to keep in mind that the scale of Domain Authority is roughly logarithmic.

    It’s not linear. Now what does this mean? It’s fairly easy to move from a zero Domain Authority or a one Domain Authority to a ten Domain Authority. You can get a handful of links, and that works pretty well. But moving from like a 70 to an 80 is much, much harder. It gets harder as you get higher. So a DA 40 is not twice a DA 20.

    It’s actually much, much bigger because as you go higher and higher and higher, until you get to 100, it gets much harder. Sites like Google and Facebook, they’re near the 100 range, and everything else comes into it. It’s almost like a funnel. 

    Next, keep in mind that DA is a relative metric. When you’re using DA, you always want to compare between competitors or your past scores.

    Having a DA 50 doesn’t really tell you much unless you’re comparing it to other DA scores. So if you’re looking in Google and a site has a DA of 50, it doesn’t make much sense unless you put it in the context of “what do the other sites have?” Are they 40? Are they 60? In that regard, when you’re looking at your own DA, you can compare against past performance or competitors.

    So if I have a 50 this month and a 40 last month, that might tell me that my ability to rank in Google has increased in that time period. 

    1. Evaluate Potential Value of a Link

    So talking about SEO use cases, we have this. We understand how to use it. What are some practical ways to use Domain Authority? Well, a very popular one with the old DA as well is judging the potential value of a link.

    For instance, you have 1,000 outreach targets that you’re thinking about asking for a link, but you only have time for 100 because you want to spend your time wisely and it’s not worth it to ask all 1,000. So you might use DA as a filter to find the most valuable link targets. A DA 90 might be more valuable than a DA 5 or a 10.

    But again, you do not want to use it in isolation. You’d be looking at other metrics as well, such as Page Authority, relevance, and traffic. But still DA might be a valuable metric to add to that experience. 

    2. Judging Keyword Difficulty

    Judging keyword difficulty, judging when you look at SERPs and see what is my potential of ranking for this SERP with this particular keyword?

    If you look at a SERP and everybody has a DA 95, it’s going to be pretty hard to rank in that SERP. But if everybody has a lower DA, you might have a chance. But again, you’re going to want to look at other metrics, such as Page Authority, keyword volume, on-page targeting. You can use Moz’s Keyword Difficulty Score to run these calculations as well.

    3. Campaign Performance

    Very popular in the agency world is link campaign performance or campaign performance in general, and this kind of makes sense. If you’re building links for a client and you want to show progress, a common way of doing this is showing Domain Authority, meaning that we built these links for you and now your potential to rank is higher.

    It’s a good metric, but it’s not the only metric I would report. I would definitely report rankings for targeted keywords. I would report traffic and conversions, because ranking potential is one thing, but I’d actually like to show that those links actually did something. So I’d be more inclined to show the other things. But DA is perfectly fine to report for campaign performance as long as you show it in context.

    4. Purchasing Existing Domains

    A popular one on the marketplaces is buying existing domains. Sites like Flippa often show DA or some similar metric like that. Again, the new Domain Authority is going to be much better at rooting out link manipulation, so these scores might be a little more trustworthy in this sense. But again, never buy a domain just on Domain Authority alone.

    You’re going to want to look at a lot of factors, such as the content, the traffic, the domain history, things like that. But Domain Authority might be a good first-line filter for you. 

    How to Find Domain Authority Metrics

    So where can you find the new Domain Authority? It is available right now. You can go to Link Explorer. It’s available through the Moz API.

    The free MozBar, you can download the MozBar for free and turn on SERP overlay, and it will show you the DA of everything as you browse through Google. 

    It’s available in Moz Campaigns and also Keyword Explorer. I hope this gives you some ideas about how to use Domain Authority. Please share your ideas and thoughts in the comments below. If you like this video, please share.

    Thanks a lot, everybody. Have a great day.

    Video transcription by Speechpad.com


    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

    Continue reading →

    A Comprehensive Analysis of the New Domain Authority

    Posted by rjonesx.

    Moz’s Domain Authority is requested over 1,000,000,000 times per year, it’s referenced millions of times on the web, and it has become a veritable household name among search engine optimizers for a variety of use cases, from determining the success of a link building campaign to qualifying domains for purchase. With the launch of Moz’s entirely new, improved, and much larger link index, we recognized the opportunity to revisit Domain Authority with the same rigor as we did keyword volume years ago (which ushered in the era of clickstream-modeled keyword data).

    What follows is a rigorous treatment of the new Domain Authority metric. What I will not do in this piece is rehash the debate over whether Domain Authority matters or what its proper use cases are. I have and will address those at length in a later post. Rather, I intend to spend the following paragraphs addressing the new Domain Authority metric from multiple directions.

    Correlations between DA and SERP rankings

    The most important component of Domain Authority is how well it correlates with search results. But first, let’s get the correlation-versus-causation objection out of the way: Domain Authority does not cause search rankings. It is not a ranking factor. Domain Authority predicts the likelihood that one domain will outrank another. That being said, its usefulness as a metric is tied in large part to this value. The stronger the correlation, the more valuable Domain Authority is for predicting rankings.

    Methodology

    Determining the “correlation” between a metric and SERP rankings has been accomplished in many different ways over the years. Should we compare against the “true first page,” top 10, top 20, top 50 or top 100? How many SERPs do we need to collect in order for our results to be statistically significant? It’s important that I outline the methodology for reproducibility and for any comments or concerns on the techniques used. For the purposes of this study, I chose to use the “true first page.” This means that the SERPs were collected using only the keyword with no additional parameters. I chose to use this particular data set for a number of reasons:

    • The true first page is what most users experience, thus the predictive power of Domain Authority will be focused on what users see.
    • By not using any special parameters, we’re likely to get Google’s typical results. 
    • By not extending beyond the true first page, we’re likely to avoid manually penalized sites (which can impact the correlations with links.)
    • We did NOT use the same training set or training set size as we did for this correlation study. That is to say, we trained on the top 10 but are reporting correlations on the true first page. This prevents us from the potential of having a result overly biased towards our model. 

    I randomly selected 16,000 keywords from the United States keyword corpus for Keyword Explorer. I then collected the true first page for all of these keywords (completely different from those used in the training set.) I extracted the URLs but I also chose to remove duplicate domains (ie: if the same domain occurred, one after another.) For a length of time, Google used to cluster domains together in the SERPs under certain circumstances. It was easy to spot these clusters, as the second and later listings were indented. No such indentations are present any longer, but we can’t be certain that Google never groups domains. If they do group domains, it would throw off the correlation because it’s the grouping and not the traditional link-based algorithm doing the work.

    I collected the Domain Authority (Moz), Citation Flow and Trust Flow (Majestic), and Domain Rank (Ahrefs) for each domain and calculated the mean Spearman correlation coefficient for each SERP. I then averaged the coefficients for each metric.

    Outcome

    Moz’s new Domain Authority has the strongest correlations with SERPs of the competing strength-of-domain link-based metrics in the industry. The sign (-/+) has been inverted in the graph for readability, although the actual coefficients are negative (and should be).

    Moz’s Domain Authority scored a ~.12, or roughly 6% stronger than the next best competitor (Domain Rank by Ahrefs.) Domain Authority performed 35% better than CitationFlow and 18% better than TrustFlow. This isn’t surprising, in that Domain Authority is trained to predict rankings while our competitor’s strength-of-domain metrics are not. It shouldn’t be taken as a negative that our competitors strength-of-domain metrics don’t correlate as strongly as Moz’s Domain Authority — rather, it’s simply exemplary of the intrinsic differences between the metrics. That being said, if you want a metric that best predicts rankings at the domain level, Domain Authority is that metric.

    Note: At first blush, Domain Authority’s improvements over the competition are, frankly, underwhelming. The truth is that we could quite easily increase the correlation further, but doing so would risk over-fitting and compromising a secondary goal of Domain Authority…

    Handling link manipulation

    Historically, Domain Authority has focused on only one single feature: maximizing the predictive capacity of the metric. All we wanted were the highest correlations. However, Domain Authority has become, for better or worse, synonymous with “domain value” in many sectors, such as among link buyers and domainers. Subsequently, as bizarre as it may sound, Domain Authority has itself been targeted for spam in order to bolster the score and sell at a higher price. While these crude link manipulation techniques didn’t work so well in Google, they were sufficient to increase Domain Authority. We decided to rein that in. 

    Data sets

    The first thing we did was compile a series off data sets that corresponded with industries we wished to impact, knowing that Domain Authority was regularly manipulated in these circles.

    • Random domains
    • Moz customers
    • Blog comment spam
    • Low-quality auction domains
    • Mid-quality auction domains
    • High-quality auction domains
    • Known link sellers
    • Known link buyers
    • Domainer network
    • Link network

    While it would be my preference to release all the data sets, I’ve chosen not to in order to not “out” any website in particular. Instead, I opted to provide these data sets to a number of search engine marketers for validation. The only data set not offered for outside validation was Moz customers, for obvious reasons.

    Methodology

    For each of the above data sets, I collected both the old and new Domain Authority scores. This was conducted all on February 28th in order to have parity for all tests. I then calculated the relative difference between the old DA and new DA within each group. Finally, I compared the various data set results against one another to confirm that the model addresses the various methods of inflating Domain Authority.

    Results

    In the above graph, blue represents the Old Average Domain Authority for that data set and orange represents the New Average Domain Authority for that same data set. One immediately noticeable feature is that every category drops. Even random domains drops. This is a re-centering of the Domain Authority score and should cause no alarm to webmasters. There is, on average, a 6% reduction in Domain Authority for randomly selected domains from the web. Thus, if your Domain Authority drops a few points, you are well within the range of normal. Now, let’s look at the various data sets individually.

    

    Random domains: -6.1%

    Using the same methodology of finding random domains which we use for collecting comparative link statistics, I selected 1,000 domains, we were able to determine that there is, on average, a 6.1% drop in Domain Authority. It’s important that webmasters recognize this, as the shift is likely to affect most sites and is nothing to worry about.  

    Moz customers: -7.4%

    Of immediate interest to Moz is how our own customers perform in relation to the random set of domains. On average, the Domain Authority of Moz customers lowered by 7.4%. This is very close to the random set of URLs and indicates that most Moz customers are likely not using techniques to manipulate DA to any large degree.  

    Link buyers: -15.9%

    Surprisingly, link buyers only lost 15.9% of their Domain Authority. In retrospect, this seems reasonable. First, we looked specifically at link buyers from blog networks, which aren’t as spammy as many other techniques. Second, most of the sites paying for links are also optimizing their site’s content, which means the sites do rank, sometimes quite well, in Google. Because Domain Authority trains against actual rankings, it’s reasonable to expect that the link buyers data set would not be impacted as highly as other techniques because the neural network learns that some link buying patterns actually work. 

    Comment spammers: -34%

    Here’s where the fun starts. The neural network behind Domain Authority was able to drop comment spammers’ average DA by 34%. I was particularly pleased with this one because of all the types of link manipulation addressed by Domain Authority, comment spam is, in my honest opinion, no better than vandalism. Hopefully this will have a positive impact on decreasing comment spam — every little bit counts. 

    Link sellers: -56%

    I was actually quite surprised, at first, that link sellers on average dropped 56% in Domain Authority. I knew that link sellers often participated in link schemes (normally interlinking their own blog networks to build up DA) so that they can charge higher prices. However, it didn’t occur to me that link sellers would be easier to pick out because they explicitly do not optimize their own sites beyond links. Subsequently, link sellers tend to have inflated, bogus link profiles and flimsy content, which means they tend to not rank in Google. If they don’t rank, then the neural network behind Domain Authority is likely to pick up on the trend. It will be interesting to see how the market responds to such a dramatic change in Domain Authority.

    High-quality auction domains: -61%

    One of the features that I’m most proud of in regards to Domain Authority is that it effectively addressed link manipulation in order of our intuition regarding quality. I created three different data sets out of one larger data set (auction domains), where I used certain qualifiers like price, TLD, and archive.org status to label each domain as high-quality, mid-quality, and low-quality. In theory, if the neural network does its job correctly, we should see the high-quality domains impacted the least and the low-quality domains impacted the most. This is the exact pattern which was rendered by the new model. High-quality auction domains dropped an average of 61% in Domain Authority. That seems really high for “high-quality” auction domains, but even a cursory glance at the backlink profiles of domains that are up for sale in the $10K+ range shows clear link manipulation. The domainer industry, especially the domainer-for-SEO industry, is rife with spam. 

    Link network: -79%

    There is one network on the web that troubles me more than any other. I won’t name it, but it’s particularly pernicious because the sites in this network all link to the top 1,000,000 sites on the web. If your site is in the top 1,000,000 on the web, you’ll likely see hundreds of root linking domains from this network no matter which link index you look at (Moz, Majestic, or Ahrefs). You can imagine my delight to see that it drops roughly 79% in Domain Authority, and rightfully so, as the vast majority of these sites have been banned by Google.

    Mid-quality auction domains: -95%

    Continuing with the pattern regarding the quality of auction domains, you can see that “mid-quality” auction domains dropped nearly 95% in Domain Authority. This is huge. Bear in mind that these drastic drops are not combined with losses in correlation with SERPs; rather, the neural network is learning to distinguish between backlink profiles far more effectively, separating the wheat from the chaff. 

    Domainer networks: -97%

    If you spend any time looking at dropped domains, you have probably come upon a domainer network where there are a series of sites enumerated and all linking to one another. For example, the first site might be sbt001.com, then sbt002.com, and so on and so forth for thousands of domains. While it’s obvious for humans to look at this and see a pattern, Domain Authority needed to learn that these techniques do not correlate with rankings. The new Domain Authority does just that, dropping the domainer networks we analyzed on average by 97%.

    Low-quality auction domains: -98%

    Finally, the worst offenders — low-quality auction domains — dropped 98% on average. Domain Authority just can’t be fooled in the way it has in the past. You have to acquire good links in the right proportions (in accordance with a natural model and sites that already rank) if you wish to have a strong Domain Authority score. 

    What does this mean?

    For most webmasters, this means very little. Your Domain Authority might drop a little bit, but so will your competitors’. For search engine optimizers, especially consultants and agencies, it means quite a bit. The inventories of known link sellers will probably diminish dramatically overnight. High DA links will become far more rare. The same is true of those trying to construct private blog networks (PBNs). Of course, Domain Authority doesn’t cause rankings so it won’t impact your current rank, but it should give consultants and agencies a much smarter metric for assessing quality.

    What are the best use cases for DA?

    • Compare changes in your Domain Authority with your competitors. If you drop significantly more, or increase significantly more, it could indicate that there are important differences in your link profile.
    • Compare changes in your Domain Authority over time. The new Domain Authority will update historically as well, so you can track your DA. If your DA is decreasing over time, especially relative to your competitors, you probably need to get started on outreach.
    • Assess link quality when looking to acquire dropped or auction domains. Those looking to acquire dropped or auction domains now have a much more powerful tool in their hands for assessing quality. Of course, DA should not be the primary metric for assessing the quality of a link or a domain, but it certainly should be in every webmaster’s toolkit.

    What should we expect going forward?

    We aren’t going to rest. An important philosophical shift has taken place at Moz with regards to Domain Authority. In the past, we believed it was best to keep Domain Authority static, rarely updating the model, in order to give users an apples-to-apples comparison. Over time, though, this meant that Domain Authority would become less relevant. Given the rapidity with which Google updates its results and algorithms, the new Domain Authority will be far more agile as we give it new features, retrain it more frequently, and respond to algorithmic changes from Google. We hope you like it.


    Be sure to join us on Thursday, March 14th at 10am PT at our upcoming webinar discussing strategies & use cases for the new Domain Authority:

    Save my spot


    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

    Continue reading →

    March 1st Google Update: The Mysterious Case of the 19-Result SERPs

    Posted by Dr-Pete

    Late last week (Feb 28 – Mar 1), we saw a sizable two-day spike in Google rankings flux, as measured by MozCast. Temperatures on Friday reached 108°F. The original temperature on Thursday was 105°F, but that was corrected down to 99°F (more on that later).

    Digging in on Friday (March 1st), we saw a number of metrics shift, but most notably was a spike in page-one Google SERPs with more than 10 organic results. Across the 10,000 keywords in MozCast, here’s what we observed at the high end:

    Counting “organic” results in 2019 is challenging — some elements, like expanded site-links (in the #1 position), Top Stories, and image results can occupy an organic position. In-depth Articles are particularly challenging (more on that in a moment), and the resulting math usually leaves us with page-one SERPs with counts from 4 to 12. Friday’s numbers were completely beyond anything we’ve seen historically, though, with organic counts up to 19 results.

    Dissecting the 19-result SERP

    Across 10K keywords, we saw 9 SERPs with 19 results. Below is one of the most straightforward (in terms of counting). There was a Featured Snippet in the #0 position, followed by 19 results that appear organic. This is a direct screenshot from a result for “pumpkin pie recipe” on Google.com/US:

    Pardon the long scroll, but I wanted you to get the full effect. There’s no clear marker here to suggest that part of this SERP is a non-organic feature or in some way different. You’ll notice, though, that we transition from more traditional recipe results (with thumbnails) to what appear to be a mix of magazine and newspaper articles. We’ve seen something like this before …

    Diving into the depths of in-depth

    You may not think much about In-depth Articles these days. That’s in large part because they’re almost completely hidden within regular, organic results. We know they still exist, though, because of deep source-code markers and a mismatch in page-one counts. Here, for example, are the last 6 results from today (March 4th) on a search for “sneakers”:

    Nestled in the more traditional, e-commerce results at the end of page one (like Macy’s), you can see articles from FiveThirtyEight, Wired, and The Verge. It’s hard to tell from the layout, but this is a 3-pack of In-depth Articles, which takes the place of a single organic position. So, this SERP appears to have 12 page-one results. Digging into the results on March 1st, we saw a similar pattern, but those 3-packs had expanded to as many as 10 articles.

    We retooled the parser to more flexibly detect In-depth Articles (allowing for packs with more than 3 results), and here’s what we saw for prevalence of In-depth Articles over the past two weeks:

    Just under 23% of MozCast SERPs on the morning of March 1st had something similar to In-depth Articles, an almost 4X increase from the day before. This number returned to normal (even slightly lower) the next day. It’s possible that our new definition is too broad, and these aren’t really traditional “In-depth” packs, but then we would expect the number to stay elevated. We also saw a large spike in SERP “real-estate” shares for major publications, like the New York Times, which typically dominate In-depth Articles. Something definitely happened around March 1st.

    By the new method (removing these results from organic consideration), the temperature for 2/28 dropped from 105°F to 99°F, as some of the unusual results were treated as In-depth Articles and removed from the weather report.

    Note that the MozCast temperatures are back-dated, since they represent the change over a 24-hour period. So, the prevalence of In-depth articles on the morning of March 1st is called “3/1” in the graph, but the day-over-day temperature recorded that morning is labeled “2/28” in the graph at the beginning of this post.

    Sorting out where to go from here

    Is this a sign of things to come? It’s really tough to say. On March 1st, I reached out to Twitter to see if people could replicate the 19-result SERPs and many people were able to, both on desktop and mobile:

    This did not appear to be a normal test (which we see roll out to something like 1% or less of searchers, typically). It’s possible this was a glitch on Google’s end, but Google doesn’t typically publicize temporary glitches, so it’s hard to tell.

    It appears that the 108°F was, in part, a reversal of these strange results. On the other hand, it’s odd that the reversal was larger than the original rankings flux. At the same time, we saw some other signals in play, such as a drop in image results on page one (about 10.5% day-over-day, which did not recover the next day). It’s possible that an algorithm update rolled out, but there was a glitch in that update.

    If you’re a traditional publisher or someone who generally benefits from In-depth Articles, I’d recommend keeping your eyes open. This could be a sign of future intent by Google, or it could simply be a mistake. For the rest of us, we’ll have to wait and see. Fortunately, these results appeared mostly at the end of page one, so top rankings were less impacted, but a 19-result page one would certainly shake-up our assumptions about organic positioning and CTR.


    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

    Continue reading →

    Rewriting the Beginner’s Guide to SEO, Chapter 7: Measuring, Prioritizing, &amp; Executing SEO

    Posted by BritneyMuller

    It’s finally here, for your review and feedback: Chapter 7 of the new Beginner’s Guide to SEO, the last chapter. We cap off the guide with advice on how to measure, prioritize, and execute on your SEO. And if you missed them, check out the drafts of our outline, Chapter One, Chapter Two, Chapter Three, Chapter FourChapter Five, and Chapter Six for your reading pleasure. As always, let us know what you think of Chapter 7 in the comments!


    Set yourself up for success.

    They say if you can measure something, you can improve it.

    In SEO, it’s no different. Professional SEOs track everything from rankings and conversions to lost links and more to help prove the value of SEO. Measuring the impact of your work and ongoing refinement is critical to your SEO success, client retention, and perceived value.

    It also helps you pivot your priorities when something isn’t working.

    Start with the end in mind

    While it’s common to have multiple goals (both macro and micro), establishing one specific primary end goal is essential.

    The only way to know what a website’s primary end goal should be is to have a strong understanding of the website’s goals and/or client needs. Good client questions are not only helpful in strategically directing your efforts, but they also show that you care.

    Client question examples:

    1. Can you give us a brief history of your company?
    2. What is the monetary value of a newly qualified lead?
    3. What are your most profitable services/products (in order)?

    Keep the following tips in mind while establishing a website’s primary goal, additional goals, and benchmarks:

    Goal setting tips

    • Measurable: If you can’t measure it, you can’t improve it.
    • Be specific: Don’t let vague industry marketing jargon water down your goals.
    • Share your goals: Studies have shown that writing down and sharing your goals with others boosts your chances of achieving them.

    Measuring

    Now that you’ve set your primary goal, evaluate which additional metrics could help support your site in reaching its end goal. Measuring additional (applicable) benchmarks can help you keep a better pulse on current site health and progress.

    Engagement metrics

    How are people behaving once they reach your site? That’s the question that engagement metrics seek to answer. Some of the most popular metrics for measuring how people engage with your content include:

    Conversion rate – The number of conversions (for a single desired action/goal) divided by the number of unique visits. A conversion rate can be applied to anything, from an email signup to a purchase to account creation. Knowing your conversion rate can help you gauge the return on investment (ROI) your website traffic might deliver.

    In Google Analytics, you can set up goals to measure how well your site accomplishes its objectives. If your objective for a page is a form fill, you can set that up as a goal. When site visitors accomplish the task, you’ll be able to see it in your reports.

    Time on page – How long did people spend on your page? If you have a 2,000-word blog post that visitors are only spending an average of 10 seconds on, the chances are slim that this content is being consumed (unless they’re a mega-speed reader). However, if a URL has a low time on page, that’s not necessarily bad either. Consider the intent of the page. For example, it’s normal for “Contact Us” pages to have a low average time on page.

    Pages per visit – Was the goal of your page to keep readers engaged and take them to a next step? If so, then pages per visit can be a valuable engagement metric. If the goal of your page is independent of other pages on your site (ex: visitor came, got what they needed, then left), then low pages per visit are okay.

    Bounce rate – “Bounced” sessions indicate that a searcher visited the page and left without browsing your site any further. Many people try to lower this metric because they believe it’s tied to website quality, but it actually tells us very little about a user’s experience. We’ve seen cases of bounce rate spiking for redesigned restaurant websites that are doing better than ever. Further investigation discovered that people were simply coming to find business hours, menus, or an address, then bouncing with the intention of visiting the restaurant in person. A better metric to gauge page/site quality is scroll depth.

    Scroll depth – This measures how far visitors scroll down individual webpages. Are visitors reaching your important content? If not, test different ways of providing the most important content higher up on your page, such as multimedia, contact forms, and so on. Also consider the quality of your content. Are you omitting needless words? Is it enticing for the visitor to continue down the page? Scroll depth tracking can be set up in your Google Analytics.

    Search traffic

    Ranking is a valuable SEO metric, but measuring your site’s organic performance can’t stop there. The goal of showing up in search is to be chosen by searchers as the answer to their query. If you’re ranking but not getting any traffic, you have a problem.

    But how do you even determine how much traffic your site is getting from search? One of the most precise ways to do this is with Google Analytics.

    Using Google Analytics to uncover traffic insights

    Google Analytics (GA) is bursting at the seams with data — so much so that it can be overwhelming if you don’t know where to look. This is not an exhaustive list, but rather a general guide to some of the traffic data you can glean from this free tool.

    Isolate organic traffic – GA allows you to view traffic to your site by channel. This will mitigate any scares caused by changes to another channel (ex: total traffic dropped because a paid campaign was halted, but organic traffic remained steady).

    Traffic to your site over time – GA allows you to view total sessions/users/pageviews to your site over a specified date range, as well as compare two separate ranges.

    How many visits a particular page has received – Site Content reports in GA are great for evaluating the performance of a particular page — for example, how many unique visitors it received within a given date range.

    Traffic from a specified campaign – You can use UTM (urchin tracking module) codes for better attribution. Designate the source, medium, and campaign, then append the codes to the end of your URLs. When people start clicking on your UTM-code links, that data will start to populate in GA’s “campaigns” report.

    Click-through rate (CTR) – Your CTR from search results to a particular page (meaning the percent of people that clicked your page from search results) can provide insights on how well you’ve optimized your page title and meta description. You can find this data in Google Search Console, a free Google tool.

    In addition, Google Tag Manager is a free tool that allows you to manage and deploy tracking pixels to your website without having to modify the code. This makes it much easier to track specific triggers or activity on a website.

    Additional common SEO metrics

    • Domain Authority & Page Authority (DA/PA) – Moz’s proprietary authority metrics provide powerful insights at a glance and are best used as benchmarks relative to your competitors’ Domain Authority and Page Authority.
    • Keyword rankings – A website’s ranking position for desired keywords. This should also include SERP feature data, like featured snippets and People Also Ask boxes that you’re ranking for. Try to avoid vanity metrics, such as rankings for competitive keywords that are desirable but often too vague and don’t convert as well as longer-tail keywords.
    • Number of backlinks – Total number of links pointing to your website or the number of unique linking root domains (meaning one per unique website, as websites often link out to other websites multiple times). While these are both common link metrics, we encourage you to look more closely at the quality of backlinks and linking root domains your site has.

    How to track these metrics

    There are lots of different tools available for keeping track of your site’s position in SERPs, site crawl health, SERP features, and link metrics, such as Moz Pro and STAT.

    The Moz and STAT APIs (among other tools) can also be pulled into Google Sheets or other customizable dashboard platforms for clients and quick at-a-glance SEO check-ins. This also allows you to provide more refined views of only the metrics you care about.

    Dashboard tools like Data Studio, Tableau, and PowerBI can also help to create interactive data visualizations.

    Evaluating a site’s health with an SEO website audit

    By having an understanding of certain aspects of your website — its current position in search, how searchers are interacting with it, how it’s performing, the quality of its content, its overall structure, and so on — you’ll be able to better uncover SEO opportunities. Leveraging the search engines’ own tools can help surface those opportunities, as well as potential issues:

    • Google Search Console – If you haven’t already, sign up for a free Google Search Console (GSC) account and verify your website(s). GSC is full of actionable reports you can use to detect website errors, opportunities, and user engagement.
    • Bing Webmaster Tools – Bing Webmaster Tools has similar functionality to GSC. Among other things, it shows you how your site is performing in Bing and opportunities for improvement.
    • Lighthouse Audit – Google’s automated tool for measuring a website’s performance, accessibility, progressive web apps, and more. This data improves your understanding of how a website is performing. Gain specific speed and accessibility insights for a website here.
    • PageSpeed Insights – Provides website performance insights using Lighthouse and Chrome User Experience Report data from real user measurement (RUM) when available.
    • Structured Data Testing Tool – Validates that a website is using schema markup (structured data) properly.
    • Mobile-Friendly Test – Evaluates how easily a user can navigate your website on a mobile device.
    • Web.dev – Surfaces website improvement insights using Lighthouse and provides the ability to track progress over time.
    • Tools for web devs and SEOs – Google often provides new tools for web developers and SEOs alike, so keep an eye on any new releases here.

    While we don’t have room to cover every SEO audit check you should perform in this guide, we do offer an in-depth Technical SEO Site Audit course for more info. When auditing your site, keep the following in mind:

    Crawlability: Are your primary web pages crawlable by search engines, or are you accidentally blocking Googlebot or Bingbot via your robots.txt file? Does the website have an accurate sitemap.xml file in place to help direct crawlers to your primary pages?

    Indexed pages: Can your primary pages be found using Google? Doing a site:yoursite.com OR site:yoursite.com/specific-page check in Google can help answer this question. If you notice some are missing, check to make sure a meta robots=noindex tag isn’t excluding pages that should be indexed and found in search results.

    Check page titles & meta descriptions: Do your titles and meta descriptions do a good job of summarizing the content of each page? How are their CTRs in search results, according to Google Search Console? Are they written in a way that entices searchers to click your result over the other ranking URLs? Which pages could be improved? Site-wide crawls are essential for discovering on-page and technical SEO opportunities.

    Page speed: How does your website perform on mobile devices and in Lighthouse? Which images could be compressed to improve load time?

    Content quality: How well does the current content of the website meet the target market’s needs? Is the content 10X better than other ranking websites’ content? If not, what could you do better? Think about things like richer content, multimedia, PDFs, guides, audio content, and more.

    Pro tip: Website pruning!

    Removing thin, old, low-quality, or rarely visited pages from your site can help improve your website’s perceived quality. Performing a content audit will help you discover these pruning opportunities. Three primary ways to prune pages include:

    1. Delete the page (4XX): Use when a page adds no value (ex: traffic, links) and/or is outdated.
    2. Redirect (3XX): Redirect the URLs of pages you’re pruning when you want to preserve the value they add to your site, such as inbound links to that old URL.
    3. NoIndex: Use this when you want the page to remain on your site but be removed from the index.

    Keyword research and competitive website analysis (performing audits on your competitors’ websites) can also provide rich insights on opportunities for your own website.

    For example:

    • Which keywords are competitors ranking on page 1 for, but your website isn’t?
    • Which keywords is your website ranking on page 1 for that also have a featured snippet? You might be able to provide better content and take over that snippet.
    • Which websites link to more than one of your competitors, but not to your website?

    Discovering website content and performance opportunities will help devise a more data-driven SEO plan of attack! Keep an ongoing list in order to prioritize your tasks effectively.

    Prioritizing your SEO fixes

    In order to prioritize SEO fixes effectively, it’s essential to first have specific, agreed-upon goals established between you and your client.

    While there are a million different ways you could prioritize SEO, we suggest you rank them in terms of importance and urgency. Which fixes could provide the most ROI for a website and help support your agreed-upon goals?

    Stephen Covey, author of The 7 Habits of Highly Effective People, developed a handy time management grid that can ease the burden of prioritization:

    Source: Stephen Covey, The 7 Habits of Highly Effective People


    Putting out small, urgent SEO fires might feel most effective in the short term, but this often leads to neglecting non-urgent important fixes. The not urgent & important items are ultimately what often move the needle for a website’s SEO. Don’t put these off.

    SEO planning & execution

    “Without strategy, execution is aimless. Without execution, strategy is useless.”
    – Morris Chang

    Much of your success depends on effectively mapping out and scheduling your SEO tasks. You can use free tools like Google Sheets to plan out your SEO execution (we have a free template here), but you can use whatever method works best for you. Some people prefer to schedule out their SEO tasks in their Google Calendar, in a kanban or scrum board, or in a daily planner.

    Use what works for you and stick to it.

    Measuring your progress along the way via the metrics mentioned above will help you monitor your effectiveness and allow you to pivot your SEO efforts when something isn’t working. Say, for example, you changed a primary page’s title and meta description, only to notice that the CTR for that page decreased. Perhaps you changed it to something too vague or strayed too far from the on-page topic — it might be good to try a different approach. Keeping an eye on drops in rankings, CTRs, organic traffic, and conversions can help you manage hiccups like this early, before they become a bigger problem.

    Communication is essential for SEO client longevity

    Many SEO fixes are implemented without being noticeable to a client (or user). This is why it’s essential to employ good communication skills around your SEO plan, the time frame in which you’re working, and your benchmark metrics, as well as frequent check-ins and reports.



    Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

    Continue reading →