About frans

Website:
frans has written 4625 articles so far, you can find them below.

Diagnosing Why a Site’s Set of Pages May Be Ranking Poorly – Whiteboard Friday

Posted by randfish

Your rankings have dropped and you don’t know why. Maybe your traffic dropped as well, or maybe just a section of your site has lost rankings. It’s an important and often complex mystery to solve, and there are a number of boxes to check off while you investigate. In this Whiteboard Friday, Rand shares a detailed process to follow to diagnose what went wrong to cause your rankings drop, why it happened, and how to start the recovery process.

Diagnosing why a site's pages may be ranking poorly

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to talk about diagnosing a site and specifically a section of a site’s pages and why they might be performing poorly, why their traffic may have dropped, why rankings may have dropped, why both of them might have dropped. So we’ve got a fairly extensive process here, so let’s get started.

Step 1: Uncover the problem

First off, our first step is uncovering the problem or finding whether there is actually a problem. A good way to think about this is especially if you have a larger website, if we’re talking about a site that’s 20 or 30 or even a couple hundred pages, this is not a big issue. But many websites that SEOs are working on these days are thousands, tens of thousands, hundreds of thousands of pages. So what I like to urge folks to do is to

A. Treat different site sections as unique segments for investigation. You should look at them individually.

A lot of times subfolders or URL structures are really helpful here. So I might say, okay, MySite.com, I’m going to look exclusively at the /news section. Did that fall in rankings? Did it fall in traffic? Or was it /posts, where my blog posts and my content is? Or was it /cities? Let’s say I have a website that’s dealing with data about the population of cities. So I rank for lots of those types of queries, and it seems like I’m ranking for fewer of them, and it’s my cities pages that are poorly performing in comparison to where they were a few months ago or last year at this time.

B. Check traffic from search over time.

So I go to my Google Analytics or whatever analytics you’re using, and you might see something like, okay, I’m going to look exclusively at the /cities section. If you can structure your URLs in this fashion, use subfolders, this is a great way to do it. Then take a look and see, oh, hang on, that’s a big traffic drop. We fell off a cliff there for these particular pages.

This data can be hiding inside your analytics because it could be that the rest of your site is performing well. It’s going sort of up and to the right, and so you see this slow plateauing or a little bit of a decline, but it’s not nearly as sharp as it is if you look at the traffic specifically for a single subsection that might be performing poorly, like this /cities section.

From there, I’m going to next urge you to use Google Trends. Why? Why would I go to Google Trends? Because what I want you to do is I want you to look at some of your big keywords and topics in Google Trends to see if there has been a serious decline in search volume at the same time. If search demand is rising or staying stable over the course of time where you have lost traffic, it’s almost certainly something you’ve done, not something searchers are doing. But if you see that traffic has declined, for example, maybe you were ranking really well for population data from 2015. It turns out people are now looking for population data for 2016 or ’17 or ’18. Maybe that is part of the problem, that search demand has fallen and your curve matches that.

C. Perform some diagnostic queries or use your rank tracking data if you have it on these types of things.

This is one of the reasons I like to rank track for even these types of queries that don’t get a lot of traffic.

1. Target keywords. In this case, it might be “Denver population growth,” maybe that’s one of your keywords. You would see, “Do I still rank for this? How well do I rank for this? Am I ranking more poorly than I used to?”

2. Check brand name plus target keyword. So, in this case, it would be my site plus the above here plus “Denver population growth,” so My Site or MySite.com Denver population growth. If you’re not ranking for that, that’s usually an indication of a more serious problem, potentially a penalty or some type of dampening that’s happening around your brand name or around your website.

3. Look for a 10 to 20-word text string from page content without quotes. It could be shorter. It could be only six or seven words, or it could be longer, 25 words if you really need it. But essentially, I want to take a string of text that exists on the page and put it in order in Google search engine, not in quotes. I do not want to use quotes here, and I want to see how it performs. This might be several lines of text here.

4. Look for a 10 to 20-word text string with quotes. So those lines of text, but in quotes searched in Google. If I’m not ranking for this, but I am for this one … sorry, if I’m not ranking for the one not in quotes, but I am in quotes, I might surmise this is probably not duplicate content. It’s probably something to do with my content quality or maybe my link profile or Google has penalized or dampened me in some way.

5. site: urlstring/ So I would search for “site:MySite.com/cities/Denver.” I would see: Wait, has Google actually indexed my page? When did they index it? Oh, it’s been a month. I wonder why they haven’t come back. Maybe there’s some sort of crawl issue, robots.txt issue, meta robots issue, something. I’m preventing Google from potentially getting there. Or maybe they can’t get there at all, and this results in zero results. That means Google hasn’t even indexed the page. Now we have another type of problem.

D. Check your tools.

1. Google Search Console. I would start there, especially in the site issues section.

2. Check your rank tracker or whatever tool you’re using, whether that’s Moz or something else.

3. On-page and crawl monitoring. Hopefully you have something like that. It could be through Screaming Frog. Maybe you’ve run some crawls over time, or maybe you have a tracking system in place. Moz has a crawl system. OnPage.org has a really good one.

4. Site uptime. So I might check Pingdom or other things that alert me to, “Oh, wait a minute, my site was down for a few days last week. That obviously is why traffic has fallen,” those types of things.

Step 2: Offer hypothesis for falling rankings/traffic

Okay, you’ve done your diagnostics. Now it’s time to offer some hypotheses. So now that we understand which problem I might have, I want to understand what could be resulting in that problem. So there are basically two situations you can have. Rankings have stayed stable or gone up, but traffic has fallen.

A. If rankings are up, but traffic is down…

In those cases, these are the five things that are most typically to blame.

1. New SERP features. There’s a bunch of featured snippets that have entered the population growth for cities search results, and so now number one is not what number one used to be. If you don’t get that featured snippet, you’re losing out to one of your competitors.

2. Lower search demand. Like we talked about in Google Trends. I’m looking at search demand, and there are just not as many people searching as there used to be.

3. Brand or reputation issues. I’m ranking just fine, but people now for some reason hate me. People who are searching this sector think my brand is evil or bad or just not as helpful as it used to be. So I have issues, and people are not clicking on my results. They’re choosing someone else actively because of reputation issues.

4. Snippet problems. I’m ranking in the same place I used to be, but I’m no longer the sexiest, most click-drawing snippet in the search results, and other people are earning those clicks instead.

5. Shift in personalization or location biasing by Google. It used to be the case that everyone who searched for city name plus population growth got the same results, but now suddenly people are seeing different results based on maybe their device or things they’ve clicked in the past or where they’re located. Location is often a big cause for this.

So for many SEOs for many years, “SEO consultant” resulted in the same search results. Then Google introduced the Maps results and pushed down a lot of those folks, and now “SEO consultant” results in different ranked results in each city and each geography that you search in. So that can often be a cause for falling traffic even though rankings remain high.

B. If rankings and traffic are down…

If you’re seeing that rankings have fallen and traffic has fallen in conjunction, there’s a bunch of other things that are probably going on that are not necessarily these things. A few of these could be responsible still, like snippet problems could cause your rankings and your traffic to fall, or brand and reputation issues could cause your click-through rate to fall, which would cause you to get dampened. But oftentimes it’s things like this:

1. & 2. Duplicate content and low-quality or thin content. Google thinks that what you’re providing just isn’t good enough.

3. Change in searcher intent. People who were searching for population growth used to want what you had to offer, but now they want something different and other people in the SERP are providing that, but you are not, so Google is ranking you lower. Even though your content is still good, it’s just not serving the new searcher intent.

4. Loss to competitors. So maybe you have worse links than they do now or less relevance or you’re not solving the searcher’s query as well. Your user interface, your UX is not as good. Your keyword targeting isn’t as good as theirs. Your content quality and the unique value you provide isn’t as good as theirs. If you see that one or two competitors are consistently outranking you, you might diagnose that this is the problem.

5. Technical issues. So if I saw from over here that the crawl was the problem, I wasn’t getting indexed, or Google hasn’t updated my pages in a long time, I might look into accessibility things, maybe speed, maybe I’m having problems like letting Googlebot in, HTTPS problems, or indexable content, maybe Google can’t see the content on my page anymore because I made some change in the technology of how it’s displayed, or crawlability, internal link structure problems, robots.txt problems, meta robots tag issues, that kind of stuff.

Maybe at the server level, someone on the tech ops team of my website decided, “Oh, there’s this really problematic bot coming from Mountain View that’s costing us a bunch of bandwidth. Let’s block bots from Mountain View.” No, don’t do that. Bad. Those kinds of technical issues can happen.

6. Spam and penalties. We’ll talk a little bit more about how to diagnose those in a second.

7. CTR, engagement, or pogo-sticking issues. There could be click-through rate issues or engagement issues, meaning pogo sticking, like people are coming to your site, but they are clicking back because they weren’t satisfied by your results, maybe because their expectations have changed or market issues have changed.

Step 3: Make fixes and observe results

All right. Next and last in this process, what we’re going to do is make some fixes and observe the results. Hopefully, we’ve been able to correctly diagnose and form some wise hypotheses about what’s going wrong, and now we’re going to try and resolve them.

A. On-page and technical issues should solve after a new crawl + index.

So on-page and technical issues, if we’re fixing those, they should usually resolve, especially on small sections of sites, pretty fast. As soon as Google has crawled and indexed the page, you should generally see performance improve. But this can take a few weeks if we’re talking about a large section on a site, many thousands of pages, because Google has to crawl and index all of them to get the new sense that things are fixed and traffic is coming in. Since it’s long tail to many different pages, you’re not going to see that instant traffic gain and rise as fast.

B. Link issues and spam penalty problems can take months to show results.

Look, if you have crappier links or not a good enough link profile as your competitors, growing that can take months or years even to fix. Penalty problems and spam problems, same thing. Google can take sometimes a long time. You’ve seen a lot of spam experts on Twitter saying, “Oh, well, all my clients who had issues over the last nine months suddenly are ranking better today,” because Google made some fix in their latest index rollout or their algorithm changed, and it’s sort of, okay, well we’ll reward the people for all the fixes that they’ve made. Sometimes that’s in batches that take months.

C. Fixing a small number of pages in a section that’s performing poorly might not show results very quickly.

For example, let’s say you go and you fix /cities/Milwaukee. You determine from your diagnostics that the problem is a content quality issue. So you go and you update these pages. They have new content. It serves the searchers much better, doing a much better job. You’ve tested it. People really love it. You fixed two cities, Milwaukee and Denver, to test it out. But you’ve left 5,000 other cities pages untouched.

Sometimes Google will sort of be like, “No, you know what? We still think your cities pages, as a whole, don’t do a good job solving this query. So even though these two that you’ve updated do a better job, we’re not necessarily going to rank them, because we sort of think of your site as this whole section and we grade it as a section or apply some grades as a section.” That is a real thing that we’ve observed happening in Google’s results.

Because of this, one of the things that I would urge you to do is if you’re seeing good results from the people you’re testing it with and you’re pretty confident, I would roll out the changes to a significant subset, 30%, 50%, 70% of the pages rather than doing only a tiny, tiny sample.

D. Sometimes when you encounter these issues, a remove and replace strategy works better than simply upgrading old URLs.

So if Google has decided /cities, your /cities section is just awful, has all sorts of problems, not performing well on a bunch of different vectors, you might take your /cities section and actually 301 redirect them to a new URL, /location, and put the new UI and the new content that better serves the searcher and fixes a lot of these issues into that location section, such that Google now goes, “Ah, we have something new to judge. Let’s see how these location pages on MySite.com perform versus the old cities pages.”

So I know we’ve covered a ton today and there are a lot of diagnostic issues that we haven’t necessarily dug deep into, but I hope this can help you if you’re encountering rankings challenges with sections of your site or with your site as a whole. Certainly, I look forward to your comments and your feedback. If you have other tips for folks facing this, that would be great. We’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

Rewriting the Beginner’s Guide to SEO, Chapter 1: SEO 101

Posted by BritneyMuller

Back in mid-November, we kicked off a campaign to rewrite our biggest piece of content: the Beginner’s Guide to SEO. You offered up a huge amount of helpful advice and insight with regards to our outline, and today we’re here to share our draft of the first chapter.

In many ways, the Beginner’s Guide to SEO belongs to each and every member of our community; it’s important that we get this right, for your sake. So without further ado, here’s the first chapter — let’s dive in!


Chapter 1: SEO 101

What is it, and why is it important?

Welcome! We’re excited that you’re here!

If you already have a solid understanding of SEO and why it’s important, you can skip to Chapter 2 (though we’d still recommend skimming the best practices from Google and Bing at the end of this chapter; they’re useful refreshers).

For everyone else, this chapter will help build your foundational SEO knowledge and confidence as you move forward.

What is SEO?

SEO stands for “search engine optimization.” It’s the practice of increasing both the quality and quantity of website traffic, as well as exposure to your brand, through non-paid (also known as “organic”) search engine results.

Despite the acronym, SEO is as much about people as it is about search engines themselves. It’s about understanding what people are searching for online, the answers they are seeking, the words they’re using, and the type of content they wish to consume. Leveraging this data will allow you to provide high-quality content that your visitors will truly value.

Here’s an example. Frankie & Jo’s (a Seattle-based vegan, gluten-free ice cream shop) has heard about SEO and wants help improving how and how often they show up in organic search results. In order to help them, you need to first understand their potential customers:

  • What types of ice cream, desserts, snacks, etc. are people searching for?
  • Who is searching for these terms?
  • When are people searching for ice cream, snacks, desserts, etc.?
    • Are there seasonality trends throughout the year?
  • How are people searching for ice cream?
    • What words do they use?
    • What questions do they ask?
    • Are more searches performed on mobile devices?
  • Why are people seeking ice cream?
    • Are individuals looking for health conscious ice cream specifically or just looking to satisfy a sweet tooth?
  • Where are potential customers located — locally, nationally, or internationally?

And finally — here’s the kicker — how can you help provide the best content about ice cream to cultivate a community and fulfill what all those people are searching for?

Search engine basics

Search engines are answer machines. They scour billions of pieces of content and evaluate thousands of factors to determine which content is most likely to answer your query.

Search engines do all of this by discovering and cataloguing all available content on the Internet (web pages, PDFs, images, videos, etc.) via a process known as “crawling and indexing.”

What are “organic” search engine results?

Organic search results are search results that aren’t paid for (i.e. not advertising). These are the results that you can influence through effective SEO. Traditionally, these were the familiar “10 blue links.”

Today, search engine results pages — often referred to as “SERPs” — are filled with both more advertising and more dynamic organic results formats (called “SERP features”) than we’ve ever seen before. Some examples of SERP features are featured snippets (or answer boxes), People Also Ask boxes, image carousels, etc. New SERP features continue to emerge, driven largely by what people are seeking.

For example, if you search for “Denver weather,” you’ll see a weather forecast for the city of Denver directly in the SERP instead of a link to a site that might have that forecast. And, if you search for “pizza Denver,” you’ll see a “local pack” result made up of Denver pizza places. Convenient, right?

It’s important to remember that search engines make money from advertising. Their goal is to better solve searcher’s queries (within SERPs), to keep searchers coming back, and to keep them on the SERPs longer.

Some SERP features on Google are organic and can be influenced by SEO. These include featured snippets (a promoted organic result that displays an answer inside a box) and related questions (a.k.a. “People Also Ask” boxes).

It’s worth noting that there are many other search features that, even though they aren’t paid advertising, can’t typically be influenced by SEO. These features often have data acquired from proprietary data sources, such as Wikipedia, WebMD, and IMDb.

Why SEO is important

While paid advertising, social media, and other online platforms can generate traffic to websites, the majority of online traffic is driven by search engines.

Organic search results cover more digital real estate, appear more credible to savvy searchers, and receive way more clicks than paid advertisements. For example, of all US searches, only ~2.8% of people click on paid advertisements.

In a nutshell: SEO has ~20X more traffic opportunity than PPC on both mobile and desktop.

SEO is also one of the only online marketing channels that, when set up correctly, can continue to pay dividends over time. If you provide a solid piece of content that deserves to rank for the right keywords, your traffic can snowball over time, whereas advertising needs continuous funding to send traffic to your site.

Search engines are getting smarter, but they still need our help.

Optimizing your site will help deliver better information to search engines so that your content can be properly indexed and displayed within search results.

Should I hire an SEO professional, consultant, or agency?

Depending on your bandwidth, willingness to learn, and the complexity of your website(s), you could perform some basic SEO yourself. Or, you might discover that you would prefer the help of an expert. Either way is okay!

If you end up looking for expert help, it’s important to know that many agencies and consultants “provide SEO services,” but can vary widely in quality. Knowing how to choose a good SEO company can save you a lot of time and money, as the wrong SEO techniques can actually harm your site more than they will help.

White hat vs black hat SEO

“White hat SEO” refers to SEO techniques, best practices, and strategies that abide by search engine rule, its primary focus to provide more value to people.

“Black hat SEO” refers to techniques and strategies that attempt to spam/fool search engines. While black hat SEO can work, it puts websites at tremendous risk of being penalized and/or de-indexed (removed from search results) and has ethical implications.

Penalized websites have bankrupted businesses. It’s just another reason to be very careful when choosing an SEO expert or agency.

Search engines share similar goals with the SEO industry

Search engines want to help you succeed. They’re actually quite supportive of efforts by the SEO community. Digital marketing conferences, such as Unbounce, MNsearch, SearchLove, and Moz’s own MozCon, regularly attract engineers and representatives from major search engines.

Google assists webmasters and SEOs through their Webmaster Central Help Forum and by hosting live office hour hangouts. (Bing, unfortunately, shut down their Webmaster Forums in 2014.)

While webmaster guidelines vary from search engine to search engine, the underlying principles stay the same: Don’t try to trick search engines. Instead, provide your visitors with a great online experience.

Google webmaster guidelines

Basic principles:

  • Make pages primarily for users, not search engines.
  • Don’t deceive your users.
  • Avoid tricks intended to improve search engine rankings. A good rule of thumb is whether you’d feel comfortable explaining what you’ve done to a website to a Google employee. Another useful test is to ask, “Does this help my users? Would I do this if search engines didn’t exist?”
  • Think about what makes your website unique, valuable, or engaging.

Things to avoid:

  • Automatically generated content
  • Participating in link schemes
  • Creating pages with little or no original content (i.e. copied from somewhere else)
  • Cloaking — the practice of showing search engine crawlers different content than visitors.
  • Hidden text and links
  • Doorway pages — pages created to rank well for specific searches to funnel traffic to your website.

Full Google Webmaster Guidelines version here.

Bing webmaster guidelines

Basic principles:

  • Provide clear, deep, engaging, and easy-to-find content on your site.
  • Keep page titles clear and relevant.
  • Links are regarded as a signal of popularity and Bing rewards links that have grown organically.
  • Social influence and social shares are positive signals and can have an impact on how you rank organically in the long run.
  • Page speed is important, along with a positive, useful user experience.
  • Use alt attributes to describe images, so that Bing can better understand the content.

Things to avoid:

  • Thin content, pages showing mostly ads or affiliate links, or that otherwise redirect visitors away to other sites will not rank well.
  • Abusive link tactics that aim to inflate the number and nature of inbound links such as buying links, participating in link schemes, can lead to de-indexing.
  • Ensure clean, concise, keyword-inclusive URL structures are in place. Dynamic parameters can dirty up your URLs and cause duplicate content issues.
  • Make your URLs descriptive, short, keyword rich when possible, and avoid non-letter characters.
  • Burying links in Javascript/Flash/Silverlight; keep content out of these as well.
  • Duplicate content
  • Keyword stuffing
  • Cloaking — the practice of showing search engine crawlers different content than visitors.

Guidelines for representing your local business on Google

These guidelines govern what you should and shouldn’t do in creating and managing your Google My Business listing(s).

Basic principles:

  • Be sure you’re eligible for inclusion in the Google My Business index; you must have a physical address, even if it’s your home address, and you must serve customers face-to-face, either at your location (like a retail store) or at theirs (like a plumber)
  • Honestly and accurately represent all aspects of your local business data, including its name, address, phone number, website address, business categories, hours of operation, and other features.

Things to avoid

  • Creation of Google My Business listings for entities that aren’t eligible
  • Misrepresentation of any of your core business information, including “stuffing” your business name with geographic or service keywords, or creating listings for fake addresses
  • Use of PO boxes or virtual offices instead of authentic street addresses
  • Abuse of the review portion of the Google My Business listing, via fake positive reviews of your business or fake negative ones of your competitors
  • Costly, novice mistakes stemming from failure to read the fine details of Google’s guidelines

Fulfilling user intent

Understanding and fulfilling user intent is critical. When a person searches for something, they have a desired outcome. Whether it’s an answer, concert tickets, or a cat photo, that desired content is their “user intent.”

If a person performs a search for “bands,” is their intent to find musical bands, wedding bands, band saws, or something else?

Your job as an SEO is to quickly provide users with the content they desire in the format in which they desire it.

Common user intent types:

Informational: Searching for information. Example: “How old is Issa Rae?”

Navigational: Searching for a specific website. Example: “HBOGO Insecure”

Transactional: Searching to buy something. Example: “where to buy ‘We got y’all’ Insecure t-shirt”

You can get a glimpse of user intent by Googling your desired keyword(s) and evaluating the current SERP. For example, if there’s a photo carousel, it’s very likely that people searching for that keyword search for photos.

Also evaluate what content your top-ranking competitors are providing that you currently aren’t. How can you provide 10X the value on your website?

Providing relevant, high-quality content on your website will help you rank higher in search results, and more importantly, it will establish credibility and trust with your online audience.

Before you do any of that, you have to first understand your website’s goals to execute a strategic SEO plan.

Know your website/client’s goals

Every website is different, so take the time to really understand a specific site’s business goals. This will not only help you determine which areas of SEO you should focus on, where to track conversions, and how to set benchmarks, but it will also help you create talking points for negotiating SEO projects with clients, bosses, etc.

What will your KPIs (Key Performance Indicators) be to measure the return on SEO investment? More simply, what is your barometer to measure the success of your organic search efforts? You’ll want to have it documented, even if it’s this simple:

For the website ________________________, my primary SEO KPI is _______________.

Here are a few common KPIs to get you started:

  • Sales
  • Downloads
  • Email signups
  • Contact form submissions
  • Phone calls

And if your business has a local component, you’ll want to define KPIs for your Google My Business listings, as well. These might include:

  • Clicks-to-call
  • Clicks-to-website
  • Clicks-for-driving-directions

Notice how “Traffic” and “Ranking” are not on the above lists? This is because, for most websites, ranking well for keywords and increasing traffic won’t matter if the new traffic doesn’t convert (to help you reach the site’s KPI goals).

You don’t want to send 1,000 people to your website a month and have only 3 people convert (to customers). You want to send 300 people to your site a month and have 40 people convert.

This guide will help you become more data-driven in your SEO efforts. Rather than haphazardly throwing arrows all over the place (and getting lucky every once in awhile), you’ll put more wood behind fewer arrows.

Grab a bow (and some coffee); let’s dive into Chapter 2 (Crawlers & Indexation).


We’re looking forward to hearing your thoughts on this draft of Chapter 1. What works? Anything you feel could be added or explained differently? Let us know your suggestions, questions, and thoughts in the comments.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

A Look Back at a Great 2017: 5 Major Moz Product Investments and a Sneak Peek Into 2018

Posted by adamf

It’s hard to believe that 2017 is already past. We entered the year with big ambitions and we’ve made some great strides. As has become tradition, I’ve compiled a rundown of some of the most interesting updates that you may have seen (or missed) this past year. We’ve intentionally focused on significant product updates, but I’ve also shared a little about some newer programs that provide value for customers in different ways.

TL;DR, here are some of the larger and more interesting additions to Moz in 2017:

  1. Keywords by Site: Keyword Explorer adds site-based keyword research and competitive intelligence
  2. Site Crawl V2: Overhauled Site Crawl for better auditing and workflow
  3. Major investments in infrastructure: Better performance and resilience across the Moz toolset
  4. New instructor-led training programs: Targeted classes to level-up your SEO knowledge
  5. Customer Success: Custom walkthroughs to help you get the most out of Moz
  6. Bonus! MozPod: Moz’s new free podcast keeps you up to date on the latest industry topics and trends

Big updates

This year and last, we’ve been spending a disproportionate focus on releasing large infrastructural improvements, new datasets, and foundational product updates. We feel these are crucial elements that serve the core needs of SEOs and will fuel frequent improvements and iterations for years to come.

To kick things off, I wanted to share some details about two big updates from 2017.


1) Keywords by Site: Leveling up keyword research and intelligence

Rank tracking provides useful benchmarks and insights for specific, targeted keywords, but you can’t track all of the keywords that are relevant to you. Sometimes you need a broader look at how visible your sites (and your competitors’ sites) are in Google results.

We built Keywords by Site to provide this powerful view into your Google presence. This brand-new dataset in Moz significantly extends Keyword Explorer and improves the quality of results in many other areas throughout Moz Pro. Our US corpus currently includes 40 million Google SERPs updated every two weeks, and allows you to do the following:

See how visible your site is in Google results

This view not only shows how authoritative a site is from a linking perspective, but also shows how prominent a site is in Google search results.

Compare your ranking prominence to your competitors

Compare up to three sites to get a feel for their relative scale of visibility and keyword ranking overlap. Click on any section in the Venn diagram to view the keywords that fall into that section.

Dig deep: Sort, filter, and find opportunities, then stash them in keyword lists

For example, let’s say you’re looking to determine which pages or content on your site might only require a little nudge to garner meaningful search visibility and traffic. Run a report for your site in Keyword Explorer and then use the filters to quickly hone in on these opportunities:

Our focus on data quality

We’ve made a few decisions to help ensure the freshness and accuracy of our keyword corpus. These extend the cost and work to maintain this dataset, but we feel they make a discernible difference in quality.

  • We recollect all of our keyword data every 2 weeks. This means that the results you see are more recent and more similar to the results on the day that you’re researching.
  • We cycle up to 15 million of our keywords out on a monthly basis. This means that as new keywords or terms trend up in popularity, we add them to our corpus, replacing terms that are no longer getting much search volume.

A few improvements we’ve made since launch:

  • Keyword recommendations in your campaigns (tracked sites) are much improved and now backed by our keyword corpus.
  • These keyword suggestions are also included in your weekly insights, suggesting new keywords worth tracking and pages worth optimizing.
  • Coming very soon: We’re also on the cusp of launching keyword corpuses for the UK, Canada, and Australia. Stay tuned.

A few resources to help you get more from Keywords by Site:

Try out Keywords by Site!


2) Site Crawl V2: Big enhancements to site crawling and auditing

Another significant project we completed in 2017 was a complete rewrite of our aging Site Crawler. In short, our new crawler is faster, more reliable, can crawl more pages, and surfaces more issues. We’ve also made some enhancements to the workflow, to make regular crawls more customizable and easy to manage. Here are a few highlights:

Week-over-week crawl comparisons

Our new crawler keeps tabs on what happened in your previous crawl to show you which specific issues are no longer present, and which are brand new.

Ignore (to hide) individual issues or whole issue types

This feature was added in response to a bunch of customer requests. While Moz does its best to call out the issues and priorities that apply to most sites, not all sites or SEOs have the same needs. For example, if you regularly noindex a big portion of your site, you don’t need us to keep reminding you that you’ve applied noindex to a huge number of pages. If you don’t want them showing your reports, just ignore individual issues or the entire issue type.

Another workflow improvement we added was the ability to mark an issue as fixed. This allows you to get it out of your way until the next crawl runs and verifies the fix.

All Pages view with improved sorting and filtering

If you’re prioritizing across a large number of pages or trying to track down an issue in a certain area of your site, you can now sort all pages crawled by Issue Count, Page Authority, or Crawl Depth. You can also filter to show, for instance, all pages in the /blog section of my site that are redirects, and have a crawl issue.

Recrawl to verify fixes

Moz’s crawler monitors your site by crawling it every week. But if you’ve made some changes and want to verify them, you can now recrawl your site in between regular weekly crawls instead of waiting for the next crawl the start.

Seven new issues checked and tracked

These include such favorites as detecting Thin Content, Redirect Chains, and Slow Pages. While we were at it, we revamped duplicate page detection and improved the UI to help you better analyze clusters of duplicate content and figure out which page should be canonical.

A few resources to help you get more from Site Crawl:


3) Major investments in infrastructure for performance and resilience

You may not have directly noticed many of the updates we’ve made this year. We made some significant investments in Moz Pro and Moz Local to make them faster, more reliable, and allow us to build new features more quickly. But here are a few tangible manifestations of these efforts:

“Infinite” history on organic Moz Pro search traffic reports

Okay, infinite is a bit of a stretch, but we used to only show the last 12 months or weeks of data. Now we’ll show data from the very inception of a campaign, broken down by weeks or months. This is made possible by an updated architecture that makes full historical data easy to surface and present in the application. It also allows for custom access to selected date ranges.

Also worth noting is that the new visualization shows how many different pages were receiving organic search traffic in context with total organic search traffic. This can help you figure out whether traffic increase was due to improved rankings across many pages, or just a spike in organic traffic for one or a few pages.

More timely and reliable access to Moz Local data at all scales

As Moz Local has brought on more and bigger customers with large numbers of locations, the team discovered a need to bolster systems for speed and reliability. A completely rebuilt scheduling system and improved core location data systems help ensure all of your data is collected and easy to access when you need it.

Improved local data distribution

Moz Local distributes your location data through myriad partners, each of which have their own formats and interfaces. The Local team updated and fine-tuned those third-party connections to improve the quality of the data and speed of distribution.


4) New instructor-led training programs: Never stop learning

Not all of our improvements this year have shown up in the product. Another investment we’ve made is in training. We’ve gotten a lot of requests for this over the years and are finally delivering. Brian Childs, our trainer extraordinaire, has built this program from the ground up. It includes:

  • Boot camps to build up core skills
  • Advanced Seminars to dig into more intensive topics
  • Custom Training for businesses that want a more tailored approach

We have even more ambitious plans for 2018, so if training interests you, check out all of our training offerings here.


5) Customer Success: Helping customers get the most out of Moz

Our customer success program took off this year and has one core purpose: to help customers get maximum value from Moz. Whether you’re a long-time customer looking to explore new features or you’re brand new to Moz and figuring out how to get started, our success team offers product webinars every week, as well as one-on-one product walkthroughs tailored to your needs, interests, and experience level.

The US members of our customer success team hone their skills at a local chocolate factory (Not pictured: our fantastic team members in the UK, Australia, and Dubai)

If you want to learn more about Moz Pro, check out a webinar or schedule a walkthrough.


Bonus! MozPod: Moz’s new free podcast made its debut

Okay, this really strays from product news, but another fun project that’s been gaining momentum is MozPod. This came about as a side passion project by our ever-ambitious head trainer. Lord knows that SEO and digital marketing are fast-moving and ever-changing; to help you keep up on hot topics and new developments, we’ve started the Mozpod. This podcast covers a range of topics, drawing from the brains of key folks in the industry. With topics ranging from structured data and app store optimization to machine learning and even blockchain, there’s always something interesting to learn about.

Join Brian every week for a new topic and guest:


What’s next?

We have a lot planned for 2018 — probably way too much. But one thing I can promise is that it won’t be a dull year. I prefer not to get too specific about projects that we’ve not yet started, but here are a few things already in the works:

  • A significant upgrade to our link data and toolset
  • On-demand Site Crawl
  • Added keyword research corpuses for the UK, Australia, and Canada
  • Expanded distribution channels for local to include Facebook, Waze, and Uber
  • More measurement and analytics features around local rankings, categories, & keywords
  • Verticalized solutions to address specific local search needs in the restaurant, hospitality, financial, legal, & medical sectors

On top of these and many other features we’re considering, we also plan to make it a lot easier for you to use our products. Right now, we know it can be a bit disjointed within and between products. We plan to change that.

We’ve also waited too long to solve for some specific needs of our agency customers. We’re prioritizing some key projects that’ll make their jobs easier and their relationships with Moz more valuable.


Thank you!

Before I go, I just want to thank you all for sharing your support, suggestions, and critical feedback. We strive to build the best SEO data and platform for our diverse and passionate customers. We could not succeed without you. If you’d like to be a part of making Moz a better platform, please let us know. We often reach out to customers and community members for feedback and insight, so if you’re the type who likes to participate in user research studies, customer interviews, beta tests, or surveys, please volunteer here.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

How to Face 3 Fundamental Challenges Standing Between SEOs and Clients/Bosses

Posted by sergeystefoglo

Every other year, the good people at Moz conduct a survey with one goal in mind: understand what we (SEOs) want to read more of. If you haven’t seen the results from 2017, you can view them here.

The results contain many great questions, challenges, and roadblocks that SEOs face today. As I was reading the 2017 Moz Blog readership survey, a common thread stood out to me: there are disconnects on fundamental topics between SEOs and clients and/or bosses. Since I work at an agency, I’ll use “client” through the rest of this article; if you work in-house, replace that with “boss.”

Check out this list:

I can definitely relate to these challenges. I’ve been at Distilled for a few years now, and worked in other firms before — these challenges are real, and they’re tough. Through sharing my experience dealing with these challenges, I hope to help other consultants and SEOs to overcome them.

In particular, I want to discuss three points of disconnect that happen between SEOs and clients.

  1. My client doesn’t understand the value of SEO and it’s difficult to prove ROI.
  2. My client doesn’t understand how SEO works and I always have to justify my actions.
  3. My client and I disagree about whether link building is the right answer.

Keep in mind, these are purely my own experiences. This doesn’t mean these answers are the end-all-be-all. In fact, I would enjoy starting a conversation around these challenges with any of you so please grab me at SearchLove (plug: our San Diego conference is selling out quickly and is my favorite) or MozCon to bounce off more ideas!

1. My client doesn’t understand the value of SEO and it’s difficult to prove ROI

The value of SEO is its influence on organic search, which is extremely valuable. In fact, SEO is more prominent in 2018 than it has ever been. To illustrate this, I borrowed some figures from Rand’s write up on the state of organic search at the end of 2017.

  • Year over year, the period of January–October 2017 has 13% more search volume than the same months in 2016.
  • That 13% represents 54 billion more queries, which is just about the total number of searches Google did, worldwide, in 2003.

Organic search brings in the most qualified visitors (at a more consistent rate) than any other digital marketing channel. In other words, more people are searching for things than ever before, which results in more potential to grow organic traffic. How do we grow organic traffic? By making sure our sites are discoverable by Google and clearly answer user queries with good content.

Source: Search Engine Land

When I first started out in SEO, I used to think I was making all my clients all the moneys. “Yes, Bill, if you hire me and we do this SEO thing I will increase rankings and sessions, and you will make an extra x dollars!” I used to send estimates on ROI with every single project I pitched (even if it wasn’t asked of me).

After a few years in the industry I began questioning the value of providing estimates on ROI. Specifically, I was having trouble determining ift I was doing the right thing by providing a number that was at best an educated guess. It would stress me out and I would feel like I was tied to that number. It also turns out, not worrying about things that are out of our control helps control stress levels.

I’m at a point now where I’ve realized the purpose of providing an estimated ROI. Our job as consultants is to effect change. We need to get people to take action. If what it takes to get sign-off is to predict an uplift, that’s totally fine. In fact, it’s expected. Here’s how that conversation might look.

In terms of a formula for forecasting uplifts in SEO, Mike King said it best:

“Forecast modeling is questionable at best. It doesn’t get much better than this:”

  • Traffic = Search Volume x CTR
  • Number of Conversions = Conversion Rate x Traffic
  • Dollar Value = Traffic x # Conversions x Avg Conversion Value

TL;DR:

  • Don’t overthink this too much — if you do, you’ll get stuck in the weeds.
  • When requested, provide the prediction to get sign-off and quickly move on to action.
  • For more in-depth thoughts on this, read Will Critchlow’s recent post on forecast modeling.
  • Remember to think about seasonality, overall trends, and the fact that few brands exist in a vacuum. What are your competitors doing and how will that affect you?

2. My client doesn’t understand how SEO works and I always have to justify my actions

Does your client actually not understand how SEO works? Or, could it be that you don’t understand what they need from you? Perhaps you haven’t considered what they are struggling with at the moment?

I’ve been there — constantly needing to justify why you’re working on a project or why SEO should be a focus. It isn’t easy to be in this position. But, more often than not I’ve realized what helps the most is to take a step back and ask some fundamental questions.

A great place to start would be asking:

  • What are the things my client is concerned about?
  • What is my client being graded on by their boss?
  • Is my client under pressure for some reason?

The answers to these questions should shine some clarity on the situation (the why or the motivation behind the constant questioning). Some of the reasons why could be:

  • You might know more about SEO than your client, but they know more about their company. This means they may see the bigger picture between investments, returns, activities, and the interplay between them all.
  • SEO might be 20% of what your client needs to think about — imagine a VP of marketing who needs to account for 5–10 different channels.
  • If you didn’t get sign off/budget for a project, it doesn’t mean your request was without merit. This just means someone else made a better pitch more aligned to their larger goals.

When you have some answers, ask yourself, “How can I make what I’m doing align to what they’re focused on?” This will ensure you are hitting the nail on the head and providing useful insight instead of more confusion.

That conversation might look like this:

TL;DR

  • This is a good problem to have — it means you have a chance to effect change.
  • Also, it means that your client is interested in your work!
  • It’s important to clarify the why before getting to in the weeds. Rarely will the why be “to learn SEO.”

3. My client and I disagree about whether link building is the right answer

The topic of whether links (and by extension, link building) are important is perhaps the most talked about topic in SEO. To put it simply, there are many different opinions and not one “go-to” answer. In 2017 alone there have been many conflicting posts/talks on the state of links.

The quick answer to the challenge we face as SEOs when it comes to links is, unless authority is holding you back do something else.

That answer is a bit brief and if your client is constantly bringing up links, it doesn’t help. In this case, I think there are a few points to consider.

  1. If you’re a small business, getting links is a legitimate challenge and can significantly impact your rankings. The problem is that it’s difficult to get links for a small business. Luckily, we have some experts in our field giving out ideas for this. Check out this, this, and this.
  2. If you’re an established brand (with authority), links should not be a priority. Often, links will get prioritized because they are easier to attain, measurable (kind of), and comfortable. Don’t fall into this trap! Go with the recommendation above: do other impactful work that you have control over first.
    1. Reasoning: Links tie success to a metric we have no control over — this gives us an excuse to not be accountable for success, which is bad.
    2. Reasoning: Links reduce an extremely complicated situation into a single variable — this gives us an excuse not to try and understand everything (which is also bad).
  3. It’s good to think about the topic of links and how it’s related to brand. Big brands get talked about (and linked to) more than small brands. Perhaps the focus should be “build your brand” instead of “gain some links”.
  4. If your client persists on the topic of links, it might be easier to paint a realistic picture for them. This conversation might look like this:

TL;DR

  • There are many opinions on the state of links in 2018: don’t get distracted by all the noise.
  • If you’re a small business, there are some great tactics for building links that don’t take a ton of time and are probably worth it.
  • If you’re an established brand with more authority, do other impactful work that’s in your control first.
  • If you are constantly getting asked about links from your client, paint a realistic picture.

Conclusion

If you’ve made it this far, I’m really interested in hearing how you deal with these issues within your company. Are there specific challenges you face within the topics of ROI, educating on SEO, getting sign-off, or link building? How can we start tackling these problems more as an industry?


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

What Does It Mean to "Write for SEO" in 2018? – Whiteboard Friday

Posted by randfish

“Those who do not learn from history are doomed to repeat its mistakes” — it’s a quote that’s actually quite applicable when it comes to writing for SEO. Much of the advice given to copywriters, journalists, editors, and other content creators for SEO writing is dangerously out of date, leaning on practices that were once tried and true but that could now get your site penalized.

In this edition of Whiteboard Friday, we hope you enjoy a brief history lesson on what should be avoided, what used to work and no longer does, and a brief 5-step process you should start using today for writing content that’ll get you to the front of the SERPs.

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re chatting about writing for SEO and what that means in 2018.

So writing for SEO has had a long history, and it meant something many years ago that it does not mean today. Unfortunately, I see a lot of bad advice, terrible advice out there for journalists and editors and authors of all kinds about what you need to do in terms of writing for SEO, meaning writing to get you to the top of search engines.

“Writing for SEO” in 2001

Now, let’s be clear, some of this stuff is mired in pure mythology. But some of it is mired in historical fact that just hasn’t been updated. So let’s talk about what writing for SEO used to be back in 2001, how it evolved in sort of the middle era of 2008, let’s say, and then what it means today in 2018.

So, back in the day, writing for SEO did mean things like…

I. Keyword stuffing

If you wanted to rank highly in early search engines, especially the late ’90s into the early 2000s, keyword stuffing was a real tactic that really did have effectiveness. So SEOs would cram keywords into all sorts of tags and locations.

II. They would use and reuse a bunch of different variants, slight keyword variants

So if I’m targeting the word blue watches, I would have blue watch, blue watches, blue watch accessory, blue watch accessories, blue watches accessory, blue watches accessories, ridiculous little variants on plurals because the search engines were not great at figuring out that all these things sort of had the same intent and meant the same thing. So raw, rough keyword matching, exact keyword matching was part of SEO.

III. Keyword use in every tag possible

If there was a tag, you’d cram keywords into it.

IV. Domain name and subdomain keyword use

So this is why you saw that brands would be outranked by, to use our example, blue-watch-accessories.bluewatchaccessories.info, that kind of silly stuff would be ranking. Some of it even maintained for a while.

V. SEO writing was writing for engines and then trying not to annoy or piss off users

So, a lot of the time, people would want to cloak. They’d want to show one set of content to the search engines and another set to searchers, to actual users, because they knew that if they showed this dense, keyword-stuffed content to users, they’d be turned off and they wouldn’t find it credible and they’d go somewhere else.

“Writing for SEO” in 2008

2008, we evolve on a bunch of these fronts, but not all of them and certainly not perfectly.

I. Keywords are still important in important locations

II. Exact matching still matters in a lot of places. So people were crafting unique pages even for keywords that shared the same intent.

Blue watches and blue timepieces might have two different pages. Blue watch and blue watches could even have two separate pages and do effectively well in 2008. 2018, that’s not the case anymore.

III. Domain names were definitely less powerful, subdomains more so, but still influential

They still had some play in the engines. You still saw a lot of debates back in ’08 about whether to create a keyword-rich domain.

IV. Since links in 2008 were overwhelmingly powerful rather than on-page signals, writing in order to get links is incredibly prized

In fact, it still is, but we’ll talk about the evolution of that a little bit.

“Writing for SEO” in 2018

So now let’s jump another decade forward. We’re in 2018. This year, what does writing for SEO mean? Well, a bunch of things.

I. Solving the searcher’s query matter most — writing that doesn’t do this tends not to rank well (for long)

Because engines have gotten so much better, Google in particular, but Bing as well, have gotten so much better at essentially optimizing for solving the searcher’s task, helping them accomplish the thing that they wanted to accomplish, the writing that does the best job of solving the searcher’s task tends to be the most highly prized. Stuff that doesn’t, writing that doesn’t do that, doesn’t tend to rank well, doesn’t tend to rank for long. You can sometimes get to the top of the search results, but you will almost certainly invariably be taken out by someone who does a great job of solving the searcher’s query.

II. Intent matching matters a lot more in 2018 than exact keyword matching.

Today, no credible SEO would tell you to create a page for blue watch and blue watches or blue watch accessories and blue watch accessory or even blue timepieces and blue watches, maybe if you’re targeting clocks too. In this case, it’s really about figuring out what is the searcher’s intent. If many keywords share the same intent, you know what? We’re going to go ahead and create a single page that serves that intent and all of the keywords or at least many of the keywords that that intent is represented by.

III. Only a few tags are still absolutely crucial to doing SEO correctly.

So SEO writing today, there are really only two that are not very fungible. Those are the title element and the body content. That’s not to say that you can’t rank without using the keyword in these two places, just that it would be inadvisable to do so. This is both because of search engines and also because of searchers. When you see the keyword that you search for in the title element of the page in the search results, you are more inclined to click on it than if you don’t see it. So it’s possible that some click-baity headline could outrank a keyword-rich headline. But the best SEO writers are mixing both of those. We have a Whiteboard Friday about headline writing on just that topic.

A few other ones, however, a few other tags are nice to have in 2018 still. Those include:

Headline tags (the H1, the H2),

URL field, so if you can make your URL include the words and phrases that people are searching for, that is mildly helpful. It’s both helpful for searchers who see the URL and would think, “Oh, okay, that is referring to the thing that I want,” as well as for people who copy and paste the URL and share it with each other, or people who link with the URL and, thus, the anchor text is carried across by that URL and those keywords in there.

The meta description, not used for rankings, but it is read by searchers. When they see a meta description that includes the words and phrases that they’ve queried, they are more likely to think this will be a relevant result and more likely to click it. More clicks, as long as the engagement is high, tends to mean better rankings.

The image alt attribute, which is helpful both for regular search results, but particularly helpful for Google Images, which, as you may know from watching Whiteboard Friday, Google Images gets a tremendous amount of search traffic even on its own.

IV. Employing words, phrases, and concepts that Google’s identified as sort of commonly associated with the query

This can provide a significant boost. We’ve seen some really interesting experimentation on this front, where folks will essentially take a piece of content, add in missing words and phrases that other pages that are highly ranking in Google have associated with those correct words and phrases.

In our example, I frequently use “New York neighborhoods,” and a page that’s missing words like Brooklyn, Harlem, Manhattan, Staten Island, that’s weird, right? Google is going to be much more likely to rank the page that includes these borough names than one that doesn’t for that particular query, because they’ve learned to associate that text with relevance for the query “New York neighborhoods.”

What I do want to make clear here is this does not mean LSI or some other particular tactic. LSI is an old-school, I think late ’80s, early ’90s computer tactic, software tactic for identifying words that are semantically connected to each other. There’s no reason you have to use this old-school junk methodology that became like pseudoscience in the SEO world and had a recent revival. But you should be using words and phrases that Google has related to a particular keyword. Related topics is a great thing to do. You can find some via the Moz Bar. We did a Whiteboard Friday on related topics, so you can check that out.

V. The user experience of the writing and content matters more than ever, and that is due to engagement metrics

Essentially, Google is able to see that people who click on a particular result are less likely to click the back button and choose a different result or more likely to stay on that page or site and engage further with that content and solve their whole task. That is a good sign to Google, and they want to rank more of those.

A brief “SEO writing” process for 2018

So, pragmatically, what does this history and evolution mean? Well, I think we can craft a brief sort of SEO writing process for 2018 from this. This is what I recommend. If you can do nothing else, do these five steps when you are writing for SEO, and you will tend to have more success than most of your competition.

Step 1: Assemble all the keywords that a page is targeting

So there should be a list of them. They should all share the same intent. You get all those keywords listed out.

Step 2: You list what the searchers are actually trying to accomplish when they search those queries

So someone searched for blue watches. What do they want? Information about them, they want to see different models, they want to know who makes them, they want to buy them, they want to see what the costs are like, they want to see where they can get them online, probably all of those things. Those are the intents behind those queries.

Step 3: Create a visual layout

Here’s going to be our headline. Here’s our subheadline. We’re going to put this important key concept up at the top in a callout box. We’re going to have this crucial visual next up. This is how we’re going to address all of those searcher intents on the page visually with content, written or otherwise.

Step 4: Write first and then go add the keywords and the crucial, related terms, phrases, top concepts, topics that you want into the page

The ones that will hopefully help boost your SEO, rather than writing first with the keywords and topics in mind. You can have a little bit of that, but this would be what I suggest.

Step 5: Craft the hook, the hook that will make influential people and publications in this space likely to amplify, likely to link

Because, in 2018, links still do matter, still are an important part of SEO.

If you follow this and learn from this history, I think you’ll do a much better job, generally speaking, of writing for SEO than a lot of the common wisdom out there. All right, everyone. Look forward to your thoughts in the comments. We’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →