Archives for 

seo

Your Daily SEO Fix: The Keyword Research Edition

Posted by FeliciaCrawford

Back in 2015, we had an epiphany. Every day, via every channel, you — our readers, subscribers, community members, and social followers — would ask us really good questions. (You’re an incredibly intelligent, friendly, inquisitive bunch, you know that? It’s humbling.) A lot of those questions were about how to accomplish your SEO goals, and it got us thinking.

Moz is an educational resource, it’s true, but we also offer a suite of tools (both free and paid) that can help you achieve those goals. Why not provide a space for those two things to converge? And thus, the idea of the Daily SEO Fix was born: quick 1–3 minute videos shared throughout the week that feature Mozzers describing how to solve problems using the tools we know best.

It’s two years later now, and both our tools and our industry have evolved. Time to revisit this idea, no?

Today’s series of Daily SEO Fixes feature our keyword research tool, Keyword Explorer. Perhaps you’ve heard us mention it a couple times — we sure like it, and we think it could help you, too. And you don’t have to be a subscriber to check this puppy out — anyone on the whole wide Internet can use it to research two queries a day for free. If you’re logged into your Moz community account, you get five free queries.

Open Keyword Explorer in a new tab!

Queue it up in another browser tab to follow along, if you’d like!*

*Keep in mind that some features, such as lists, are only available when you’re also a Moz Pro Medium subscriber or above. If you’re bursting with curiosity, you can always check out the 30-day free trial, which features everything you’d see in a paid subscription… but for free. 🙂


Fix #1: Nitty-gritty keyword research

Let’s get down to brass tacks: your keyword research. Janisha’s here to walk you through…

  • Researching your keyword;
  • Determining whether it strikes the right balance of volume, difficulty, and opportunity;
  • How to quickly analyze the SERPs for your query and see what factors could be affecting your ranking opportunity;
  • Finding keyword suggestions ripe with promise; and
  • Organizing your newly discovered keywords into lists.

Fix #2: Finding question keywords to boost your content & win featured snippets

When you answer the questions searchers are actually asking, you’ve got way more opportunity to rank, earn qualified traffic to your site, and even win yourself a featured snippet or two. Brittani shows you how to broaden your page content by speaking to your audience’s most burning questions.


Fix #3: Updating your keyword metrics on a whim

If you’re hot on the trail of a good ranking, you don’t have the time or patience to wait for your metrics to update on their own. Kristina shows you how to get that sweet, sweet, up-to-date data after you’ve organized a list of related keywords in Keyword Explorer.


Fix #4: Moving curated keyword lists to Moz Pro for long-term tracking

If you’re interested in tracking the overall SEO progress of a site and digging into the nuts and bolts of your keyword data, you’ll want to pay attention. Kristina’s back to explain how to import your curated Keyword Explorer lists into a Moz Pro campaign to track long-term rankings for a specific site.


That’s a wrap for Week 1!

There you have it — four ways to level up your keyword research and knock some to-dos off your list. We’ll be back next Thursday with more fixes from a new group of Mozzers; keep an eye on our social channels for a sneak peek, and maybe try a free spin of Moz Pro if you’d like to follow along.

Curious about what else you can do with Keyword Explorer? Here are some fab resources:

And if you’re fairly new to the game or looking for ways to grow your team members’ SEO knowledge, be sure to check out our classes on introductory SEO, keyword research, site audits, link building, reporting, and more.

See you next week, friends!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

SEO Rankings Drop: A Step-by-Step Guide to Recovery

Posted by KristinaKledzik

A few weeks ago, rankings for pages on a key section of my site dropped an average of a full position in one day. I’ve been an SEO for 7 years now, but I still ran around like a chicken with my head cut off, panicked that I wouldn’t be able to figure out my mistake. There are so many things that could’ve gone wrong: Did I or my team unintentionally mess with internal link equity? Did we lose links? Did one of Google’s now-constant algorithm updates screw me over?

Since the drop happened to a group of pages, I made the assumption it had to do with our site or page structure (it didn’t). I wasted a good day focused on technical SEO. Once I realized my error, I decided to put together a guide to make sure that next time, I’ll do my research effectively. And you, my friends, will reap the rewards.

First, make sure there’s actually a rankings change

Okay, I have to start with this: before you go down this rabbit hole of rankings changes, make sure there was actually a rankings change. Your rankings tracker may not have localized properly, or have picked up on one of Google’s rankings experiments or personalization.

Find out:

  • Has organic traffic dropped to the affected page(s)?
    • We’re starting here because this is the most reliable data you have about your site. Google Search Console and rankings trackers are trying to look at what Google’s doing; your web analytics tool is just tracking user counts.
    • Compare organic traffic to the affected page(s) week-over-week both before and after the drop, making sure to compare similar days of the week.
    • Is the drop more significant than most week-over-week changes?
    • Is the drop over a holiday weekend? Is there any reason search volume could’ve dropped?
  • Does Google Search Console show a similar rankings drop?
    • Use the Search Analytics section to see clicks, impressions, and average position for a given keyword, page, or combo.
    • Does GSC show a similar rankings drop to what you saw in your rankings tracker? (Make sure to run the report with the selected keyword(s).)
  • Does your rankings tracker show a sustained rankings drop?
    • I recommend tracking rankings daily for your important keywords, so you’ll know if the rankings drop is sustained within a few days.
    • If you’re looking for a tool recommendation, I’m loving Stat.

If you’ve just seen a drop in your rankings tool and your traffic and GSC clicks are still up, keep an eye on things and try not to panic. I’ve seen too many natural fluctuations to go to my boss as soon as I see an issue.

But if you’re seeing that there’s a rankings change, start going through this guide.

Figure out what went wrong

1. Did Google update their algorithm?

Google rolls out a new algorithm update at least every day, most silently. Good news is, there are leagues of SEOs dedicated to documenting those changes.

  • Are there any SEO articles or blogs talking about a change around the date you saw the change? Check out:
  • Do you have any SEO friends who have seen a change? Pro tip: Make friends with SEOs who run sites similar to yours, or in your industry. I can’t tell you how helpful it’s been to talk frankly about tests I’d like to run with SEOs who’ve run similar tests.

If this is your issue…

The bad news here is that if Google’s updated their algorithm, you’re going to have to change your approach to SEO in one way or another.

Make sure you understand:

Your next move is to put together a strategy to either pull yourself out of this penalty, or at the very least to protect your site from the next one.

2. Did your site lose links?

Pull the lost links report from Ahrefs or Majestic. They’re the most reputable link counters out there, and their indexes are updated daily.

  • Has there been a noticeable site-wide link drop?
  • Has there been a noticeable link drop to the page or group of pages you’ve seen a rankings change for?
  • Has there been a noticeable link drop to pages on your site that link to the page or group of pages you’ve seen a rankings change for?
    • Run Screaming Frog on your site to find which pages link internally to the affected pages. Check internal link counts for pages one link away from affected pages.
  • Has there been a noticeable link drop to inbound links to the page or group of pages you’ve seen a rankings change for?
    • Use Ahrefs or Majestic to find the sites that link to your affected pages.
      • Have any of them suffered recent link drops?
      • Have they recently updated their site? Did that change their URLs, navigation structure, or on-page content?

If this is your issue…

The key here is to figure out who you lost links from and why, so you can try to regain or replace them.

  • Can you get the links back?
    • Do you have a relationship with the site owner who provided the links? Reaching out may help.
    • Were the links removed during a site update? Maybe it was accidental. Reach out and see if you can convince them to replace them.
    • Were the links removed and replaced with links to a different source? Investigate the new source — how can you make your links more appealing than theirs? Update your content and reach out to the linking site owner.
  • Can you convince your internal team to invest in new links to quickly replace the old ones?
    • Show your manager(s) how much a drop in link count affected your rankings and ask for the resources it’ll take to replace them.
    • This will be tricky if you were the one to build the now-lost links in the first place, so if you did, make sure you’ve put together a strategy to build longer-term ones next time.

3. Did you change the affected page(s)?

If you or your team changed the affected pages recently, Google may not think that they’re as relevant to the target keyword as they used to be.

  • Did you change the URL?
    • DO NOT CHANGE URLS. URLs act as unique identifiers for Google; a new URL means a new page, even if the content is the same.
  • Has the target keyword been removed from the page title, H1, or H2s?
  • Is the keyword density for the target keyword lower than it used to be?
  • Can Google read all of the content on the page?
    • Look at Google’s cache by searching for cache:www.yourdomain.com/your-page to see what Google sees.
  • Can Google access your site? Check Google Search Console for server and crawl reports.

If this is your issue…

Good news! You can probably revert your site and regain the traffic you’ve lost.

  • If you changed the URL, see if you can change it back. If not, make sure the old URL is 301 redirecting to the new URL.
  • If you changed the text on the page, try reverting it back to the old text. Wait until your rankings are back up, then try changing the text again, this time keeping keyword density in mind.
  • If Google can’t read all of the content on your page, THIS IS A BIG DEAL. Communicate that to your dev team. (I’ve found dev teams often undervalue the impact of SEO, but “Googlebot can’t read the page” is a pretty understandable, impactful problem.)

4. Did you change internal links to the affected page(s)?

If you or your team added or removed internal links, that could change the way link equity flows through your site, changing Google’s perceived value of the pages on your site.

  • Did you or your team recently update site navigation anywhere? Some common locations to check:
    • Top navigation
    • Side navigation
    • Footer navigation
    • Suggested products
    • Suggested blog posts
  • Did you or your team recently update key pages on your site that link to target pages? Some pages to check:
    • Homepage
    • Top category pages
    • Linkbait blog posts or articles
  • Did you or your team recently update anchor text on links to target pages? Does it still include the target keyword?

If this is your issue…

Figure out how many internal links have been removed from pointing to your affected pages. If you have access to the old version of your site, run Screaming Frog (or a similar crawler) on the new and old versions of your site so you can compare inbound link counts (referred to as inlinks in SF). If you don’t have access to the old version of your site, take a couple of hours to compare navigation changes and mark down wherever the new layout may have hurt the affected pages.

How you fix the problem depends on how much impact you have on the site structure. It’s best to fix the issue in the navigational structure of the site, but many of us SEOs are overruled by the UX team when it comes to primary navigation. If that’s the case for you, think about systematic ways to add links where you can control the content. Some common options:

  • In the product description
  • In blog posts
  • In the footer (since UX will generally admit, few people use the footer)

Keep in mind that removing links and adding them back later, or from different places on the site, may not have the same effect as the original internal links. You’ll want to keep an eye on your rankings, and add more internal links than the affected pages lost, to make sure you regain your Google rankings.

5. Google’s user feedback says you should rank differently.

Google is using machine learning to determine rankings. That means they’re at least in part measuring the value of your pages based on their click-through rate from SERPs and how long visitors stay on your page before returning to Google.

  • Did you recently add a popup that is increasing bounce rate?
  • Is the page taking longer to load?
    • Check server response time. People are likely to give up if nothing happens for a few seconds.
    • Check full page load. Have you added something that takes forever to load and is causing visitors to give up quickly?
  • Have you changed your page titles? Is that lowering CTR? (I optimized page titles in late November, and that one change moved the average rank of 500 pages up from 12 to 9. One would assume things can go in reverse.)

If this is your issue…

  • If the issue is a new popup, do your best to convince your marketing team to test a different type of popup. Some options:
    • Scroll popups
    • Timed popups
    • Exit popups
    • Stable banners at the top or bottom of the page (with a big CLICK ME button!)
  • If your page is taking longer to load, you’ll need the dev team. Put together the lost value from fewer SEO conversions now that you’ve lost some rankings and you’ll have a pretty strong case for dev time.
  • If you’ve changed your page titles, change them back, quick! Mark this test as a dud, and make sure you learn from it before you run your next test.

6. Your competition made a change.

You may have changed rank not because you did anything, but because your competition got stronger or weaker. Use your ranking tool to identify competitors that gained or lost the most from your rankings change. Use a tool like Versionista (paid, but worth it) or Wayback Machine (free, but spotty data) to find changes in your competitors’ sites.

  • Which competitors gained or lost the most as your site’s rankings changed?
  • Has that competition gained or lost inbound links? (Refer to #2 for detailed questions)
  • Has that competition changed their competing page? (Refer to #3 for detailed questions)
  • Has that competition changed their internal link structure? (Refer to #4 for detailed questions)
  • Has that competition started getting better click-through rates or dwell time to their pages from SERPs? (Refer to #5 for detailed questions)

If this is your issue…

You’re probably fuming, and your managers are probably fuming at you. But there’s a benefit to this: you can learn about what works from your competitors. They did the research and tested a change, and it paid off for them. Now you know the value! Imitate your competitor, but try to do it better than them this time — otherwise you’ll always be playing catch up.

Now you know what to do

You may still be panicking, but hopefully this post can guide you to some constructive solutions. I find that the best response to a drop in rankings is a good explanation and a plan.

And, to the Moz community of other brilliant SEOs: comment below if you see something I’ve missed!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

The Moz 2016 Annual Report

Posted by SarahBird

I have a longstanding tradition of boring Moz readers with our exhaustive annual reports (2012, 2013, 2014, 2015).

tradition fiddler.gif

If you’re avoiding sorting the recycling, going to the gym, or cleaning out your closet, I have got a *really* interesting post that needs your attention *right now*.

(Yeah. I know it’s March. But check this out, I had pneumonia in Jan/Feb so my life slid sideways for a while.)

Skip to your favorite parts:

Part 1: TL;DR

Part 2: Achievements unlocked

Part 3: Oh hai, elephant. Oh hai, room.

Part 4: More wood, fewer arrows

Part 5: Performance (metrics vomit)

Part 6: Inside Moz HQ

Part 7: Looking ahead


Part 1: TL;DR

We closed out 2016 with more customers and revenue than 2015. Our core SEO products are on a roll with frequent, impactful launches.

The year was not all butterflies and sunshine, though. Some of our initiatives failed to produce the results we needed. We made some tough calls (sunsetting some products and initiatives) and big changes (laying off a bunch of folks and reallocating resources). On a personal level, it was the most emotionally fraught time in my career.

Thank the gods, our hard work is paying off. Moz ended the year cashflow, EBITDA, and net income profitable (on a monthly basis), and with more can-do spirit than in years past. In fact, in the month of December we added a million dollars cash to the business.

We’re completely focused on our mission to simplify SEO for everyone through software, education, and community.


Part 2: Achievements unlocked

It blows my mind that we ended the year with over 36,000 customers from all over the world. We’ve got brands and agencies. We’ve got solopreneurs and Fortune 500s. We’ve got hundreds of thousands of people using the MozBar. A bunch of software companies integrate with our API. It’s humbling and awesome. We endeavor to be worthy of you!

Customers and Community.png

We were very busy last year. The pace and quality of development has never been better. The achievements captured below don’t come even close to listing everything. How many of these initiatives did you know about?


Part 3: Oh hai, elephant. Oh hai, room.

When a few really awful things happen, it can overshadow the great stuff you experience. That makes this a particularly hard annual report to write. 2016 was undoubtedly the most emotionally challenging year I’ve experienced at Moz.

It became clear that some of our strategic hypotheses were wrong. Pulling the plug on those projects and asking people I care deeply about to leave the company was heartbreaking. That’s what happened in August 2016.

Tolstoy Happy products and unhappy products.jpg

As Tolstoy wrote, “Happy products are all alike; every unhappy product is unhappy in its own way.” The hard stuff happened. Rehashing what went wrong deserves a couple chapters in a book, not a couple lines in a blog post. It shook us up hard.

And *yet*, I am determined not to let the hard stuff take away from the amazing, wonderful things we accomplished and experienced in 2016. There was a lot of good there, too.

Smarter people than me have said that progress doesn’t happen in a straight line; it zigs and zags. I’m proud of Mozzers; they rise to challenges. They lean into change and find the opportunity in it. They turn their compassion and determination up to 11. When the going gets tough, the tough get going.

beast mode q4-finish-strong.jpg

I’ve learned a lot about Moz and myself over the last year. I’m taking all those learnings with me into the next phase of Moz’s growth. Onwards.


Part 4: More wood, fewer arrows

At the start of 2016, our hypothesis was that our customers and community would purchase several inbound marketing tools from Moz, including SEO, local SEO, social analytics, and content marketing. The upside was market expansion. The downside was fewer resources to go around, and a much more complex brand and acquisition funnel.

By trimming our product lines, we could reallocate resources to initiatives showing more growth potential. We also simplified our mission, brand, and acquisition funnel.

It feels really good to be focusing on what we love: search. We want to be the best place to learn and do SEO.

Whenever someone wonders how to get found in search, we want them to go to Moz first. We aspire to be the best in the world at the core pillars of SEO: rankings, keywords, site audit and optimization, links, location data management.

SEO is dynamic and complex. By reducing our surface area, we can better achieve our goal of being the best. We’re putting more wood behind fewer arrows.

more wood fewer arrows.png


Part 5: Performance (metrics vomit)

Check out the infographic view of our data barf.

We ended the year at ~$42.6 million in gross revenue, amounting to ~12% annual growth. We had hoped for better at the start of the year. Moz Pro is still our economic engine, and Local drives new revenue and cashflow.

revenue for annual report 2016.png

Gross profit margin increased a hair to 74%, despite Moz Local being a larger share of our overall business. Product-only gross profit margin is a smidge higher at 76%. Partner relationships generally drag the profit margin on that product line.

Our Cost of Revenue (COR) went up in raw numbers from the previous year, but it didn’t increase as much as revenue.COR 2016.png

COR Pie Annual Report 2016.png

Total Operating Expenses came to about ~$41 million. Excluding the cost of the restructure we initiated in August, the shape and scale of our major expenses has remained remarkably stable.

2016 year in review major expenses.png

We landed at -$5.5 million in EBITDA, which was disappointingly below our plan. We were on target for our budgeted expenses. As we fell behind our revenue goals, it became clear we’d need to right-size our expenses to match the revenue reality. Hence, we made painful cuts.

EBITDA Annual Report 2016.png

Cash Burn Annual Report 2016.png

I’m happy/relieved/overjoyed to report that we were EBITDA positive by September, cashflow positive by October, and net income positive by November. Words can’t express how completely terrible it would have been to go through what we all went through, and *not* have achieved our business goals.

My mind was blown when we actually added a million in cash in December. I couldn’t have dared to dream that… Ha ha! They won’t all be like that! It was the confluence of a bunch of stuff, but man, it felt good.

one million dollars dr evil.jpg


Part 6: Inside MozHQ

Thanks to you, dear reader, we have a thriving and opinionated community of marketers. It’s a great privilege to host so many great exchanges of ideas. Education and community are integral to our mission. After all, we were a blog before we were a tech company. Traffic continues to climb and social keeps us busy. We love to hear from you!

organic traffic 2016 annual report.png

social channels for annual report 2016.png

We added a bunch of folks to the Moz Local, Moz.com, and Customer Success teams in the last half of the year. But our headcount is still lower than last year because we asked a lot of talented people to leave when we sunsetted a bunch of projects last August. We’re leaner, and gaining momentum.

End of year headcount bar charg 2016 annual report.png

Moz is deeply committed to making tech a more inclusive industry. My vision is for Moz to be a place where people are constantly learning and doing their best work. We took a slight step back on our gender diversity gains in 2016. Ugh. We’re not doing much hiring in 2017, so it’s going to be challenging to make substantial progress. We made a slight improvement in the ratio of underrepresented minorities working at Moz, which is a positive boost.

Gender ratios annual report 2016.png

The tech industry has earned its reputation of being unwelcoming and myopic.

Mozzers work hard to make Moz a place where anyone could thrive. Moz isn’t perfect; we’re human and we screw up sometimes. But we pick ourselves up, dust off, and try again. We continue our partnership with Ada Academy, and we’ve deepened our relationship with Year Up. One of my particular passions is partnering with programs that expose girls and young women to STEM careers, such as Ignite Worldwide, Techbridge, and BigSisters.

I’m so proud of our charitable match program. We match Mozzer donations 150% up to $3k. Over the years, we’ve given over half a million dollars to charity. In 2016, we gave over $111,028 to charities. The ‘G’ in TAGFEE stands for ‘generous,’ and this is one of the ways we show it.

charitable donation match annual report 2016.png

One of our most beloved employee benefits is paid, PAID vacation. We give every employee up to $3,000 to spend on his or her vacation. This year, we spent over half a million dollars exploring the world and sucking the marrow out of life.

paid paid vacation annual report 2016.png


Part 7: Looking ahead

Dear reader, I don’t have to tell you that search has been critical for a long time.

This juggernaut of a channel is becoming *even more* important with the proliferation of search interfaces and devices. Mobile liberated search from the desktop by bringing it into the physical world. Now, watches, home devices, and automobiles are making search ubiquitous. In a world of ambient search, SEO becomes even more important.

SEO is more complicated and dynamic than years past because the number of human interfaces, response types, and ranking signals are increasing. We here at Moz are wild about the complexity. We sink our teeth into it. It drives our mission: Simplify SEO for everyone through software, education, and community.

We’re very excited about the feature and experience improvements coming ahead. Thank you, dear reader, for sharing your feedback, inspiring us, and cheering us on. We look forward to exploring the future of search together.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

Infinite "People Also Ask" Boxes: Research & SEO Opportunities

Posted by BritneyMuller

A Glimpse Into Google’s Machine Learning?

You’ve likely seen the People Also Ask (Related Questions) boxes in SERPs. These accordion-like question and answer boxes are Google’s way of saying, “Hey, you beautiful searcher, you! These questions also relate to your search… maybe you’re interested in exploring these too? Kick off your shoes, stay a while!”

However, few people have come across infinite PAAs. These occur when you expand a PAA question box to see 2 or 3 other related questions appear at the bottom. These infinite PAA lists can continue into the hundreds, and I’ve been lucky enough to come across 75+ of these gems!

So, grab a coffee and buckle up! I’d like to take you on a journey of my infinite PAA research, discoveries, machine learning hypothesis, and how you can find PAA opportunities.

Why PAAs should matter to you

PAAs have seen a 1,723% growth in SERPs since 7/31/15 via Mozcast! ← Tweet this stat!

Compare that to featured snippets, which have seen only a 328% growth since that timeframe.

Research has also shown that a single PAA can show up in 21 unique SERPs! How ’bout dem apples?! PAA opportunities can take over some serious SERP real estate.

My infinite PAA obsession

These mini-FAQs within search results have fascinated me since Google started testing of them in 2015. Then in November 2016, I discovered Google’s PAA dynamic testing:

You guys, I’ve discovered a SERP black hole! I’m on #200 suggested PAA for this SERP?! Has anyone else seen an infinite PAA SERP before? pic.twitter.com/YgZDVWdWJ9
— Britney Muller (@BritneyMuller) November 23, 2016

The above infinite PAA expanded into the hundreds! This became an obsession of mine as I began to notice them across multiple devices (for a variety of different searches) and coined them “PAA Black Holes.”

I began saving data from these infinite PAAs to see if I could find any patterns, explore how Google might be pulling this data, and dive deeper into how the questions/topics changed as a result of my expanding question boxes, etc.

After seeing a couple dozen infinite PAAs, I began to wonder if this was actually a test to implement in search, but several industry leaders assured me this was more likely a bug.

They were wrong.

Infinite People Also Ask boxes are live

Now integrated into U.S. SERPs (sorry foreign friends, but get ready for this to potentially migrate your way) you can play with these on desktop & mobile:

If you’re in the US and like exploring topics, there’s a nifty feature for you to try with “People also ask” on Google. 🙂 pic.twitter.com/s2WtwyYvun
— Satyajeet Salgar (@salgar) February 10, 2017
I’m fascinated by Satyajeet’s use of “exploring topics”.

Why does Google want people to spend more time on individual SERPs (instead of looking at several)? Could they charge more for advertisements on SERPs with these sticky, expansive PAAs? Might they eventually start putting ads in PAAs? These are the questions that follow me around like a shadow.

To get a better idea of the rise of PAAs, here’s a timeline of my exploratory PAA research:

PAA timeline

April 17, 2015 – Google starts testing PAAs

July 29, 2015 – Dr. Pete gets Google to confirm preferred “Related Questions” name

Aug 15, 2015 – Google tests PAA Carousels on desktop

Dec 30, 2015 – Related Questions (PAAs) grow +500% in 5 months

Mar 11, 2016 – See another big uptick in Related Questions (PAAs) in Mozcast

Nov 11, 2016 – Robin Rozhon notices PAA Black Hole

Nov 23, 2016 – Brit notices PAA Black Hole

Nov 29, 2016 – STAT Analytics publishes a research study on PAAs

Dec 12, 2016 – Realized new PAA results would change based on expanded PAA

Dec 14, 2016 – Further proof PAAs dynamically load based on what you click

Dec 19, 2016 – Still seeing PAA Black Holes

Dec 22, 2016 – Discovered a single PAA result (not a 3-pack)

Jan 11, 2016 – Made a machine learning (TensorFlow) discovery and hypothesis!

Jan 22, 2016 – Discovered a PAA Black Hole on a phone

Jan 25, 2016 – Discovered a PAA Black Hole that maxed out at 9

Feb 10, 2017 – PAA Black Holes go live!

Feb 14, 2017 – Britney Muller is still oblivious to PAA Black Holes going live and continues to hypothesize how they are being populated via entity graph-based ML.


3 big infinite PAA discoveries:

#1 – Google caters to browsing patterns in real time

It took me a while to grasp that I can manipulate the newly populated question boxes based on what I choose to expand.

Below, I encourage more Vans-related PAAs by clicking “Can I put my vans in the washing machine?” Then, I encourage more “mildew”-related ones simply by clicking a “How do you get the mildew smell out of clothes” PAA above:

vans-paa.gif

Another example of this is when I clicked “organic SEO” at the very top of a 100+ PAA Black Hole (the gif would make you dizzy, so I took a screenshot instead). It altered my results from “how to clean leather” to “what is seo” and “what do you mean by organic search”:

leather to seo.png


#2 – There are dynamic dead ends

When I reach an exhaustive point in my PAA expansions (typically ~300+), Google will prompt the first two PAAs, as in: “We aren’t sure what else to provide, are you interested in these again?”

Here is an example of that happening: I go from “mitosis”-related PAAs (~300 PAAs deep) to a repeat of the first two PAAs: “What is Alexa ranking based on?” and “What is the use of backlinks?”:

getting sick of me.gif

This reminds me of a story told by Google machine learning engineers: whenever an early ML model couldn’t identify a photograph, it would say a default ‘I don’t know’ answer of: “Men talking on cell phone.” It could have been a picture of an elephant dancing, and if the ML model wasn’t sure what it was, it would say “Men talking on cell phone.”

My gut tells me that G reverts back to the strongest edge cases (the first two PAAs) to your original query when running out of a certain relational threshold of PAAs.

It will then suggest the third and fourth PAA when you push these limits to repeat again, and so on.


#3 – Expand & retract one question to explore the most closely related questions

This not only provides you with the most relevant PAAs to the query you’re expanding and retracting, but if it’s in your wheelhouse, you can quickly discover other very relevant PAA opportunities.

Here I keep expanding and retracting “What is the definition of SEO?”:

exhaust-paa.gif

Notice how “SEO” or “search engine optimization” is in every subsequent PAA!? This is no coincidence and has a lot to do with the entity graph.

First, let’s better understand machine learning and why an entity-based, semi-supervised model is so relevant to search. I’ll then draw out what I think is happening with the above results (like a 5-year-old), and go over ways you can capture these opportunities! Woohoo!


Training data’s role in machine learning

Mixups are commonplace in machine learning, mostly due to a lack of quality training data.

Machine Learning Process SEO

Well-labeled training data is typically the biggest component necessary in training an accurate ML model.

Fairly recently, the voice search team at Google came across an overwhelming amount of EU voice data that was interpreted as “kdkdkdkd.” An obvious exclusion in their training data (who says “kdkdkdkd”?!), the engineers had no idea what could be prompting that noise. Confused, they finally figured out that it was the trains and subways making that noise!

This is a silly example of adding the “kdkdkd” = Trains/Subways training data. Google is now able to account for these pesky “kdkdkdkd” inclusions.


Relational data to the rescue

Because we don’t always have enough training data to properly train a ML model, we look to relational data for help.

Example: If I showed you the following picture, you could gather a few things from it, right? Maybe that it appears to be a female walking down a street, and that perhaps it’s fall by her hat, scarf, and the leaves on the ground. But it’s hard to determine a whole lot else, right?

Screen Shot 2017-01-10 at 1.21.03 AM.png

What about now? Here are two other photos from the above photo’s timeline:

Screen Shot 2017-01-10 at 1.23.36 AM.pngScreen Shot 2017-01-10 at 1.23.47 AM.png

Aha! She appears to be a U.S. traveler visiting London (with her Canon Ti3 camera). Now we have some regional, demographic, and product understanding. It’s not a whole lot of extra information, but it provides much more context for the original cryptic photo, right?

Perhaps, if Google had integrated geo-relational data with their voice machine learning, they could have more quickly identified that these noises were occurring at the same geolocations. This is just an example; Google engineers are WAY smarter than myself and have surely thought of much better solutions.


Google leverages entity graphs similarly for search

Google leverages relational data (in a very similarly way to the above example) to form better understandings of digital objects to help provide the most relevant search results.

A kind of scary example of this is Google’s Expander: A large-scale ML platform to “exploit relationships between data objects.”

Screen Shot 2017-01-09 at 11.39.57 PM.png

Machine learning is typically “supervised” (training data is provided, which is more common) or “unsupervised” (no training data). Expander, however, is “semi-supervised,” meaning that it’s bridging the gap between provided and not-provided data. ← SEO pun intended!

Expander leverages a large, graph-based system to infer relationships between datasets. Ever wonder why you start getting ads about a product you started emailing your friend about?

Expander is bridging the gap between platforms to better understand online data and is only going to get better.


Relational entity graphs for search

Here is a slide from a Google I/O 2016 talk that showcases a relational word graph for search results:

Screen Shot 2017-01-09 at 11.24.47 PM.png

Slide from Breakthroughs in Machine Learning Google I/O 2016 video.

Solid edges represent stronger relationships between nodes than the dotted lines. The above example shows there is a strong relationship between “What are the traditions of halloween” and “halloween tradition,” which makes sense. People searching for either of those would each be satisfied by quality content about “halloween traditions.”

Edge strength can also be determined by distributional similarity, lexical similarity, similarity based on word embeddings, etc.


Infinite PAA machine learning hypothesis:

Google is providing additional PAAs based on the strongest relational edges to the expanded query.

You can continue to see this occur in infinite PAAs datasets. When a word with two lexical similarities overlaps the suggested PAAs, the topic changes because of it:

Screen Shot 2017-03-02 at 7.21.58 PM.png

The above topic change occurred through a series of small relational suggestions. A PAA above this screenshot was “What is SMO stands for?” (not a typo, just a neural network doing its best people!) which led to “What is the meaning of SMO?”, to “What is a smo brace?” (for ankles).

This immediately made me think of the relational word graph and what I envision Google is doing:

I hope my parents hang this on their fridge.

My hypothesis is that the machine learning model computes that because I’m interested in “SMO,” I might also be interested in ankle brace “SMO.”

There are ways for SEOs and digital marketers to leverage topical relevance and capture PAAs opportunities.


4 ways to optimize for machine learning & expand your topical reach for PAAs:

Topical connections can always be made within your content, and by adding additional high quality topically related content, you can strengthen your content’s edges (and expand your SERP real estate). Here are some quick and easy ways to discover related topics:

#1: Quickly discover Related Topics via MozBar

MozBar is a free SEO browser add-on that allows you to do quick SEO analysis of web pages and SERPs. The On-Page Content Suggestions feature is a quick and simple way to find other topics related to your page.

Step 1: Activate MozBar on the page you are trying to expand your keyword reach with, and click the Page Optimization:

Beginner's Guide To SEO Mozbar.png

Step 2: Enter in the word you are trying to expand your keyword reach with:

seo-browser-tool.png

Step 3: Click On-Page Content Suggestions for your full list of related keyword topics.

seo-toolbar.png

Step 4: Evaluate which related keywords can be incorporated naturally into your current on-page content. In this case, it would be beneficial to incorporate “seo tutorial,” “seo tools,” and “seo strategy” into the Beginner’s Guide to SEO.

mozbar related keywords seo.png

Step 5: Some may seem like an awkward add to the page, like “seo services” and “search engine ranking,” but are relevant to the products/services that you offer. Try adding these topics to a better-fit page, creating a new page, or putting together a strong FAQ with other topically related questions.


#2: Wikipedia page + SEOBook Keyword Density Checker*

Let’s say you’re trying to expand your topical keywords in an industry you’re not very familiar with, like “roof repair.” You can use this free hack to pull in frequent and related topics.

Step 1: Find and copy the roof Wikipedia page URL.

Step 2: Paste the URL into SEOBook’s Keyword Density Checker:

Screen Shot 2017-02-13 at 6.15.51 AM.png

Step 3: Hit submit and view the most commonly used words on the Wikipedia page:

Screen Shot 2017-02-13 at 6.12.59 AM.png

Step 4: You can dive even deeper (and often more topically related) by clicking on the “Links” tab to evaluate the anchor text of on-page Wikipedia links. If a subtopic is important enough, it will likely have another page to link to:

keyword density links.png

Step 5: Use any appropriate keyword discoveries to create stronger topic-based content ideas.

*This tactic was mentioned in Experts On The Wire episode on keyword research tools.


#3: Answer the Public

Answer the Public is a great free resource to discover questions around a particular topic. Just remember to change your country if you’re not seeking results from the UK (the default).

Step 1: Enter in your keyword/topic and select your country:

Screen Shot 2017-02-13 at 6.39.14 AM.png

Step 2: Explore the visualization of questions people are asking about your keyword:

Screen Shot 2017-02-13 at 6.40.12 AM.png

Doesn’t this person look like they’re admiring themselves in a mirror (or taking a selfie)? A magnifying glass doesn’t work from that distance, people!

Note: Not all questions will be relevant to your research, like “why roof of mouth hurts” and “why roof of mouth itches.”Screen Shot 2017-02-13 at 6.40.43 AM.png

Step 3: Scroll back up to the top to export the data to CSV by clicking the big yellow button (top right corner):

Screen Shot 2017-02-13 at 12.32.56 PM.png

The magnifying glass looks much larger here… perhaps it would work at that distance?

Step 4: Clean up the data and upload the queries to your favorite keyword research tool (Moz Keyword Explorer, SEMRush, Google Keyword Planner, etc.) to discover search volume and SERP feature data, like featured snippets, reviews, related questions (PAA boxes), etc.

Note: Google’s Keyword Planner does not support SERP features data and provides vague, bucket-based search volume.


#4: Keyword research “only questions”

Moz Keyword Explorer provides an “only questions” filter to uncover potential PAA opportunities.

Step 1: Enter your keyword into KWE:

best keyword research tool.png

Step 2: Click Keyword Suggestions:

Screen Shot 2017-02-13 at 12.53.26 PM.png

Step 3: Filter by “are questions”:

Screen Shot 2017-02-13 at 12.51.59 PM.png

Pro tip: Find grouped question keyword opportunities by grouping keywords by “low lexical similarity” and ordering them from highest search volume to lowest:

Screen Shot 2017-02-13 at 12.52.28 PM.png

Step 4: Select keywords and add to a new or previous list:

Screen Shot 2017-02-13 at 1.14.49 PM.png

Step 5: Once in a list, KWE will tell you how many “related questions” (People Also Ask boxes) opportunities are within your list. In this case, we have 18:

Screen Shot 2017-02-13 at 1.27.03 PM.png

Step 6: Export your keyword list to a Campaign in Moz Pro:

Screen Shot 2017-02-13 at 1.33.15 PM.png

Step 7: Filter SERP Features by “Related Questions” to view PAA box opportunities:

Screen Shot 2017-02-13 at 1.35.02 PM.png

Step 8: Explore current PAA box opportunities and evaluate where you currently rank for “Related Questions” keywords. If you’re on page 1, you have a better chance of stealing a PAA box.

+Evaluate what other SERP features are present on these SERPs. Here, Dr. Pete tells me that I might be able to get a reviews rich snippet for “gutter installation”. Thanks, Dr. Pete!

Screen Shot 2017-02-13 at 1.36.02 PM.png

Hopefully, this research can help energize you to do topical research of your own to grab some relevant PAAs! PAAs aren’t going away anytime soon and I’m so excited for us to learn more about them.

Please share your PAA experiences, questions, or comments below.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

Aren’t 301s, 302s, and Canonicals All Basically the Same? – Whiteboard Friday

Posted by Dr-Pete

They say history repeats itself. In the case of the great 301 vs 302 vs rel=canonical debate, it repeats itself about every three months. In today’s Whiteboard Friday, Dr. Pete explains how bots and humans experience pages differently depending on which solution you use, why it matters, and how each choice may be treated by Google.

Aren't 301s, 302s, and canonicals all basically the same?

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Hey, Moz fans, it’s Dr. Pete, your friendly neighborhood marketing scientist here at Moz, and I want to talk today about an issue that comes up probably about every three months since the beginning of SEO history. It’s a question that looks something like this: Aren’t 301s, 302s, and canonicals all basically the same?

So if you’re busy and you need the short answer, it’s, “No, they’re not.” But you may want the more nuanced approach. This popped up again about a week [month] ago, because John Mueller on the Webmaster Team at Google had posted about redirection for secure sites, and in it someone had said, “Oh, wait, 302s don’t pass PageRank.”

John said, “No. That’s a myth. It’s incorrect that 302s don’t pass PR,” which is a very short answer to a very long, technical question. So SEOs, of course, jumped on that, and it turned into, “301s and 302s are the same, cats are dogs, cakes are pie, up is down.” We all did our freakout that happens four times a year.

So I want to get into why this is a difficult question, why these things are important, why they are different, and why they’re different not just from a technical SEO perspective, but from the intent and why that matters.

I’ve talked to John a little bit. I’m not going to put words in his mouth, but I think 95% of this will be approved, and if you want to ask him, that’s okay afterwards too.

Why is this such a difficult question?

So let’s talk a little bit about classic 301, 302. So a 301 redirect situation is what we call a permanent redirect. What we’re trying to accomplish is something like this. We have an old URL, URL A, and let’s say for example a couple years ago Moz moved our entire site from seomoz.org to moz.com. That was a permanent change, and so we wanted to tell Google two things and all bots and browsers:

  1. First of all, send the people to the new URL, and, second,
  2. pass all the signals. All these equity, PR, ranking signals, whatever you want to call them, authority, that should go to the new page as well.

So people and bots should both end up on this new page.

A classic 302 situation is something like a one-day sale. So what we’re saying is for some reason we have this main page with the product. We can’t put the sale information on that page. We need a new URL. Maybe it’s our CMS, maybe it’s a political thing, doesn’t matter. So we want to do a 302, a temporary redirect that says, “Hey, you know what? All the signals, all the ranking signals, the PR, for Google’s sake keep the old page. That’s the main one. But send people to this other page just for a couple of days, and then we’re going to take that away.”

So these do two different things. One of these tells the bots, “Hey, this is the new home,” and the other one tells it, “Hey, stick around here. This is going to come back, but we want people to see the new thing.”

So I think sometimes Google interprets our meaning and can change things around, and we get frustrated because we go, “Why are they doing that? Why don’t they just listen to our signals?”

Why are these differentiations important?

The problem is this. In the real world, we end up with things like this, we have page W that 301s to page T that 302s to page F and page F rel=canonicals back to page W, and Google reads this and says, “W, T, F.” What do we do?

We sent bad signals. We’ve done something that just doesn’t make sense, and Google is forced to interpret us, and that’s a very difficult thing. We do a lot of strange things. We’ll set up 302s because that’s what’s in our CMS, that’s what’s easy in an Apache rewrite file. We forget to change it to a 301. Our devs don’t know the difference, and so we end up with a lot of ambiguous situations, a lot of mixed signals, and Google is trying to help us. Sometimes they don’t help us very well, but they just run into these problems a lot.

In this case, the bots have no idea where to go. The people are going to end up on that last page, but the bots are going to have to choose, and they’re probably going to choose badly because our intent isn’t clear.

How are 301s, 302s, and rel=canonical different?

So there are a couple situations I want to cover, because I think they’re fairly common and I want to show that this is complex. Google can interpret, but there are some reasons and there’s some rhyme or reason.

1. Long-term 302s may be treated as 301s.

So the first one is that long-term 302s are probably going to be treated as 301s. They don’t make any sense. If you set up a 302 and you leave it for six months, Google is going to look at that and say, “You know what? I think you meant this to be permanent and you made a mistake. We’re going to pass ranking signals, and we’re going to send people to page B.” I think that generally makes sense.

Some types of 302s just don’t make sense at all. So if you’re migrating from non-secure to secure, from HTTP to HTTPS and you set up a 302, that’s a signal that doesn’t quite make sense. Why would you temporarily migrate? This is probably a permanent choice, and so in that case, and this is actually what John was addressing in this post originally, in that case Google is probably going to look at that and say, “You know what? I think you meant 301s here,” and they’re going to pass signals to the secure version. We know they prefer that anyway, so they’re going to make that choice for you.

If you’re confused about where the signals are going, then look at the page that’s ranking, because in most cases the page that Google chooses to rank is the one that’s getting the ranking signals. It’s the one that’s getting the PR and the authority.

So if you have a case like this, a 302, and you leave it up permanently and you start to see that Page B is the one that’s being indexed and ranking, then Page B is probably the one that’s getting the ranking signals. So Google has interpreted this as a 301. If you leave a 302 up for six months and you see that Google is still taking people to Page A, then Page A is probably where the ranking signals are going.

So that can give you an indicator of what their decision is. It’s a little hard to reverse that. But if you’ve left a 302 in place for six months, then I think you have to ask yourself, “What was my intent? What am I trying to accomplish here?”

Part of the problem with this is that when we ask this question, “Aren’t 302s, 301s, canonicals all basically the same?” what we’re really implying is, “Aren’t they the same for SEO?” I think this is a legitimate but very dangerous question, because, yes, we need to know how the signals are passed and, yes, Google may pass ranking signals through any of these things. But for people they’re very different, and this is important.

2. Rel=canonical is for bots, not people.


So I want to talk about rel=canonical briefly because rel=canonical is a bit different. We have Page A and Page B again, and we’re going to canonical from Page A to Page B. What we’re basically saying with this is, “Look, I want you, the bots, to consider Page B to be the main page. You know, for some reason I have to have these near duplicates. I have to have these other copies. But this is the main one. This is what I want to rank. But I want people to stay on Page A.”

So this is entirely different from a 301 where I want people and bots to go to Page B. That’s different from a 302, where I’m going to try to keep the bots where they are, but send people over here.

So take it from a user perspective. I have had in Q&A all the time people say, “Well, I’ve heard that rel=canonical passes ranking signals. Which should I choose? Should I choose that or 301? What’s better for SEO?”

That’s true. We do think it generally passes ranking signals, but for SEO is a bad question, because these are completely different user experiences, and either you’re going to want people to stay on Page A or you’re going to want people to go to Page B.

Why this matters, both for bots and for people

So I just want you to keep in mind, when you look at these three things, it’s true that 302s can pass PR. But if you’re in a situation where you want a permanent redirect, you want people to go to Page B, you want bots to go to Page B, you want Page B to rank, use the right signal. Don’t confuse Google. They may make bad choices. Some of your 302s may be treated as 301s. It doesn’t make them the same, and a rel=canonical is a very, very different situation that essentially leaves people behind and sends bots ahead.

So keep in mind what your use case actually is, keep in mind what your goals are, and don’t get over-focused on the ranking signals themselves or the SEO uses because all off these three things have different purposes.

So I hope that makes sense. If you have any questions or comments or you’ve seen anything weird actually happen on Google, please let us know and I’ll be happy to address that. And until then, we’ll see you next week.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →