Archives for 

seo

The State of Searcher Behavior Revealed Through 23 Remarkable Statistics

Posted by randfish

One of the marketing world’s greatest frustrations has long been the lack of data from Google and other search engines about the behavior of users on their platforms. Occasionally, Google will divulge a nugget of bland, hard-to-interpret information about how they process more than X billion queries, or how many videos were uploaded to YouTube, or how many people have found travel information on Google in the last year. But these numbers aren’t specific enough, well-sourced enough, nor do they provide enough detail to be truly useful for all the applications we have.

Marketers need to know things like: How many searches happen each month across various platforms? Is Google losing market share to Amazon? Are people really starting more searches on YouTube than Bing? Is Google Images more or less popular than Google News? What percent of queries are phrased as questions? How many words are in the average query? Is it more or less on mobile?

These kinds of specifics help us know where to put our efforts, how to sell our managers, teams, and clients on SEO investments, and, when we have this data over time, we can truly understand how this industry that shapes our livelihoods is changing. Until now, this data has been somewhere between hard and impossible to estimate. But, thanks to clickstream data providers like Jumpshot (which helps power Moz’s Keyword Explorer and many of our keyword-based metrics in Pro), we can get around Google’s secrecy and see the data for ourselves!

Over the last 6 months, Russ Jones and I have been working with Jumpshot’s Randy Antin, who’s been absolutely amazing — answering our questions late at night, digging in with his team to get the numbers, and patiently waiting while Russ runs fancy T-Distributions on large datasets to make sure our estimates are as accurate as possible. If you need clickstream data of any kind, I can’t recommend them enough.

If you’re wondering, “Wait… I think I know what clickstream data is, but you should probably tell me, Rand, just so I know that you know,” OK. 🙂 Clickstream monitoring means Jumpshot (and other companies like them — SimilarWeb, Clickstre.am, etc.) have software on the device that records all the pages visited in a browser session. They anonymize and aggregate this data (don’t worry, your searches and visits are not tied to you or to your device), then make parts of it available for research or use in products or through APIs. They’re not crawling Google or any other sites, but rather seeing the precise behavior of devices as people use them to surf or search the Internet.

Clickstream data is awesomely powerful, but when it comes to estimating searcher behavior, we need scale. Thankfully, Jumpshot can deliver here, too. Their US panel of Internet users is in the millions (they don’t disclose exact size, but it’s between 2–10) so we can trust these numbers to reliably paint a representative picture. That said, there may still be biases in the data — it could be that certain demographics of Internet users are more or less likely to be in Jumpshot’s panel, their mobile data is limited to Android (no iOS), and we know that some alternative kinds of searches aren’t captured by their methodology**. Still, there’s amazing stuff here, and it’s vastly more than we’ve been able to get any other way, so let’s dive in.

23 Search Behavior Stats

Methodology: All of the data was collected from Jumpshot’s multi-million user panel in October 2016. T-distribution scaling was applied to validate the estimates of overall searches across platforms. All other data is expressed as percentages. Jumpshot’s panel includes mobile and desktop devices in similar proportions, though no devices are iOS, so users on Macs, iPhones, and iPads are not included.

#1: How many searches are *really* performed on Google.com each month?

On the devices and types of queries Jumpshot can analyze, there were an average of 3.4 searches/day/searcher. Using the T-Distribution scaling analysis on various sample set sizes of Jumpshot’s data, Russ estimated that the most likely reality is that between 40–60 billion searches happen on Google.com in the US each month.

Here’s more detail from Russ himself:

“…All of the graphs are non-linear in shape, which indicates that as the samples get bigger we are approaching correct numbers but not in a simple % relationship… I have given 3 variations based on the estimated number of searches you think happen in the US annually. I have seen wildly different estimates from 20 billion to 100 billion, so I gave a couple of options. My gut is to go with the 40 billion numbers, especially since once we reach the 100MM line for 40 and 60B, there is little to no increase for 1 billion keywords, which would indicate we have reached a point where each new keyword is searched just 1 time.”

How does that compare to numbers Google’s given? Well, in May of 2016, Google told Search Engine Land they “processed at least 2 trillion searches per year.” Using our Jumpshot-based estimates, and assuming October of 2016 was a reasonably average month for search demand, we’d get to 480–720 billion annual searches. That’s less than half of what Google claims, but Google’s number is WORLDWIDE! Jumpshot’s data here is only for the US. This suggests that, as Danny Sullivan pointed out in the SELand article, Google could well be handling much, much more than 2 trillion annual searches.

Note that we believe our 40–60 billion/month number is actually too low. Why? Voice searches, searches in the Google app and Google Home, higher search use on iOS (all four of which Jumpshot can’t measure), October could be a lower-than-average month, some kinds of search partnerships, and automated searches that aren’t coming from human beings on their devices could all mean our numbers are undercounting Google’s actual US search traffic. In the future, we’ll be able to measure interesting things like growth or shrinkage of search demand as we compare October 2016 vs other months.

#2: How long is the average Google search session?

Form the time of the initial query to the loading of the search results page and the selection of any results, plus any back button clicks to those SERPs and selection of new results, the all-in average was just under 1 minute. If that seems long, remember that some search sessions may be upwards of an hour (like when I research all the best ryokans in Japan before planning a trip — I probably clicked 7 pages deep into the SERPs and opened 30 or more individual pages). Those long sessions are dragging up that average.

#3: What percent of users perform one or more searches on a given day?

This one blew my mind! Of the millions of active, US web users Jumpshot monitored in October 2016, only 15% performed at least one or more searches in a day. 45% performed at least one query in a week, and 68% performed one or more queries that month. To me, that says there’s still a massive amount of search growth opportunity for Google. If they can make people more addicted to and more reliant on search, as well as shape the flow of information and the needs of people toward search engines, they are likely to have a lot more room to expand searches/searcher.

#4: What percent of Google searches result in a click?

Google is answering a lot of queries themselves. From searches like “Seattle Weather,” to more complicated ones like “books by Kurt Vonnegut” or “how to remove raspberry stains?“, Google is trying to save you that click — and it looks like they’re succeeding.

66% of distinct search queries resulted in one or more clicks on Google’s results. That means 34% of searches get no clicks at all. If we look at all search queries (not just distinct ones), those numbers shift to a straight 60%/40% split. I wouldn’t be surprised to find that over time, we get closer and closer to Google solving half of search queries without a click. BTW — this is the all-in average, but I’ve broken down clicks vs. no-clicks on mobile vs. desktop in #19 below.

#5: What percent of clicks on Google search results go to AdWords/paid listings?

It’s less than I thought, but perhaps not surprising given how aggressive Google’s had to be with ad subtlety over the last few years. Of distinct search queries in Google, only 3.4% resulted in a click on an AdWords (paid) ad. If we expand that to all search queries, the number drops to 2.6%. Google’s making a massive amount of money on a small fraction of the searches that come into their engine. No wonder they need to get creative (or, perhaps more accurately, sneaky) with hiding the ad indicator in the SERPs.

#6: What percent of clicks on Google search results go to Maps/local listings?

This is not measuring searches and clicks that start directly from maps.google.com or from the Google Maps app on a mobile device. We’re talking here only about Google.com searches that result in a click on Google Maps. That number is 0.9% of Google search clicks, just under 1 in 100. We know from MozCast that local packs show up in ~15% of queries (though that may be biased by MozCast’s keyword corpus).

#7: What percent of clicks on Google search results go to links in the Knowledge Graph?

Knowledge panels are hugely popular in Google’s results — they show up in ~38% of MozCast‘s dataset. But they’re not nearly as popular for search click activity, earning only ~0.5% of clicks.

I’m not totally surprised by that. Knowledge panels are, IMO, more about providing quick answers and details to searchers than they are about drawing the click themselves. If you see Knowledge Panels in your SERPs, don’t panic too much that they’re taking away your CTR opportunity. This made me realize that Keyword Explorer is probably overestimating the degree to which Knowledge Panels remove organic CTR (e.g. Alice Springs, which has only a Knowledge Panel next to 10 blue links, has a CTR opportunity of 64).

#8: What percent of clicks on Google search results go to image blocks?

Images are one of the big shockers of this report overall (more on that later). While MozCast has image blocks in ~11% of Google results, Jumpshot’s data shows images earn 3% of all Google search clicks.

I think this happens because people are naturally drawn to images and because Google uses click data to specifically show images that earn the most engagement. If you’re wondering why your perfectly optimized image isn’t ranking as well in Google Images as you hoped, we’ve got strong suspicions and some case studies suggesting it might be because your visual doesn’t draw the eye and the click the way others do.

If Google only shows compelling images and only shows the image block in search results when they know there’s high demand for images (i.e. people search the web, then click the “image” tab at the top), then little wonder images earn strong clicks in Google’s results.

#9: What percent of clicks on Google search results go to News/Top Stories results?

Gah! We don’t know for now. This one was frustrating and couldn’t be gathered due to Google’s untimely switch from “News Results” to “Top Stories,” some of which happened during the data collection period. We hope to have this in the summer, when we’ll be collecting and comparing results again.

#10: What percent of clicks on Google search results go to Twitter block results?

I was expecting this one to be relatively small, and it is, though it slightly exceeded my expectations. MozCast has tweet blocks showing in ~7% of SERPs, and Jumpshot shows those tweets earning ~0.23% of all clicks.

My guess is that the tweets do very well for a small set of search queries, and tend to be shown less (or shown lower in the results) over time if they don’t draw the click. As an example, search results for my name show the tweet block between organic position #1 and #2 (either my tweets are exciting or the rest of my results aren’t). Compare that to David Mihm, who tweeted very seldomly for a long while and has only recently been more active — his tweets sit between positions #4 and #5. Or contrast with Dr. Pete, whose tweets are above the #1 spot!

#11: What percent of clicks on Google search results go to YouTube?

Technically, there are rare occasions when a video from another provider (usually Vimeo) can appear in Google’s SERPs directly. But more than 99% of videos in Google come from YouTube (which violates anti-competitive laws IMO, but since Google pays off so many elected representatives, it’s likely not an issue for them). Thus, we chose to study only YouTube rather than all video results.

MozCast shows videos in 6.3% of results, just below tweets. In Jumpshot’s data, YouTube’s engagement massively over-performed its raw visibility, drawing 1.8% of all search clicks. Clearly, for those searches with video intent behind them, YouTube is delivering well.

#12: What percent of clicks on Google search results go to personalized Gmail/Google Mail results?

I had no guess at all on this one, and it’s rarely discussed in the SEO world because it’s so relatively difficult to influence and obscure. We don’t have tracking data via MozCast because these only show in personalized results for folks logged in to their Gmail accounts when searching, and Google chooses to only show them for certain kinds of queries.

Jumpshot, however, thanks to clickstream tracking, can see that 0.16% of search clicks go to Gmail or Google Mail following a query, only a little under the number of clicks to tweets.

#13: What percent of clicks on Google search results go to Google Shopping results?

The Google Shopping ads have become pretty compelling — the visuals are solid, the advertisers are clearly spending lots of effort on CTR optimization, and the results, not surprisingly, reflect this.

MozCast has Shopping results in 9% of queries, while clickstream data shows those results earning 0.55% of all search clicks.

#14: What percent of Google searches result in a click on a Google property?

Google has earned a reputation over the last few years of taking an immense amount of search traffic for themselves — from YouTube to Google Maps to Gmail to Google Books and the Google App Store on mobile, and even Google+, there’s a strong case to be made that Google’s eating into opportunity for 3rd parties with bets of their own that don’t have to play by the rules.

Honestly, I’d have estimated this in the 20–30 percent range, so it surprised me to see that, from Jumpshot’s data, all Google properties earned only 11.8% of clicks from distinct searches (only 8.4% across all searches). That’s still significant, of course, and certainly bigger than it was 5 years ago, but given that we know Google’s search volume has more than doubled in the last 5 years, we have to be intellectually honest and say that there’s vastly more opportunity in the crowded-with-Google’s-own-properties results today than there was in the cleaner-but-lower-demand SERPs of 5 years ago.

#15: What percent of all searches happen on any major search property in the US?

I asked Jumpshot to compare 10 distinct web properties, add together all the searches they receive combined, and share the percent distribution. The results are FASCINATING!

Here they are in order:

  1. Google.com 59.30%
  2. Google Images 26.79%
  3. YouTube.com 3.71%
  4. Yahoo! 2.47%
  5. Bing 2.25%
  6. Google Maps 2.09%
  7. Amazon.com 1.85%
  8. Facebook.com 0.69%
  9. DuckDuckGo 0.56%
  10. Google News 0.28%

I’ve also created a pie chart to help illustrate the breakdown:

Distribution of US Searches October 2016

If the Google Images data shocks you, you’re not alone. I was blown away by the popularity of image search. Part of me wonders if Halloween could be responsible. We should know more when we re-collect and re-analyze this data for the summer.

Images wasn’t the only surprise, though. Bing and Yahoo! combine for not even 1/10th of Google.com’s search volume. DuckDuckGo, despite their tiny footprint compared to Facebook, have almost as many searches as the social media giant. Amazon has almost as many searches as Bing. And YouTube.com’s searches are nearly twice the size of Bing’s (on web browsers only — remember that Jumpshot won’t capture searches in the YouTube app on mobile, tablet, or TV devices).

For the future, I also want to look at data for Google Shopping, MSN, Pinterest, Twitter, LinkedIn, Gmail, Yandex, Baidu, and Reddit. My suspicion is that none of those have as many searches as those above, but I’d love to be surprised.

BTW — if you’re questioning this data compared to Comscore or Nielsen, I’d just point out that Jumpshot’s panel is vastly larger, and their methodology is much cleaner and more accurate, too (at least, IMO). They don’t do things like group site searches on Microsoft-owned properties into Bing’s search share or try to statistically sample and merge methodologies, and whereas Comscore has a *global* panel of 2 million, Jumpshot’s *US-only* panel of devices is considerably larger.

#16: What’s the distribution of search demand across keywords?

Let’s go back to looking only at keyword searches on Google. Based on October’s searches, the top 1MM queries accounts for about 25% of all searches with the top 10MM queries accounting for about 45% and the top 1BB queries accounting for close to 90%. Jumpshot’s kindly illustrated this for us:

The long tail is still very long indeed, with a huge amount of search volume taking place in keywords outside the top 10 million most-searched-for queries. In fact, almost 25% of all search volume happens outside the top 100 million keywords!

I illustrated this last summer with data from Russ’ analysis based on Clickstre.am data, and it matches up fairly well (though not exactly; Jumpshot’s panel is far larger).

#17: How many words does the average desktop vs. mobile searcher use in their queries?

According to Jumpshot, a typical searcher uses about 3 words in their search query. Desktop users have a slightly higher query length due to having a slightly higher share of queries of 6 words or more than mobile (16% for desktop vs. 14% for mobile).

I was actually surprised to see how close desktop and mobile are. Clearly, there’s not as much separation in query formation as some folks in our space have estimated (myself included).

#18: What percent of queries are phrased as questions?

For this data, Jumpshot used any queries that started with the typical “Who,” “What,” “Where,” “When,” “Why,” and “How,” as well as “Am” (e.g. Am I registered to vote?) and “Is” (e.g. Is it going to rain tomorrow?). The data showed that ~8% of search queries are phrased as questions .

#19: What is the difference in paid vs. organic CTR on mobile compared to desktop?

This is one of those data points I’ve been longing for over many years. We’ve always suspected CTR on mobile is lower than on desktop, and now it’s confirmed.

For mobile devices, 40.9% of Google searches result in an organic click, 2% in a paid click, and 57.1% in no click at all. For desktop devices, 62.2% of Google searches result in an organic click, 2.8% in a paid click, and 35% in no click. That’s a pretty big delta, and one that illustrates how much more opportunity there still is in SEO vs. PPC. SEO has ~20X more traffic opportunity than PPC on both mobile and desktop. If you’ve been arguing that mobile has killed SEO or that SERP features have killed SEO or, really, that anything at all has killed SEO, you should probably change that tune.

#20: What percent of queries on Google result in the searcher changing their search terms without clicking any results?

You search. You don’t find what you’re seeking. So, you change your search terms, or maybe you click on one of Google’s “Searches related to…” at the bottom of the page.

I’ve long wondered how often this pattern occurs, and what percent of search queries lead not to an answer, but to another search altogether. The answer is shockingly big: a full 18% of searches lead to a change in the search query!

No wonder Google has made related searches and “people also ask” such a big part of the search results in recent years.

#21: What percent of Google queries lead to more than one click on the results?

Some of us use ctrl+click to open up multiple tabs when searching. Others click one result, then click back and click another. Taken together, all the search behaviors that result in more than one click following a single search query in a session combine for 21%. That’s 21% of searches that lead to more than one click on Google’s results.

#22: What percent of Google queries result in pogo-sticking (i.e. the searcher clicks a result, then bounces back to the search results page and chooses a different result)?

As SEOs, we know pogo-sticking is a bad thing for our sites, and that Google is likely using this data to reward pages that don’t get many pogo-stickers and nudge down those who do. Altogether, Jumpshot’s October data saw 8% of searches that followed this pattern of search > click > back to search > click a different result.

Over time, if Google’s successful at their mission of successfully satisfying more searchers, we’d expect this to go down. We’ll watch that the next time we collect results and see what happens.

#23: What percent of clicks on non-Google properties in the search results go to a domain in the top 100?

Many of us in the search and web marketing world have been worried about whether search and SEO are becoming “winner-take-all” markets. Thus, we asked Jumpshot to look at the distribution of clicks to the 100 domains that received the most Google search traffic (excluding Google itself) vs. those outside the top 100.

The results are somewhat relieving: 12.6% of all Google clicks go to the top 100 search-traffic-receiving domains. The other 87.4% are to sites in the chunky middle and long tail of the search-traffic curve.


Phew! That’s an immense load of powerful data, and over time, as we measure and report on this with our Jumpshot partners, we’re looking forward to sharing trends and additional numbers, too.

If you’ve got a question about searcher behavior or search/click patterns, please feel free to leave it in the comments. I’ll work with Russ and Randy to prioritize those requests and make the data available. It’s my goal to have updated numbers to share at this year’s MozCon in July.


** The following questions and responses from Jumpshot can illustrate some of the data and methodology’s limitations:

Rand: What search sources, if any, might be missed by Jumpshot’s methodology?
Jumpshot: We only looked at Google.com, except for the one question that asked specifically about Amazon, YouTube, DuckDuckGo, etc.

Rand: Do you, for example, capture searches performed in all Google apps (maps, search app, Google phone native queries that go to the web, etc)?
Jumpshot: Nothing in-app, but anything that opens a mobile browser — yes.

Rand: Do you capture all voice searches?
Jumpshot: If it triggers a web browser either on desktop or on mobile, then yes.

Rand: Is Google Home included?
Jumpshot: No.

Rand: Are searches on incognito windows included?
Jumpshot: Yes, should be since the plug-in is at the device level, we track any URL regardless.

Rand: Would searches in certain types of browsers (desktop or mobile) not get counted?
Jumpshot: From a browser perspective, no. But remember we have no iOS data so any browser being used on that platform will not be recorded.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

The State of Searcher Behavior Revealed Through 23 Remarkable Statistics

Posted by randfish

One of the marketing world’s greatest frustrations has long been the lack of data from Google and other search engines about the behavior of users on their platforms. Occasionally, Google will divulge a nugget of bland, hard-to-interpret information about how they process more than X billion queries, or how many videos were uploaded to YouTube, or how many people have found travel information on Google in the last year. But these numbers aren’t specific enough, well-sourced enough, nor do they provide enough detail to be truly useful for all the applications we have.

Marketers need to know things like: How many searches happen each month across various platforms? Is Google losing market share to Amazon? Are people really starting more searches on YouTube than Bing? Is Google Images more or less popular than Google News? What percent of queries are phrased as questions? How many words are in the average query? Is it more or less on mobile?

These kinds of specifics help us know where to put our efforts, how to sell our managers, teams, and clients on SEO investments, and, when we have this data over time, we can truly understand how this industry that shapes our livelihoods is changing. Until now, this data has been somewhere between hard and impossible to estimate. But, thanks to clickstream data providers like Jumpshot (which helps power Moz’s Keyword Explorer and many of our keyword-based metrics in Pro), we can get around Google’s secrecy and see the data for ourselves!

Over the last 6 months, Russ Jones and I have been working with Jumpshot’s Randy Antin, who’s been absolutely amazing — answering our questions late at night, digging in with his team to get the numbers, and patiently waiting while Russ runs fancy T-Distributions on large datasets to make sure our estimates are as accurate as possible. If you need clickstream data of any kind, I can’t recommend them enough.

If you’re wondering, “Wait… I think I know what clickstream data is, but you should probably tell me, Rand, just so I know that you know,” OK. 🙂 Clickstream monitoring means Jumpshot (and other companies like them — SimilarWeb, Clickstre.am, etc.) have software on the device that records all the pages visited in a browser session. They anonymize and aggregate this data (don’t worry, your searches and visits are not tied to you or to your device), then make parts of it available for research or use in products or through APIs. They’re not crawling Google or any other sites, but rather seeing the precise behavior of devices as people use them to surf or search the Internet.

Clickstream data is awesomely powerful, but when it comes to estimating searcher behavior, we need scale. Thankfully, Jumpshot can deliver here, too. Their US panel of Internet users is in the millions (they don’t disclose exact size, but it’s between 2–10) so we can trust these numbers to reliably paint a representative picture. That said, there may still be biases in the data — it could be that certain demographics of Internet users are more or less likely to be in Jumpshot’s panel, their mobile data is limited to Android (no iOS), and we know that some alternative kinds of searches aren’t captured by their methodology**. Still, there’s amazing stuff here, and it’s vastly more than we’ve been able to get any other way, so let’s dive in.

23 Search Behavior Stats

Methodology: All of the data was collected from Jumpshot’s multi-million user panel in October 2016. T-distribution scaling was applied to validate the estimates of overall searches across platforms. All other data is expressed as percentages. Jumpshot’s panel includes mobile and desktop devices in similar proportions, though no devices are iOS, so users on Macs, iPhones, and iPads are not included.

#1: How many searches are *really* performed on Google.com each month?

On the devices and types of queries Jumpshot can analyze, there were an average of 3.4 searches/day/searcher. Using the T-Distribution scaling analysis on various sample set sizes of Jumpshot’s data, Russ estimated that the most likely reality is that between 40–60 billion searches happen on Google.com in the US each month.

Here’s more detail from Russ himself:

“…All of the graphs are non-linear in shape, which indicates that as the samples get bigger we are approaching correct numbers but not in a simple % relationship… I have given 3 variations based on the estimated number of searches you think happen in the US annually. I have seen wildly different estimates from 20 billion to 100 billion, so I gave a couple of options. My gut is to go with the 40 billion numbers, especially since once we reach the 100MM line for 40 and 60B, there is little to no increase for 1 billion keywords, which would indicate we have reached a point where each new keyword is searched just 1 time.”

How does that compare to numbers Google’s given? Well, in May of 2016, Google told Search Engine Land they “processed at least 2 trillion searches per year.” Using our Jumpshot-based estimates, and assuming October of 2016 was a reasonably average month for search demand, we’d get to 480–720 billion annual searches. That’s less than half of what Google claims, but Google’s number is WORLDWIDE! Jumpshot’s data here is only for the US. This suggests that, as Danny Sullivan pointed out in the SELand article, Google could well be handling much, much more than 2 trillion annual searches.

Note that we believe our 40–60 billion/month number is actually too low. Why? Voice searches, searches in the Google app and Google Home, higher search use on iOS (all four of which Jumpshot can’t measure), October could be a lower-than-average month, some kinds of search partnerships, and automated searches that aren’t coming from human beings on their devices could all mean our numbers are undercounting Google’s actual US search traffic. In the future, we’ll be able to measure interesting things like growth or shrinkage of search demand as we compare October 2016 vs other months.

#2: How long is the average Google search session?

Form the time of the initial query to the loading of the search results page and the selection of any results, plus any back button clicks to those SERPs and selection of new results, the all-in average was just under 1 minute. If that seems long, remember that some search sessions may be upwards of an hour (like when I research all the best ryokans in Japan before planning a trip — I probably clicked 7 pages deep into the SERPs and opened 30 or more individual pages). Those long sessions are dragging up that average.

#3: What percent of users perform one or more searches on a given day?

This one blew my mind! Of the millions of active, US web users Jumpshot monitored in October 2016, only 15% performed at least one or more searches in a day. 45% performed at least one query in a week, and 68% performed one or more queries that month. To me, that says there’s still a massive amount of search growth opportunity for Google. If they can make people more addicted to and more reliant on search, as well as shape the flow of information and the needs of people toward search engines, they are likely to have a lot more room to expand searches/searcher.

#4: What percent of Google searches result in a click?

Google is answering a lot of queries themselves. From searches like “Seattle Weather,” to more complicated ones like “books by Kurt Vonnegut” or “how to remove raspberry stains?“, Google is trying to save you that click — and it looks like they’re succeeding.

66% of distinct search queries resulted in one or more clicks on Google’s results. That means 34% of searches get no clicks at all. If we look at all search queries (not just distinct ones), those numbers shift to a straight 60%/40% split. I wouldn’t be surprised to find that over time, we get closer and closer to Google solving half of search queries without a click. BTW — this is the all-in average, but I’ve broken down clicks vs. no-clicks on mobile vs. desktop in #19 below.

#5: What percent of clicks on Google search results go to AdWords/paid listings?

It’s less than I thought, but perhaps not surprising given how aggressive Google’s had to be with ad subtlety over the last few years. Of distinct search queries in Google, only 3.4% resulted in a click on an AdWords (paid) ad. If we expand that to all search queries, the number drops to 2.6%. Google’s making a massive amount of money on a small fraction of the searches that come into their engine. No wonder they need to get creative (or, perhaps more accurately, sneaky) with hiding the ad indicator in the SERPs.

#6: What percent of clicks on Google search results go to Maps/local listings?

This is not measuring searches and clicks that start directly from maps.google.com or from the Google Maps app on a mobile device. We’re talking here only about Google.com searches that result in a click on Google Maps. That number is 0.9% of Google search clicks, just under 1 in 100. We know from MozCast that local packs show up in ~15% of queries (though that may be biased by MozCast’s keyword corpus).

#7: What percent of clicks on Google search results go to links in the Knowledge Graph?

Knowledge panels are hugely popular in Google’s results — they show up in ~38% of MozCast‘s dataset. But they’re not nearly as popular for search click activity, earning only ~0.5% of clicks.

I’m not totally surprised by that. Knowledge panels are, IMO, more about providing quick answers and details to searchers than they are about drawing the click themselves. If you see Knowledge Panels in your SERPs, don’t panic too much that they’re taking away your CTR opportunity. This made me realize that Keyword Explorer is probably overestimating the degree to which Knowledge Panels remove organic CTR (e.g. Alice Springs, which has only a Knowledge Panel next to 10 blue links, has a CTR opportunity of 64).

#8: What percent of clicks on Google search results go to image blocks?

Images are one of the big shockers of this report overall (more on that later). While MozCast has image blocks in ~11% of Google results, Jumpshot’s data shows images earn 3% of all Google search clicks.

I think this happens because people are naturally drawn to images and because Google uses click data to specifically show images that earn the most engagement. If you’re wondering why your perfectly optimized image isn’t ranking as well in Google Images as you hoped, we’ve got strong suspicions and some case studies suggesting it might be because your visual doesn’t draw the eye and the click the way others do.

If Google only shows compelling images and only shows the image block in search results when they know there’s high demand for images (i.e. people search the web, then click the “image” tab at the top), then little wonder images earn strong clicks in Google’s results.

#9: What percent of clicks on Google search results go to News/Top Stories results?

Gah! We don’t know for now. This one was frustrating and couldn’t be gathered due to Google’s untimely switch from “News Results” to “Top Stories,” some of which happened during the data collection period. We hope to have this in the summer, when we’ll be collecting and comparing results again.

#10: What percent of clicks on Google search results go to Twitter block results?

I was expecting this one to be relatively small, and it is, though it slightly exceeded my expectations. MozCast has tweet blocks showing in ~7% of SERPs, and Jumpshot shows those tweets earning ~0.23% of all clicks.

My guess is that the tweets do very well for a small set of search queries, and tend to be shown less (or shown lower in the results) over time if they don’t draw the click. As an example, search results for my name show the tweet block between organic position #1 and #2 (either my tweets are exciting or the rest of my results aren’t). Compare that to David Mihm, who tweeted very seldomly for a long while and has only recently been more active — his tweets sit between positions #4 and #5. Or contrast with Dr. Pete, whose tweets are above the #1 spot!

#11: What percent of clicks on Google search results go to YouTube?

Technically, there are rare occasions when a video from another provider (usually Vimeo) can appear in Google’s SERPs directly. But more than 99% of videos in Google come from YouTube (which violates anti-competitive laws IMO, but since Google pays off so many elected representatives, it’s likely not an issue for them). Thus, we chose to study only YouTube rather than all video results.

MozCast shows videos in 6.3% of results, just below tweets. In Jumpshot’s data, YouTube’s engagement massively over-performed its raw visibility, drawing 1.8% of all search clicks. Clearly, for those searches with video intent behind them, YouTube is delivering well.

#12: What percent of clicks on Google search results go to personalized Gmail/Google Mail results?

I had no guess at all on this one, and it’s rarely discussed in the SEO world because it’s so relatively difficult to influence and obscure. We don’t have tracking data via MozCast because these only show in personalized results for folks logged in to their Gmail accounts when searching, and Google chooses to only show them for certain kinds of queries.

Jumpshot, however, thanks to clickstream tracking, can see that 0.16% of search clicks go to Gmail or Google Mail following a query, only a little under the number of clicks to tweets.

#13: What percent of clicks on Google search results go to Google Shopping results?

The Google Shopping ads have become pretty compelling — the visuals are solid, the advertisers are clearly spending lots of effort on CTR optimization, and the results, not surprisingly, reflect this.

MozCast has Shopping results in 9% of queries, while clickstream data shows those results earning 0.55% of all search clicks.

#14: What percent of Google searches result in a click on a Google property?

Google has earned a reputation over the last few years of taking an immense amount of search traffic for themselves — from YouTube to Google Maps to Gmail to Google Books and the Google App Store on mobile, and even Google+, there’s a strong case to be made that Google’s eating into opportunity for 3rd parties with bets of their own that don’t have to play by the rules.

Honestly, I’d have estimated this in the 20–30 percent range, so it surprised me to see that, from Jumpshot’s data, all Google properties earned only 11.8% of clicks from distinct searches (only 8.4% across all searches). That’s still significant, of course, and certainly bigger than it was 5 years ago, but given that we know Google’s search volume has more than doubled in the last 5 years, we have to be intellectually honest and say that there’s vastly more opportunity in the crowded-with-Google’s-own-properties results today than there was in the cleaner-but-lower-demand SERPs of 5 years ago.

#15: What percent of all searches happen on any major search property in the US?

I asked Jumpshot to compare 10 distinct web properties, add together all the searches they receive combined, and share the percent distribution. The results are FASCINATING!

Here they are in order:

  1. Google.com 59.30%
  2. Google Images 26.79%
  3. YouTube.com 3.71%
  4. Yahoo! 2.47%
  5. Bing 2.25%
  6. Google Maps 2.09%
  7. Amazon.com 1.85%
  8. Facebook.com 0.69%
  9. DuckDuckGo 0.56%
  10. Google News 0.28%

I’ve also created a pie chart to help illustrate the breakdown:

Distribution of US Searches October 2016

If the Google Images data shocks you, you’re not alone. I was blown away by the popularity of image search. Part of me wonders if Halloween could be responsible. We should know more when we re-collect and re-analyze this data for the summer.

Images wasn’t the only surprise, though. Bing and Yahoo! combine for not even 1/10th of Google.com’s search volume. DuckDuckGo, despite their tiny footprint compared to Facebook, have almost as many searches as the social media giant. Amazon has almost as many searches as Bing. And YouTube.com’s searches are nearly twice the size of Bing’s (on web browsers only — remember that Jumpshot won’t capture searches in the YouTube app on mobile, tablet, or TV devices).

For the future, I also want to look at data for Google Shopping, MSN, Pinterest, Twitter, LinkedIn, Gmail, Yandex, Baidu, and Reddit. My suspicion is that none of those have as many searches as those above, but I’d love to be surprised.

BTW — if you’re questioning this data compared to Comscore or Nielsen, I’d just point out that Jumpshot’s panel is vastly larger, and their methodology is much cleaner and more accurate, too (at least, IMO). They don’t do things like group site searches on Microsoft-owned properties into Bing’s search share or try to statistically sample and merge methodologies, and whereas Comscore has a *global* panel of 2 million, Jumpshot’s *US-only* panel of devices is considerably larger.

#16: What’s the distribution of search demand across keywords?

Let’s go back to looking only at keyword searches on Google. Based on October’s searches, the top 1MM queries accounts for about 25% of all searches with the top 10MM queries accounting for about 45% and the top 1BB queries accounting for close to 90%. Jumpshot’s kindly illustrated this for us:

The long tail is still very long indeed, with a huge amount of search volume taking place in keywords outside the top 10 million most-searched-for queries. In fact, almost 25% of all search volume happens outside the top 100 million keywords!

I illustrated this last summer with data from Russ’ analysis based on Clickstre.am data, and it matches up fairly well (though not exactly; Jumpshot’s panel is far larger).

#17: How many words does the average desktop vs. mobile searcher use in their queries?

According to Jumpshot, a typical searcher uses about 3 words in their search query. Desktop users have a slightly higher query length due to having a slightly higher share of queries of 6 words or more than mobile (16% for desktop vs. 14% for mobile).

I was actually surprised to see how close desktop and mobile are. Clearly, there’s not as much separation in query formation as some folks in our space have estimated (myself included).

#18: What percent of queries are phrased as questions?

For this data, Jumpshot used any queries that started with the typical “Who,” “What,” “Where,” “When,” “Why,” and “How,” as well as “Am” (e.g. Am I registered to vote?) and “Is” (e.g. Is it going to rain tomorrow?). The data showed that ~8% of search queries are phrased as questions .

#19: What is the difference in paid vs. organic CTR on mobile compared to desktop?

This is one of those data points I’ve been longing for over many years. We’ve always suspected CTR on mobile is lower than on desktop, and now it’s confirmed.

For mobile devices, 40.9% of Google searches result in an organic click, 2% in a paid click, and 57.1% in no click at all. For desktop devices, 62.2% of Google searches result in an organic click, 2.8% in a paid click, and 35% in no click. That’s a pretty big delta, and one that illustrates how much more opportunity there still is in SEO vs. PPC. SEO has ~20X more traffic opportunity than PPC on both mobile and desktop. If you’ve been arguing that mobile has killed SEO or that SERP features have killed SEO or, really, that anything at all has killed SEO, you should probably change that tune.

#20: What percent of queries on Google result in the searcher changing their search terms without clicking any results?

You search. You don’t find what you’re seeking. So, you change your search terms, or maybe you click on one of Google’s “Searches related to…” at the bottom of the page.

I’ve long wondered how often this pattern occurs, and what percent of search queries lead not to an answer, but to another search altogether. The answer is shockingly big: a full 18% of searches lead to a change in the search query!

No wonder Google has made related searches and “people also ask” such a big part of the search results in recent years.

#21: What percent of Google queries lead to more than one click on the results?

Some of us use ctrl+click to open up multiple tabs when searching. Others click one result, then click back and click another. Taken together, all the search behaviors that result in more than one click following a single search query in a session combine for 21%. That’s 21% of searches that lead to more than one click on Google’s results.

#22: What percent of Google queries result in pogo-sticking (i.e. the searcher clicks a result, then bounces back to the search results page and chooses a different result)?

As SEOs, we know pogo-sticking is a bad thing for our sites, and that Google is likely using this data to reward pages that don’t get many pogo-stickers and nudge down those who do. Altogether, Jumpshot’s October data saw 8% of searches that followed this pattern of search > click > back to search > click a different result.

Over time, if Google’s successful at their mission of successfully satisfying more searchers, we’d expect this to go down. We’ll watch that the next time we collect results and see what happens.

#23: What percent of clicks on non-Google properties in the search results go to a domain in the top 100?

Many of us in the search and web marketing world have been worried about whether search and SEO are becoming “winner-take-all” markets. Thus, we asked Jumpshot to look at the distribution of clicks to the 100 domains that received the most Google search traffic (excluding Google itself) vs. those outside the top 100.

The results are somewhat relieving: 12.6% of all Google clicks go to the top 100 search-traffic-receiving domains. The other 87.4% are to sites in the chunky middle and long tail of the search-traffic curve.


Phew! That’s an immense load of powerful data, and over time, as we measure and report on this with our Jumpshot partners, we’re looking forward to sharing trends and additional numbers, too.

If you’ve got a question about searcher behavior or search/click patterns, please feel free to leave it in the comments. I’ll work with Russ and Randy to prioritize those requests and make the data available. It’s my goal to have updated numbers to share at this year’s MozCon in July.


** The following questions and responses from Jumpshot can illustrate some of the data and methodology’s limitations:

Rand: What search sources, if any, might be missed by Jumpshot’s methodology?
Jumpshot: We only looked at Google.com, except for the one question that asked specifically about Amazon, YouTube, DuckDuckGo, etc.

Rand: Do you, for example, capture searches performed in all Google apps (maps, search app, Google phone native queries that go to the web, etc)?
Jumpshot: Nothing in-app, but anything that opens a mobile browser — yes.

Rand: Do you capture all voice searches?
Jumpshot: If it triggers a web browser either on desktop or on mobile, then yes.

Rand: Is Google Home included?
Jumpshot: No.

Rand: Are searches on incognito windows included?
Jumpshot: Yes, should be since the plug-in is at the device level, we track any URL regardless.

Rand: Would searches in certain types of browsers (desktop or mobile) not get counted?
Jumpshot: From a browser perspective, no. But remember we have no iOS data so any browser being used on that platform will not be recorded.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

Google Algorithmic Penalties Still Happen, Post-Penguin 4.0

Posted by MichaelC-15022

When Penguin 4.0 launched in September 2016, the story from Gary Illyes of Google was that Penguin now just devalued spammy links, rather than penalizing a site by adjusting the site’s ranking downward, AKA a penalty.

Apparently for Penguin there is now “less need” for a disavow, according to a Facebook discussion between Gary Illyes and Barry Schwartz of Search Engine Land back in September. He suggested that webmasters can help Google find spammy sites by disavowing links they know are bad. He also mentioned that manual actions still happen — and so I think we can safely infer that the disavow file is still useful in manual penalty recovery.

But algorithmic penalties DO still exist. A client of mine, who’d in the past built a lot of really spammy links to one of their sites, had me take a look at their backlinks about 10 days ago and build a disavow file. There was no manual penalty indicated in Search Console, but they didn’t rank at all for terms they were targeting — and they had a plenty strong backlink profile even after ignoring the spammy links.

I submitted the disavow file on March 2nd, 2017. Here’s the picture of what happened to their traffic:

4 days after the disavow file submission, their traffic went from just a couple hundred visits/day from Google search to nearly 3,000.

Penguin might no longer be handing out penalties, but clearly there are still algorithmic penalties handed out by Google. And clearly, the disavow file still works on these algorithmic penalties.

Perhaps we just need to give them another animal name. (Personally, I like the Okapi… goes along with the black-and-white animal theme, and, like Google algorithmic penalties, hardly anyone knows they still exist.)

Image courtesy Chester Zoo on Flickr.

I look forward to animated comments from other SEOs and webmasters who might have been suspecting the same thing!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

Rankings Correlation Study: Domain Authority vs. Branded Search Volume

Posted by Tom.Capper

A little over two weeks ago I had the pleasure of speaking at SearchLove San Diego. My presentation, Does Google Still Need Links, looked at the available evidence on how and to what extent Google is using links as a ranking factor in 2017, including the piece of research that I’m sharing here today.

One of the main points of my presentation was to argue that while links still do represent a useful source of information for Google’s ranking algorithm, Google now has many other sources, most of which they would never have dreamed of back when PageRank was conceived as a proxy for the popularity and authority of websites nearly 20 years ago.

Branded search volume is one such source of information, and one of the sources that is most accessible for us mere mortals, so I decided to take a deeper look on how it compared with a link-based metric. It also gives us some interesting insight into the KPIs we should be pursuing in our off-site marketing efforts — because brand awareness and link building are often conflicting goals.

For clarity, by branded search volume, I mean the monthly regional search volume for the brand of a ranking site. For example, for the page https://www.walmart.com/cp/Gift-Cards/96894, this would be the US monthly search volume for the term “walmart” (as given by Google Keyword Planner). I’ve written more about how I put together this dataset and dealt with edge cases below.

When picking my link-based metric for comparison, domain authority seemed a natural choice — it’s domain-level, which ought to be fair given that generally that’s the level of precision with which we can measure branded search volume, and it came out top in Moz’s study of domain-level link-based factors.

A note on correlation studies

Before I go any further, here’s a word of warning on correlation studies, including this one: They can easily miss the forest for the trees.

For example, the fact that domain authority (or branded search volume, or anything else) is positively correlated with rankings could indicate that any or all of the following is likely:

  • Links cause sites to rank well
  • Ranking well causes sites to get links
  • Some third factor (e.g. reputation or age of site) causes sites to get both links and rankings

That’s not to say that correlation studies are useless — but we should use them to inform our understanding and prompt further investigation, not as the last word on what is and isn’t a ranking factor.

Methodology

(Or skip straight to the results!)

The Moz study referenced above used the provided 800 sample keywords from all 22 top-level categories in Google Keyword Planner, then looked at the top 50 results for each of these. After de-duplication, this results in 16,521 queries. Moz looked at only web results (no images, answer boxes, etc.), ignored queries with fewer than 25 results in total, and, as far as I can tell, used desktop rankings.

I’ve taken a slightly different approach. I reached out to STAT to request a sample of ~5,000 non-branded keywords for the US market. Like Moz, I stripped out non-web results, but unlike Moz, I also stripped out anything with a baserank worse than 10 (baserank being STAT’s way of presenting the ranking of a search result when non-web results are excluded). You can see the STAT export here.

Moz used Mean Spearman correlations, which is a process that involves ranking variables for each keyword, then taking the average correlation across all keywords. I’ve also chosen this method, and I’ll explain why using the below example:

Keyword

SERP Ranking Position

Ranking Site

Branded Search Volume of Ranking Site

Per Keyword Rank of Branded Search Volume

Keyword A

1

example1.com

100,000

1

Keyword A

2

example2.com

10,000

2

Keyword A

3

example3.com

1,000

3

Keyword A

4

example4.com

100

4

Keyword A

5

example5.com

10

5

For Keyword A, we have wildly varying branded search volumes in the top 5 search results. This means that search volume and rankings could never be particularly well-correlated, even though the results are perfectly sorted in order of search volume.

Moz’s approach avoids this problem by comparing the ranking position (the 2nd column in the table) with the column on the far right of the table — how each site ranks for the given variable.

In this case, correlating ranking directly with search volume would yield a correlation of (-)0.75. Correlating with ranked search volume yields a perfect correlation of 1.

This process is then repeated for every keyword in the sample (I counted desktop and mobile versions of the same keyword as two keywords), then the average correlation is taken.

Defining branded search volume

Initially, I thought that pulling branded search volume for every site in the sample would be as simple as looking up the search volume for their domain minus its subdomain and TLD (e.g. “walmart” for https://www.walmart.com/cp/Gift-Cards/96894). However, this proved surprisingly deficient. Take these examples:

  • www.cruise.co.uk
  • ecotalker.wordpress.com
  • www.sf.k12.sd.us

Are the brands for these sites “cruise,” “wordpress,” and “sd,” respectively? Clearly not. To figure out what the branded search term was, I started by taking each potential candidate from the URL, e.g., for ecotalker.wordpress.com:

  • Ecotalker
  • Ecotalker wordpress
  • Wordpress.com
  • Wordpress

I then worked out what the highest search volume term was for which the subdomain in question ranked first — which in this case is a tie between “Ecotalker” and “Ecotalker wordpress,” both of which show up as having zero volume.

I’m leaning fairly heavily on Google’s synonym matching in search volume lookup here to catch any edge-edge-cases — for example, I’m confident that “ecotalker.wordpress” would show up with the same search volume as “ecotalker wordpress.”

You can see the resulting dataset of subdomains with their DA and branded search volume here.

(Once again, I’ve used STAT to pull the search volumes in bulk.)

The results: Brand awareness > links

Here’s the main story: branded search volume is better correlated with rankings than domain authority is.

However, there’s a few other points of interest here. Firstly, neither of these variables has a particularly strong correlation with rankings — a perfect correlation would be 1, and I’m finding a correlation between domain authority and rankings of 0.071, and a correlation between branded search volume and rankings of 0.1. This is very low by the standards of the Moz study, which found a correlation of 0.26 between domain authority and rankings using the same statistical methods.

I think the biggest difference that accounts for this is Moz’s use of 50 web results per query, compared to my use of 10. If true, this would imply that domain authority has much more to do with what it takes to get you onto the front page than it has to do with ranking in the top few results once you’re there.

Another potential difference is in the types of keyword in the two samples. Moz’s study has a fairly even breakdown of keywords between the 0–10k, 10k–20k, 20k–50k, and 50k+ buckets:

On the other hand, my keywords were more skewed towards the low end:

However, this doesn’t seem to be the cause of my lower correlation numbers. Take a look at the correlations for rankings for high volume keywords (10k+) only in my dataset:

Although the matchup between the two metrics gets a lot closer here, the overall correlations are still nowhere near as high as Moz’s, leading me to attribute that difference more to their use of 50 ranking positions than to the keywords themselves.

It’s worth noting that my sample size of high volume queries is only 980.

Regression analysis

Another way of looking at the relationship between two variables is to ask how much of the variation in one is explained by the other. For example, the average rank of a page in our sample is 5.5. If we have a specific page that ranks at position 7, and a model that predicts it will rank at 6, we have explained 33% of its variation from the average rank (for that particular page).

Using the data above, I constructed a number of models to predict the rankings of pages in my sample, then charted the proportion of variance explained by those models below (you can read more about this metric, normally called the R-squared, here).

Some explanations:

  • Branded Search Volume of the ranking site – as discussed above
  • Log(Branded Search Volume) – Taking the log of the branded search volume for a fairer comparison with domain authority, where, for example, a DA 40 site is much more than twice as well linked to as a DA 20 site.
  • Ranked Branded Search Volume – How this site’s branded search volume compares to that of other sites ranking for the same keyword, as discussed above

Firstly, it’s worth noting that despite the very low R-squareds, all of the variables listed above were highly statistically significant — in the worst case scenario, within a one ten-millionth of a percent of being 100% significant. (In the best case scenario being a vigintillionth of a vigintillionth of a vigintillionth of a nonillionth of a percent away.)

However, the really interesting thing here is that including ranked domain authority and ranked branded search volume in the same model explains barely any more variation than just ranked branded search volume on its own.

To be clear: Nearly all of the variation in rankings that we can explain with reference to domain authority we could just as well explain with reference to branded search volume. On the other hand, the reverse is not true.

If you’d like to look into this data some more, the full set is here.

Nice data. Why should I care?

There are two main takeaways here:

  1. If you care about your domain authority because it’s correlated with rankings, then you should care at least as much about your branded search volume.
  2. The correlation between links and rankings might sometimes be a bit of a red-herring — it could be that links are themselves merely correlated with some third factor which better explains rankings.

There are also a bunch of softer takeaways to be had here, particularly around how weak (if highly statistically significant) both sets of correlations were. This places even more emphasis on relevancy and intent, which presumably make up the rest of the picture.

If you’re trying to produce content to build links, or if you find yourself reading a post or watching a presentation around this or any other link building techniques in the near future, there are some interesting questions here to add to those posed by Tomas Vaitulevicius back in November. In particular, if you’re producing content to gain links and brand awareness, it might not be very good at either, so you need to figure out what’s right for you and how to measure it.

I’m not saying in any of this that “links are dead,” or anything of the sort — more that we ought to be a bit more critical about how, why, and when they’re important. In particular, I think that they might be of increasingly little importance on the first page of results for competitive terms, but I’d be interested in your thoughts in the comments below.

I’d also love to see others conduct similar analysis. As with any research, cross-checking and replication studies are an important step in the process.

Either way, I’ll be writing more around this topic in the near future, so watch this space!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

Better Alternatives to "Expert Roundup"-Style Content – Whiteboard Friday

Posted by randfish

You may be tempted to publish that newest round of answers you’ve gotten from industry experts, but hold off — there’s a better way. In today’s Whiteboard Friday, Rand explains why expert roundups just aren’t the best use of your time and effort, and how to pivot your strategy to create similar content that’ll make the juice worth the squeeze.

Alternatives to expert roundup style content

Click on the whiteboard image above to open a high-resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to look at some better alternatives to the expert roundup-style content that’s become extremely popular on the web. There are a few reasons why it’s popular. So let’s talk about why SEOs and content marketers do so many expert roundups, why this became a popular content format.

Why do SEOs and content marketers even use “expert roundups?”

Okay. It turns out if you’ve got a piece of content that’s like “75 Experts Share Their Favorite Constitutional Law Cases,” maybe you interviewed a bunch of constitutional laws scholars and you put together this article, there’s a bunch of nice things that you actually do get from this, which is why people use this format, right?

You kind of get automatic outreach, because if you talk to these people, you’ve had a connection with them. You’ve built a little bit of a relationship. There’s now something of an incentive to share for these folks and the potential for a link. All of those are sort of elements that people are looking for, well, that marketers are looking for from their content.

The nice thing is you’ve got this long cadre of individuals who have contributed, and they create the content, which means you don’t have to, saving you a bunch of time and energy. They become your amplifier so you can kind of sit back and relax when it comes time to broadcast it out there. You just tell them it’s ready, and they go and push it. They lend your content credibility. So even if you don’t have any credibility with your brand or with your website, they deliver it for you. You don’t have to do that.

There are a few big problems with this kind of content.

Those are all really nice things. Don’t get me wrong. I understand why. But there are some big, big problems with expert roundup-style content.

1. Like many easy-to-replicate tactics, expert roundups become WAY overdone.

First one, like many of the easy to replicate tactics, expert roundup has got spam to hack. They became way, way overdone. I get emails like this. “Dear Fishkin, I roundup. You write. Do this. Then share. Okay. Bye, Spammy McSpams-A-Lot.”

Look, Mr. McSpams-A-Lot, I appreciate how often you think of me. I love that every day there are a couple of offers like this in my inbox. I try to contribute to less than one every two or three weeks and only the ones that look super credible and real interesting. But jeez, can you imagine if you are truly an expert, who can lend credibility and create lots of amplification, you’re getting overwhelmed with these kinds of requests, and people are probably getting very tired of reading them, especially in certain market segments where they’ve become way too overdone.

2. It’s hard for searchers to get valuable, useful info via this format — and search engines don’t like it, either.

But even if it’s the case that you can get all these experts to contribute and it’s not overdone in your market space, there are two other big problems. One, the content format is awful, awful for trying to get valuable and useful information. It rarely actually satisfies either searchers or engines.

If you search for constitutional law cases and you see “75 Experts Share Their Favorite Constitutional Law Cases,” you might click. But my god, have you gone through those types of content? Have you tried to read a lot of those roundups? They are usually awful, just terrible.

You might get a nugget here or there, but there’s a bunch of contributions that are multiple paragraphs long and try to include links back to wherever the expert is trying to get their links going. There’s a bunch of them that are short and meaningless. Many of them overlap.

It’s annoying. It’s bad. It’s not well-curated. It’s not well-put together. There are exceptions. Sometimes people put real effort into them and they get good, but most of the time these are real bad things, and you rarely see them in the search results.

BuzzSumo did a great analysis of content that gets shares and gets links and gets rankings. Guess what did not fall into it — expert roundups.

3. Roundups don’t earn as many links, and the traffic spike from tweets is temporary.

Number three. That’s number three. The links that the creators want from these roundups, that they’re hoping they’re going to get, it doesn’t end up there most of the time. What usually happens is you get a short traffic spike, some additional engagement, some additional activity on mostly Twitter, sometimes a little bit Facebook or LinkedIn, but it’s almost all social activity, and it’s a very brief spike.

5 formats to try instead

So what are some better alternatives? What are some things we can do? Well, I’ve got five for you.

1. Surveys

First off, if you’re going to be creating content that is around a roundup, why not do almost exactly the same process, but rather than asking a single question or a set of questions that people are replying to, ask them to fill out a short survey with a few data points, because then you can create awesome graphs and visuals, which have much stronger link earning potential. It’s the same outreach effort, but for much more compelling content that often does a better job of ranking, is often more newsworthy and link worthy. I really, really like surveys, and I think that they can work tremendously well if you can put them together right.

2. Aggregations of public data

Second, let’s say you go, “Oh, Rand, that would be great, but I want to survey people about this thing, and they won’t give me the information that I’m looking for.” Never fear. You can aggregate public data.

So a lot of these pieces of information that may be interesting to your audience, that you could use to create cool visuals, the graphs and charts and all that kind of thing and trend lines, are actually available on the web. All you need to do is cite those sources, pull in that data, build it yourself, and then you can outreach to the people who are behind these companies or these organizations or these individuals, and then say, “Hey, I made this based on public data. Can you correct any errors?” Now you’ve got the outreach, which can lead to the incentive to share and to build a link. Very cool.

3. Experiments and case studies

So this is taking a much smaller group, saying, “I’m only going to work with this one person or these couple of people, or I’m going to do it myself. Here’s what Seattle’s most influential law firm found when they challenged 10 state laws.” Well, there you go. Now I’ve got an interesting, wholly formed case study. I only had to work with one expert, but chances are good that lots and lots of people will be interested in this. It’s also excellent for newsworthiness. It often can get lots of press coverage in whatever industry you’re in.

4. Seeking out controversial counter-opinions on a topic

Fourth, if you’re going to do a roundup-style thing and you’re going to collect multiple opinions, if you can find a few points or a single subject around which multiple experts have different opinions, that could be just two people, it could be four or five, it could be seven or eight, but you’re basically trying to create this controversy.

You’re saying like, “Here are these people on this side of this issue. Here are these people on this side of this issue, Wil Reynolds versus Rand Fishkin on link building.” I think we did a presentation like that in Minneapolis last year or a couple years ago. It was super fun. Wil and I got up on stage, and we sort of debated with each other. There were no losers in that debate. It was great.

This leverages the emotional response you’re seeking of conflict. It creates more engaging content by far, and there’s more incentive for the parties who participate to link and share, because they’re sort of showing off their opinion and trying to make counterpoints. You can get a lot of good things.

5. Not just text!

Number five. If you’ve decided, “You know what? None of these formats or any others work. I really, really want to do a roundup. I think it can work for me,” okay. But do me a favor and try something that is not just text, not just text.

Muzli is a newsletter I subscribe to in the design world that does lots of roundup-style content, but the roundups are all visuals. They’re visuals. They’re like UI interactions and GIFs and animations and illustrations. I actually really love those. Those get great engagement, and they rank, by the way. They rank quite well. Many of the ones that they link to in the newsletter do well.

You can do this with visuals. You can do it with data. You could do it with revenue numbers. You could do it with tools. You could do it with products, whatever it is.

I would suggest thinking a little more broadly than, “Dear Fishkin, I roundup. You write.” I think that there’s a lot more opportunity outside of the pure expert roundup space, and I hope you’ll share your creative ideas with us and the successes you’ve seen.

We look forward to seeing you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →