Archives for 

seo

The 2013 Local Search Ecosystems (and a GetListed Upgrade)

Posted by David-Mihm

Well, it’s been nearly a year since I published the last version(s) of this graphic. That’s a long time in a space that evolves as quickly as Local Search, but frankly, 2013 hasn’t seen quite the turmoil of 2012, in which Google+ Local, Apple Maps, and Facebook Nearby were all released within seven months of each other.

We’ll be adding all of these graphics to the GetListed.org Learning Center in the next few weeks, with full references and screenshots showing attribution. But while I had a bit of time before the fall conference season—I’ll be speaking more about these at Local University Advanced in just a few weeks—I thought I’d consolidate my thoughts and get them into a blog post.

My thoughts on the U.S. ecosystem

The Big Three are now the Big Four

Since I first started researching the local search space back in 2006-2007, Infogroup, Localeze, and Acxiom have been the undisputed primary data suppliers in the U.S.

Although multiple independent sources heard from Yelp this summer that they no longer actively ingest data from Acxiom, Acxiom is one of only two suppliers mentioned on Google Maps’ legal notices page, and they’ve fed data to Apple Maps since it launched.

It’s always been difficult for me to recommend an answer to the question, “Which data aggregator would you pay to manage data with?” My standard answer has always been “all three.” But if you are looking to prioritize your local marketing spend, I hope the graphics below showing each provider’s publicly verifiable network assist with that.

Factual is a relatively new player on the scene—they were barely on my radar less than two years ago. And yet today, if you visit their homepage, you see a who’s who of local search portals, including Yelp, Bing, and TripAdvisor. It’s clear they’re a force to be reckoned with, especially globally (more on that below).

Aside: the GetListed upgrade

As a result of Acxiom’s resurgence and Factual’s emergence, for the last several months we’ve been working to add both to the roster of data platforms we display on GetListed. I’m excited to announce their release today. Big thanks to Adrian, Frank, and Josh for making those additions happen this summer.

Foursquare as a data provider?

The fragmentation of the location-based app market is only going to increase, and like Factual, Foursquare has turned its sights on becoming “the location layer for the Internet.” Its API has been quite reliable for GetListed, at least, and it surely counts a healthy percentage of web developers among its 40-odd-million users, whom it’s now enlisting in a quest to provide extremely fine-grained venue data.

If Foursquare can expand its typical venue categories beyond food, drink, and entertainment, it could become even more of a key player despite a declining rate of user growth. I still wouldn’t be surprised to see Foursquare purchased by the end of the year, but the list of companies who both need and could afford it is slimming considerably as its dataset continues to get better.

The traditional IYPs have it tough

From a citation-strength standpoint, few traditional directories are competing favorably with Yelp across a broad array of categories. Citysearch, Superpages, Yahoo, and YP.com are still very strong players, but with Citysearch laying off a substantial percentage of its staff recently and Superpages’ merge with Dex, it’s pretty clear that a lot of consolidation and reconfiguration is happening among the major players.

It also seems that vertical and geo-focused directories, and even unstructured local citations, are playing a larger role than ever in competitive search categories. With so many traditional local search sites offering free listings to business owners, citations from traditional providers now appear to be “table stakes” in Local SEO…but the sites that offer those listings are continuing to have a hard time monetizing them.

What’s Apple up to?

It’s been almost exactly a year since Apple’s less-than-impressive release of Maps. The good folks in Cupertino went silent for a good long while before making a couple of key summer acquisitions: Locationary and HopStop. For our little world, Locationary is the more relevant purchase. Grant Ritchie and his team essentially built their own version of Map Maker (see below)—an efficient system of ingesting data from multiple sources and making sense of it.

I don’t see the Locationary acquisition affecting any of Apple’s existing data relationships imminently, but expect we’ll start to see a lot faster pace of innovation with their mapping platform in the coming year. And the quality of data will get considerably better as Apple beefs up its Ground Truth and engineering forces.

The continued importance of Google Map Maker

One of the least-heralded but most important stories in the last year has been Google’s unification of its backend location database. There are now effectively four (and possibly more) public front-ends to this database: “Report a Problem” reports, Places and Google+ Page Management, and the Map Maker interface itself.

There’s still no substitute for querying Map Maker directly if you’re having persistent issues with incorrect business categorization, PIN placement, or duplicate listings, and Map Maker’s release in many, many more countries—including longtime holdout Italy—making it a relevant and useful tool for SEOs almost no matter where your clients are.

Internationally speaking

One of the least-obvious facts for newcomers to local search is that other than Google’s central position, every country’s ecosystem is different. Factual is one of the very few companies with a reliable global dataset, and the search giant relies on a completely different set of providers in each country that Maps operates. Typically these are established yellow pages players, such as YPG in Canada, Telelistas in Brasil, and Sensis in Australia.

Secondary and tertiary relationships can be considerably harder to tease out, but the graphics below represent my best effort to reconstruct these markets. I received a considerable amount of help on both Germany and Australia from Nyagoslav Zhekov of NGS Marketing, who may have more experience building citations in international markets than anyone in the world.

Thoughts on Canada:

In my introduction to the international section, I already mentioned the primacy of YPG in supplying data to Google, and in few markets around the world is there a single provider as dominant in its country than YPG. The number of prominent local search sites under the YPG umbrella is impressive, and may be a reason its digital revenues are responsible for a comparatively large share of its overall earnings.

Canada’s also relatively unique in that an arm of the Canadian Government, Industry Canada, offers such an easily-crawlable database of business information to the public. Whether Google has a formal relationship with Industry Canada or not, it’s clear that this data makes it into Google’s index. Thanks to Jen Salamandick of Kick Point for her empirical confirmation of this relationship.

Thoughts on the UK:

The UK features the most complex ecosystem of any country country in the world. At first glance, Google should have a dominant provider in BT, but my experience during a two-month sabbatical in the UK in May 2011 indicated that The Local Data Company, Market Location, and 118 Information were all more influential sources for data that would eventually wind up at Google. TouchLocal’s acquisition of Scoot in 2009 makes that duo a significant citation source as well. Qype and Yelp are both extremely well-crawled, and there are a number of geographically-focused directories, especially in Greater London, that Google is surely looking at.

Similar to Canada, there are two governmental entities—Companies House and the Royal Mail—whose datasets provide the backbone to a number of location indexes, I’m sure.

All this means a lot of work for UK SEO’s trying to clean up or establish citation profiles for their clients.

Thoughts on Germany:

In preparing for my SMX Munich presentation earlier this year, the primary providers in Germany clearly seemed to be the Deutsche Telekom-GelbeSeiten-Das Ortliche trifecta. German SEOs should not overlook infobel, however, the owner of Kapitol S.A., which is mentioned on the Google Maps’ legal notices page.

There are a myriad of secondary local search engines in Germany and in my research, their strength depended on the industry I was investigating. Qype was essentially the only dominant consumer portal horizontally, but Varta Guides and Restaurant-Kritik were exceptionally strong in travel and cuisine. If I’m a German SEO, I’m paying special attention to my client’s phone contract records and their listings on the associated GelbeSeiten, Das Telefonbuch, and Das Ortliche, updating Qype, and then I’m going straight for industry-specific directories, before circling back to the secondary search engines. That’s quite a different workflow from what I’d recommend here in the States.

Thoughts on Brazil:

The Brazilian market strikes me as one of the biggest global opportunities in local search. It’s a huge country with a lot of urban population centers, a relatively well-educated population, and high percentage of smartphone ownership. And from an SEO standpoint, it appears to be about four to five years behind the United States.

Certainly the complexity of the Local ecosystem is nowhere near that of more established markets. Telelistas and Apontador are the clear market leaders, and Yelp’s purchase of Qype looks like a smart investment in this market.

Conclusion

As I said in the introduction, we’ll be establishing a permanent archive for these ecosystems in the GetListed Learning Center in the next several weeks, but in the meantime, I look forward to hearing your questions and feedback in the comments below!

A final thanks to Gregory T’Kint of James Hargreaves Plumbing, Tom Lynch of Location3, Russ Offord of Orion Group for their correspondence regarding these ecosystems in the past year.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

Forget Google’s Games – Make Social a Primary Traffic Source

Posted by simonpenson

This post was originally in YouMoz, and was promoted to the main blog because it provides great value and interest to our community. The author’s views are entirely his or her own and may not reflect the views of Moz, Inc.

For most business owners social is a toy. The marketing equivalent of that friend we all have outside our professional lives that you just don’t talk to your work colleagues about. Even though they are really the life and soul of the party.

But things are changing. A recent Forrester survey unlocked some startling stats about how we are all discovering things online.

Discovery is Google’s heartland. The very thing it has built its empire upon and yet it seems that a seismic shift is occurring. According to the research piece almost 50% of those in the 18 to 23 bracket used social as their primary discovery engine in the last year.

Those stats should make Google think very carefully indeed about its future strategy and how to keep its users, and advertisers, happy.

For a marketer such information should be impetus to think very seriously about strategy, and where to invest that budget in the next few years as we are finally given another way to access audiences at scale online.

Tipping Point

As powerful as that survey is, however, it is not enough alone to get you to invest hard fought marketing dollars on social, but for me there is an even more powerful argument brewing.

As we know, the ability to intelligently target consumers has always been the zenith for marketers. To do it well you need to be able to collect, and then slice and dice, information about the people using your platform.

Google’s been pretty good at this to date, as the propensity to buy has always been high from search queries. Social, however takes that data to a whole other level.

It is this, and the fact that in attempting to monetize the social space themselves the likes of Facebook and Twitter have opened the door for all of us to do the same thing with our own social audiences that really does suggest that a true tipping point has now been reached.

Is Google walking a dangerous line?

The problem for social until now has always been Google’s dominance, specifically its unrelenting focus on making search the only place you need to look to find your audience.

Until recently, few would have argued with that mantra. But things are changing.

The level of flux in organic SERPs and shrinking margins in paid are making many people look again, not just at their strategy generally but at the trust they have put in the brand for so long. I talk to business owners weekly who say they have ‘had enough’ of having their eggs in one basket and want a ‘safer’, more diverse, strategy.

Combine that with the fact that social now offers both audience size and access to the right people within it and the scales begin to tip significantly. Let’s look at that picture in more detail now.

Social’s key trump card

Until very recently social has been perceived as very much a ‘creative’ game. One for progressing conversations and engaging with people but not for making money directly. And while this is still very much a big part of the tactical piece there is now a layer of science sitting above it, which it critical to the success of any strategy.

That layer is all about the collection and interpretation of the right data to inform the entire marketing strategy.

As search marketers we have always known the power of data in informing strategies that convert into sales. Digital marketing is, after all, about not having to guess any more as the data is there to inform the strategy.

Social data takes that insight to a whole other level. Richer and more connected than whatever search can throw at us, it tells us such things as:

  1. The age breakdown of our audience.
  2. How often they interact with our content. (this post digs deeply into this)
  3. Precisely when they want that content and in what form.
  4. What other passions or interests they have.
  5. Deeper demographic data.
  6. Plus much more.
All of the info above can be obtained to a certain depth within Facebook’s Insights interface. A guide to how that works can be found here.

Google is concerned by this and the subsequent ability to target advertising into that space. It’s one of the key motivators behind the creation of Google+ alongside the obvious use the data has as a tool to power its personalisation and semantic plans.

Access is improving

Add better and more robust access to the platforms and we suddenly start to see why social is becoming so attractive. Facebook in particular has started to change mindsets around commercialization of social too and that opens the doors to all marketers to follow suit.

And with APIs opening up and becoming more robust, analytics improving and self-serve ad systems launching, we now have the keys to access the audience.

Where should you invest?

The question now is where should you invest and begin to execute a strategy that returns positive ROI.

The simple answer is to view social not as a ‘community management’ project, but as a science, designed to attract precisely the right people with the right content and to engage with them long enough that they convert. Consistently.

Facebook, Facebook, Facebook

Choosing the platform to center your strategy on is tough, but through testing and experience it has become clear, to me at least, that for 90% of businesses Facebook should be the commercial hub of your activity.

We have spent in excess of £500,000 on social advertising over the past 18 months and it’s that activity that has taught us the value of Facebook from an ROI perspective. It also lacks any real competitor in real terms as a central, all-encompassing social audience aggregator.

Google+ has no real part to play as a pure social play platform. There is no doubting the potential there for search benefits, but outside of tech industries it is very much a wasteland at present.

Twitter is useful as a distribution channel and, if you lean on its ad functionality, it has a place as a broadcast medium, but past that its strength really lies in being used as a customer service tool.

Pinterest is another worth considering, especially if you work in a creative industry, but again its one-dimensional content USP and lack of access makes it a limited option, for now.

Others like LinkedIn are working hard to improve the way they surface content and curate but they still have a long way to go. Sponsored content is certainly a step in the right direction.

And that leaves the king of them all. Facebook. A platform with the three key characteristics required for success:

  1. Access to a large audience (1 billion+ and counting).
  2. Access to data (to understand and measure marketing efforts).
  3. The ability to target and refine strategy based on user interest.
The question then is: “How do I go about making a platform, where I regularly see people sharing pictures of their favourite cat, work for my business?� That’s what we will dig into now.

Three phase strategy

Structure and process is key to making any marketing strategy work and social is no different.

There are three key stages to any social plan and they are:

  • Audience Growth
  • Engagement
  • Monetization

Without any one of the three you are doomed to failure and what is even more important is the fact that this is no linear process. You need to work consistently on all three, cycling through them all one by one to ensure growth and improvement is possible across all of them.

Growth

The first stage is the most important as without enough of the RIGHT people (and we’ll come to that) you simply won’t be able to monetize to a level where the project can be seen as a success.

That does NOT mean aggressively acquiring fans from anywhere. We have had clients come to us with 1 million ‘Likes’, complaining that they can’t monetize. The answer to why always sat very squarely with the type of people they had attracted and a content strategy that simply was not aligned to the right people.

You are looking for a relatively small audience in reality, but one that engages regularly with your content.

But how do you find those people? The answer is actually simpler than you think.

Digging into the data

Those that have access to the Facebook API are a lucky bunch. The data available out the back door is rich enough to make any Google engineer’s eyes water.

While search engines spit out fairly two dimensional, quantitative data around search behavior, social gives so much more. It tells us about the people. Their passions, relationships, loves and hates.

For any marketer that is rocket fuel.

And the great news is that you can not only look at what your audiences are interested in, but also what other brands’ audiences, and even general interest ‘sets’ are into (such as ‘digital marketing’ as a whole). That means you can spy on your competitors, which makes the data even more powerful. And if you have pumped your account full of cheap ‘fans’ and want a cleaner view you have the ability to simply look at similar brand audiences for the answers.

But you haven’t got API access right? Sure, but there is still hope and it comes in the form of Facebook’s Power Editor.

While rarely publicized, this little gem of a tool allows the user to dig into and segment data based on pages or groups of ‘Likes’, which means for the marketer that you can understand, in granular detail, more about your audience’s interests or those of your competitors. More ‘stuff’ about what they care about as people.

It’s accessible to all and to get it all you have to do is follow this simple step-by-step to install it on your account. It does require you to use the Chrome browser at this stage but that will change in time. It’s a free addition to your account and does not require you to spend money to use it either and ‘free tools’ are always a friend to all marketers!

We use Power Editor to segment by setting up a series of adverts with different targeting – similar to A/B testing. This allows us to choose different targeting for each segment, and in turn it gives us the estimated reach for each interest set. Capturing and correlating this data allows us to draw great insights in terms of audience interests.

By finding out how many fans of a Page ‘Like’ certain interest sets, such as football related pages you can quickly work out generalist interest sets and from that even correlate against the average Facebook audience to discover if the brand or Page has a high percentage of football fans, for example.

To help explain how such data can be used let’s examine the Moz and general digital marketing audience. For this purpose the digital marketing audience is defined as the people who ‘Like’ Digital marketing Pages on Facebook.

Below you can see clearly that the digital marketing audience correlates nicely with the overall Facebook audience (the dark blue line is the general audience and the light blue line is digital marketing). No great surprises so far.

But where it begins to get really interesting is when we start looking deeper; at what other interests the digital marketer has.

Again we can see here the general FB audience in dark blue and how interest sets vary against the digital marketing audience.

We can clearly see the digital segment over-indexes insanely around business, gaming, sci-fi, mobile devices and, interestingly, cycling, while it is clear that celebrities, pop music and fashion are really not that exciting for us (does that suggest we’re uncool?).

Diving deeper still we can extrapolate specific topics of most interest and we end up with something that looks a little like this:

As you can see we love Mashable and Steve Jobs (no surprise there) but the Wall Street Journal, Game of Thrones and Walking Dead may not be quite so obvious. Having this kind of info at hand gives you the ability to really target paid, owned and earned activity precisely where it will have most effect.

Using the insight

All the data in the world is irrelevant though if you have no way of using it in your day-to-day activity. So how does knowing this help?

In simple terms knowing who you are writing for or advertising to means that you can tailor your ‘content’ specifically at them, improving engagement and click through.

Paid media

In paid it means that you can be MUCH smarter with your spend and it opens up a whole other world to your targeting.

Forget looking to target people that just like ‘SEO’ or ‘digital marketing’ and look instead for what other interests they have. Run campaigns that capture them in ‘other’ places where they are likely to be; where their interests over-index against the average person.

If I were lucky enough to be a Moz marketer, for instance, I would absolutely look to target some social campaigns around the sci-fi audience. We know there is a high correlation between that market and digital and you’ll also pay less per click for the privilege – reaching the ‘same’ people for less and therefore improving the potential ROI of any campaign.

By targeting sci-fi fans you get the opportunity to reach those same ‘digital marketers’ in a less competitive space and those people that are not into the subject matter are immaterial anyway as they will simply ‘ignore’ the advertising, which is not a problem when you are paying Cost per Click, of course.

Content strategy

For content too this offers incredible levels of insight. Historically I had always been one of the very worst offenders when it came to believing that my creative content ideas were the best. That came from spending a decade in print, working ‘blind’ in terms of audience insight. My ideas were the best ideas going on in my own head.

The reality though, is that with data like this available you no longer have to guess, or rely on your own twisted understanding of what your reader may like.

I ensure that the data is integrated into the initial and ongoing brainstorming process each and every time to keep ideas tied to interests we know are likely to be engaged with and consumed. You can see that ideation process below and where data fits into it:

Engagement

Growth is one thing. Creating enough engaging content consistently is entirely another, however, and while you cannot engage without an audience, without engagement you have little to no chance of monetizing or organically growing your reach.

And to do that, on Facebook, at least, you must bow down to the majesty of Edgerank.

Edgerank

The majority of you will be more than aware of Facebook’s algorithm, but for those that don’t it is the ‘thing’ responsible for what you see and don’t see within your News Feed. And while internally Facebook says it no longer uses ‘Edgerank’ and that the algorithm governing feeds is now more complex the three key pillars still very much exist.

I’m not going to go in the complexities of that right here. This site does a great job of that should you require more background.

The basis of it is that the more you interact with a post, or a person, the more likely you are to see more posts from them in the future. And visibility means prizes, as we know only too well from search.

So, how can you better create content that resonates, aside from utilizing the data already discussed?

Use of the following content ‘tips’ can certainly help in my experience:

Top tips

  • Images – Almost all social networks are geared up to push visual content. It makes them more interesting and it is proven that images provoke more powerful, emotional responses than text.
  • Competitions – But we’re not just talking ‘free iPad’ here. They only work well when the prize is closely tied to the insight (so a more thoughtful prize based on their ‘Likes’) and these further tips also help.
  • Exclusive offers – Being able to make your Page feel ‘exclusive’ by creating bespoke offers is good because people share for two key reasons: 1. To show off. 2. To help a friend, and you benefit from both.
  • Curation – You do not need unlimited creation resource, as good curation is very powerful too. Play the newspaper editor role and filter the ‘trash’ so your audience doesn’t have to. They’ll thank you for it.
  • Listening – Not a content ‘type’, but being plugged into what is being talked about has long been a key social topic. Your reason for doing it though is NOT to sell, but to help. Get as close to your audience as possible.
  • Timing – The beauty of social data is real time feedback. You can see what works! To test on Google+ I like Timing+, but for Facebook, the focus of this piece and our strategy, Pageplanner is a great, low cost option.
  • HIPPO – I’m passionate about this one, and not because I like big grey animals, but because HIPPO stands for ‘Highest Paid Person’s Opinion’. Or more importantly their involvement in the Page. If they are visible you win trust from your customers and the hearts and minds of your business in taking social seriously. Get them to write a weekly post or host a webinar or chat.
  • Webinars – A great way to combine a winning content type, in video, with thought leadership. Webinars allow you to put across brand values personally through social.
  • Geo-location of content – Few think about segmenting content strategy by geography, but on a Page with a lot of followers it can be a killer strategy. Refine posts based on where the reader is will do wonders for engagement. Again, tools like Page Planner can make this really simple.

Validate effectiveness

All of the above work to greater or lesser degrees in different markets. The beauty of social though is that you can very quickly learn what works for your audience thanks to real time engagement insight. Facebook’s own reporting tool gives a view on this and we have created our own version, which also allows you to add in other Pages, so you can keep an eye on competitor strategies within the same view, as you can see below:

Monetization

For business it is the value of what ‘comes out the other end’. You can have all the fancy, soft metrics in the world, but without the ‘Ker-ching!’ moment the value is lost on most.

The great news is that the commercialization of Facebook has opened the door to all marketers and made it more acceptable to start looking at ways to monetize.

Editorial V Ads

And that brings us back to an age-old battle: one between editorial and advertising/commercial ‘content’ to a content driven audience.

It is a battle that has been fought for decades at newspapers, magazines, TV and radio stations as media companies attempt to maximize revenues without sacrificing audience.

And we are now going to have to get used to it in digital, past simply juggling how many ad spots we have on our site. Commercialism within content goes much deeper than that.

So, how do you get it right in social? The great news is that the real time engagement data is available, as explained earlier, and getting it ‘right’ is simply a case of playing with the relationship between editorial posts and more commercial ones.

Below you can see a screenshot of a social client we work with and you can see more clearly the difference between the two post types.

On the right you have a ‘commercial’ post, linking back through to a pre-order on the website and this sits comfortably with the ‘editorial’ piece on the left.

Try adding a commercial post every fourth post to begin with and then work from there, monitoring engagement rates and fan counts for signs of drop off. As soon as that happens reduce it and stick to that ratio.

Vertical Pages

For those without an ‘off page’ monetization opportunity there is also a sneaky little model you can try for yourself.

I’ve been playing with a small handful of Pages myself, building content strategies and investing in some fan acquisition activity to build up relevant niche audiences around such things as parenting and finance.

Once those pages are established and you have an audience of around 5,000+ people you can follow that same ad/editorial model replacing the commercial link-to-site with a simple affiliate link. That way you can begin monetizing via the affiliate route.

Measurement

Of course, no monetization project is complete without the measurement piece and the good news again here is that our ability to measure social’s impact on the bottom line has improved drastically too in line with the genre’s own path towards commercialization.

Google Analytics and other analytic packages now help us understand clearly not just the last click, but much more of the funnel so we can truly measure social’s part in any conversion. As the channel is now being used increasingly as a discovery channel, knowing that it may have played a part at the initial interaction stage can make your social numbers more reflective of its true value.

Softer metrics

And then there are the ‘softer’ metrics that should have monetary values assigned to them. ‘Likes’, comments, shares and impressions should and can be tracked for GA easily now thanks to the _trackSocial method. This feeds more info on that engagement through to your analytics reports so you can better understand the value interaction brings.

Paid and organic

You can also separate out paid and organic social campaigns easily enough in the same way you would within search by making use of the Google URL builder. This allows you to create bespoke URLs for specific campaigns, allowing you to measure everything from fan acquisition campaigns through to individual content projects with ease.

Takeaways

It’s clear then that the combination of changes to audience behavior, in the way they discover new things, and social’s increasing maturity as a channel that ‘accepts’ commercial content means a tipping point is close.

Combine that with Google’s current obsession with change and the channel is becoming a serious option for those looking to vary traffic sources. And with all the tools now in place and a mass of data available to inform our decision making perhaps it is time to invest?

Top 5 Takeaways

  1. Find a way of digging into your audience’s social data and leverage that information to understand them based on their interests. The more you know about them the more effective your marketing will be.
  2. Ensure that a thoroughly thought our content strategy sits at the heart of those marketing efforts and powers your social channels. That strategy should include ideas created from the above insight.
  3. Test content types regularly based on engagement rates to refine your strategy. That way you are not guessing what your audience wants to see.
  4. Set up a thorough measurement strategy from the very start. That way you can truly understand the value that social is bringing to your business; at every point within the buying funnel.
  5. And above all: Take social seriously. It’s growing fast and with access improving it really can become a primary traffic and revenue source for your business!
And if you want to refer back to anything in this post we’ve created this eBook on the topic for you. You can download it for free by clicking on the link.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

Using Google Keyword Planner (and Other Tools Instead) for Keyword Volume

Posted by Ruth_Burr

While far from a perfect tool (seriously skewed toward “commercial intent,” not always inclusive of trend data, difficult to drill down into local terms), the Google Keyword Tool was one of the best keyword research tools available. The keyword volume numbers were more trustworthy than other keyword tools, simply because they came right from the source—who better to know what kind of search volume keywords get than Google itself?

With Google’s recent announcement that their free Keyword Tool has gone away, replaced with their integrated PPC tool the Keyword Planner, a cry has gone up from SEOs: “What do we do now?”

Google Keyword Planner pros and cons

With the advent of the Keyword Planner, Google is making a strong statement that they’ll continue to focus on supporting PPC advertisers rather than organic search marketers. To that end, the Keyword Planner is heavily focused on PPC ads; you even have to sign up for an AdWords account to use it (although you don’t have to enter any payment information, and would only end up paying for the tool if you created and launched an ad). That said, the tool definitely retains some SEO utility.

Pros of Google Keyword Planner:

  • Users can now view keyword volume on a hyper-local basis; I was able to view search volume not only for the Oklahoma City area, but even drill down into Norman, the smaller OKC-area town where I live. This is great for businesses doing local and hyper-local SEO to get a better idea of the volume and competition in their geographic area.

  • The tool divides keywords up into suggested ad groups; this is designed to be a PPC-focused feature, but does provide some insight into which keywords Google deems to be semantically/topically related.
  • The “multiply keyword lists” feature allows you to search on combinations of words from two different lists. This allows you to combine your terms with modifiers such as location or color and compare search volume without having to concatenate in Excel.
  • Users can filter out keywords below a certain search volume, so you don’t even have to look at them.
  • Since you have to be logged in to use the tool, users aren’t limited to 100 words like we were with the logged-out version of the old tool.

Cons of Google Keyword Planner:

  • The ability to select Broad, Phrase or Exact match has been removed—only Exact match data is now available.
  • “Average monthly searches” is calculated over 12 months, meaning the Keyword Planner isn’t a good place to research trending topics. Use Google Trends for that.
  • The option to only search for words closely related to your term has been removed. However, Google has said they will probably add it back in.
  • Device targeting is gone—no more segmenting volume for desktop vs. mobile searches. This means volume numbers are, in general, higher for the Keyword Planner than they were for Keyword Tool since those two buckets have been combined.
  • “Local” vs “Global” search volume is no longer automatically displayed. Instead, Global (which Google is now calling “all locations”) is the default and users must drill down into specific locales for local search volume. To me, the added functionality around location targeting makes this a mixed blessing, but users will probably miss the easy comparison of seeing Local and Global side-by-side.

Alternative tools for keyword volume

Of course, for some of us, this latest example of data hoarding on Google’s part is the last straw. Here are some other places you can look for keyword volume. Since the Google Keyword Tool was free, I kept these options to tools that are free or have a free option (which is why I didn’t include the Moz Keyword Difficulty and SERP Analysis tool, even though I love it, since it’s only available to paid Moz subscribers).

Google Webmaster Tools impression data

Anyone with a Google Webmaster Tools verified site can view how often their site has shown up for certain keywords.

Pros:

  • This data still comes from Google itself.

Cons:

  • Because it only shows how many impressions your site got from a keyword, GWT Impression data can’t be used to research terms you’re not already ranking for.
  • There are disputes about the accuracy of the data—the consensus among SEO pros is that it’s less reliable than the Keyword Tool data was.

Bing Keyword Tool

The Bing-provided alternative to the Google Keyword tool goes a long way toward making up for the tool’s departure. It’s what we use in our Keyword Difficulty and SERP Analysis tool.

Pros:

  • Users can narrow searches by date range, to more accurately track recent search data.
  • Recent keyword volume trend data displays alongside other metrics.
  • A “strict” filter acts like the old “closely related” filter in Google’s Keyword Tool.
  • The tool is in Beta, so it’s likely we’ll continue to see more features and improvements as the Bing team keeps working on it.

Cons:

  • Because this data comes from Bing, which has fewer users, all search volume numbers will skew lower than they would in Google.
  • Geographic drilldown is only available at the country level.
  • Users must be signed in to a Bing Webmaster Tools account with a verified site in order to use the tool (but you should be checking Bing Webmaster Tools anyway, it’s free and there’s a lot of good stuff in there).

WordTracker

Good old WordTracker. This was the first tool I ever used for keyword research and it’s still plugging along.

Pros:

  • Their proprietary Keyword Effectiveness Index gives a gauge of how competitive each keyword is for the amount of search volume it generates.
  • WordTracker partners with SEMRush to provide paid users with paid search data as well.
  • Users can filter results by match type: “keywords in any order”, “exact keyword inside a search term” and “exact keyword only” as well as “related terms.”

Cons:

  • The full tool requires a paid subscription (starting at $69/month) to use—however, there’s also a free version that offers less functionality: Global searches only, no SEMRush data, and only 50 results per search.
  • Users must create an account with a valid email address to use the free tool.
  • Depending on which version of the tool you’re using, WordTracker data comes from one of two sources: a “major search engine advertising network,” or from metacrawlers such as DogPile, which search multiple search engines at one time. Since only a small portion of searchers are using metacrawlers, the sample of searches may be skewed based on the demographic of people who use them.

SEMRush

Full disclosure: I blog occasionally for SEMRush and am part of their customer feedback team, which means they have generously provided me with free access to their PRO tool.

Pros:

  • The free SEMRush keyword research tool provides PPC and SEO information in one view, which can be useful for marketers running hybrid PPC/SEO programs.
  • SEMRush surfaces up both the root domain and the specific URL that rank for your keyword term in the first 20 slots.
  • Related and phrase match terms, along with volume, are also served up in an individual keyword’s report.
  • Keyword volume data comes from the Google Keyword API, making it one of the more trustworthy sources of keyword volume data.

Cons:

  • Users must create a login with a valid email address to use the tool—but it’s free.
  • SERP information doesn’t take into account local, video, carousel or other non-text result types.
  • Geographic drilldown is only available at the country level.
  • Despite the related and phrase match keyword info, this tool is more effective at researching individual keywords, once you already have them, than it is at generating lots of new keyword ideas—so keep that in mind.

Don’t Hit Enter

I’d be remiss if I didn’t include one of my favorite keyword brainstorming tools, first introduced by Wil Reynolds at MozCon last year: Just start typing one of your core terms into Google, don’t hit enter, and see which keywords are suggested. Then “start the next word” by typing different letters to get further suggestions.

Pros:

  • Discover the results that Google is most likely to drive users to (since many users will use Google Suggestions that are close to their original query if they come up).

Cons:

  • No “related terms” data—everything that comes up will start with that first word.
  • No keyword volume data. You’ll have to use one of the other tools listed above for that!
  • Your suggestions may be skewed based on your location and search history.

The Future of Keyword Volume

I don’t really think any one tool is going to cut it in this day and age—I’d always recommend using more than one tool for something like keyword volume research, especially since the data can vary so much depending on where the data comes from. The best (safest) way to use keyword data from any tool, including Google, is at a directional level to make inferences about Google: If Keyword A has 10 times as many searches as Keyword B in Bing, and 5 times as many searches as Keyword B in WordTracker, Keyword A will most likely also be more popular in Google. This kind of directional approach is much more likely to be successful than treating the numbers from any one tool as gospel.

There are a few other things to consider in your keyword volume research. For one, increased personalization in search results means that even if you rank very well for a keyword most of the time, you may not show up every time that term is searched; there’s no way for keyword volume tools to predict how often you’ll be personalized in or out of people’s SERPs. Also, keep in mind that certain terms may be important to target even if they’re lower in volume, whether because they’re important to your brand or because they convert so highly that the lower traffic numbers don’t matter.

I’ll probably be using Google’s Keyword Planner in conjunction with one or two of these other tools, plus Moz tools, for my keyword research going forward. How about you? Any awesome free tools I’ve missed? Feel free to let me know in the comments!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

Don’t Take Your Brand Too Seriously

Posted by Rob Toledo

Everyone likes humor; we all know this.

But humor can seem risky when it comes to branding—it has certainly backfired on numerous occasions when a company takes things perhaps a bit too far (or sometimes when it is just misunderstood).

On the other hand, playing it too safe is also a great way to remain somewhere in the middle. Almost everyone likes the middle. Nobody loses their job in the middle. Customers come and go at a steady rate in the middle. Nobody boycotts the middle.

To quote the greatest show of all time, “Ain’t nobody got nothing to say about a 40-degree day.”

From HBO: source

A lot of brands talk about wanting to take risks. They might even discuss some radical ideas in the safety of their own conference rooms. But most of the time we end up with “safe” when it’s time to execute on a strategy.

Does any of this sound familiar?

“Let’s tweet more!”

“How about we make a hilarious infographic!”

“Let’s put one of those meme things on our blog!”

“Our competitor just did that one awesome thing, let’s do the exact same thing!”

Don’t DO something, BE something

One of my favorite books of all time, Hey Whipple, Squeeze This, discusses the topic of “being something” as opposed to just “doing something.”

“When a client says ‘we want to seem cooler’ the answer isn’t an ad that says ‘we’re cool’—the answer is to BE cool.”

It’s important to make any attempt at a strategy—especially when it involves humor—a full effort where you’re not simply doing something for the occasional chuckle. You are going to have to fight a much more difficult (but fully worthwhile) battle of changing the overall perception of your brand.

I asked Joel Klettke, resident internet funnyman and owner of Business Casual Copywriting, for his thoughts on the topic:

Do you think every brand should partake in a strategy involving humor?

I think every brand is capable, but not every brand should try. I think the downfall of humor in advertising or online is when a business starts becoming a sideshow and the brand is lost in the mix. A lot of brands get too focused on laughs: Entertaining an audience is great, but you’re still trying to sell things.
I also think that there are some products or services where humor needs to be considered extremely carefully – things like child welfare, etc.

Can you list some examples of brands that overstepped the boundaries on using humor?

Yup, the Hyundai suicide commercials were terrible attempts at humor. Summer’s Eve had a series of commercials [NSFW-ish] that were a terrible choice.

What are some of your favorite examples of brands using humor as a strategy well?

OK great, humor is good; but where do we draw the line?

Well, that’s a tricky one as the line gets a bit fuzzy depending on a lot of variables. Every brand is going to have varying persona research, and you should know your customers better than anyone, so you’ll likely have to find that line on your own. Good customer research will be the key here.

For an extreme example, I think this KMart “Ship My Pants” campaign is the stuff of legends. It’s hilarious and teeters right on the line of offensive, all while remaining relevant to the brand (free shipping at KMart). They wanted to grow their online presence and drive traffic to their site with this campaign, which this ad certainly did as it got massive amounts of attention. It’s been hailed widely as a success, earning 19.5 million YouTube views, but they did earn themselves a small boycott from some folks who were offended, which has mostly fizzled.

This raises an important point: Take as much risk as you want, but try not to offend people in some key areas. I can ignore something I find slightly annoying pretty easily, but if it strikes a chord that offends one of my core principles, that’s when I’m going to get on my social media soapbox and start ranting.

Some things that are guaranteed to offend:

  • Racism
  • Sexism
  • Stereotyping
  • Religious focus
  • Political focus
  • Being a bully (don’t pick on the little guy, even in retaliation)
  • Making the wrong assumptions (research, research, research!)

So, how do you get started?

Make sure your humor is somehow relevant to your brand

“Oh, I see. All I have to do is show something interesting and funny for the first 25 seconds of the ad and then cut to the product?” – Luke Sullivan

While running a campaign where you just tell jokes and make funny videos might get a lot of attention, at the end of the day, making “cool stuff” is not a content strategy.

Find the ridiculous parts of your brand and “go there”

Vintage VW ad: source

“You know those really funny ideas you get that make you laugh and say, ‘Wouldn’t it be great if we could really do that?’ Those are often the very best ideas, and it is only your superego/parent/internalized client saying you can’t do it. You’ve stumbled on a mischievous idea. Something you shouldn’t do. That’s a good sign you’re onto something you SHOULD do. Revisit it.” – Luke Sullivan

Here are some of my most recent favorite examples of brands poking fun at themselves:

Bigstock Photo

Making fun of your core product can be risky. VW used this strategy during its early advertising efforts and it paid massive dividends. Bigstock recently took that approach and fully embraced the concept of “awkward” in their photo collection.

It’s no secret that there are plenty of these awkward stock photos out there. But were you aware of the assortment of awkward “steak” photos available? Puns might be considered the lowest form of humor on the joke food chain, but be honest: You like them, no matter how deeply buried that linguistic love might be.

Air New Zealand

Taking the bland and boring parts of your business and attempting to make them exciting takes quite a bit of creativity, but it’s a powerful angle.

Nobody has paid attention to an airline safety presentation since 1974. Air New Zealand aimed to change that (and bring themselves plenty of brand recognition in the process) by making a mockery of the otherwise mind-numbing instructional sessions. Featuring Bear Grylls, The Lord of the Rings, and naked employees, these videos quickly grabbed the attention of all those aboard the aircraft as well as everyone online.

The Seattle Police Department

Criminal justice is hardly ever intentionally humorous, but the Seattle Police Department made it a part of their rebranding strategy. The department has been in hot water for the past several years from both local citizens as well as the federal government, so they brought in local journalist Jonah Spangenthal-Lee to attempt a rebrand. To say he has been knocking it out of the ballpark would be a major understatement.

Some recent highlights include their distributing Doritos to Hempfest attendees, releasing the funniest blog post about marijuana legalization of all time (seriously, read that one), pictures of their mounted patrol horses at the dentist and just generally being ridiculously responsive on social media, even to trolls.

Source: Seattle Police Department Twitter page

What happened here? The city population started to view the department differently. Public perception quickly shifted positively, and before our very eyes, our police department had personality. There were real people that worked behind the badges. It was a huge risk to take on a humorous strategy—especially as a government agency–but it has quickly earned positive national attention with very little push-back.

Funny isn’t everything; it has to be based on something smart

“Should you do something humorous, don’t mistake a good joke for a good idea. Funny is fine. But set out to be interesting first. You must have an idea [of where to go next].” – Luke Sullivan

I’m repeating myself a bit here, but it’s always important to make sure that this humor is based on a solid overall strategy–that it is well researched and planned. Always think: Who is your ideal customer, and what do they find funny?

Lastly, keep in mind the Internet has a short memory

I know a lot of people worry about taking risks in fear of potential backlash, but ask yourself: Can you truly name more than a handful of brands that got a bunch of bad press in 2012 for a risky campaign? I understand that it can seem as if the world is ending when your brand takes a few days of heat for having taken a risk. But truthfully, in this day and age, unless you say something completely tasteless, I can assure you that a slight misstep here and there will come and go faster than you can brainstorm your next ideas. Just apologize and move on. Most importantly, quit being so afraid of taking chances in your next strategy.

What about you? Got any favorite creative campaigns that you felt have really worked? How about anything your own brand is doing?

Let me know in the comments below, or feel free to reach out on Twitter!
@stentontoledo


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

Weighting the Clusters of Ranking Factors in Google Analytics – Whiteboard Friday

Posted by randfish

One thing we collect for our semiannual ranking factors survey is the opinions of a group of SEO experts (128 of them this year!) about the relative weights of the categories of ranking factors. In other words, how important each of those categories is for SEO relative to the others.

In today’s Whiteboard Friday, Rand explains some key takeaways from the results of that particular survey question. In addition, the pie chart below shows what the categories are and just where each of them ended up.

Whiteboard Friday – Weighting the Clusters of Ranking Factors in Google Analytics

For reference, here’s a still of this week’s whiteboard and a fancy version of the chart from this week’s video!

Weighting of Thematic Clusters of Ranking Factors in Google

larger version

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week I’m going to talk a little bit about the ranking factors survey that we did this year and specifically some of the results from that.

One of my favorite questions that we ask in our ranking factors survey, which happens every two years and goes out to a number of SEO experts. This year, 128 SEO experts responded, sort of folks who were hand chosen by us as being very, very knowledgeable in the field. We asked them, based on these sort of thematic clusters of ranking elements, things like domain level link authority versus page level keyword agnostic features, weight them for us. You know, give a percentage that you would assign if you were giving an overall assessment of the importance of this factor in Google’s ranking algorithm.

So this is opinion data. This is not fact. This is not actually what Google’s using. This is merely the aggregated collective opinions of a lot of smart people who study this field pretty well. This week what I want to do is run through what these elements are, the scores that people gave them, and then some takeaways, and I even have an exercise for you all at home or at the office as the case may be.

So interestingly, the largest portion that was given credit by the SEOs who answered this question was domain-level link authority. This is sort of the classic thing we think of in the Moz scoring system as domain authority, DA. They said 20.94%, which is fairly substantive. It was the largest one.

Just underneath that, page-level link features, meaning external links, how many, how high-quality, where are they coming from, those kinds of things for ranking a specific page.

Then they went to page-level keyword and content features. This isn’t just raw keyword usage, keyword in the title tag, how many times you repeat on the page; this is also content features like if they think Google is using topic modeling algorithms or semantic analysis models, those types of things. That would also fit into here. That was given about 15%, 14.94%.

At 9.8%, then they all kind of get pretty small. Everything between here and here is between 5% and 10%. A bunch of features in there, like page-level keyword agnostic features. So this might be like how much content is in there, to what degree Google might be analyzing the quality of the content, are there images on the page, stuff like this. “How fast does the page load” could go in there.

Domain level brand features. Does this domain or the brand name associated with the website get mentioned a lot on the Internet? Does the domain itself get, for example, mentioned around the Web, lots of people writing about it and saying, “Moz.com, blah, blah, blah.”

User usage and traffic or query data. This one’s particularly fascinating, got an 8.06%, which is smaller but still sizeable. The interesting thing about this is I think this is something that’s been on the rise. In years past, it had always been under 5%. So it’s growing. This is things like: Are there lots of people visiting your website? Are people searching for your domain name, for your pages, for your brand name? How are people using the site? Do you have a high bounce rate or a lot of engagement on the site? All that kind of stuff.

Social metrics, Twitter, Facebook, Google+, etc., domain-level keyword usage, meaning things like if I’m trying to rank for blue shoes, do I have blue shoes in the domain name, like blueshoes.com or blue-shoes.com. This is one that’s been declining.

Then domain-level keyword agnostic features. This would be things like:
What’s the length of the domain name registration, or how long is the domain name? What’s the domain name extension? Other features like that, that aren’t related to the keywords, but are related to the domain.

So, from this picture I think there’s really some interesting takeaways, and I wanted to walk through a few of those that I’ve seen. Hopefully, it’s actually helpful to understand the thematic clusters themselves.

Number one: What we’re seeing year after year after year is complexity increasing. This picture has never gotten simpler any two years in a row that we’ve done this study. It’s never that one factor, you know, used to be smaller and now it’s kind of dominant and it’s just one thing. Years ago, I bet if we were to run this survey in 2001, it’d be like page rank, Pac-Man, everything else, little tiny chunk of Pac-Man’s mouth.

Number two: Links are still a big deal. Look here, right? I mean what we’re essentially seeing in this portion here is domain-level link authority and page-level link features, all of them. You could sort of think of this as maybe page authority being a proxy for this and domain authority being a proxy for this. That’s still a good 40% of how SEOs are perceiving Google’s algorithm. So links being a big important portion, but not the overwhelming portion.

It has almost always been the case in years past that the link features, when combined, were 50%. So we’re seeing that they’re a big deal both in the page and domain level, just not as big or as overwhelming as they used to be, and I think this is reflected in people’s attitudes towards link acquisition, which is, “Hey, that’s still a really important practice. That’s still something I’m looking forward to and trying to accomplish.”

Number three: Brand-related and brand-driven metrics are on the rise. Take a look. Domain level brand features and user usage or traffic query data, this is comprising a percentage that actually in sum exceeds page-level keyword content and features. This is really kind of the branding world happening right here. So if you’re not building a brand on the Web, that could be seriously hurting your SEO, maybe to the same degree that not doing on-page optimization is. Actually, that would be a conclusion that I personally would agree with as well.

Number four: Social is still perceived to have a minor impact despite some metrics to the contrary. So, social you can see up here at 7.24%, which is reasonably small. It’s the third-smallest factor that was on there. And yet, when we look at how do social metrics correlate with things that rank highly versus things that rank poorly, we’re seeing very high numbers, numbers that in many cases exceed or equal the link metrics that we look at. So here at Moz we kind of look at those and we go, “Well, obviously correlation does not imply causation.” It could be the case that there are other things Google’s measuring that just happen to perform well and happen to correlate quite nicely with social metrics, like +1s and shares and tweets and those kinds of things.

But certainly it’s surprising to us to see such a high correlation and such a low perception. My guess is, if I had to take a guess, what I’d say is that SEOs have a very hard time connecting these directly. Essentially, you go and you see a page that’s ranking number nine, and you think, “Hey, let me try to get a bunch of tweets and shares and +1s, and I’m going to acquire those in some fashion. Still ranking number nine. I don’t think social does all that much.” Versus, you go out and get links, and you can see the page kind of rising in the search results. You get good links from good places, from authoritative sites and many of them. Boom, boom, boom, boom. “I look like I’m rising; links are it.”

I think what might be being missed there is that the content of the page, the quality of the page and the quality of the domain and the brand and the amplification that it can achieve from social is an integral part. I don’t know exactly how Google’s measuring that, and I’m not going to speculate on what they are or aren’t doing. The only thing they’ve told us specifically is that we are not exclusively using just +1s precisely to increase rankings unless it’s personalized results, in which case maybe we are. To me, that kind of hyper specificity says there’s a bigger secret story hiding behind the more complex things that they are not saying they aren’t doing.

Number five, the last one: Keyword-based domain names, which I know have been kind of a darling of the SEO world (or historically a darling of the SEO world) and particularly of the affiliate marketing worlds for a long time, continue to shrink. You can see that in the correlation data. You can see it in the performance data. You can see it in the MozCast data set, which monitors sort of what appears in Google and doesn’t.

Our experience reinforces that. So remember Moz switched from the domain name SEOmoz, which had the keyword SEO right in there, to the Moz domain name not very long ago, and we did see kind of a rankings dive for a little while. Now almost all of those numbers are right back up where they were. So I think that’s (a) a successful domain shift, and I give huge credit to folks like Ruth Burr and Cyrus Shepard who worked so hard and so long on making that happen, Casey Henry too. But I think there’s also a story to be told there that having SEO in the domain name might not have been the source of as many rankings for SEO-related terms as we may have perceived it to be. I think that’s fascinating as well.

My recommendation, my suggestion to all of you, if you get the chance, try this. Go grab your SEO team or your SEO colleagues, buddies, friends in the field. Sit down in a room with a whiteboard or with some pen and paper. Don’t take a laptop in. Don’t use your phones. List out these features and go do this yourself. Go try making these percentages for what you think the algorithm actually looks like, what your team thinks the algorithm looks like, and then compare. What is it that’s the difference between kind of the aggregate of these numbers and the perception that you have personally or you have as a team?

I think that can be a wonderful exercise. It can really open up a great dialogue about why these things are happening. I think it’s some fun homework if you get a chance over the next week.

Until then, see you next week. Take care.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →