Archives for 

seo

New Moz-Builtwith Study Examines Big Website Tech and Google Rankings

Posted by Cyrus-Shepard

BuiltWith knows about your website.

Go ahead. Try it out.

BuiltWith also knows about your competitors’ websites. They’ve cataloged over 5,000 different website technologies on over 190 million sites. Want to know how many sites use your competitor’s analytics software? Or who accepts Bitcoin? Or how many sites run WordPress?

Like BuiltWith, Moz also has a lot of data. Every two years, we run a Search Engine Ranking Factors study where we examine over 180,000 websites in order to better understand how they rank in Google’s search results.

We thought, “Wouldn’t it be fun to combine the two data sets?

That’s exactly what our data science team, led by Dr. Matt Peters, did. We wanted to find out what technologies websites were using, and also see if those technologies correlated with Google rankings.

How we conducted the study

BuiltWith supplied Moz with tech info on 180,000 domains that were previously analyzed for the Search Engine Ranking Factors study. Dr. Peters then calculated the correlations for over 50 website technologies.

The ranking data for the domains was gathered last summer—you can read more about it here—and the BuiltWith data is updated once per quarter. We made the assumption that basic web technology, like hosting platforms and web servers, don’t change often.

It’s very important to note that the website technologies we studied are not believed to be actual ranking factors in Google’s algorithm. There are huge causation/correlation issues at hand. Google likely doesn’t care too much what framework or content management system you use, but because SEOs often believe one technology superior to the other, we thought it best to take a look..

Web hosting platforms performance

One of the cool things about BuiltWith is not only can you see what technology a website uses, but you can view trends across the entire Internet.

One of the most important questions a webmaster has to answer is who to use as a hosting provider. Here’s BuiltWith’s breakdown of the hosting providers for the top 1,000,000 websites:

Holy GoDaddy! That’s a testament to the power of marketing.

Webmasters often credit good hosting as a key to their success. We wanted to find out if certain web hosts were correlated with higher Google rankings.

Interestingly, the data showed very little correlation between web hosting providers and higher rankings. The results, in fact, were close enough to zero to be considered null.

Web Hosting Correlation
Rackspace 0.024958629
Amazon 0.043836395
Softlayer -0.02036524
GoDaddy -0.045295217
Liquid Web -0.000872457
CloudFlare Hosting -0.036254475

Statistically, Dr. Peters assures me, these correlations are so small they don’t carry much weight.

The lesson here is that web hosting, at least for the major providers, does not appear to be correlated with higher rankings or lower rankings one way or another. To put this another way, simply hosting your site on GoDaddy should neither help or hurt you in the large, SEO scheme of things.

That said, there are a lot of bad hosts out there as well. Uptime, cost, customer service and other factors are all important considerations.

CMS battle – WordPress vs. Joomla vs. Drupal

Looking at the most popular content management systems for the top million websites, it’s easy to spot the absolute dominance of WordPress.

Nearly a quarter of the top million sites run WordPress.

You may be surprised to see that Tumblr only ranks 6,400 sites in the top million. If you expand the data to look at all known sites in BuiltWith’s index, the number grows to over 900,000. That’s still a fraction of the 158 million blogs Tumblr claims, compared to the only 73 million claimed by WordPress.

This seems to be a matter of quality over quantity. Tumblr has many more blogs, but it appears fewer of them gain significant traffic or visibility.

Does any of this correlate to Google rankings? We sampled five of the most popular CMS’s and again found very little correlation.

CMS Correlation
WordPress -0.009457206
Drupal 0.019447922
Joomla! 0.032998891
vBulletin -0.024481161
ExpressionEngine 0.027008018

Again, these numbers are statistically insignificant. It would appear that the content management system you use is not nearly important as how you use it.

While configuring these systems for SEO varies in difficulty, plugins and best practices can be applied to all.

Popular social widgets – Twitter vs. Facebook

To be honest, the following chart surprised me. I’m a huge advocate of Google+, but never did I think more websites would display the Google Plus One button over Twitter’s Tweet button.

That’s not to say people actually hit the Google+ button as much. With folks tweeting over 58 million tweets per day, it’s fair to guess that far more people are hitting relatively few Twitter buttons, although Google+ may be catching up.

Sadly, our correlation data on social widgets is highly suspect. That’s because the BuiltWith data is aggregated at the domain level, and social widgets are a page-level feature.

Even though we found a very slight positive correlation between social share widgets and higher rankings, we can’t conclusively say there is a relationship.

More important is to realize the significant correlations that exist between Google rankings and actual social shares. While we don’t know how or even if Google uses social metrics in its algorithm (Matt Cutts specifically says they don’t use +1s) we do know that social shares are significantly associated with higher rankings.

Again, causation is not correlation, but it makes sense that adding social share widgets to your best content can encourage sharing, which in turn helps with increased visibility, mentions, and links, all of which can lead to higher search engine rankings.

Ecommerce technology – show us the platform

Mirror, mirror on the wall, who is the biggest ecommerce platform of them all?

Magento wins this one, but the distribution is more even than other technologies we’ve looked at.

When we looked at the correlation data, again we found very little relationship between the ecommerce platform a website used and how it performed in Google search results.

Here’s how each ecommerce platform performed in our study.

Ecommerce Correlation
Magento -0.005569493
Yahoo Store -0.008279856
Volusion -0.016793737
Miva Merchant -0.027214854
osCommerce -0.012115017
WooCommerce -0.033716129
BigCommerce SSL -0.044259375
Magento Enterprise 0.001235127
VirtueMart -0.049429445
Demandware 0.021544097

Although huge differences exist in different ecommerce platforms, and some are easier to configure for SEO than others, it would appear that the platform you choose is not a huge factor in your eventual search performance.

Content delivery networks – fast, fast, faster

One of the major pushes marketers have made in the past 12 months has been to improve page speed and loading times. The benefits touted include improved customer satisfaction, conversions and possible SEO benefits.

The race to improve page speed has led to huge adoption of content delivery networks.

In our Ranking Factors Survey, the response time of a web page showed a -0.10 correlation with rankings. While this can’t be considered a significant correlation, it offered a hint that faster pages may perform better in search results—a result we’ve heard anecdotally, at least on the outliers of webpage speed performance.

We might expect websites using CDNs to gain the upper hand in ranking, but the evidence doesn’t yet support this theory. Again, these values are basically null.

CDN Correlation
AJAX Libraries API 0.031412968
Akamai 0.046785574
GStatic Google Static Content 0.017903898
Facebook CDN 0.0005199
CloudFront 0.046000385
CloudFlare -0.036867599

While using a CDN is an important step in speeding up your site, it is only one of many optimizations you should make when improving webpage performance.

SSL certificates, web servers, and framework: Do they stack up?

We ran rankings correlations on several more data points that BuiltWith supplied us. We wanted to find out if things like your website framework (PHP, ASP.NET), your web server (Apache, IIS) or whether or not your website used an SSL certificate was correlated with higher or lower rankings.

While we found a few outliers around Varnish software and Symanted VeriSign SSL certificates, overall the data suggests no strong relationships between these technologies and Google rankings.

Framework Correlation
PHP 0.032731241
ASP.NET 0.042271235
Shockwave Flash Embed 0.046545556
Adobe Dreamweaver 0.007224319
Frontpage Extensions -0.02056009
SSL Certificates
GoDaddy SSL 0.006470096
GeoTrust SSL -0.007319401
Comodo SSL -0.003843119
RapidSSL -0.00941283
Symantec VeriSign 0.089825587
Web Servers
Apache 0.029671122
IIS 0.040990108
nginx 0.069745949
Varnish 0.085090249

What we can learn

We had high hopes for finding “silver bullets” among website technologies that could launch us all to higher rankings.

The reality turns out to be much more complex.

While technologies like great hosting, CDNs, and social widgets can help set up an environment for improving SEO, they don’t do the work for us. Even our own Moz Analytics, with all its SEO-specific software, can’t help improve your website visibility unless you actually put the work in.

Are there any website technologies you’d like us to study next time around? Let us know in the comments below!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

Hummingbird’s Unsung Impact on Local Search

Posted by David-Mihm

Though I no longer actively consult for clients, there seems to have been a significant qualitative shift in local results since Google’s release of Hummingbird that I haven’t seen reported on search engine blogs and media outlets. The columns I have seen have generally espoused advice to take advantage of what Hummingbird was designed to do rather than looked at the outcome of the update.

From where I sit, the outcome has been a slightly lower overall quality in Google’s local results, possibly due in part to a “purer” ranking algorithm in local packs. While these kinds of egregious results reported soon after Hummingbird’s release have mostly disappeared, it’s the secondary Hummingbird flutter, which may have coincided with the November 14th “update,” that seems to have caused the most noticeable changes.

I’ll be working with Dr. Pete to put together more quantitative local components of Mozcast in the coming months, but for the time being, I’ll just have to describe what I’m seeing today with a fairly simplistic analysis.

To do the analysis, I performed manual searches for five keywords, both geo-modified and generic, in five diverse markets around the country. I selected these keywords based on terms that I knew Google considered to have “local intent” across as broad a range of industries as I could think of. After performing the searches, I took note of the top position and number of occurrences of four types of sites, as well as position and number of results in each “pack.”

Keywords Markets Result Type Taxonomy
personal injury lawyer Chicago national directory (e.g., Yelp)
assisted living facility Portland regional directory (e.g., ArizonaGolf.com)
wedding photographer Tampa local business website (e.g., AcmeElectric.com)
electrician Burlington barnacle webpage (e.g., facebook.com/acmeelectric)
pet store Flagstaff national brand (e.g., Petsmart.com)

I also performed an even smaller analysis using three keywords that returned carousel results (thanks to SIM Partners for this sample list of keywords): “golf course,” “restaurant,” and “dance club.”

Again, a very simple analysis that is by no means intended to be a statistically significant study. I fully realize that these results may be skewed by my Portland IP address (even though I geo-located each time I searched for each market), data center, time of day, etc.

I’ll share with you some interim takeaways that I found interesting, though, as I work on a more complete version with Dr. Pete over the winter.

1. Search results in search results have made a comeback in a big way

If anything, Hummingbird or the November 14th update seem to have accelerated the trend that started with the Venice update: more and more localized organic results for generic (un-geo-modified) keywords.

But the winners of this update haven’t necessarily been small businesses. Google is now returning specific metro-level pages from national directories like Yelp, TripAdvisor, Findlaw, and others for these generic keywords.

This trend is even more pronounced for keywords that do include geo-modifiers, as the example below for “pet store portland” demonstrates.

Results like the one above call into question Google’s longstanding practice of minimizing the frequency with which these pages occur in Google search results. While the Yelp example above is one of the more blatant instances that I came across, plenty of directories (including WeddingWire, below) are benefitting from similar algorithmic behavior. In many cases the pages that are ranking are content-thin directory pages—the kind of content to which Panda, and to some extent Penguin, were supposed to minimize visibility.

Overall, national directories were the most frequently-occurring type of organic result for the phrases I looked at—a performance amplified when considering geo-modified keywords alone.

National brands as a result type is underrepresented due to ‘personal injury lawyer,’ ‘electrician,’ and ‘wedding photographer’ keyword choices. For the keywords where there are relevant national brands (‘assisted living facility’ and ‘pet store’), they performed quite well.

2. Well-optimized regional-vertical directories accompanied by content still perform well

While a number of thriving directories were wiped out by the initial Panda update, here’s an area where the Penguin and Hummingbird updates have been effective. There are plenty of examples of high-quality regionally focused content rewarded with a first-page position—in some cases above the fold. I don’t remember seeing as many of these kinds of sites over the last 18 months as I do now.

Especially if keywords these sites are targeting return carousels instead of packs, there’s still plenty of opportunity to rank: in my limited sample, an average of 2.3 first-page results below carousels were for regional directory-style sites.

3. There’s little-to-no blending going on in local search anymore

While Mike Blumenthal and Darren Shaw have theorized that the organic algorithm still carries weight in terms of ranking Place results, visually, authorship has been separated from place in post-Hummingbird SERPs.

Numerous “lucky” small businesses (read: well-optimized small businesses) earned both organic and map results across all industries and geographies I looked at.

4. When it comes to packs, position 4 is the new 1

The overwhelming majority of packs seem to be displaying in position 4 these days, especially for “generic” local intent searches. Geo-modified searches seem slightly more likely to show packs in position #1, which makes sense since the local intent is explicitly stronger for those searches.

Together with point #3 in this post, this is yet another factor that is helping national and regional directories compete in local results where they couldn’t before—additional spots appear to have opened up above the fold, with authorship-enabled small business sites typically shown below rather than above or inside the pack. 82% of the searches in my little mini-experiment returned a national directory in the top three organic results.

5. The number of pack results seems now more dependent on industry than geography

This is REALLY hypothetical, but prior to this summer, the number of Place-related results on a page (whether blended or in packs) seemed to depend largely on the quality of Google’s structured local business data in a given geographic area. The more Place-related signals Google had about businesses in a given region, and the more confidence Google had in those signals, the more local results they’d show on a page. In smaller metro areas for example, it was commonplace to find 2- and 3-packs across a wide range of industries.

At least from this admittedly small sample size, Google increasingly seems to be a show a consistent number of pack results by industry, regardless of the size of the market.

Keyword # in Pack Reason for Variance
assisted living facility 6.9 6-pack in Burlington
electrician 6.9 6-pack in Portland
personal injury lawyer 6.4 Authoritative OneBox / Bug in Chicago
pet store 3.0
wedding photographer 7.0

This change may have more to do with the advent of the carousel than with Hummingbird, however. Since the ranking of carousel results doesn’t reliably differ from that of (former) packs, it stands to reason that visual display of all local results might now be controlled by a single back-end mechanism.

6. Small businesses are still missing a big opportunity with basic geographic keyword optimization

This is more of an observational bullet point than the others. While there were plenty of localized organic results featuring small business websites, these tended to rank lower than well-optimized national directories (like Yelp, Angie’s List, Yellowpages.com, and others) for small-market geo-modified phrases (such as “electrician burlington”).

For non-competitive phrases like this, even a simple website with no incoming links of note can rank on the first page (#7) just by including “Burlington, VT” in its homepage Title Tag. With just a little TLC—maybe a link to a contact page that says “contact our Burlington electricians”—sites like this one might be able to displace those national directories in positions 1-2-3.

7. The Barnacle SEO strategy is underutilized in a lot of industries

Look at the number of times Facebook and Yelp show up in last year’s citation study I co-authored with Whitespark’s Darren Shaw. Clearly these are major “fixed objects” to which small businesses should be attaching their exoskeletons.

Yet 74% of searches I conducted as part of this experiment returned no Barnacle results.

This result for “pet store chicago” is one of the few barnacles that I came across—and it’s a darn good result! Not only is Liz (unintenionally?) leveraging the power of the Yelp domain, but she gets five schema’d stars right on the main Google SERP—which has to increase her clickthrough rate relative to her neighbors.

Interestingly, the club industry is one outlier where small businesses are making the most of power profiles. This might have been my favorite result—the surprisingly competitive “dance club flagstaff” where Jax is absolutely crushing it on Facebook despite no presence in the carousel.

What does all this mean?

I have to admit, I don’t really know the answer to this question yet. Why would Google downgrade the visibility of its Place-related results just as the quality of its Places backend has finally come up to par in the last year? Why favor search-results-in-local-search-results, something Google has actively and successfully fought to keep out of other types of searches for ages? Why minimize the impact of authorship profiles just as they are starting to gain widespread adoption by small business owners and webmasters?

One possible reason might be in preparation for more card-style layouts on mobile phones and wearable technology. But why force these (I believe slightly inferior) results on users of desktop computers, and so far in advance of when cards will be the norm?

At any rate, here are five takeaways from my qualitative review of local results in the last couple of months.

  1. Reports of directories’ demise have been greatly exaggerated. For whatever reason (?), Google seems to be giving directories a renewed lease on life. With packs overwhelmingly in the fourth position, they can now compete for above-the-fold visibility in positions 1-2-3, especially in smaller and mid-size metro areas.
  2. Less-successful horizontal directories (non-Yelps and TripAdvisors, e.g.) should consider the economics of their situation. Their ship has largely sailed in larger metro areas like Chicago and Portland. But they still have the opportunity to dominate smaller markets. I realize you probably can’t charge a personal injury lawyer in Burlington what you charge his colleague in downtown Chicago. But, in terms of the lifetime value of who will actually get business from your advertising packages, the happy Burlington attorney probably exceeds the furious one from Chicago (if she is even able to stay in business through the end of her contract with you).
  3. The Barnacle opportunity is huge, for independent and national businesses alike. With Google’s new weighting towards directories in organic results and the unblending of packs, barnacle listings present an opportunity for savvy businesses to earn three first-page positions for the same keyword—one pack listing, one web listing, and one (or more) barnacle listing.
  4. National brands who haven’t taken my advice to put in a decent store locator yet should surely do so now. Well-structured regional pages, and easily-crawled store-level pages, can get great visibility pretty easily. (If you’re a MozCon attendee or have purchased access, you can learn more about this advice in my MozCon 2013 presentation.)
  5. Andrew Shotland already said it in the last section of his Search Engine Land column, but regionally-focused sites—whether directories or businesses—should absolutely invest in great content. With Penguin and Hummingbird combined, thin-content websites of all sizes are having a harder time ranking relative to slightly thicker content directories.

Well, that’s my take on what’s happening in local search these days…is the Moz community seeing the same things? Do you think the quality of local results has improved or declined since Hummingbird? Have you perceived a shift since November 14th? I’d be particularly interested to hear comments from SEOs in non-U.S. markets, as I don’t get the chance to dive into those results nearly as often as I’d like.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

11 Marketing Survival Lessons Learned from Accidentally Enraging an Island City-State

Posted by DannyDover

My initial response to the massive traffic increase was not exactly professional.

HOLY FREAKING CRAP BALLS!“, I blurted out. I searched the room for a fellow nerd to share my e-thusiasm with, but only found a room full of strangers eating sandwiches.

Over the course of the next few days, the post received more than 600,000 unique visitors. If you segment the traffic to only include visits from Singapore, the number of unique visitors is equivalent to 10% of the entire population of the country (although admittedly this metric is a bit inflated due to people reading the post on multiple devices.)

Some context

I support myself financially as a storytelling consultant. On a day-to-day level this means I work on marketing strategy, creative writing, and web development. Admittedly it is a weird mix, but I enjoy the lifestyle.

I am currently living in Vietnam, but recently spent two months living in Singapore.

Like I do with all of my travels, I penned a blog post about my experience living in Singapore and hit publish. You can read the entire post here, but the quick summary is:

  • Singapore has accomplished a lot in a short amount of time.
  • I am deeply concerned that the societal and cultural costs of these accomplishments are harming the populace (I cited concerning data points related to stress).
  • I have limited time and resources, and will not be returning to Singapore.

My blog is fairly well read, so I was surprised that this post started out as one of my least-read posts. After a few days the post was, for most intents and purposes, just another link in the archive.

Last Wednesday, I grabbed my normal Vietnamese breakfast (a local sandwich called a Bánh mì and a coconut milk-based smoothie) and went into my co-working office to start on my to-do list for the day.

I have been trying to convert bad online habits into good ones, so when I found myself craving a peek at Facebook, I clicked on my Google Analytics shortcut instead. It opened up my real-time report, and I practically dropped my meal.

Marketing lessons learned

The next few days were the craziest marketing adventure that I have ever had. The following are the key lessons I learned from this experience:

1. Honesty is power

I think the key reason that this post resonated with people was that it was uncommonly honest. (This is a trait I picked up from Rand when I worked at Moz. It isn’t a marketing trait, it is a life trait.) This post was published on my personal blog where I don’t have any ads or up-sells. I write posts there solely because I enjoy writing. In this case, I thought I had some interesting insights about Singapore and wanted to share my honest thoughts. The power in this was that when people read it, they too wanted to share my thoughts (along with their own!) with their online friends.

2. Be conscious of the clickstream

In the post I cited some suicide statistics that were quite alarming. As the thousands of comments about the post came in (mostly via Facebook), I continually received the criticism that my data was incorrect. I triple-checked my sources (they checked out) and tried to reply to as many of the false claims of bad data as possible. It wasn’t until two days later that I realized that people Googling the statistics were taken straight to a Wikipedia article that listed outdated data. After I updated the Wikipedia article to include the most recent data, the data criticism comments immediately stopped. I could have saved myself a giant headache if I had just viewed the situation from the readers’ perspective and found the misinformation on Wikipedia earlier.

3. Be a first-responder

As the comments came in, I was alerted (rudely and repeatedly) that I had erroneously cited a date as 2011 rather than 2001. My first thought was just to subtly update the number but was worried this might start a backlash. For this reason, I called Jessica Dover. Jessica has worked on social media strategy for many of the world’s most well known celebrities and has solved more social media problems than I have followers. (Disclaimer: She also happens to be my sister, but I honestly think that has hindered her more than helped her :-p. Her success is hard-earned and her own.) Without hesitation, she told me exactly what to do.

  1. Publicly thank the readers for all of their feedback.
  2. Acknowledge that you are listening to them.
  3. Acknowledge the error and then actually fix it.
This strategy worked wonders. I fixed my mistake and the amount of comments on the blog post quadrupled (after the audience was reassured that I was listening and responding). Huge win!

If you don’t have your own social media mentor like Jessica, Moz’s Q&A can be a great source of information.

4. Patch the holes in your net

At the onset, I was receiving a lot of traffic but none of it was converting (my conversion events were email captures and social follows). When I couldn’t fix this myself, I called another member of my marketing SWAT team, Joe Chura. Joe runs an agency called Launch Digital Marketing. I think they are the most underrated team in the industry. In no time, they had a plan. Following their advice I installed two WordPress plugins:

  • MailChimp for WordPress Lite: There are lots of plugins that add MailChimp to a WordPress site but this is the only one that I know of that adds an opt-out check box below your comment reply box. If your readers are already entering their e-mail address in order to leave a comment, they might as well be asked if they want to sign up for your newsletter. For the text box, I used the text “I want to be kept up-to-date on Life Listed and receive free resources!”
  • Flare: This is my favorite social media sharing plugin (there are countless other options). This version is technically no longer under active development (they are building a new version to replace it), so I had disabled it on my site. Launch convinced me to re-add it.

After I added these plugins, it doubled the size of my mailing list and started what eventually became a viral spread of the blog post on Twitter. These were huge wins. (Hat tip to Dan Andrews for being at the forefront of that Twitter storm.)

Again, if you don’t have your own marketing SWAT team, Moz’s Q&A can be a great resource.

5. If you have to think about server optimization, it is too late

Throughout the entire process my server never went down. I credit this to two things:

First, props to WPengine (my host) for being seamless. They handled the spike without any hiccups or annoying interruptions. I will likely have to pay an overage fee but that is a MUCH better option than having a site outage.

Second, I credit preparation. I have long been using a tool called http://gtmetrix.com/ to diagnose speed problems on my site. (Hat tip to Jon over at Raven for introducing this tool to me). I love this tool because it combines the Google Page Speed tool and Yahoo’s YSlow into one convenient and easy to understand interface. Luckily, I had implemented all of the recommended fixes well before this traffic spike. I am kind of a speed optimization nerd. :-p

6. Take comfort in the negativity slope

When I first posted the blog post, no one cared. When it started to gain some traction, I was immediately told how stupid it and I were. As it gained momentum the amount of naysayers increased. It wasn’t until the post reached full velocity that the supporters started to outnumber the naysayers. This has been a trend that I have observed with all of my successful content. I now take comfort in knowing that it is going to get worse until it suddenly gets better. Negativity online is a slope, and luckily it does have a peak.

7. Facebook’s walled garden is much worse than it was before

Facebook once offered a tool called Facebook Insights for Domains. This tool allowed you to get valuable information on any traffic that was referred to your verified domain from Facebook. Unfortunately, Facebook has killed it off. When my post went viral on Facebook, I had no visibility other than that the traffic was coming from Facebook and Facebook mobile. I had no idea what pages or groups the applicable conversations were happening on, and thus had no way to respond to conversations happening behind the wall. This was a huge frustration throughout the whole process.

8. A rising tide…

When people came to my website to read the Singapore post, many of them checked out my other posts as well (this is to be expected). In response to this, I published a post that I thought would also be applicable to the new readers. Due to the increased visibility, this post (on useful money philosophies) subsequently went mildly viral. This in turn drove even more conversions.

9. Be aware of parallel universes

Stories exist in parallel universes:

  • What the storyteller experiences
  • The story the storyteller shares
  • The story as the audience members understand it

These are all very different stories!

Many of the comments, compliments and criticism that I received about the Singapore post had absolutely nothing to do with the words written in my article. For many, it was their personal experiences, not my blog post, that drove their responses. At first, this was a major frustration point for me. It wasn’t until I mapped out the perspectives in the above list that I calmed down and started to appreciate the storytelling experience.

10. Listen first, then wait, then react

When the responses came in, I was vastly outnumbered (it was literally 500,000 to 1)! The only way I was able to deal with that amount of volume was to listen, learn from an expert (see lesson 3), collect data, process that data, and then react. I let the first several dozen comments come in before I started to respond. I think this was critical in me being able to follow and supplement the large-scale discussion.

11. Titles are 60% of the battle

The click-worthiness of the blog post title was a major contributing factor to its success. (Second only to its honesty). Admittedly it was an attention-grabbing title but at the same time it was true. I actually will never be returning to Singapore. I didn’t perform any keyword research or A/B tests when picking the blog post title. Instead, I just picked something that I figured I would want to click. The best titles are always that simple.

When I look back on this marketing adventure, I feel thankful. The world, not just Singapore, is in an amazing state of change right now. I am glad that my little voice was able to contribute a little bit to the global discussion.

If you would like to hear about other marketing adventures, feel free to connect with me on Google+.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →