Archives for 

seo

How to Improve Your Conversion Rates with a Faster Website

Posted by Zoompf

This post was originally in YouMoz, and was promoted to the main blog because it provides great value and interest to our community. The author’s views are entirely his or her own and may not reflect the views of Moz, Inc.

credit-card-on-computer

Back in August the team at Zoompf published a joint research study with Moz analyzing How Website Speed Actually Impacts Search Ranking. In this research, a surprise result showed no clear correlation between page load time and search ranking. This confounded us, since we expected to see at least some small measure of correlation, especially after Google announced in 2010 that site speed would have a partial impact on search ranking. We did, however, observe a correlation between “Time to First Byte” and search ranking, and we delved into more detail in our follow-up post.

In these two articles, it was noted by our readers that while page load time may not appear to directly impact search ranking, it still has an obvious impact on user experience and will likely have an increasing impact on search ranking in the future. In other words, page load time should still be considered a priority to the success of your site.

But how big of a priority is it really? Of course it depends: The slower your site is now, the greater your user experience lags behind your competitors. Additionally, the more traffic your site receives, the more benefit you’ll receive from performance optimization (we’ll dig into that more below).

The good news is that, unlike the impact on search ranking, there is a wide body of independent research showing clear causation between improved site performance and increased conversion rates, user engagement, and customer satisfaction. It also just makes sense—we’ve all visited slow websites, and we’ve all bailed out when the page takes too long to load. On mobile we’re even less patient.

What may be surprising, though, is just how big of an impact a slow performance can have on your conversions. Let’s look at that first.

The research

research_books

Back in 2006, Amazon presented one of the first studies linking a clear causation between page load time and online customer revenue, summarized in Greg Linden’s presentation Make Data Useful. Through A/B testing, Greg showed every 100 millisecond delay in page rendering time resulted in a 1% loss of sales for Amazon.

In more recent research, Intuit presented findings at Velocity 2013 from their recent effort to reduce page load time from 15 seconds to 2 seconds. During that effort, they observed a dramatic increase in conversions for every second shaved off their page load time, in a stair step that decreased with increasing speed. Specifically:

  • +3% conversions for every second reduced from 15 seconds to 7 seconds
  • +2% conversions for every second reduced from seconds 7 to 5
  • +1% conversions for every second reduced from seconds 4 to 2

So in other words there was tremendous value in the initial optimization, and diminishing value as they got faster.

In another recent report, Kyle Rush from the 2011 Obama for America campaign site showed through A/B testing that a 3-second page time reduction (from 5 seconds to 2 seconds) improved onsite donations by 14%, resulting in an increase of over $34 million in election contributions.

In fact, there’s a wide body of research supporting clear economic benefits of improving your site performance, and clearly the slower your site is, the more you have to gain. Additionally, the higher your traffic, the larger the impact each millisecond will yield.

How fast should I be?

Whenever we talk with people about web performance, they always want to know “How fast should I be?” Unfortunately this one is hard to answer, since the result is subjective to your business goals. Those in the performance industry (of which, full disclosure, Zoompf is a member) may push you to hit two seconds or less, citing research such as that from Forrester showing that 47% of users expect pages to load in two seconds or less.

We prefer a more pragmatic approach: You should optimize to the point where the ROI continues to makes sense. The higher your traffic, the more monetary difference each millisecond gained will make. If you’re Amazon.com, a 200-ms improvement could mean millions of dollars. If you’re just launching a new site, getting down to 4-6 seconds may be good enough. Its really a judgment call on your current traffic levels, where your competition sits, your budget, and your strategic priorities.

The first step, though, is to measure where you stand. Fortunately, there’s a great free tool supported by Google at WebPageTest.org that can measure your page load time from various locations around the world. If you receive a lot of international traffic, don’t just select a location close to home—see how fast your site is loading from Sydney, London, Virginia, etc. The individual results may vary quite a bit! WebPageTest has a lot of bells and whistles, so check out this beginner’s guide to learn more.

Where do I start?

Improving the performance of your site can seem daunting, so it’s important you start with the low hanging fruit. Steve Souders, the Head Performance Engineer at Google, has famously stated:

“80-90% of the end-user response time is spent on the front-end. Start there.”

This has come to be called the Performance Golden Rule. In layman’s terms, this means that while optimizing your web server and database infrastructure is important, you will get a higher return on your time investment by first optimizing the front-end components loaded by your users’ browsers. This means all the images, CSS, JavaScript, Flash and other resources linked as dependencies from your base HTML page.

You can see the Performance Golden Rule well illustrated in a typical waterfall chart returned by tools like WebPageTest. Note how the original page requested is a very small subset of the overall time. Generating this original base page is where all the back-end server work is done. However, all the other resources included by that page (images, CSS, etc.) are what take the large majority of the time to load:

waterfall_frontend

So how can you speed up your front-end performance and reap the rewards of a better user experience? There are literally hundreds of ways. In the sections below, we will focus on the high-level best practices that generally yield the most benefit for the least amount of effort.

Step 1: Reduce the size of your page

Bloated content takes a long time to download. By reducing the size of your page, you not only improve your speed, you also reduce the used network bandwidth for which your hosting provider charges you.

An easy optimization is enabling HTTP compression, which can often reduce the size of your text resources (HTML, CSS, and JavaScript) by 50% or more. WhatsMyIP.org has a great free tool to test if compression is turned on for your site. When using, don’t just test the URL to your home page, but also test links to your JavaScript and CSS files. Often we find compression is turned on for HTML files, but not for JavaScript and CSS. This can represent a considerable potential performance boost when your server is configured for compression properly. Keep in mind, though, you do NOT want your images to be compressed by the server as they are already compressed. The extra server processing time will only slow things down. You can learn more in this detailed guide on what content you should compressing on your website.

If you find your server is not using compression, talk to your server admin or hosting provider to turn it on. Its often a simple configuration setting, for example see the mod_deflate module for Apache, IIS 7 configuration docs, or this article on enabling on WordPress sites.

In addition, images can often contribute to 80% or more of your total page download size, so its very important to optimize them as well. Follow these best practices to cut down your image size by 50% or more in some cases:

  • Don’t use PNG images for photos. JPEG images compress photographs to significantly smaller sizes with great image quality. For example, on Windows 8 launch day, the Microsoft homepage used a 1 megabyte PNG photograph when a visually comparable JPEG would have been 140k! Think of all the wasted bandwidth on that one image alone!
  • Don’t overuse PNGs for transparency. Transparency is a great effect (and not supported by JPEG), but if you don’t need it, you don’t always need the extra space of a PNG image, especially for photographic images. PNGs work better for logos and images with sharp contrast, like text.
  • Correctly set your JPEG image quality. Using a quality setting of 50-75% can significantly reduce the size of your image without noticeable impact on image quality. Of course, each result should be individually evaluated. In most cases your image sizes should all be less than 100k, and preferably smaller.
  • Strip out extraneous metadata from your images. Image editors leave a lot of “junk” in your image files, including thumbnails, comments, unused palette entries and more. While these are useful to the designer, they don’t need to be downloaded by your users. Instead, have your designer make a backup copy for their own use, and then run the website image versions through a free optimizer like Yahoo’s Smush.It or open source tools like pngcrush and jpegtran.

Lastly, another good way to reduce your page size is to Minify your Javascript and CSS. “Minification” is a process that strips out the extra comments and spaces in your code, as well as shortening the names of functions and variables. This is best seen by example:

Example: Original Javascript

 /* ALERT PLUGIN DEFINITION
  * ======================= */
  var old = $.fn.alert
  $.fn.alert = function (option) {
    return this.each(function () {
      var $this = $(this)
        , data = $this.data('alert')
      if (!data) $this.data('alert', (data = new Alert(this)))
      if (typeof option == 'string') data[option].call($this)
    })
  }
  $.fn.alert.Constructor = Alert

Minified Version (from YUI Compressor):

var old=$.fn.alert;$.fn.alert=function(a){return this.each(function(){var c=$(this),b=c.data("alert");if(!b){c.data("alert",(b=new Alert(this)))}if(typeof a=="string"){b[a].call(c)}})};

Your minified pages will still render the same, and this can often reduce file sizes by 10-20% or more. As you can see, this also has the added benefit of obfuscating your code to make it harder for your competitors to copy and modify all your hard earned work for their own purposes. JSCompress is a basic easy online tool for Javascript, or you can also try out more powerful tools like JSMin or Yahoo’s YUI compressor (also works for CSS). There’s also a useful online version of YUI which we recommend.

Step 2: Reduce the number of browser requests

The more resources your browser requests to render your page, the longer it will take to load. A great strategy to reduce your page load time is to simply cut down the number of requests your page has to make. This means less images, fewer JavaScript files, fewer analytics beacons, etc. There’s a reason Google’s homepage is so spartan, the clean interface has very few dependencies and thus loads super fast.

While “less is more” should be the goal, we realize this is not always possible, so are some additional strategies you can employ:

  • Allow browser caching. If your page dependencies don’t change often, there’s no reason the browser should download them again and again. Talk to your server admin to make sure caching is turned on for your images, JS and CSS. A quick test is to plug the URL of one of your images into redbot.org and look for the header Expires or Cache-Control: max-age in the result. For example, this image off the eBay home page will be cached by your browser for 28,180,559 seconds (just over 1 year).

expires_header2

Cache-Control is the newer way of doing things, but often times you’ll also see Expires to support older browsers. If you see both, Cache-Control will “win” for newer browsers.

While browser side caching will not speed up the initial page load of your site, it will make a HUGE difference on repeat views, often knocking off 70% or more of the time. You can see this clearly when looking at the “Repeat View” metrics in a WebPageTest test, for example:

broswer_caching

  • Combine related CSS and JS files. While numerous individual CSS and JS files are easier for your developers to maintain, a lesser number of files can load much faster by your browser. If your files change infrequently, then a one time concatenation of files is an easy win. If they do change frequently, consider adding a step to your deploy process that automatically concatenates related groups of functionality prior to deployment, grouping by related functional area. There are pros and cons to each approach, but there’s some great info in this StackOverflow thread.
  • Combine small images into CSS sprites. If your site has lots of small images (buttons, icons, etc.), you can realize significant performance gains by combining them all into a single image file called a “sprite.” Sprites are more challenging to implement, but can yield significant performance gains for visually rich sites. See the CSS Image Sprites article on w3schools for more information, and check out the free tool SpriteMe.

Step 3: Reduce the distance to your site

If your website is hosted in Virginia, but your users are visiting from Australia, it’s going to take them a long time to download your images, JavaScript and CSS. This can be a big problem if your site is content-heavy and you get a lot of traffic from users far away. Fortunately, there’s an easy answer: Sign up for a Content Delivery Network (CDN). There are many excellent ones out there now, including Akamai, Amazon CloudFront, CloudFlare and more.

CDN’s work basically like this: you change the URL of your images, JS and CSS from something like this:

http://mysite.com/myimage.png

to something like this (as per the instructions given to you from your CDN provider):

http://d34vewdf5sdfsdfs.cloudnfront.net/myimage.png

Which then instructs the browser to look out on the CDN network for your image. The CDN provider will then return that image to the browser if it has it, or it will pull it from your site and store for reuse later if it doesn’t. The magic of CDNs is that they then copy that same image (or javascript or CSS file) to dozens, hundreds or even thousands of “edge nodes” around the world to route that browser request to the closest available location. So if you’re in Melbourne and request an image hosted in Virginia, you may instead get a copy from Sydney. Just like magic.

To illustrate, consider the left image (centralized server) vs. the right image (duplicated content around the world):

In closing

While front-end performance does not currently appear to have a direct impact on search ranking, it has a clear impact on user engagement and conversions into paying customers. Since page load time also has a direct impact on user experience, it is very likely to have a future impact on search ranking.

While there are many ways to optimize your site, we suggest three core principles to remember when optimizing your site:

  1. Reduce the size of your page
  2. Reduce the number of browser requests
  3. Reduce the distance to your site

Within each of these, there are different strategies that apply based on the makeup of your site. We at Zoompf have also introduced several free tools that can help you determine which areas will make the biggest impact, and we also support a free tool to analyze your website for over 400 common causes of slow front-end performance. You can find them here: http://zoompf.com/free.

Happy hunting!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

Help Us Improve the Moz Blog

Posted by Trevor-Klein

It’s been since April 2010 that we asked for your feedback on the Moz Blog, and quite a lot has changed since then. Panda and Penguin drove an industry-wide shift toward higher-quality content and links, and SEOs have come to realize that they can’t just be SEOs anymore. The percentage of web traffic coming from mobile devices has risen 600%, Google stopped providing data about keywords, and search engines started looking at far more than just your query to give you helpful results.

To keep up with those changes, SEOmoz became Moz, expanding the scope of its tools and content to reflect the increasingly diverse and interrelated areas of inbound marketing. Our audience has expanded accordingly, and in that vein, we want to be sure the blog is keeping pace.

Our mission is to help you all become better marketers, and to do that, we need to know more about you. What challenges do you all face? What are your pain points? Your day-to-day frustrations? If you could learn more about one or two (or three) topics, what would those be?

If you’ll help us out by taking this five-minute survey, we can make sure we’re offering the most useful and valuable content we possibly can. When we’re done looking through the responses, we’ll follow up with a post about what we learned.

Note: If you’re viewing this page on a mobile device, you might need to tap on the icon below to see a mobile-friendly version of the survey.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

Why Bad Linkbait Needs to Die: How Linkable Assets Deliver 10x More Value

Posted by Cyrus-Shepard

I hate bad linkbait, and it floods my inbox. Bad linkbait wastes our time, money, and our audience’s attention.

On the other hand, I love creating linkable assets. I also love searching the web for linkable assets and sharing them with others. Before we go any further, let’s define what we mean by linkbait, bad linkbait and linkable assets.

Linkbait: Website feature, usually content, meant to attract links for the purposes of SEO.

Bad Linkbait: Content that attracts links without adding additional value. One of the hallmark characteristics of bad linkbait is that it often rehashes the work of others, without creating anything new.

Linkable Assets: Content or features characterized by a high degree of practical utility or emotional engagement. Linkable assets often attract links over time due the high value they offer.

The SEO problem with bad linkbait

Bad linkbait is not only less effective, but it often has very real SEO consequences down the line in terms of types of links earned and the relevance of the content. In extreme examples, we’ve seen instances of poorly executed linkbait leading to Penguin-style Google penalties.

While there is no single type of bad linkbait, the following characteristics are often defining hallmarks:

  1. Temporary spike in linking followed by a quick drop-off
  2. Meant to be scalable and easy
  3. Off-topic or marginally relevant content
  4. Visitors not likely to return
  5. Rehashed “Top 10” Lists
  6. Infographics without the “info”
  7. Controversy for the sake of controversy
  8. Commercial anchor text controlled by creator

The reason bad linkbait sucks so much energy is that you get almost no return on investment for the effort you put into it.

An example seen all the time is an infographic that is only marginally related to the subject matter of the website, such as those that Rand discussed in last week’s Whiteboard Friday. Imagine a plumbing company that makes an infographic called “10 Most Horrific Water Deaths Ever.”

  • The SEO company convinced them that the keyword “water” is related to plumbing, and this will help them to rank if they can get the infographic distributed widely enough. Maybe it will, but not nearly as much as if they created something truly new that was actually related to their core business.
  • The links they earn spike when they are actively pouring money and effort into sharing, but stop almost immediately after that.
  • The plumbing website has no other content about “horrific water deaths,” so the topic is only marginally related.
  • The links all have the same anchor text due to the widget used to embed the infographic. Google’s Penguin algorithm picks this up and penalizes them for “water” related keywords.
  • After 2 weeks, traffic trickles to almost nothing. The SEO company moves onto the next infographic.

Is there an easy solution? Take the same amount of time and money spent to create 2-3 pieces of mediocre linkbait, and spend that energy creating a truly remarkable linkable asset.

How linkable assets deliver 10x the value

The great thing about linkable assets is that, when successful, they take on a life of their own and the SEO benefit can grow to 10 or even 100 times what was originally anticipated.

Good linkable assets earn repeat visits and traffic over time. Links aren’t pushed but earned in unexpected places with natural and topically relevant anchor text. Plus, when you publish valuable content actually related to your core subject matter, you help establish yourself as an authority on that topic, and more likely to appear in search results for topically relevant queries.

Because good linkable assets often earn a greater variety of links spread over time through value instead of aggressive link promotion, they are less likely to ever earn a Google penalty.

Examples of linkable assets include this worldwide guide to etiquette, this online salary calculator or even Moz’s Google Algorithm Change History.

Questions used to help identify linkable assets:

  1. Does it create something new?
  2. Does it make something easier?
  3. Is it likely to be used again and again?
  4. Does it reveal new insight or knowledge?
  5. Does it create something beautiful?
  6. Does it evoke a strong emotional response?
  7. Does it provide practical value?

Can linkable assets also be linkbait?

The most successful linkable assets possess the better qualities of fine linkbait. In fact, for SEO benefit, it’s essential that your linkable asset invoke a strong emotional response or be perceived as having high practical value.

This is the “sweet spot” in the middle that combines the best marketing value of linkbait with the added value of linkable assets.

Linkable assets: exemplary examples

Visual assets

Rand mentioned a good number in his recent Whiteboard Friday Why Visual Assets > Infographics, so I wanted to list a few more that offer high practical value and succeed in earning natural, highly-topical links.

Can an infographic act as a linkable asset? Yes, when it meets the requirements defined above.

This excellent Radiation Dose Chart infographic created by xkcd not only inspires awe but has been linked to thousands of times due to people wanting to share its practical utility.

Which Local Review Sites Should You Try to Get Review On? by LocalVisibilitySystem.org displays a ton of knowledge in a succinct and successful format.

Moz’s Web Developers SEO Cheat Sheet provides a visual asset we’re quite proud of.

For pure visual appeal, this Cheetah infographic by Jacob Neal is one of my all-time favorites. It stretches the boundaries of visual design and I found myself reading every word as a result.

Tools

ShareTally – Similar in function to SharedCount, ShareTally gives you a free and quick overview of important social metrics for any URL. This is one you bookmark.

Creative assets

Robby Leonardi’s Interactive Game Resume feels like playing a game and has led Robby to win multiple design awards.

Data sharing

Everyone has data if you look hard enough. Done at scale, the results can be truly outstanding.

The (not provided) Global Report aggregates data from over 5000 websites to display near real-time reporting of Google’s (not provided) keywords worldwide.

Studies

One of our favorite email providers, MailChimp, recently studied email subject line open rates. This graphic explores the effect of including a subject’s first and last name across various industries.

Moz’s own Search Engine Ranking Factors is consistently one of the most popular studies we publish.

Videos

Look no further than Wistia’s learning center for best practices on producing videos for your business. Check out this one they made on advanced video SEO with they guys from Distilled.

Endless possibilities for linkable assets

You can turn any unique knowledge into a linkable asset without shooting a video or adding fancy graphics. Think of folks like Seth Godin or Patrick McKenzie who regularly share their valuable thoughts with the world.

The key is to deliver the content in both a valuable and emotionally engaging way. If you are a talented writer, this is probably your best avenue. If not, then thinking outside the blog post box may be required.

What are your favorite examples of examplary linkable assets? Let us know in the comments below.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

Stop Thinking Keywords, Think Topics

Posted by katemorris

We have hired a few new people at Distilled—we’re always growing—but as I was explaining the Keyword Planner to our new hires I realized that we are all thinking the wrong way for the future of online marketing.

One of my colleagues, Tom Anthony, has a very scientific way of explaining it: The new query according to Google. He comes to the same conclusion I did: “We need to stop looking at keywords and starting looking at queries.” In short, we need to be focusing on what the user is looking for rather than specifically all of the ways they can phrase it.

I am not going to try to convince you of this. We are here. This is the world we live in, so rather than adapting the old way of thinking to the new search order (NSO?), it’s time to change our thinking.

What does this really mean to us as practitioners on a day-to-day basis?

We have to stop using the term “keyword” as much as we can. It will never go away, don’t get me wrong, but our focus has to change. This means speaking differently, reporting differently, and changing the conversation with our clients about their goals.

You are going to get asked for a keyword research report or a keyword ranking report soon. We as search professionals have provided them in the past, so it’s normal for your boss or clients to expect a certain type of data or report. However, with the changes over the last few years, it’s time to modify what we report to align better with the data we can get and the data that is best for our goals.

Start by defining your goals

We’ve said this time and time again: You have to define what you want as a business before you can really get to doing your job in the best way. Your company goals could be:

  • to be a thought leader in your space
  • to grow the business
  • to launch a new product
  • to increase your company’s share of voice in the market

These goals should be set by the company collectively, not just you. Your goals are based off of this. Your goals should be something measurable and impact the company’s goals. Let’s say that the company wants to grow their business’s revenue by 50% next year, website performance can help that with conversions, new visitors, and overall more traffic. Therefore your goals might look like:

  • Increase overall website traffic by 25%
  • Increase new visitor percentage from 25% to 40%
  • Increase conversion rate from the website from 45% to 70%

Notice that keyword ranking and traffic based on keywords are not in here. It’s doubtful they ever have been part of your defined goals, but knowing your goals and the company’s goals helps change the conversation.

Now, what do you want to accomplish?

Time to start the hard conversations. You should be reporting on your goals from above and what actions you are taking to affect those numbers. At some point your boss or client will ask for a keyword or ranking report. When they do, ask what they want to accomplish with that information. It’ll give you more insight into what they are looking for and how best to report that to them.

Most likely it’s so they know what your efforts are focused on, and that’s understandable.

  1. Start by explaining your goals and how they impact the bigger company goals.

  2. Then, explain the changes to the information that is sent to analytics, and that reporting on the keyword level is next to impossible.

  3. Finally, talk about how you want to stay dedicated to things that can be measured, and provide results to the company’s bottom line.

But… we have to RANK!

If they then say that the keyword is the most important thing for you to report on, ask why again. The answer is usually because that’s how you tell if your site is ranking for a term, or if your “SEO is working.”

Rankings happen for many reasons; the keyword or query is just the initiator of the process. You optimize a page to be the strongest it can be after you’ve made it the best page for a specific need or topic. There are multiple variations of keywords for any one topic, and therefore your focus should be on the page and the topic, not just one or two of potentially hundreds of keywords.

The two major factors in ranking that you can have an effect on are related to the target page. Having relevant content and strengthening the page are what you should be focused on as a search marketer. Look at the highest correlated factors to ranking from the 2013 Ranking Factors Survey. All of the top factors are page-related.

Your next question should be: If I stop thinking about keywords, how do I know what content to develop to rank?

That depends on the user. We as a profession have really lost sight of talking to the users of our websites. Think about any number of keyword research presentations in the past few years (I’ve done and seen a number of them) and you’ll see that many of them spoke to the Google Keyword Tool’s numbers being wrong and getting inspiration in other places, mainly where your target market hangs out.

If you want to know what content to write to “rank” for terms, ask the people who are searching for that topic what they are looking for and write that. This changes how we do research but I think for the better.

Changing reporting

I am going to leave you with how I have started reporting on page level changes and how “SEO is doing.” You should again be reporting on the metrics you defined in your goals, but you’ll need to replace keyword-specific reports. I’m referring to reports like the number of keywords sending traffic (RIP; that was a favorite of mine), branded vs. non-branded keyword traffic, and ranking reports.

Step 1: Define all search landing pages

This should be all pages on your site technically except those noindexed, but we almost all have an idea of what pages get traffic from search engines. If you are a larger e-commerce site with thousands or millions of pages, you can group these into categories or by page type. Whatever works for you.

Step 2: Prioritize the top landing pages

Remember the terms you always had to report ranking on? What were the pages that needed to rank? Identify those and make a prioritized list just like you would have with keywords.

Step 3: Pull monthly traffic over the last year for those pages

You can automate this of course, but if you have a small number of pages it can be done by hand as well. Traffic is what you want to know about, and you want it to be going up. If traffic goes down to that page, that is your sign that something changed, either the SERPs or demand for that content. Just like if rankings went down, you’d investigate why after seeing that drop.

Step 4: Pull related data per page based on your goals

For the goals we defined above, I’d also report on the percentage of new visitors and conversions. You could report on bounce rate or time on page as well. Below is something that I recently sent to a client (modified to be able to share with a wider audience, of course).

I then investigated the pages that lost traffic and they are on my list to watch next month. This is just how I decided to do it for this client and I am interested to hear how you are having to change your reporting to deal with the changes in our world.

Please share your thoughts below, and have a great week!


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →

Why Visual Assets > Infographics – Whiteboard Friday

Posted by randfish

The marketing industry seems to have a love-hate relationship with infographics. When they’re really done well, they can be effective ways of conveying a lot of complex information in a way that’s easier to digest. The problem is that relatively few of today’s infographics are really done well, and many are simply created for shallow SEO benefit.

In today’s Whiteboard Friday, Rand talks about the differences between infographics and visual assets, and why the latter are far more effective in our efforts.

Whiteboard Friday – Visual Assets better than Infographics_1

For reference, here’s a still of this week’s whiteboard:

Video Transcription

Howdy, Moz fans, and welcome to another edition to of Whiteboard Friday. This week I’m going to take a stance. It’s a little bit of a strong and contrarian stance. I’m going to say that I really, really dislike most infographics. In fact, not even most. The vast majority of infographics I strongly dislike. And that said, I really like visual assets. Today I’m going to try and explain the difference to you and show you why I’m a huge believer in one and such a disrespecter of the other.

So the typical infographic and the thing that frustrates me about it so much is that it’s really designed primarily to get embeds, to get links, potentially to get some traffic and build some branding. But it’s actually not optimized for a lot of these things. In fact, because the medium has both become so overused and because the execution on many of them is such poor quality, I find that they often hurt more than they help. Because of that, I’m not a fan.

So here is your typical infographic. How obsessed are Facebook users with celebrities? Oh my gosh, look, 35% have liked a celebrity’s page, and look, more and more people have liked more and more celebrity pages over time. Here’s a picture of some people, and here are words in some graphic format that’s really hard to read and unnecessary illustrations on the side just to ornament this thing up.

Then they hope that someone is going to pick it up and embed it on their news site, and occasionally this stuff does work. In fact, for a few years now it has worked. The challenge is it keeps going down and down and down. It’s reaching a point of diminishing returns, and I think that’s because audiences are really tired of the infographic format or are getting very tired, especially more sophisticated and savvy audiences, which for a lot of B2B and even many B2C marketers, let’s face it, we are reaching those areas.

Also, these things can be tremendously burdensome to try and put on a web page. They’re hard to read a lot of the time. So it makes it challenging even when someone does embed it. Google has said specifically that they’re looking at algorithmic ways that they can work around infographics that get embedded that people didn’t really mean to or intend to link back, and they are merely doing a link to the infographic because of the embed itself.

This kind of stuff, eh, I’m just not about that. I don’t think that most of us in the inbound marketing field should be about that, despite the potentially positive impact that something very similar can have.

So these are visual assets. There are many different kinds of visual assets. In fact, I would say infographics, traditional infographics are just one type of visual asset and possibly not the best one. In fact, probably not the best one in my opinion.

Photos, just a collection of pictures from relevant and interesting people, events, places, even concepts that are illustrated, these get picked up. They get shared around the web. They’re useful for social media networks. But they’re also useful to have in a photo library that people might take and use for all kinds of different reasons.

Charts and graphs that illustrate or explain the numbers behind a story or a phenomenon, these can be incredibly useful, and they get picked up and used all the time by sources that want to quote the numbers and even by sources that originated the numbers that are looking for visual ways to represent them. This is a phenomenal way to build value through visual assets.

Visual representations, I do stuff like this all the time. Think of the SEO Pyramid. It starts at the base with accessibility, and then we talk about keywords and links, social, user and usage signals, and all that kind of stuff. I’ve done some visuals like that on Whiteboard Friday, things like the ranking factors by distribution through the pie chart explaining those different things.

I’ve done stuff like the T-shaped web marketer, talking about going deep in a particular niche, but having a lot of cross domain expertise. These are not high-quality graphics. They’re made by me. I use Flash 6 to make these things, because I learned Flash way back in my days as a web designer. I’m lazy and have not learned to get good at Illustrator or Photoshop in particular. Yet, they get picked up and sent all over the place, and you can see visual assets doing the same thing in all sorts of niches.

Comics, illustrations, or storyboards that tell a narrative visually, incredibly popular and get picked up all the time. Screen shots; even just a simple screen shot with some annotation and explanation, examples of what to do, how to use it, how to interpret that information, layering on top some data, these types of visual assets have huge caché and value.

You get a lot more opportunity from these kinds of visual assets, in my opinion and experience, for links, for referencing, being referenced by media outlets, by industry resources, by third parties, by people in your professional or personal sphere. You have more of an opportunity for embeds because they’re much simpler to embed, and they can be useful in so many more places than an infographic, which really needs to take up an entire post about it if it’s going to get referenced at all.

They give a lot more value to people. They’re simple to consume, to understand, and they’re useful and usable in ways that infographics often are not. And, a lot of the time they’re far simpler to execute. It doesn’t take a graphic designer to produce a ton of these different types of resources. It often doesn’t cost very much, if anything at all, to make them, and that means you can produce a far greater quantity of visual assets than you could of infographics and have potential there to get links, to get references, to build your brand in really authentic ways.

So I’m sure there will be some vigorous debate and discussion in the comments, and I look forward to it. We’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Continue reading →