Looking Beyond Keywords: How to Drive Conversion with Visual Search & Search by Camera

Posted by Jes.Scholz

Let’s play a game. I’ll show you an image. You type in the keyword to find the exact product featured in the image online. Ready?

Google her sunglasses…

What did you type? Brown sunglasses? Brown sunglasses with heavy frame? Retro-look brown sunglasses with heavy frame? It doesn’t matter how long-tail you go, it will be difficult to find that exact pair, if not impossible. And you’re not alone.

For 74% of consumers, traditional text-based keyword searches are inefficient at helping find the right products online.

But much of your current search behavior is based on the false premise that you can describe things in words. In many situations, we can’t.

And this shows in the data. Sometimes we forget that Google Images accounts for 22.6% of all searches — searches where traditional methods of searching were not the best fit.

Image credit: Sparktoro

But I know what you’re thinking. Image SEO drives few to no sessions, let alone conversions. Why should I invest my limited resources into visual marketing?

Because humans are visual creatures. And now, so too are mobile phones — with big screens, multiple cameras, and strong depth perception.

Developments in computer vision have led to a visual marketing renaissance. Just look to visual search leader Pinterest, who reported that 55% of their users shop on the platform. How well do those users convert? Heap Analytics data shows that on shopping cart sizes under $199, image-based Pinterest Ads have an 8.5% conversion rate. To put that in context, that’s behind Google’s 12.3% but in front of Facebook’s 7.2%.

Not only can visual search drive significant conversions online. Image recognition is also driving the digitalization and monetization in the real world.

The rise of visual search in Google

Traditionally, image search functioned like this: Google took a text-based query and tried to find the best visual match based on metadata, markups, and surrounding copy.

But for many years now, the image itself can also act as the search query. Google can search for images with images. This is called visual search.

Google has been quietly adding advanced image recognition capabilities to mobile Google Images over the last years, with a focus on the fashion industry as a test case for commercial opportunities (although the functionality can be applied to automotive, travel, food, and many other industries). Plotting the updates, you can see clear stepping stone technologies building on the theme of visual search.

  • Related images (April 2013): Click on a result to view visually similar images. The first foray into visual search.
  • Collections (November 2015): Allows users to save images directly from Google’s mobile image search into folders. Google’s answer to a Pinterest board.
  • Product images in web results (October 2016): Product images begin to display next to website links in mobile search.
  • Product details on images (December 2016): Click on an image result to display product price, availability, ratings, and other key information directly in the image search results.
  • Similar items (April 2017): Google can identify products, even within lifestyle images, and showcases similar items you can buy online.
  • Style ideas (April 2017): The flip side to similar items. When browsing fashion product images on mobile, Google shows you outfit montages and inspirational lifestyle photos to highlight how the product can be worn in real life.
  • Image badges (August 2017): Label on the image indicate what other details are available, encouraging more users to click; for example, badges such as “recipe” or a timestamp for pages featuring videos. But the most significant badge is “product,” shown if the item is available for purchase online.
  • Image captions (March 2018): Display the title tag and domain underneath the image.

Combining these together, you can see powerful functionality. Google is making a play to turn Google Images into shoppable product discovery — trying to take a bite out of social discovery platforms and give consumers yet another reason to browse on Google, rather than your e-commerce website.

Image credit: Google

What’s more, Google is subtly leveraging the power of keyword search to enlighten users about these new features. According to 1st May MozCast, 18% of text-based Google searches have image blocks, which drive users into Google Images.

This fundamental change in Google Image search comes with a big SEO opportunity for early adopters. Not only for transactional queries, but higher up the funnel with informational queries as well.

kate-middleton-style.gif

Let’s say you sell designer fashion. You could not only rank #1 with your blog post on a informational query on “kate middleton style,” including an image on your article result to enhance the clickability of your SERP listing. You can rank again on page 1 within the image pack, then have your products featured in Similar Items — all of which drives more high-quality users to your site.

And the good news? This is super simple to implement.

How to drive organic sessions with visual search

The new visual search capabilities are all algorithmically selected based on a combination of schema and image recognition. Google told TechCrunch:

“The images that appear in both the style ideas and similar items grids are also algorithmically ranked, and will prioritize those that focus on a particular product type or that appear as a complete look and are from authoritative sites.”

This means on top of continuing to establish Domain Authority site-wide, you need images that are original, high resolution, and clearly focus on a single theme. But most importantly, you need images with perfectly implemented structured markup to rank in Google Images.

To rank your images, follow these four simple steps:

1. Implement schema markup

To be eligible for similar items, you need product markup on the host page that meets the minimum metadata requirements of:

  • Name
  • Image
  • Price
  • Currency
  • Availability

But the more quality detail, the better, as it will make your results more clickable.

2. Check your implementation

Validate your implementation by running a few URLs through Google’s Structured Data Testing Tool. But remember, just being valid is sometimes not enough. Be sure to look into the individual field result to ensure the data is correctly populating and user-friendly.

3. Get indexed

Be aware, it can take up to one week for your site’s images to be crawled. This will be helped along by submitting an image XML sitemap in Google Search Console.

4. Look to Google Images on mobile

Check your implementation by doing a site:yourdomain.cctld query on mobile in Google Images.

If you see no image results badges, you likely have an implementation issue. Go back to step 2. If you see badges, click a couple to ensure they show your ideal markup in the details.

Once you confirm all is well, then you can begin to search for your targeted keywords to see how and where you rank.

Like all schema markup, how items display in search results is at Google’s discretion and not guaranteed. However, quality markup will increase the chance of your images showing up.

It’s not always about Google

Visual search is not limited to Google. And no, I’m not talking about just Bing. Visual search is also creating opportunities to be found and drive conversion on social networks, such as Pinterest. Both brands allow you to select objects within images to narrow down your visual search query.

Image credit: MarTech Today

On top of this, we also have shoppable visual content on the rise, bridging the gap between browsing and buying. Although at present, this is more often driven by data feeds and tagging more so than computer vision. For example:

  • Brahmin offers shoppable catalogs
  • Topshop features user-generated shoppable galleries
  • Net-a-Porter’s online magazine features shoppable article
  • Ted Baker’s campaigns with shoppable videos
  • Instagram & Pinterest both monetize with shoppable social media posts

Such formats reduce the number of steps users need to take from content to conversion. And more importantly for SEOs, they exclude the need for keyword search.

I see a pair of sunglasses on Instagram. I don’t need to Google the name, then click on the product page and then convert. I use the image as my search query, and I convert. One click. No keywords.

…But what if I see those sunglasses offline?

Digitize the world with camera-based search

The current paradigm for SEOs is that we wait for a keyword search to occur, and then compete. Not only for organic rankings, but also for attention versus paid ads and other rich features.

With computer vision, you can cut the keyword search out of the customer journey. By entering the funnel before the keyword search occurs, you can effectively exclude your competitors.

Who cares if your competitor has the #1 organic spot on Google, or if they have more budget for Adwords, or a stronger core value proposition messaging, if consumers never see it?

Consumers can skip straight from desire to conversion by taking a photo with their smartphone.

Brands taking search by camera mainstream

Search by camera is well known thanks to Pinterest Lens. Built into the app, simply point your camera phone at a product discovered offline for online recommendations of similar items.

If you point Lens at a pair of red sneakers, it will find you visually similar sneakers as well as idea on how to style it.

Image credit: Pinterest

But camera search is not limited to only e-commerce or fashion applications.

Say you take a photo of strawberries. Pinterest understand you’re not looking for more pictures of strawberries, but for inspiration, so you’ll see recipe ideas.

The problem? For you, or your consumers, Pinterest is unlikely to be a day-to-day app. To be competitive against keyword search, search by camera needs to become part of your daily habit.

Samsung understands this, integrating search by camera into their digital personal assistant Bixby, with functionality backed by powerful partnerships.

  • Pinterest Lens powers its images search
  • Amazon powers its product search
  • Google translates text
  • Foursquare helps to find places nearby

Bixby failed to take the market by storm, and so is unlikely to be your go-to digital personal assistant. Yet with the popularity of search by camera, it’s no surprise that Google has recently launched their own version of Lens in Google Assistant.

Search engines, social networks, and e-commerce giants are all investing in search by camera…

…because of impressive impacts on KPIs. BloomReach reported that e-commerce websites reached by search by camera resulted in:

  • 48% more product views
  • 75% greater likelihood to return
  • 51% higher time on site
  • 9% higher average order value

Camera search has become mainstream. So what’s your next step?

How to leverage computer vision for your brand

As a marketer, your job is to find the right use case for your brand, that perfect point where either visual search or search by camera can reduce friction in conversion flows.

Many case studies are centered around snap-to-shop. See an item you like in a friend’s home, at the office, or walking past you on the street? Computer vision takes you directly from picture to purchase.

But the applications of image recognition are only limited by your vision. Think bigger.

Branded billboards, magazines ads, product packaging, even your brick-and-mortar storefront displays all become directly actionable. Digitalization with snap-to-act via a camera phone offers more opportunities than QR codes on steroids.

If you run a marketplace website, you can use computer vision to classify products: Say a user wants to list a pair of shoes for sale. They simply snap a photo of the item. With that photo, you can automatically populate the fields for brand, color, category, subcategory, materials, etc., reducing the number of form fields to what is unique about this item, such as the price.

A travel company can offer snap-for-info on historical attractions, a museum on artworks, a healthy living app on calories in your lunch.

What about local SEO? Not only could computer vision show the rating or menu of your restaurant before the user walks inside, but you could put up a bus stop ad calling for hungry travelers to take a photo. The image triggers Google Maps, showing public transport directions to your restaurant. You can take the customer journey, quite literally. Tell them where to get off the bus.

And to build such functionality is relatively easy, because you don’t need to reinvent the wheel. There are many open-source image recognition APIs to help you leverage pre-trained image classifiers, or from which you can train your own:

  • Google Cloud Vision
  • Amazon Rekognition
  • IBM Watson
  • Salesforce Einstein
  • Slyce
  • Clarifai

Let’s make this actionable. You now know computer vision can greatly improve your user experience, conversion rate and sessions. To leverage this, you need to:

  1. Make your brand visual interactive through image recognition features
  2. Understand how consumers visually search for your products
  3. Optimize your content so it’s geared towards visual technology

Visual search is permeating online and camera search is becoming commonplace offline. Now is the time to outshine your competitors. Now is the time to understand the foundations of visual marketing. Both of these technologies are stepping stones that will lead the way to an augmented reality future.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Comments are closed.