Tuesday, June 30, 2015

Help Us Improve the Moz Blog: 2015 Reader Survey

Posted by Trevor-Klein

In late 2013, we asked you all about your experience with the Moz Blog. It was the first time we'd collected direct feedback from our readers in more than three years—an eternity in the marketing industry. With the pace of change in our line of work (not to mention your schedules and reading habits) we didn't want to wait that long again, so we're taking this opportunity to ask you how well we're keeping up.

Our mission is to help you all become better marketers, and to do that, we need to know more about you. What challenges do you all face? What are your pain points? Your day-to-day frustrations? If you could learn more about one or two (or three) topics, what would those be?

If you'll help us out by taking this five-minute survey, we can make sure we're offering the most useful and valuable content we possibly can. When we're done looking through the responses, we'll follow up with a post about what we learned.

Thanks, everyone; we're excited to see what you have to say!

Can't see the survey? Click here to take it in a new tab.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Monday, June 29, 2015

Advanced Local SEO Competition Analysis

Posted by Casey_Meraz

Competition in local search is fierce. While it's typical to do some surface level research on your competitors before entering a market, you can go much further down the SEO rabbit hole. In this article we will look at how you can find more competitors, pull their data, and use it to beat them in the search game.

Since there are plenty of resources out there on best practices, this guide will assume that you have already followed the best practices for your own listing and are looking for the little things that might make a big difference in putting you over your competition. So if you haven't already read how to perform the Ultimate Local SEO Audit or how to Find and Build Citations then you should probably start there.

Disclaimer: While it's important to mention that correlation does not mean causation, we can learn a lot by seeing what the competition has done.

Some of the benefits of conducting competitive research are:

  • You can really dive into your customers' market and understand it better.
  • You can figure out who your real customers area and better target them.
  • You can get an understanding of what your competitors have done that has been successful without re-inventing the wheel.

Once you isolate trends that seem to make a positive difference, you can create a hypothesis and test. This allows you to constantly be testing, finding out what works, and growing those positive elements while eliminating the things that don't produce results. Instead of making final decisions off of emotion, make your decisions off of the conversion data.

A good competition analysis will give you a strong insight into the market and allow you to test, succeed, or fail fast. The idea behind this process is to really get a strong snapshot of your competition at a glance to isolate factors you may be missing in your company's online presence.

Disclaimer 2: It's good to use competitors' ideas if they work, but don't make that your only strategy.

Before we get started

Below I will cover a process I commonly use for competition analysis. I have also created this Google Docs spreadsheet for you to follow along with and use for yourself. To make your own copy simply go to File > Make A Copy. (Don't ask me to add you as an owner please :)

Let's get started

1. Find out who your real competitors are

Whether you work internally or were hired as an outside resource to help with your client's SEO campaign, you probably have some idea of who the competition is in your space. Some companies may have good offline marketing but poor online marketing. If you're looking to be the best, it's a good idea to do your own research and see who you're up against.

In my experience it's always good to find and verify 5-10 online competitors in your space from a variety of sources. You can use tools for this or take the manual approach. Keep in mind that you have to screen the data tools give you with your own eye for accuracy.

How do you find your "real" competitors?

We're going to look at some tools you can use to find competitors here in a second, but keep in mind you want to record everything you find.

Make sure to capture the basic information for each competitor including their company name, location, and website. These tools will be useful at a later time. Record these in the "competitor research" tab of the spreadsheet.

Method 1: Standard Google searches for competitors

This is pointing out the obvious, but if you have a set of keywords you want to rank for, you can look for trends and see who is already ranking where you want to be. Don't limit this to just one or two keywords, instead get a broader list of the competitors out there.

To do this, simply come up with a list of several keywords you want to rank for and search for them in your geographic area. Make sure your Geographic preference is set correctly so you get accurate data.

  1. Collect a list of keywords
  2. Search Google to see which companies are ranking in the local pack
  3. Record a list of the companies' names and website URLs in the spreadsheet under the competitor research tab.

To start we're just going to collect the data and enter it into the spreadsheet. We will revisit this data shortly.

Outside of the basics, I always find it's good to see who else is out there. Since organic and local rankings are more closely tied together than ever, it's a good idea to use 3rd party tools to get some insight as to what else your website could be considered related to.

This can help provide hidden opportunities outside of the normal competition you likely look at most frequently.

Method 2: Use SEMRUSH.com

SEMRush is a pretty neat competitive analysis tool. While it is a paid program, they do in fact have a few free visits a day you can check out. It's limited but it will show you 10 competitors based on keyword ranking data. It's also useful for recording paid competition as well.

To use the tool, visit www.SEMRush.com and enter your website in the provided search box and hit search. Once the page loads, you simply have to scroll down to the area that says "main competitors". If you click the "view full report" option you'll be taken to a page with 10 competition URLs.

Put these URLs into the spreadsheet so we can track them later.

Method 3: Use SPYFU.com

This is a cool tool that will show your top 5 competitors in paid and organic search. Just like SEMRush, it's a paid tool that's easy to use. On the home page, you will see a box that loads where you can enter your URL. Once you hit search, a list of 5 websites will populate for free.

Enter these competitors into your spreadsheet for tracking.

Method 4: Use Crunchbase.com

This website is a goldmine of data if you're trying to learn about a startup. In addition to the basic information we're looking for, you can also find out things like how much money they've raised, staff members, past employee history, and so much more.

Crunchbase also works pretty similarly to the prior tools in the sense that you you just enter your website URL and hit the search button. Once the page loads, you can scroll down the page to the competitors section for some data.

While Crunchbase is cool, it's not too useful for smaller companies as it doesn't seem to have too much data outside of the startup world.

Method 5: Check out Compete.com

This tool seems to have limited data for smaller websites but it's worth a shot. It can also be a little bit more high-level than I prefer, but you should still check it out.

To use the tool visit www.compete.com and enter the URL you want to examine in the box provided then hit search.

Click the "Find more sites like" box to get list of three related sites. Enter these in the provided spreadsheet.

Method 6: Use SimilarWeb.com

SimilarWeb provides a cool tool with a bunch of data to check out websites. After entering your information, you can scroll down to the similar sites section which will show websites it believes to be related.

The good news about SimilarWeb is that it seems to have data no matter how big or small your site is.


2. After you know who they are, mine their data

Now that we have a list of competitors, we can really do a deep dive to see who is ranking and what factors might be contributing to their success. To start, make sure to pick your top competitors from the spreadsheet and then look for and record the information below about each business on the Competitor Analysis tab.

You will want to to pull this information from their Google My Business page.

If you know the company's name, it's pretty easy to find them just by searching the brand. You can add the geographic location if it's a multi-location business.

For example if I was searching for a Wendy's in Parker, Colorado, I could simply search this: "Wendy's Parker, CO" and it will pull up the location(s).

Make sure to take and record the following information from their local listings. Get the data from their Google My Business (Google + Page) and record it in the spreadsheet!

  1. Business name - Copy and paste the whole business name. Sometimes businesses keyword stuff a name or have a geographic modifier. It's important to account for this.
  2. Address - The full address of the business location. Although we can't do anything about its physical location, we will search using this information shortly.
  3. City, state, zip code - The city, state, and zip listed on the Google My Business listing.
  4. Phone number - Take the listing's primary number
  5. Phone number 2 - Take the listing's secondary number like an 800 number.
  6. Landing page URL - The one connected to their Google My Business listing.
    PRO TIP: The URL will display as the root domain, but click the link to see if it takes you to an internal landing page. This is essential!
  7. Number of categories - Does your listing have more or less categories than the listing?
  8. Categories in Google My Business
    You can find the categories by clicking on the main category of the listing. It will pop out a list of all of the categories the business is listed under. If you only see one after doing this, open your browser and go to View Source. If you do Ctrl+F you can search the page for "GCID" without the quotes. This will show you the categories they're listed under if you look through the HTML.
  9. Does the profile appear to be 100% complete?
  10. How many reviews do they have?
  11. Is their business name visible in Google Street View? Obviously there is not much we can do about this, but it's interesting especially considering some patents Bill Slawski was recently talking about.

** Record this information on the spreadsheet. A sample is below.

What can we do with this data?

Since you've already optimized your own listing for best practices, we want to see if there is any particular trends that seem to be working better in a certain area. We can then create a hypothesis and test it to see if any gains are losses are made. While we can't isolate factors, we can get some insight as to what's working the more you change it.

In my experience, examining trends is much easier when the data is side by side. You can easily pick out data that stands out from the rest.

3. Have a close(r) look at their landing pages

You already know the ins and outs of your landing page. Now let's look at each competitor's landing page individually. Let's look at the factors that carry the most weight and see if anything sticks out.

Record the following information into the spreadsheet and compare side by side with your company vs. the successful ones.

Page title of landing page
City present? - Is the city present in the landing page meta title?
State present? - Is the state present in the landing page meta title?
Major KW in title? Is there a major keyword in the landing page meta title?
Content length on landing page - Possibly minor but worth examining. Copy/paste into MS Word
H1 present? - Is the H1 tag present?
City in H1? - Does the H1 contain the city name?
State in H1? - Does the H1 have the state or abbreviation in the heading?
Keyword in H1? - Do they use a keyword in the H1?
Local business schema present? - Are they using schema? Find out using the Google structured data testing tool here.
Embedded map present? - Are they embedding a Google map?
GPS coordinates present? - Are they using GPS coordinates via schema or text?


4. Off site: See what google thinks is authoritative

Recently, I was having a conversation with a client who was super-excited about the efforts his staff was making. He proudly proclaimed that his office was building 10 new citations a day and added over 500 within the past couple of months!

His excitement freaked me out. As I suspected, when I asked to see his list, I saw a bunch of low quality directory sites that were passing little or no value. One way I could tell they were not really helping (besides the fact that some were NSFW websites), was that the citations or listings were not even indexed in Google.

I think it's a reasonable assumption that you should test to see what Google knows about your business. Whatever Google delivers about your brand, it's serving because it has the most relevance or authority in its eyes.

So how can we see what Google sees?

It's actually pretty simple. Just do a Google Search. One of the ways that I try to evaluate and see whether or not a citation website is authoritative enough is to take the competition's NAP and Google it. While you've probably done this many times before for citation earning, you can prioritize your efforts based off of what's recurring between top ranked competitor websites.

As you can see in the example below where I did a quick search for a competitor's dental office (by pasting his NAP in the search bar), I see that Google is associating this particular brand with websites like:

  1. The company's main website
  2. Whitepages
  3. Amazon Local (New)
  4. Rateadentist.com
  5. DentalNeighbor.com

Pro Tip: Amazon local is relatively new, but you can see that it's going to carry a citation benefit in local search. If your clients are willing, you should sign up for this.

Don't want to copy and paste the NAP in a variety of formats? Use Andrew Shotland's NAP Hunter to get your competitor's variants. This tool will easily open multiple window tabs in your browser and search for combinations of your competitor's NAP listings. It makes it easy and it's kind of fun.

5. Check important citations

With citations, I'm generally in the ballpark of quality over quantity. That being said, if you're just getting the same citations that everyone else has, that doesn't really set you apart does it? I like to tell clients that the top citation sources are a must, but it's good to seek out opportunities and monitor what your competition does so you can keep up and stay ahead of the game.

You need to check the top citations and see where you're listed vs. your competition. Tools like Whitespark's local citation finder make this much easier to get an easy snapshot.

If you're looking to see which citations you should find and check, use these two resources below:

Just like in the example in the section above, you can find powerful hidden gems and also new website opportunities that arise from time to time.

Just because you did it once doesn't mean you should leave it alone

A common mistake I see is businesses thinking it's ok to just turn things off when they get to the top.That's a bad idea. If you're serious about online marketing, you know that someone is always out to get you. So in addition to tracking your brand mentions through the Fresh Web Explorer, you also need to be tracking your competition at least once a month! The good news is that you can do this easily with Fresh Web Explorer from Moz.

So what should you setup in Fresh Web Explorer?

  • Your competitor's brand name - Monitor their mentions and see what type of marketing they're doing!
  • Your competitor's NAP - Easily find new citations they're going after
  • City+Industry+Keywords - Maybe there are some hidden gems outside of your competition you could go after!

Plus track anything else you can think of related to your brand. This will help the on-going efforts get a bit easier.

6. Figure out which citations have dofollow links

Did you know some citation sources have dofollow links which mean they pass link juice to your website? Now while these by themselves likely won't pass a lot of juice, it adds an incentive for you to be proactive with recording and promoting these listings.

When reviewing my competition's citations and links I use a simple Chrome plugin called NoFollow which simply highlights nofollow links on pages. It makes it super easy to see what's a follow vs. a nofollow link.

But what's the benefit of this? Let's say that I have a link on a city website that's a follow link and a citation. If it's an authority page that talks highly about my business, it would make sense for me to link to it from time to time. If you're getting links from websites other than your own and linking to these high quality citations you will pass link juice to your page. It's a pretty simple way of increasing the authority of your local landing pages.

7. Links, links, links

Since the Pigeon update almost a year ago, links started to make a bigger impact in local search. You have to be earning links and you have to earn high quality links to your website and especially your Google My Business Landing page.


If the factors show you're on the same playing field as your competition except in domain authority or page authority, you know your primary focus needs to be links.

Now here is where the research gets interesting. Remember the data sources we pulled earlier like compete, spyfu.com, etc? We are now going to get a bigger picture on the link profile because we did this extra work. Not only are we just going to look at the links that our competition in the pack has, we've started to branch out of that for more ideas which will potentially pay off big in the long run.

What to do now

Now we want to take every domain we looked at when we started and run Open Site Explorer on each and every domain. Once we have these lists of links, we can then sort them out and go after the high quality ones that you don't already have.


Typically, when I'm doing this research I will export everything into Excel or Google Docs, combine them into one spreadsheet and then sort from highest authority to least authority. This way you can prioritize your road map and focus on the bigger fish.

Keep in mind that citations usually have links and some links have citations. If they have a lot of authority you should make sure you add both.

8. But what about user behavior?

If you feel like you've gone above and beyond your competition and yet you're not seeing the gains you want, there is more you have to look at. Sometimes as an SEO it's easy to get in a paradigm of just the technical or link side of things. But what about user behavior?


It's no secret and even some recent tests are showing promising data. If your users visit your site and then click back to the search results it indicates that they didn't find what they were looking for. Through our own experiments we have seen listings in the SERPs jump a few positions in hours just based off of user behavior.

So what does this mean for you?

You need to make sure your pages are answering the users queries as they land on your page, preferably above the fold. For example, if I'm looking for a haircut place and I land on your page, I might be wanting to know the hours, pricing, or directions to your store. Making information prevalent is essential.

Make sure that if you're going to make these changes you test them. Come up with a hypothesis, test the results, and come to conclusion or another test based off of the data. If you want to know more about your users, I say that you need to find as much about them as human possible. Some services you can use for that are:

1. Inspectlet - Record user sessions and watch how they navigate your website. This awesome tool literally allows you to watch recorded user sessions. Check out their site.

2. LinkedIn Tracking Script - Although I admit it's a bit creepy, did you know that you can see the actual visitors to your website if they're logged into LinkedIn while browsing your website? You sure can. To do this complete the following steps:

1. Sign up for a LinkedIn Premium Account
2. Enter this code into the body of your website pages:

<img src="https://www.linkedin.com/profile/view?authToken=zRgB&authType=name&id=XXXXX" />


3. Replace the XXXXX with your account number of your profile. You can get this by logging into your profile page and getting the number present after viewid?=
4. Wait for the visitors to start showing up under "who's viewed your profile"

3. Google Analytics - Watch user behavior and gain insights as so what they were doing on your website.

Reviews

Speaking of user behavior, is your listing the only one without reviews? Does it have fewer or less favorable reviews? All of these are negative signals for user experience. Do you competitors have more positive reviews? If so you need to work getting more.


Meta descriptions

While this post was mainly geared towards local SEO as in Google My Business rankings, you have to consider that there are a lot of localized search queries that do not generate pack results. In these cases they're just standard organic listings.

If you've been deterred to add these by Google picking its own meta descriptions or by their lack of ranking benefit, you need to check yourself before you wreck yourself. Seriously. Customers will make a decision on which listing to click on based on this information. If you're not thinking about optimizing these for user intent on the corresponding page then you're just being lazy. Spend the time, increase CTR, and increase your rankings if you're serving great content.

Conclusion

The key to success here is realizing that this is a marathon and not a sprint. If you examine the competition in the top areas mentioned above and create a plan to overcome, you will win long term. This of course also assumes you're not doing anything shady and staying above board.

While there were many more things I could add to this article, I believe that if you put your focus on what's mentioned here you'll have the greatest success. Since I didn't talk too much about geo-tagged media in this article, I also included some other items to check in the spreadsheet under the competitor analysis tab.

Remember to actively monitor what those around you are doing and develop a pro-active plan to be successful for your clients.

What's the most creative thing you have seen a competitor do successfully local search? I would love to hear about it in the comments below.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Tuesday, June 23, 2015

Your Daily SEO Fix: Week 5

Posted by Trevor-Klein

We've arrived, folks! This is the last installment of our short (< 2-minute) video tutorials that help you all get the most out of Moz's tools. If you haven't been following along, these are each designed to solve a use case that we regularly hear about from Moz community members.

Here's a quick recap of the previous round-ups in case you missed them:

  • Week 1: Reclaim links using Open Site Explorer, build links using Fresh Web Explorer, and find the best time to tweet using Followerwonk.
  • Week 2: Analyze SERPs using new MozBar features, boost your rankings through on-page optimization, check your anchor text using Open Site Explorer, do keyword research with OSE and the keyword difficulty tool, and discover keyword opportunities in Moz Analytics.
  • Week 3: Compare link metrics in Open Site Explorer, find tweet topics with Followerwonk, create custom reports in Moz Analytics, use Spam Score to identify high-risk links, and get link building opportunities delivered to your inbox.
  • Week 4: Use Fresh Web Explorer to build links, analyze rank progress for a given keyword, use the MozBar to analyze your competitors' site markup, use the Top Pages report to find content ideas, and find on-site errors with Crawl Test.

We've got five new fixes for you in this edition:

  • How to Use the Full SERP Report
  • How to Find Fresh Links and Manage Your Brand Online Using Open Site Explorer
  • How to Build Your Link Profile with Link Intersect
  • How to Find Local Citations Using the MozBar
  • Bloopers: How to Screw Up While Filming a Daily SEO Fix

Hope you enjoy them!


Fix 1: How to Use the Full SERP Report

Moz’s Full SERP Report is a detailed report that shows the top ten ranking URLs for a specific keyword and presents the potential ranking signals in an easy-to-view format. In this Daily SEO Fix, Meredith breaks down the report so you can see all the sections and how each are used.


Fix 2: How to Find Fresh Links and Manage Your Brand Online Using Open Site Explorer

The Just-Discovered Links report in Open Site Explorer helps you discover recently created links within an hour of them being published. In this fix, Nick shows you how to use the report to view who is linking to you, how they're doing it, and what they are saying, so you can capitalize on link opportunities while they're still fresh and join the conversation about your brand.


Fix 3: How to Build Your Link Profile with Link Intersect

The quantity and (more importantly) quality of backlinks to your website make up your link profile, one of the most important elements in SEO and an incredibly important factor in search engine rankings. In this Daily SEO Fix, Tori shows you how to use Moz's Link Intersect tool to analyze the competitions' backlinks. Plus, learn how to find opportunities to build links and strengthen your own link profile.


Fix 4: How to Find Local Citations Using the MozBar

Citations are mentions of your business and address on webpages other than your own such as an online yellow pages directory or a local business association page. They are a key component in search engine ranking algorithms so building consistent and accurate citations for your local business(s) is a key Local SEO tactic. In today’s Daily SEO Fix, Tori shows you how to use MozBar to find local citations around the web


Bloopers: How to Screw Up While Filming a Daily SEO Fix

We had a lot of fun filming this series, and there were plenty of laughs along the way. Like these ones. =)


Looking for more?

We've got more videos in the previous four weeks' round-ups!

Your Daily SEO Fix: Week 1

Your Daily SEO Fix: Week 2

Your Daily SEO Fix: Week 3

Your Daily SEO Fix: Week 4


Don't have a Pro subscription? No problem. Everything we cover in these Daily SEO Fix videos is available with a free 30-day trial.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

The Alleged $7.5 Billion Fraud in Online Advertising

Posted by SamuelScott

"This is the biggest advertising story of the decade, and it's being buried."

So wrote Ad Contrarian Bob Hoffman, the retired CEO and chairman of Hoffman/Lewis Advertising, in June 2013 on a $7.5 billion scandal that has been developing under the digital radar in the advertising world for the past few years. The three main allegations, according to those who are making them:

  1. Half or more of the paid online display advertisements that ad networks, media buyers, and ad agencies have knowingly been selling to clients over the years have never appeared in front of live human beings.
  2. Agencies have been receiving kickbacks and indirect payments from ad networks under the guise of "volume discounts" for serving as the middlemen between the networks and the clients who were knowingly sold the fraudulent ad impressions.
  3. Ad networks knowingly sell bot traffic to publishers and publishers knowingly buy the bot traffic because the resulting ad impressions earn both of them money—at the expense of the clients who are paying for the impressions.

These charges have not seen much discussion within the online marketing community. But the allegations have the potential to affect everyone involved in online advertising—ad agencies, in-house departments, agency and in-house digital marketers, online publishers, media buyers, and ad networks. An entire industry—billions of dollars and thousands of jobs—is at stake.

And it all starts with a single impression.

The impression that you make

(Wikimedia)

Online advertising is based on an "impression"—without the impression, then an advertisement cannot be viewed or clicked or provoke any other engagement. The Internet Advertising Bureau, which was founded in 1996 and "recommends standards and practices and fields critical research on interactive advertising," defines "impression" in this manner:

a measurement of responses from an ad delivery system to an ad request from the user's browser

In another words, an "impression" occurs whenever one machine (an ad network) answers a request from another machine (a browser). (For reference, you can see my definition and example of a "request" in a prior Moz essay on log analytics and technical SEO.) Just in case it's not obvious: Human beings and human eyeballs have nothing to do with it. If your advertising data states than a display ad campaign had 500,000 impressions, then that means that the ad network served a browser 500,000 times—and nothing more. Digital marketers may tell their bosses and clients that "impression" is jargon for one person seeing an advertisement one time, but that statement is not accurate.

The impression that you don't make

(Wikipedia)

Just because a server answers a browser request for an advertisement does not mean that the person using the browser will see it. According to Reid Tatoris at MediaPost, there are three things that get in the way:

  • Broken Ads—This is a server not loading an ad or loading the wrong one by mistake. Tatoris writes that these mistakes occur roughly 15% of the time.
  • Bot Traffic—Whenever hackers write these automated computer programs to visit websites and post spam or create fake accounts, each visit is a pageview that results an an ad impression. According to a December 2013 report in The Atlantic, 60% of Internet traffic consists of bots.
  • Alleged Fraud—In Tatoris' words, "People will hide ads behind other ads, spoof their domain to trick ad networks into serving higher-paying ads on their site, and purposefully send bots to a site to drive up impressions." Noam Schwartz described in TechCrunch two additional methods of alleged fraud: compressing ads into a tiny one-by-one pixels that are impossible to see and using malware to send people to websites they never planned to visit and thereby generate ad impressions. AdWeek found in October 2013 that 25% of online ad impressions are allegedly fraudulent.

Tatoris crunches all the numbers:

We start with the notion that only 15% of impressions ever have the possibility to be seen by a real person. Then, factor in that 54% of ads are not viewable (and we already discussed how flawed that metric is), and you're left with only 8% of impressions that have the opportunity to be seen by a real person. Let me clarify: That does not mean that 8% of impressions are seen. That means only 8% have the chance to be seen. That's an unbelievable amount of waste in an industry where metrics are a major selling point.

Essentially: If you have an online display ad budget of $100,000, then only $8,000 of that ad spend has the chance to put advertisements in front of human eyeballs. (And that's not even taking into account the poor clickthrough rates of display ads when people do see them.)

If you are paying $0.10 per impression, then the $10,000 that you will pay for 100,000 impressions will result in only 8,000 human views—meaning that the effective CPI will actually be $1.25.

How bot traffic affects online ads

(Wikipedia)

Jack Marshall, an alleged reformed fake web traffic buyer, explains in a Digiday interview how the scheme allegedly operates. Here are just three excerpts:

How and why were you buying non-human traffic?
We were spending anywhere from $10,000 to $35,000 a day on traffic. My conversations with [these ad networks] were similar: They would let me decide how much I was willing to pay for traffic, and when I told them $0.002 or below, they made it clear they had little control over the quality of traffic they would send at that price. Quality didn't really matter to us, though. As a website running an arbitrage model, all that mattered was profit, and for every $0.002 visit we were buying, we were making between $0.0025 and $0.004 selling display ads through networks and exchanges. The biggest determinate of which traffic partner we were spending the most money with was pageviews per visit. Since we were paying a fixed cost per visit, more pageviews equaled more ad impressions. Almost none of these companies were based in the U.S. While our contacts were in the US and had American names and accents, most of the time we found ourselves sending payment to a non-US bank.

In other words, the publisher would allegedly pay an ad network $0.0020 for a visit from a bot, and the resulting ad impression would garner $0.0025 to $0.0040 in revenue—that's a gross margin of 25% to 100% for the publisher for doing nothing! It's no wonder that so many websites around the world may be allegedly involved in this practice.

Do you think publishers know when they're buying fake traffic?
Publishers know. They might say "we had no idea" and blame it on their traffic acquisition vendor, but that's bullshit, and they know it. If you're buying visits for less than a penny, there's no way you don't understand what's going on. Any publisher that's smart enough understand an arbitrage opportunity is smart enough to understand that if it was a legitimate strategy that the opportunity would eventually disappear as more buyers crowded in. What we were doing was 100 percent intentional. Some articles revolving around bot traffic paint publishers as rubes who were duped into buying bad traffic by shady bot owners. Rather, I believe publishers are willing to do anything to make their economics work.

Do networks, exchanges and other ad tech companies do anything to stop this from happening?
We worked with a major supply-side platform partner that was just wink wink, nudge nudge about it. They asked us to explain why almost all of our traffic came from one operating system and the majority had all the same user-agent string. There was nothing I could really say to answer that question. It was their way of letting us know that they understood what was going on. It wasn't just our account rep, either. It was people at the highest levels in the company. Part of me wished they'd said "You are in violation of our TOS and you have to stop running our tags." I would have been happy with that. But they didn't; they were willing to take the money.

If these stories are true, then ad networks do not care that the impressions are from bot traffic and publishers do not care that are getting bot traffic because they are both making money. Who gets hurt? The companies advertising their products and services.

The worst part of it all

(Flickr user Don Hankins)

It's not only that online display ads are alleged to be amazingly useless and that many publishers and ad networks are allegedly involved in sleazy deals. A March 2015 investigative report in Ad Age found the following:

Kickback payments tied to U.S. media-agency deals are real and on the rise, according to Ad Age interviews with more than a dozen current and former media-agency executives, marketers' auditors, media sellers and ad-tech vendors who said they'd either participated in such arrangements or had seen evidence of them. The murky practice—sometimes disguised as (undisclosed) "rebates" or bills for bogus services—is being motivated by shrinking agency fees and fueled by an increasingly convoluted and global digital marketplace. "It's really ugly and crooked," said one ad-tech executive who described receiving such requests.

Some arrangements go like this: A large media shop, poised to spend $1 million with that ad-tech executive's firm to buy digital ads last year, asked for $200,000 to be routed back to the agency's corporate sibling in Europe. The $200,000 would pay for a presentation or presentations by the sibling's consultants. But these types of presentations aren't worth a fraction of the price tag, according to numerous executives dealing with the same issue, who spoke on condition of anonymity for fear of losing business.

Essentially, here is what is allegedly happening:

  • Clients give money to agencies to purchase online display advertising
  • The agencies give the money to the ad networks
  • The ad networks give a portion of the money back to the agencies
  • The clients' display ads are only 8% viewable
  • The 92% non-viewable impressions still earn money for publishers and ad networks

I think we can see who the loser is—everyone is making money except for the clients.

During the same month as the Ad Age report, former Mediacom CEO Jon Mandel reportedly told the Association of National Advertisers Media Leadership Conference that widespread "media agency rebates and kickbacks" were the reason that he left the agency business.

Heads in the digital sand

(Flickr user Peter)

I have yet to hear about this issue being addressed in any talk, panel, or session at a digital marketing, martech, or adtech conference. Prior to today, I have seen only one article each in two major publications in the online marketing industry. (Mozzers, please correct me if I am mistaken and have missed something major on this topic.)

Why is no one talking about this?

No marketing agency wants clients to know that 92% of its display advertising spend is wasted. No advertising manager wants the CMO to know that only 8% of the company's ads are reaching people at 100% cost. No CMO wants the CEO to know that 92% of the entire ad budget is being flushed down the digital toilet.

I myself would probably have not been permitted to write this article when I held various agency positions in the past because I managed clients' online advertising and some PR and digital marketing clients of the agencies were advertising networks themselves.

(Today, I am the director of marcom for Logz.io, a log analytics startup, and I have the luxury of being accountable only for the results of my in-house work—and I do not plan to use online advertising anytime soon. Still, I was a journalist in my first career years ago, and I wanted to write this report because I think everyone in my beloved industry should know about this explosive issue.)

Hoffman, the retired ad agency CEO who I quoted at the beginning, puts it better than I can:

How does an agency answer a client who asks, "You mean more than half the money you were supposed to be custodian of was embezzled from me and you knew nothing about it?" How does an ad network answer, "You mean all those clicks and eyeballs you promised me never existed, and you knew nothing about it?" How does a CMO answer his management when they ask, "You mean these people screwed us out of hundreds of thousands (millions?) of dollars in banner ads and you had no idea what you were buying?"

Everyone is in jeopardy and everyone is in "protect" mode. Everyone wants to maintain deniability. Nobody wants to know too much. If display advertising were to suffer the disgrace it deserves, imagine the fallout. Imagine the damage to Facebook, which at last report gets over 80% of its revenue from display. Imagine the damage to online publishers whose bogus, inflated numbers probably constitute their margin of profit.

If the comScore findings are correct and projectable, it means that of the 14 billion dollars spent on display advertising last year in America, 7.5 billion was worthless and constituted some degree of fraud or misrepresentation.

But clients, CMOs, and CEOs are going to read one of these articles one day and start asking uncomfortable questions. I would suggest that Mozzers—as well as all digital marketers and advertisers—start thinking about responses now.

Responses to the scandal

(Flickr user Chris Potter)

Google, to its credit, has disclosed that 56% of its digital ad impressions are never actually seen—of course, the report was also released with the announcement of a new ad-viewability product.

Ginny Marvin summarizes at Marketing Land:

Google's viewability measurement tool, Active View, is integrated into both the Google Display Network and DoubleClick. Advertisers can monitor viewability rates and buy ads on a viewable impression basis rather than by served impressions.

Google also announced an update to DoubleClick Verification last week, which includes viewability monitoring, ad blocking, a content ratings system and spam filtering capabilities.

The goals of the Media Rating Council (MRC), an industry organization founded in the United States in the 1960s following congressional hearings into the media industry, are:

  • To secure for the media industry and related users audience measurement services that are valid, reliable and effective
  • To evolve and determine minimum disclosure and ethical criteria for media audience measurement services
  • To provide and administer an audit system designed to inform users as to whether such audience measurements are conducted in conformance with the criteria and procedures developed

The MRC has certified "viewable impressions" as a legitimate metric (as opposed to "served impressions"). The Interactive Advertising Bureau (IAB), mentioned earlier, issued guidelines in December that online advertising networks should aim for at least 70% viewability.

Facebook, for its part, announced in February 2015:

We are working with the MRC and a consortium of advertisers and agencies to develop more robust standards for viewable impressions. Our goal is to work with the MRC, our partners, and industry leaders around the world to help apply further standards for feed-based websites like Facebook, mobile media and new ad formats.

The American Association of Advertising Agencies, Association of National Advertisers, and IAB announced last year that they would create a new organization, the Trustworthy Accountability Group, to fight problems in the online advertising market and do the following:

  • Eliminate fraudulent traffic
  • Combat malware
  • Fight Internet piracy
  • Promote greater transparency

TAG now consists of representatives from Mondelez International, JCPenney, Omnicom, Motorola, Google, Facebook, AOL, and Brightroll.

Canada's latest anti-spam legislation aims to fight Internet malware and bots—but a big stumbling block is that most of the problem comes from outside the country.

Will these corporate and organizational responses be enough? For the following reasons and more, it's impossible to know:

  • Industry guidelines depend on voluntary compliance. Industry recommendations do not have the force of law—any business that thinks it can still make a lot of money by ignoring the guidelines will likely continue to do so.
  • Possible penalties for past behavior. Regardless of what reforms may occur in the future, should those who knowingly engaged in such alleged fraud and deception in the past be held criminally or civilly liable? (I'm not a lawyer, so I cannot comment on that.)
  • IAB's 70% viewability goal. Should advertisers accept this metric as simply the nature of the medium? One estimate of the total display ad market amounted to $14 billion. If the 70% viewability goal can even be reached, should and will advertisers accept that $4.2 billion of their collective ad spend will still be lost before their advertisements are even viewed by human beings?

I have no answer—only time, I suppose, will tell.

But others are coming up with their own answers—those large corporations that are spending billions of dollars a year on online display advertising. As Lara O'Reilly wrote in May 2015 at Business Insider, $25 billion in ad spend is now under review in what Adweek is calling "Mediapalooza 2015." O'Reilly gives one possible reason:

Media reviews let brands reassess their ad spending, often by offering those contracts out in a competitive bidding process. The companies include General Mills, Procter & Gamble, Volkswagen, Visa, Sony, Coca-Cola, Citi, 21st Century Fox ... the list goes on. Some of these — P&G, Sony, and 21st Century Fox — spend more than $1 billion on advertising each year...

It could be that marketers are finally getting fed up with the apparent lack of transparency about where their budgets are actually being spent and why.

What marketers can do

(Image of an Indian online-marketing team I used with rights in a prior Moz essay
on the future of marketing departments)

Regardless of what the future will hold, here are my recommendations on how digital advertisers can respond:

  • Stop doing cost-per-impression (CPI or CPM) campaigns. Traditional digital advertising strategy recommends that people use CPM campaigns for brand awareness, cost-per-click (CPC) campaigns for traffic, and cost-per-action (CPA) campaigns for sales and conversions. In light of this scandal, I see no good reason to do CPM at all anymore.
  • Revise advertising KPIs and metrics in human terms. Earlier in this article, I calculated the following change to a hypothetical CPI value: "If you are paying $0.10 per impression, then the $10,000 that you will pay for 100,000 impressions will result in only 8,000 human views—meaning that the effective CPI rate will actually be $1.25." In addition, half of all clicks in CPC campaigns might also be bots. As a result, a $2 CPC result may actually be $4 when reaching human beings is taken into account. Ad campaign analysts may want to take alleged bot and fraudulent activity into account when calculating ROI and whether display advertising is worthwhile.
  • Demand full disclosure. Clients should ask agencies and media buyers if they are getting paid directly or indirectly by ad networks. Agencies and media buyers should ad networks how they are combating bot activity and any fraudulent behavior. Ad networks should not turn a digital blind-eye to publishers who intentionally use bots to make profits off of advertisers. If anyone gives vague answers or otherwise disparages such questions, then that is a red flag. Advertisers should demand and receive full, verifiable information in light of what has allegedly been occurring.
  • Block certain countries from campaigns. According to a report in Ad Week, China, Venezuela, Ukraine, and Singapore have "suspicious traffic" rates of between 86% and 92%. (The rate in the United States is 43%.)
  • Use ad-fraud detection platforms. Companies such as Forensiq, SimilarWeb, Spider.io (which was bought by Google), Telemetry, and White Ops compare visit patterns with industry benchmark behavior as well as check for malicious software proxy unmasking, verify devices, and detect any manipulation.
  • Run manual campaigns as much as possible. The only way to reduce wasted impressions significantly is to research and implement digital ad campaigns manually rather than use programmatic ad buying. Digital advertisers should research potential websites on which they want to run advertisements to see if they are legitimate—potentially even running ads on only the largest, well-known sites but doing so continuously. This way, it might be best to focus your ad campaigns on quality viewers rather than trying to maximize the quantity of viewers by also including lesser-known sites.

Beyond the current responses of the ad industry and my present recommendations for marketers, I do not know what will happen. My goal here is simply to explain to digital marketers what has allegedly been occurring. What the future will hold—well, that's up to we marketers and advertisers.

Additional resources


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Saturday, June 20, 2015

How to Estimate the Total Volume and Value of Keywords in a Given Market or Niche - Whiteboard Friday

Posted by randfish

To get a sense for the potential value of keywords in a certain niche, we need to do more than just look at the number of searches those keywords get each month. In today's Whiteboard Friday, Rand explains what else we should be looking at, and how we can use other data to prioritize some groups over others.

How to Estimate the Total Volume and Value of Keywords in a Given Market or Niche Whiteboard

For reference, here's a still of this week's whiteboard. Click on it to open a high resolution image in a new tab!

Video transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week I want to chat about how you can estimate the total volume and value of a large set of keywords in a market or a niche.

Look, we're going to try and simplify this and reduce it to something that is actually manageable, because you can go way, way deep down a well. You could spend a year trying to figure out whether Market A or Market B is better to enter or better to chase keywords in, better to create content in. But I want to try and make it a little simple without reducing it to something that is of no value whatsoever, which unfortunately can be how some marketers have looked at this in the past.

Asian noodle keywords

So let's try this thought exercise. Let's say I'm a recipe site or a food site and I'm thinking I want to get into the Asian noodles scene. There's a lot of awesome Asian noodles out there. I, in fact, had Chow fun for lunch from Trove on Capitol Hill. When you come to MozCon, you have to try them. It's awesome.

So maybe I'm looking at Chow fun and sort of all the keyword sets around those, that Chinese noodle world. Maybe I'm looking at pad Thai, a very popular Thai noodle, particularly in the U.S., and maybe Vietnamese rice noodles or bun. I'm trying to figure out which of these is the one that I should target. Should I start creating a lot of pad Thai recipes, a lot of Chow fun recipes? Should I go research one or the other of these? Am I going to chase the mid and long tail keywords?

I'm about to invest a large amount of effort and really build up a brand around this. Which one of these should I do?

Side note, this is getting more and more important as Google is moving to these topic modeling and sight specific, topic authority models. So if Google starts to identify my site as being an authority on Chow fun, I can expect to rank for all sorts of awesome stuff around it, versus if I just kind of dive in and out and have one-offs of Chow fun and 50 different other kinds of noodles. So this gets really important.

The wrong way to look at AdWords data

A massively oversimplified version, that a lot of people have done in the past, is to look broadly at kind of AdWords groups, the ones that AdWords selects for you, or individual keywords and say, "Oh, okay. Well, Chow fun gets 22,000 searches a month, Pad Thai gets 165,000, and rice noodles, which is the most popular version of that query -- it could also be called Vietnamese noodles or bun noodles or something like that -- gets 27,000. So there you go, one, two, three.

This is dead wrong. It's totally oversimplified. It's not taking into account all the things we have to do to really understand the market.

First off, this isn't going to include all the variations, the mid and long tail keywords. So potentially there might be a ton of variations of rice noodles that actually add up to as much or more than pad Thai. Same thing with Chow fun. In fact, when I looked, it looked like there's a ton of Chow fun modifications and different kinds of things that go in there. The Pad Thai list is a little short. It's like chicken, vegetable, shrimp, and beef. Pretty simplistic.

There's also no analysis of the competition going on here. Pad Thai, yeah it's popular, but it also has 50 recipe sites all bidding for it, tons of online grocers bidding for it, tons of recipes books that are bidding on that. I don't know. Then it could be that Chow fun has almost no competition whatsoever. So you're really not considering that when you look in here.

Finally, and this can be important too, these numbers can be off by up to 200% plus or minus this number. So if you were to actually bid on Chow fun, you might see that you get somewhere in the 22,000 impressions per month, assuming your ad consistently shows up on page one, but you could see as little as 11,000. I've seen as much as 44,000, like huge variations and swings in either direction and not always totally consistent between these. You want them to be, but they're not always.

A better process

So because of that, we have to go deeper. These problems mean that we have to expend a little more energy. Not a ton. It doesn't have to be massive, but probably a week or two of work at least to try and figure this out. But it's so important I think it's worth it every time.

1) Keyword research dive

First off, we're going to conduct a broad keyword research dive into each one of these. Not as much as we would do if we knew, hey, Chow fun is the one we're going to target. We're going to go deep. We're going to find every possible keyword. We're going to do kind of what I call a broad dive, not a deep dive into each market. So I might want to go, hey, I'm going to look at the AdWords suggestions and tally those up. I'm going to look at search suggest and related searches for some of the queries that I get from AdWords, some of the top ones anyway, and I'm going to do a brief competitive analysis. Maybe I'll put the domains that I'm seeing most frequently around these specific topics into SEMrush or another tool like that -- SpyFu, Key Compete or whatever your preference might be -- and see what other terms and phrases they might be ranking on.

So now I've got a reasonable set. It probably didn't take me more than a few hours to put that together, if that. If I've got an efficient process for this already, maybe even less.

2) Bid on sample keyword sets

Now comes the tricky part. I want you to take a small sample set, and we've done this a few times. Random might be not the right word. It's a small considered set of keywords and bid on them through AdWords. When I say "considered," what I mean is a few from the long tail, a few from the chunky middle, and a few from the head of the demand curve that are getting lots and lots of searches. Now I want to point each of those to some new, high-quality pages on your site as a test.

So I might make maybe one, two, or three different landing pages for each of these different sets. One of them might be around noodles. One might be around recipes. One might be around history or uses in cuisine or whatever it is.

Then I am going to know from that exercise three critically important things. I'm going to know accuracy of AdWords volume estimates, which is awesome. Now I know whether these numbers mean anything or not, how far off they were or not. I could probably run for between 10 and 15 days and get a really good sense for the accuracy of AdWords. If you're feeling like being very comprehensive, run for a full month, especially if you have the budget, because you can learn even more over time, and you'll rule out any inconsistencies due to a particular spike, like maybe The New York Times recipe section features Chow fun that week and suddenly there's a huge spike or whatever it is.

You can also learn relative price competition in click-through rate. This is awesome. This means that I know it costs a lot more per visitor that I'm trying to get on pad Thai. There are two really good things to know there. When a click costs more money, that also usually means there are more advertisers willing to pay for that traffic.

If you're primarily on the organic side and you believe you can compete with the folks in the organic ranking, a very high bid price and payment price that you have to pay to AdWords is a good thing.

If you're on the other side of that, where you think, "Hey, look, we're not going to compete organically right now. We just don't have the domain authority to do it. It's going to take us a while," then a high price is a bad thing. You want that cheaper traffic so you can start to build up that brand through paid as you're growing the organic side. So it really depends on who you are and what situation you're in.

Then finally you can figure out some things around click-through rate as well, which is great to know. So you can build some true model estimates and then go into your board meeting or your client pitch or whatever it is and say, "Hey, here are the numbers."

Lastly, you're going to learn the difficulty of content creation, like how hard was it for you to create these kinds of things. Like, "Wow, when we write about Chow fun, it's just easy. It just rolls off. Pad Thai we have a really hard time creating unique value because everything has been done in that world. We're just not as passionate about those noodles as we are about Chow fun." Cool. Great, you know that.

Also, assuming your test includes this, which it doesn't always have to, you can guess from sort of engagement rate, browse rate, time on site, all those kinds of things, but you can look at search conversion as well. So let's say you have some action to complete on the page -- subscribe to our email newsletter, sign up to get updates when we send them out about this recipe, or create an account so you can sign in and save this recipe. All that kind of stuff or a direct ecommerce conversion, you can learn that through your bidding test.

3) Analyze groups based on relevant factors

Awesome. That's great. Now we really, really know something. Based on that, we can do a true analysis, an accurate analysis of the different groups based on:

  • Relative value
  • Difficulty
  • Opportunity
  • Growth rate
  • ROI

Growth rate might be an interpreted thing, but you can look at the Google trends to kind of figure out over time whether a broad group of terms is getting more or less popular. You could use something like Mention.net or Fresh Web Explorer from Moz to look at mentions as well.

Now, you can be happy here. I might have chosen chow fun because I looked and I said, "Hey, you know what, it did not have the most volume overall, but it did have the lightest competition, the highest return on investment. We were great at creating the content. We were able to engage our visitors there, had lots of mid and long tail terms. We think it's poised for big growth with the growth of Chinese noodles overall and the fact that the American food scene hasn't really discovered Chow fun the way they have Vietnamese noodles and pad Thai. So that is where we're placing our bet."

Great. Now you have a real analysis. You have numbers behind it. You have estimates you can make. This process, although a little heavy, is going to get you so much further than this kind of simplistic thinking.

All right, everyone, I look forward to hearing from you about how you've done analyses like these in the past, and we'll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Thursday, June 18, 2015

Study: 300 Google Sitelinks Search Boxes - Triggers and Trip-Ups Analyzed

Posted by Royh

The sitelinks search box (schema.org/SearchAction) is one of the most popular markups out there. According to SimilarTech, there are now more than 650,000 sites that have implemented this markup, making it one of the most popular of all schema markup types.

That said, we don't really know the reason why Google sometimes shows the search box for branded queries for sites that have implemented the markup, and sometimes doesn't. While we don't know what Google's criteria are behind the search box algorithm, we have the data to definitely see that there's a correlation between the traffic of the websites and the appearance of the markup.

5564ecdae35730.19942260.jpg

source: "Sitelinks Search Box" on Google's developers site

What determines if Google displays your search box?

Using a SimilarTech "Websites using SearchAction Schema Entity" report, we compiled a list of websites implementing the above schema. We chose over 300 websites to sample, with varying traffic volumes. Then we researched each site and checked if Google was displaying a sitelinks search box when searching for the URL.

If we found a search box wasn't displayed, we looked at the website in question to see if there were technical issues (based on Google's setup instructions). Finally, we analyzed the results and produced the most common scenarios that would prevent Google from showing the sitelinks search box for a website.

Reasons why the sitelinks search box may not show (and what to do about it)

This list is ordered by frequency, from the most common to least common reasons that the Google sitelinks searchbox isn't being displayed:

Reason No. 1: Traffic to the website is too low

MxwhkmVlP8_0XjVUtpffQCx686g7oPAk9SA1yxtP

As you can see in the chart, amongst the sites with SearchAction schema markup, there's a definite correlation between website traffic and the likelihood that the searchbox will appear in Google search results. There were just a few sites (2.5%) with 100K monthly desktop visits where the searchbox was displayed. By contrast, nearly three-quarters of the sites with 50M monthly desktop visits had the sitelinks searchbox.

All the websites we tested implemented the schema SearchAction markup.

Here's what it means:

  • Monthly desktop visits – the number of average monthly desktop visits to the website according to SimilarWeb's analytics.
  • With "site:" search box – the number of websites that have the "site:" search box for their website:

C:\Users\user\Google Drive\Roy\New posts\unnamed.png

  • With the custom search box – the number of websites that have the custom search box for their website:

C:\Users\user\Google Drive\Roy\New posts\custom.png

The biggest difference between the custom search box and the "site:" search box: Searches inside the custom search box will redirect you to the website results page in the website itself, while the searches in the site:searchbox will lead you to a second search within Google.

Reason No. 2: Markup is not implemented in the site

This is fairly obvious, but it needs to be reiterated: The searchbox can only appear if the markup is implemented. There are two available schema formats you can use to implement the markup.

1. Using JSON-LD:

<script type="application/ld+json">
{
  "@context": "http://schema.org",
  "@type": "WebSite",
  "url": "https://www.example.com/",
  "potentialAction": {
    "@type": "SearchAction",
    "target": "https://query.example.com/search?q={search_term_string}",
    "query-input": "required name=search_term_string"
  }
}
</script>

2. Using Microdata:

<div itemscope itemtype="http://schema.org/WebSite">
  <meta itemprop="url" content="https://www.example.com/"/>
  <form itemprop="potentialAction" itemscope itemtype="http://schema.org/SearchAction">
    <meta itemprop="target" content="https://query.example.com/search?q={search_term_string}"/>
    <input itemprop="query-input" type="text" name="search_term_string" required/>
    <input type="submit"/>
  </form>
</div>

The Google recommendation is to implement the JSON-LD format, so if you prefer to do that, you can find the instructions here.

Reason No. 3: The URL attribute is wrong

This occurs when the "URL" attribute's value doesn't match the canonical URL of the domain's homepage, or there are problems with the canonical tags of the main domain.

The most common problems are differences between the URL value in the markup to the domain himself.

Here are some examples:

  • http:// instead of https:// or the opposite
  • With WWW or without

This can be tested by using Google's structured data testing tool and checking for problems with the URL value.

Reason No. 4: Issues with the search results page

The "target" attribute in the markup should point to the search results page URL on the website, including a placeholder for the query input parameter name, wrapped by curly braces.
"target": "https://query.example.com/search?q={search_term_string}"
  • "target" attribute is not defined in the markup or defined incorrectly.
  • No search results page exists (404) or it's returning a server error (500)
  • The results page never yields results or the content is irrelevant to the search query input (this can be due to a technical problem in the engine of the search results page)
  • The field of the target in the markup isn't defined well.

Reason No. 5: The query input doesn't match

The value of the "query-input" name attribute doesn't match the string that's inside the curly braces in the "target" property. You need to make sure that the value of the "name" will match, otherwise it won't work.

"query-input": "required name=search_term_string"

Reason No. 6: Using nositelinkssearchbox to disable the markup

<meta name="google" content="nositelinkssearchbox" />

Use this tag and, you guessed it, Google won't show the searchbox. But unless you're actively trying to disable the searchbox markup, this is likely one of the least common scenarios.

Now that we've covered all the reasons the sitelinks searchbox may not appear, here's what it means in a nutshell:

Beyond markup: Best practices for winning the box

First, there's a very strong correlation to site traffic. This is perhaps the main factor that determines whether or not Google will show the search box, even if all technical issues are addressed and schema is implemented correctly. Again, out of the websites we sampled that have more than 50M monthly desktop visits, 74% of them have sitelinks searchbox for their websites. When we checked the websites that have just 25-100K monthly visits, however, only 1.4% had the searchbox working for their site.

Secondly, as you can see from the various reasons listed above, there are a slew of technical kinks that may result in Google not displaying the searchbox. Some of these have to do with improperly implemented schema. If you suspect a technical issue is to blame, be sure to go through all of the tech-related scenarios listed above to ensure the bug is found. Then you can use our troubleshooting tips to fix the problem.

As you can see, there are several factors that affect the searchbox appearance in Google's search results. But if you play your cards right and do your due diligence, getting those valuable searchboxes to appear is easier than you think.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!

Tuesday, June 16, 2015

Dissecting and Surviving Google's Local Snack Pack Results

Posted by MiriamEllis

Google's Snack Packs (a.k.a Local Stacks) haven't gotten the best reception in the local business community. Many people feel these results serve Google itself more so than the local businesses they feature. For this reason, it may be more important than ever to make sure your local search marketing is both accurate and thorough. In this post, we'll dive into why.

Let's take one of these results apart and discover how best to respond to their limitations.

I was never a fan of Google's Carousel. The vivid, image-oriented, horizontal display moved me out of my comfort zone after a decade or so of easy familiarity with the minimalist blue/grey/green palette of more traditional packs. A few orange stars here, a red teardrop marker there—like most Google users, I had been conditioned to see and understand these elements of pack results at a glance. The carousel felt like a shocking departure from the simplicity I consider to be a chief hallmark of Google's historic style. I wasn't sorry to bid them farewell … until I got my first look at the Snack Packs that have now become the standard results for hospitality and entertainment searches.

Bearing in mind that "Google's goal is to provide users with the most relevant results and a great user experience," please join me in dissecting the parts of a single Snack Pack entry to see if you think it's living up to Google's stated purpose.

elements of google's snack pack

Key to the Snack Pack

Here we're going to see what each of the elements of this Snack Pack result for a search for "Tex Mex Restaurants" in Dallas signifies and directs us to. But before we do so, let's quickly note what isn't immediately accessible in this type of result:

  1. The phone number of the business
  2. The full address of the business
  3. A link to the website of the business

In other words, if we want to call the business right now, or understand exactly where it's located, or visit the website to see a menu or get the feel of the place, we're out of luck on our first try. Instead of instant gratification, we're going to have to start clicking around on the elements of the typical snack pack to see if Google will give us what we want. Here's what happens when we interact with these 10 elements:

Element 1

The business name is clear enough. We click on it, perhaps assuming that we'll be going to the website of the business, as we would in an organic result, or at least to a Google+ Local page which years of Google use will have acquainted us with. Instead we end up on a secondary interface that isn't a website and isn't a Google+ Local page. It's more like a large knowledge graph hanging in space above some organic results:

element 1 google snack pack

Personally, I find the hanging-in-space presentation of this secondary interface a bit odd, but at least we can now see the full address, phone number and a globe icon linking to the website. Likely, we've now found what we need, but I'm left asking why we had to click to get to this information. Traditional packs gave us instant, direct access to NAP+W—the core name, address phone number and website elements of any citation. By contrast, Snack Packs may make us feel that Google is holding out on us, making us click further into their own product before they'll deliver.

Elements 2 and 3

Clicking on the stars and reviews also takes us to the in-space interface. Fair enough. We probably didn't expect to see all of the Google reviews on our first try, but I do have to wonder why we don't reach them in one click on these elements. Instead, we have to go from the second interface to a third, by clicking on the "View All Google Reviews" link. It looks like this, and is again disconnected, sitting on a greyed-out background:

elements 2 and 3 google snack pack

I have to ask, why doesn't the reviews link in the Snack Pack take us directly to all of the reviews right away? Presumably, I want to see all of the reviews if I'm clicking that link—not just three of them.

Elements 4 and 5

The price gauge and the word "Mexican" take us to the in-space interface. Fine enough. I confess, I'm not sure where that word "Mexican" comes from, and as a student of regional American cuisines, I'll state for the record that Tex-Mex food is not Mexican food. I was curious enough about this to go hunt up the Google+ Local page for Mia's Tex-Mex Restaurant.

You can't get to it from the Snack Pack, as we've seen, and Google has been making direct links to Google+ Local pages harder and harder to find, so here's a shameless plug for the Moz Check Listing tool. Look up the name and zip in Check Listing to get right to the Google+ Local page. No fussing with Google Maps, branded searches, etc.

Then, once you're there, take a tip from Darren Shaw and click on the category on the Google+ Local page to see what appears to be a full list of the categories a business has selected:

elements 4 and 5 google snack pack

Okay, so now we've seen that the business did select "Mexican Restaurant" as a category, and perhaps if we visited, we'd find that they're serving traditional Chiles en Nogada alongside the Tex-Mex standard queso dip. If the word "Mexican" in the Snack Pack display is coming from the categories, it has been abbreviated and given precedence over the primary, exact match "Tex-Mex Restaurant" category. I know the word isn't coming from Zagat, where this restaurant is categorized as "Tex-Mex". I'm not 100% sure about the origin of this word being given such prominence in the Snack Pack, but I guess we can let it go at that.

Elements 6 and 7

I really do have a bone to pick with element 6—the partial street address. What good does this do anyone? Not only are we lacking a street number to tell us exactly where the restaurant is, but the fact that there is no city shown erodes our confidence that we are, indeed, being shown a result in Dallas. We'll have to click through to the in-space interface if we want Google to deliver the goods for us on this one.

Element 7, the sentiment snippet "longtime spot with famous brisket tacos" also takes us to the second interface. It's not a direct quote of the business description which reads, "Bustling, casual, longtime eatery (since 1981) popular for its brisket tacos & other Tex-Mex fare." It also doesn't seem to originate directly from a user review, and I don't see it described this way on Zagat. So, it appears to be a custom hybrid of sentiments Google and Zagat have created. I'm fine with this, but should it be more important to see random sentiment than a phone number in the Snack Pack? Which element do you feel is more deserving of pack real estate?

Elements 8 and 9

This is where I feel the average Google user may become somewhat confused, if they don't understand that Google acquired Zagat in 2011. Clicking on the prominent Zagat logo or the wording "Zagat—Dallas' best Tex-Mex restaurants", one might expect to go to Zagat. But, you guessed, it—we're going ridin' on a freeway right back to the in-space interface, and we're not even taken to the portion of it that shows the Zagat data. We have to scroll down to get to this:

elements 8 and 9 google snack pack

So, now we're kind of intrigued. What does it mean that Zagat is voting this restaurant to be one of the best? We click that link, again likely assuming that we're going to Zagat. Instead, we get yet another interface. It looks like this, and Mia's Tex-Mex isn't even the first thing we see on it. It's down at numero cuatro in some sort of Google list that appears to be branded with Zagat's name:

google snack pack

Just for fun, let's click on Mia's and see where we go. Que cosa? We're back on the in-space interface yet again, and maybe feeling a bit like we're going in circles.

There are actually pages on Zagat for these things. Here's their page for the best Tex-Mex restaurants in Dallas, which I've noticed appears to have a completely different ordering of the results. It's interesting that, instead of Google's Snack Pack or the secondary interface taking us directly to this page, we remain firmly locked with Google's own interfaces.

Element 10

As with most of the other elements, clicking the image takes us to the secondary interface (which appears to be different than the Google+ Local image gallery interface) and that then clicks to a page like this one which also feels a bit disconnected to me.

Unfortunately for this business, their primary image isn't doing their listing any favors, but I don't really have a problem with having to click a couple of times to get to an image gallery.

In sum, the initial interface of the Snack Pack may feel to users a bit like stubbing one's toe on a blunt object of questionable usefulness. I know that's the approximate sensation I have when I encounter this display.

Google may have turned off the Carousel for restaurants, but human users are still getting quite a merry-go-round ride trying to use and interpret the Snack Pack that has replaced it. As they bounce from one Google-owned interface to another instead of being given immediate NAP+W or taken directly to owner-managed websites or Google+ Local pages, or even directly to platforms like Zagat, users are given few signals about what connects all of these disparate elements together. To me, the experience is piecemeal and lacking in cohesive glue and feels like a step backward from the clearer UX of the traditional local packs that Google has so long promoted. What do you think? In your opinion, does this search results display live up to Google's goals of usability and quality?

Snack Pack survival for local business owners and SEOs

However I may feel about Snack Packs, this I know: when I want to play with Google, it's always got to be by their rules. So how can businesses like hotels, restaurants, bakeries, venues, bars, clubs, amusement parks, caterers and their marketers survive Snack Pack treatment?

The answer is clear:

Given that your customers will be interacting within a series of Google interfaces, it is now more important than ever that your Google-related marketing be as flawless as possible.

Using that secondary in-space interface as our springboard, this means that you have to get all of the following correct:

In your Google My Business dashboard

  1. Business name
  2. Address
  3. Phone number
  4. Website
  5. Description
  6. Hours of operation
  7. Categories
  8. Images

Beyond your Google My Business dashboard

  1. You must be earning positive, Google-based reviews and keeping an eagle eye on any patterns of negative reviews that arise so that you can quickly remedy internal service problems and respond appropriately.
  2. If you're marketing a food service business, you should upload your menu to sites like UrbanSpoon and GrubHub. These are the sites from which I've seen Google pulling menus, but there could be other platforms I haven't noticed.
  3. Food-oriented businesses must also tackle the Zagat environment. Here are Google's detailed guidelines covering how to get Zagat rated, what's allowed and what isn't, editing listings, uploading menus, and lots more.
  4. Remember that Google draws data not just from places like Zagat but from all over the web. This means that your website, your structured citations, and unstructured mentions of your business must accurately, and hopefully positively, represent your business.
  5. There is no replacement for good service at your place of business, and excellent service may earn you additional perks like being added to "Best Of" lists by Google's Zagat, which then make it into the interfaces Google controls.
  6. Be prepared for change. If we've learned one thing in the local SEO industry, it's that Google makes both small and large changes on an on-going basis. We all went for a ride on the Carousel in 2013 and, with the exception of a few search categories, hopped off in 2014. Now we're gnawing on Snack Packs. Tomorrow, who knows? What has historically stood business owners in good stead amidst all of these search evolutions is adherence to guidelines and data accuracy on Google's products and around the web.

Your key takeaway: Be alert to developments but don't be dismayed—if you're getting your marketing right, chances are good that you'll survive any foreseeable local display change. That's good news for local business owners and their marketers alike!

Header images by Scott Bauer (United States Department of Agriculture) [Public domain] and Ricraider (Own work) [CC BY-SA 3.0], both via Wikimedia Commons.


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!