Moz the Monster: Anatomy of an (Averted) Brand Crisis

Posted by Dr-Pete

On the morning of Friday, November 10, we woke up to the news that John Lewis had launched an ad campaign called “Moz the Monster“. If you’re from the UK, John Lewis needs no introduction, but for our American audience, they’re a high-end retail chain that’s gained a reputation for a decade of amazing Christmas ads.

It’s estimated that John Lewis spent upwards of £7m on this campaign (roughly $9.4M). It quickly became clear that they had organized a multi-channel effort, including a #mozthemonster Twitter campaign.

From a consumer perspective, Moz was just a lovable blue monster. From the perspective of a company that has spent years building a brand, John Lewis was potentially going to rewrite what “Moz” meant to the broader world. From a search perspective, we were facing a rare possibility of competing for our own brand on Google results if this campaign went viral (and John Lewis has a solid history of viral campaigns).

Step #1: Don’t panic

At the speed of social media, it can be hard to stop and take a breath, but you have to remember that that speed cuts both ways. If you’re too quick to respond and make a mistake, that mistake travels at the same speed and can turn into a self-fulfilling prophecy, creating exactly the disaster you feared.

The first step is to get multiple perspectives quickly. I took to Slack in the morning (I’m two hours ahead of the Seattle team) to find out who was awake. Two of our UK team (Jo and Eli) were quick to respond, which had the added benefit of getting us the local perspective.

Collectively, we decided that, in the spirit of our TAGFEE philosophy, a friendly monster deserved a friendly response. Even if we chose to look at it purely from a pragmatic, tactical standpoint, John Lewis wasn’t a competitor, and going in metaphorical guns-blazing against a furry blue monster and the little boy he befriended could’ve been step one toward a reputation nightmare.

Step #2: Respond (carefully)

In some cases, you may choose not to respond, but in this case we felt that friendly engagement was our best approach. Since the Seattle team was finishing their first cup of coffee, I decided to test the waters with a tweet from my personal account:

I’ve got a smaller audience than the main Moz account, and a personal tweet as the west coast was getting in gear was less exposure. The initial response was positive, and we even got a little bit of feedback, such as suggestions to monitor UK Google SERPs (see “Step #3”).

Our community team (thanks, Tyler!) quickly followed up with an official tweet:

While we didn’t get direct engagement from John Lewis, the general community response was positive. Roger Mozbot and Moz the Monster could live in peace, at least for now.

Step #3: Measure

There was a longer-term fear – would engagement with the Moz the Monster campaign alter Google SERPs for Moz-related keywords? Google has become an incredibly dynamic engine, and the meaning of any given phrase can rewrite itself based on how searchers engage with that phrase. I decided to track “moz” itself across both the US and UK.

In that first day of the official campaign launch, searches for “moz” were already showing news (“Top Stories”) results in the US and UK, with the text-only version in the US:

…and the richer Top Stories carousel in the UK:

The Guardian article that announced the campaign launch was also ranking organically, near the bottom of page one. So, even on day one, we were seeing some brand encroachment and knew we had to keep track of the situation on a daily basis.

Just two days later (November 12), Moz the Monster had captured four page-one organic results for “moz” in the UK (at the bottom of the page):

While it still wasn’t time to panic, John Lewis’ campaign was clearly having an impact on Google SERPs.

Step #4: Surprises

On November 13, it looked like the SERPs might be returning to normal. The Moz Blog had regained the Top Stories block in both US and UK results:

We weren’t in the clear yet, though. A couple of days later, a plagiarism scandal broke, and it was dominating the UK news for “moz” by November 18:

This story also migrated into organic SERPs after The Guardian published an op-ed piece. Fortunately for John Lewis, the follow-up story didn’t last very long. It’s an important reminder, though, that you can’t take your eyes off of the ball just because it seems to be rolling in the right direction.

Step #5: Results

It’s one thing to see changes in the SERPs, but how was all of this impacting search trends and our actual traffic? Here’s the data from Google Trends for a 4-week period around the Moz the Monster launch (2 weeks on either side):

The top graph is US trends data, and the bottom graph is UK. The large spike in the middle of the UK graph is November 10, where you can see that interest in the search “moz” increased dramatically. However, this spike fell off fairly quickly and US interest was relatively unaffected.

Let’s look at the same time period for Google Search Console impression and click data. First, the US data (isolated to just the keyword “moz”):

There was almost no change in impressions or clicks in the US market. Now, the UK data:

Here, the launch spike in impressions is very clear, and closely mirrors the Google Trends data. However, clicks to Moz.com were, like the US market, unaffected. Hindsight is 20/20, and we were trying to make decisions on the fly, but the short-term shift in Google SERPs had very little impact on clicks to our site. People looking for Moz the Monster and people looking for Moz the search marketing tool are, not shockingly, two very different groups.

Ultimately, the impact of this campaign was short-lived, but it is interesting to see how quickly a SERP can rewrite itself based on the changing world, especially with an injection of ad dollars. At one point (in UK results), Moz the Monster had replaced Moz.com in over half (5 of 8) page-one organic spots and Top Stories – an impressive and somewhat alarming feat.

By December 2, Moz the Monster had completely disappeared from US and UK SERPs for the phrase “moz”. New, short-term signals can rewrite search results, but when those signals fade, results often return to normal. So, remember not to panic and track real, bottom-line results.

Your crisis plan

So, how can we generalize this to other brand crises? What happens when someone else’s campaign treads on your brand’s hard-fought territory? Let’s restate our 5-step process:

(1) Remember not to panic

The very word “crisis” almost demands panic, but remember that you can make any problem worse. I realize that’s not very comforting, but unless your office is actually on fire, there’s time to stop and assess the situation. Get multiple perspectives and make sure you’re not overreacting.

(2) Be cautiously proactive

Unless there’s a very good reason not to (such as a legal reason), it’s almost always best to be proactive and respond to the situation on your own terms. At least acknowledge the situation, preferably with a touch of humor. These brand intrusions are, by their nature, high profile, and if you pretend it’s not happening, you’ll just look clueless.

(3) Track the impact

As soon as possible, start collecting data. These situations move quickly, and search rankings can change overnight in 2017. Find out what impact the event is really having as quickly as possible, even if you have to track some of it by hand. Don’t wait for the perfect metrics or tracking tools.

(4) Don’t get complacent

Search results are volatile and social media is fickle – don’t assume that a lull or short-term change means you can stop and rest. Keep tracking, at least for a few days and preferably for a couple of weeks (depending on the severity of the crisis).

(5) Measure bottom-line results

As the days go by, you’ll be able to more clearly see the impact. Track as deeply as you can – long-term rankings, traffic, even sales/conversions where necessary. This is the data that tells you if the short-term impact in (3) is really doing damage or is just superficial.

The real John Lewis

Finally, I’d like to give a shout-out to someone who has felt a much longer-term impact of John Lewis’ succesful holiday campaigns. Twitter user and computer science teacher @johnlewis has weathered his own brand crisis year after year with grace and humor:

So, a hat-tip to John Lewis, and, on behalf of Moz, a very happy holidays to Moz the Monster!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Advertisements

Keyword Research Beats Nate Silver’s 2016 Presidential Election Prediction

Posted by BritneyMuller

100% of statisticians would say this is a terrible method for predicting elections. However, in the case of 2016’s presidential election, analyzing the geographic search volume of a few telling keywords “predicted” the outcome more accurately than Nate Silver himself.

The 2016 US Presidential Election was a nail-biter, and many of us followed along with the famed statistician’s predictions in real time on FiveThirtyEight.com. Silver’s predictions, though more accurate than many, were still disrupted by the election results.

In an effort to better understand our country (and current political chaos), I dove into keyword research state-by-state searching for insights. Keywords can be powerful indicators of intent, thought, and behavior. What keyword searches might indicate a personal political opinion? Might there be a common denominator search among people with the same political beliefs?

It’s generally agreed that Fox News leans to the right and CNN leans to the left. And if we’ve learned anything this past year, it’s that the news you consume can have a strong impact on what you believe, in addition to the confirmation bias already present in seeking out particular sources of information.

My crazy idea: What if Republican states showed more “fox news” searches than “cnn”? What if those searches revealed a bias and an intent that exit polling seemed to obscure?

The limitations to this research were pretty obvious. Watching Fox News or CNN doesn’t necessarily correlate with voter behavior, but could it be a better indicator than the polls? My research says yes. I researched other media outlets as well, but the top two ideologically opposed news sources – in any of the 50 states – were consistently Fox News and CNN.

Using Google Keyword Planner (connected to a high-paying Adwords account to view the most accurate/non-bucketed data), I evaluated each state’s search volume for “fox news” and “cnn.”

Eight states showed the exact same search volumes for both. Excluding those from my initial test, my results accurately predicted 42/42 of the 2016 presidential state outcomes including North Carolina and Wisconsin (which Silver mis-predicted). Interestingly, “cnn” even mirrored Hillary Clinton, similarly winning the popular vote (25,633,333 vs. 23,675,000 average monthly search volume for the United States).

In contrast, Nate Silver accurately predicted 45/50 states using a statistical methodology based on polling results.

Click for a larger image

This gets even more interesting:

The eight states showing the same average monthly search volume for both “cnn” and “fox news” are Arizona, Florida, Michigan, Nevada, New Mexico, Ohio, Pennsylvania, and Texas.

However, I was able to dive deeper via GrepWords API (a keyword research tool that actually powers Keyword Explorer’s data), to discover that Arizona, Nevada, New Mexico, Pennsylvania, and Ohio each have slightly different “cnn” vs “fox news” search averages over the previous 12-month period. Those new search volume averages are:

“fox news” avg monthly search volume

“cnn” avg monthly search volume

KWR Prediction

2016 Vote

Arizona

566333

518583

Trump

Trump

Nevada

213833

214583

Hillary

Hillary

New Mexico

138833

142916

Hillary

Hillary

Ohio

845833

781083

Trump

Trump

Pennsylvania

1030500

1063583

Hillary

Trump

Four out of five isn’t bad! This brought my new prediction up to 46/47.

Silver and I each got Pennsylvania wrong. The GrepWords API shows the average monthly search volume for “cnn” was ~33,083 searches higher than “fox news” (to put that in perspective, that’s ~0.26% of the state’s population). This tight-knit keyword research theory is perfectly reflected in Trump’s 48.2% win against Clinton’s 47.5%.

Nate Silver and I have very different day jobs, and he wouldn’t make many of these hasty generalizations. Any prediction method can be right a couple times. However, it got me thinking about the power of keyword research: how it can reveal searcher intent, predict behavior, and sometimes even defy the logic of things like statistics.

It’s also easy to predict the past. What happens when we apply this model to today’s Senate race?

Can we apply this theory to Alabama’s special election in the US Senate?

After completing the above research on a whim, I realized that we’re on the cusp of yet another hotly contested, extremely close election: the upcoming Alabama senate race, between controversy-laden Republican Roy Moore and Democratic challenger Doug Jones, fighting for a Senate seat that hasn’t been held by a Democrat since 1992.

I researched each Alabama county – 67 in total – for good measure. There are obviously a ton of variables at play. However, 52 out of the 67 counties (77.6%) 2016 presidential county votes are correctly “predicted” by my theory.

Even when giving the Democratic nominee more weight to the very low search volume counties (19 counties showed a search volume difference of less than 500), my numbers lean pretty far to the right (48/67 Republican counties):

It should be noted that my theory incorrectly guessed two of the five largest Alabama counties, Montgomery and Jefferson, which both voted Democrat in 2016.

Greene and Macon Counties should both vote Democrat; their very slight “cnn” over “fox news” search volume is confirmed by their previous presidential election results.

I realize state elections are not won by county, they’re won by popular vote, and the state of Alabama searches for “fox news” 204,000 more times a month than “cnn” (to put that in perspective, that’s around ~4.27% of Alabama’s population).

All things aside and regardless of outcome, this was an interesting exploration into how keyword research can offer us a glimpse into popular opinion, future behavior, and search intent. What do you think? Any other predictions we could make to test this theory? What other keywords or factors would you look at? Let us know in the comments.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

What Do Google’s New, Longer Snippets Mean for SEO? – Whiteboard Friday

Posted by randfish

Snippets and meta descriptions have brand-new character limits, and it’s a big change for Google and SEOs alike. Learn about what’s new, when it changed, and what it all means for SEO in this edition of Whiteboard Friday.

https://fast.wistia.net/embed/iframe/rzyt0jmt93?videoFoam=true

https://fast.wistia.net/assets/external/E-v1.js

What do Google's now, longer snippets mean for SEO?

Click on the whiteboard image above to open a high-resolution version in a new tab!

https://w.soundcloud.com/player/?url=https%3A//api.soundcloud.com/tracks/366792656&color=%23ff5500&auto_play=false&hide_related=false&show_comments=true&show_user=true&show_reposts=false&show_teaser=true

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re chatting about Google’s big change to the snippet length.

This is the display length of the snippet for any given result in the search results that Google provides. This is on both mobile and desktop. It sort of impacts the meta description, which is how many snippets are written. They’re taken from the meta description tag of the web page. Google essentially said just last week, “Hey, we have officially increased the length, the recommended length, and the display length of what we will show in the text snippet of standard organic results.”

So I’m illustrating that for you here. I did a search for “net neutrality bill,” something that’s on the minds of a lot of Americans right now. You can see here that this article from The Hill, which is a recent article – it was two days ago – has a much longer text snippet than what we would normally expect to find. In fact, I went ahead and counted this one and then showed it here.

So basically, at the old 165-character limit, which is what you would have seen prior to the middle of December on most every search result, occasionally Google would have a longer one for very specific kinds of search results, but more than 90%, according to data from SISTRIX, which put out a great report and I’ll link to it here, more than 90% of search snippets were 165 characters or less prior to the middle of November. Then Google added basically a few more lines.

So now, on mobile and desktop, instead of an average of two or three lines, we’re talking three, four, five, sometimes even six lines of text. So this snippet here is 266 characters that Google is displaying. The next result, from Save the Internet, is 273 characters. Again, this might be because Google sort of realized, “Hey, we almost got all of this in here. Let’s just carry it through to the end rather than showing the ellipsis.” But you can see that 165 characters would cut off right here. This one actually does a good job of displaying things.

So imagine a searcher is querying for something in your field and they’re just looking for a basic understanding of what it is. So they’ve never heard of net neutrality. They’re not sure what it is. So they can read here, “Net neutrality is the basic principle that prohibits internet service providers like AT&T, Comcast, and Verizon from speeding up, slowing down, or blocking any . . .” And that’s where it would cut off. Or that’s where it would have cut off in November.

Now, if I got a snippet like that, I need to visit the site. I’ve got to click through in order to learn more. That doesn’t tell me enough to give me the data to go through. Now, Google has tackled this before with things, like a featured snippet, that sit at the top of the search results, that are a more expansive short answer. But in this case, I can get the rest of it because now, as of mid-November, Google has lengthened this. So now I can get, “Any content, applications, or websites you want to use. Net neutrality is the way that the Internet has always worked.”

Now, you might quibble and say this is not a full, thorough understanding of what net neutrality is, and I agree. But for a lot of searchers, this is good enough. They don’t need to click any more. This extension from 165 to 275 or 273, in this case, has really done the trick.

What changed?

So this can have a bunch of changes to SEO too. So the change that happened here is that Google updated basically two things. One, they updated the snippet length, and two, they updated their guidelines around it.

So Google’s had historic guidelines that said, well, you want to keep your meta description tag between about 160 and 180 characters. I think that was the number. They’ve updated that to where they say there’s no official meta description recommended length. But on Twitter, Danny Sullivan said that he would probably not make that greater than 320 characters. In fact, we and other data providers, that collect a lot of search results, didn’t find many that extended beyond 300. So I think that’s a reasonable thing.

When?

When did this happen? It was starting at about mid-November. November 22nd is when SISTRIX’s dataset starts to notice the increase, and it was over 50%. Now it’s sitting at about 51% of search results that have these longer snippets in at least 1 of the top 10 as of December 2nd.

Here’s the amazing thing, though – 51% of search results have at least one. Many of those, because they’re still pulling old meta descriptions or meta descriptions that SEO has optimized for the 165-character limit, are still very short. So if you’re the person in your search results, especially it’s holiday time right now, lots of ecommerce action, if you’re the person to go update your important pages right now, you might be able to get more real estate in the search results than any of your competitors in the SERPs because they’re not updating theirs.

How will this affect SEO?

So how is this going to really change SEO? Well, three things:

A. It changes how marketers should write and optimize the meta description.

We’re going to be writing a little bit differently because we have more space. We’re going to be trying to entice people to click, but we’re going to be very conscientious that we want to try and answer a lot of this in the search result itself, because if we can, there’s a good chance that Google will rank us higher, even if we’re actually sort of sacrificing clicks by helping the searcher get the answer they need in the search result.

B. It may impact click-through rate.

We’ll be looking at Jumpshot data over the next few months and year ahead. We think that there are two likely ways they could do it. Probably negatively, meaning fewer clicks on less complex queries. But conversely, possible it will get more clicks on some more complex queries, because people are more enticed by the longer description. Fingers crossed, that’s kind of what you want to do as a marketer.

C. It may lead to lower click-through rate further down in the search results.

If you think about the fact that this is taking up the real estate that was taken up by three results with two, as of a month ago, well, maybe people won’t scroll as far down. Maybe the ones that are higher up will in fact draw more of the clicks, and thus being further down on page one will have less value than it used to.

What should SEOs do?

What are things that you should do right now? Number one, make a priority list – you should probably already have this – of your most important landing pages by search traffic, the ones that receive the most search traffic on your website, organic search. Then I would go and reoptimize those meta descriptions for the longer limits.

Now, you can judge as you will. My advice would be go to the SERPs that are sending you the most traffic, that you’re ranking for the most. Go check out the limits. They’re probably between about 250 and 300, and you can optimize somewhere in there.

The second thing I would do is if you have internal processes or your CMS has rules around how long you can make a meta description tag, you’re going to have to update those probably from the old limit of somewhere in the 160 to 180 range to the new 230 to 320 range. It doesn’t look like many are smaller than 230 now, at least limit-wise, and it doesn’t look like anything is particularly longer than 320. So somewhere in there is where you’re going to want to stay.

Good luck with your new meta descriptions and with your new snippet optimization. We’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Don’t Be Fooled by Data: 4 Data Analysis Pitfalls & How to Avoid Them

Posted by Tom.Capper

Digital marketing is a proudly data-driven field. Yet, as SEOs especially, we often have such incomplete or questionable data to work with, that we end up jumping to the wrong conclusions in our attempts to substantiate our arguments or quantify our issues and opportunities.

In this post, I’m going to outline 4 data analysis pitfalls that are endemic in our industry, and how to avoid them.

1. Jumping to conclusions

Earlier this year, I conducted a ranking factor study around brand awareness, and I posted this caveat:

“…the fact that Domain Authority (or branded search volume, or anything else) is positively correlated with rankings could indicate that any or all of the following is likely:

  • Links cause sites to rank well
  • Ranking well causes sites to get links
  • Some third factor (e.g. reputation or age of site) causes sites to get both links and rankings”

    ~ Me

However, I want to go into this in a bit more depth and give you a framework for analyzing these yourself, because it still comes up a lot. Take, for example, this recent study by Stone Temple, which you may have seen in the Moz Top 10 or Rand’s tweets, or this excellent article discussing SEMRush’s recent direct traffic findings. To be absolutely clear, I’m not criticizing either of the studies, but I do want to draw attention to how we might interpret them.

Firstly, we do tend to suffer a little confirmation bias – we’re all too eager to call out the cliché “correlation vs. causation” distinction when we see successful sites that are keyword-stuffed, but all too approving when we see studies doing the same with something we think is or was effective, like links.

Secondly, we fail to critically analyze the potential mechanisms. The options aren’t just causation or coincidence.

Before you jump to a conclusion based on a correlation, you’re obliged to consider various possibilities:

  • Complete coincidence
  • Reverse causation
  • Joint causation
  • Linearity
  • Broad applicability

If those don’t make any sense, then that’s fair enough – they’re jargon. Let’s go through an example:

Before I warn you not to eat cheese because you may die in your bedsheets, I’m obliged to check that it isn’t any of the following:

  • Complete coincidence – Is it possible that so many datasets were compared, that some were bound to be similar? Why, that’s exactly what Tyler Vigen did! Yes, this is possible.
  • Reverse causation – Is it possible that we have this the wrong way around? For example, perhaps your relatives, in mourning for your bedsheet-related death, eat cheese in large quantities to comfort themselves? This seems pretty unlikely, so let’s give it a pass. No, this is very unlikely.
  • Joint causation – Is it possible that some third factor is behind both of these? Maybe increasing affluence makes you healthier (so you don’t die of things like malnutrition), and also causes you to eat more cheese? This seems very plausible. Yes, this is possible.
  • Linearity – Are we comparing two linear trends? A linear trend is a steady rate of growth or decline. Any two statistics which are both roughly linear over time will be very well correlated. In the graph above, both our statistics are trending linearly upwards. If the graph was drawn with different scales, they might look completely unrelated, like this, but because they both have a steady rate, they’d still be very well correlated. Yes, this looks likely.
  • Broad applicability – Is it possible that this relationship only exists in certain niche scenarios, or, at least, not in my niche scenario? Perhaps, for example, cheese does this to some people, and that’s been enough to create this correlation, because there are so few bedsheet-tangling fatalities otherwise? Yes, this seems possible.

So we have 4 “Yes” answers and one “No” answer from those 5 checks.

If your example doesn’t get 5 “No” answers from those 5 checks, it’s a fail, and you don’t get to say that the study has established either a ranking factor or a fatal side effect of cheese consumption.

A similar process should apply to case studies, which are another form of correlation – the correlation between you making a change, and something good (or bad!) happening. For example, ask:

  • Have I ruled out other factors (e.g. external demand, seasonality, competitors making mistakes)?
  • Did I increase traffic by doing the thing I tried to do, or did I accidentally improve some other factor at the same time?
  • Did this work because of the unique circumstance of the particular client/project?

This is particularly challenging for SEOs, because we rarely have data of this quality, but I’d suggest an additional pair of questions to help you navigate this minefield:

  • If I were Google, would I do this?
  • If I were Google, could I do this?

Direct traffic as a ranking factor passes the “could” test, but only barely – Google could use data from Chrome, Android, or ISPs, but it’d be sketchy. It doesn’t really pass the “would” test, though – it’d be far easier for Google to use branded search traffic, which would answer the same questions you might try to answer by comparing direct traffic levels (e.g. how popular is this website?).

2. Missing the context

If I told you that my traffic was up 20% week on week today, what would you say? Congratulations?

What if it was up 20% this time last year?

What if I told you it had been up 20% year on year, up until recently?

It’s funny how a little context can completely change this. This is another problem with case studies and their evil inverted twin, traffic drop analyses.

If we really want to understand whether to be surprised at something, positively or negatively, we need to compare it to our expectations, and then figure out what deviation from our expectations is “normal.” If this is starting to sound like statistics, that’s because it is statistics – indeed, I wrote about a statistical approach to measuring change way back in 2015.

If you want to be lazy, though, a good rule of thumb is to zoom out, and add in those previous years. And if someone shows you data that is suspiciously zoomed in, you might want to take it with a pinch of salt.

3. Trusting our tools

Would you make a multi-million dollar business decision based on a number that your competitor could manipulate at will? Well, chances are you do, and the number can be found in Google Analytics. I’ve covered this extensively in other places, but there are some major problems with most analytics platforms around:

  • How easy they are to manipulate externally
  • How arbitrarily they group hits into sessions
  • How vulnerable they are to ad blockers
  • How they perform under sampling, and how obvious they make this

For example, did you know that the Google Analytics API v3 can heavily sample data whilst telling you that the data is unsampled, above a certain amount of traffic (~500,000 within date range)? Neither did I, until we ran into it whilst building Distilled ODN.

Similar problems exist with many “Search Analytics” tools. My colleague Sam Nemzer has written a bunch about this – did you know that most rank tracking platforms report completely different rankings? Or how about the fact that the keywords grouped by Google (and thus tools like SEMRush and STAT, too) are not equivalent, and don’t necessarily have the volumes quoted?

It’s important to understand the strengths and weaknesses of tools that we use, so that we can at least know when they’re directionally accurate (as in, their insights guide you in the right direction), even if not perfectly accurate. All I can really recommend here is that skilling up in SEO (or any other digital channel) necessarily means understanding the mechanics behind your measurement platforms – which is why all new starts at Distilled end up learning how to do analytics audits.

One of the most common solutions to the root problem is combining multiple data sources, but…

4. Combining data sources

There are numerous platforms out there that will “defeat (not provided)” by bringing together data from two or more of:

  • Analytics
  • Search Console
  • AdWords
  • Rank tracking

The problems here are that, firstly, these platforms do not have equivalent definitions, and secondly, ironically, (not provided) tends to break them.

Let’s deal with definitions first, with an example – let’s look at a landing page with a channel:

  • In Search Console, these are reported as clicks, and can be vulnerable to heavy, invisible sampling when multiple dimensions (e.g. keyword and page) or filters are combined.
  • In Google Analytics, these are reported using last non-direct click, meaning that your organic traffic includes a bunch of direct sessions, time-outs that resumed mid-session, etc. That’s without getting into dark traffic, ad blockers, etc.
  • In AdWords, most reporting uses last AdWords click, and conversions may be defined differently. In addition, keyword volumes are bundled, as referenced above.
  • Rank tracking is location specific, and inconsistent, as referenced above.

Fine, though – it may not be precise, but you can at least get to some directionally useful data given these limitations. However, about that “(not provided)”…

Most of your landing pages get traffic from more than one keyword. It’s very likely that some of these keywords convert better than others, particularly if they are branded, meaning that even the most thorough click-through rate model isn’t going to help you. So how do you know which keywords are valuable?

The best answer is to generalize from AdWords data for those keywords, but it’s very unlikely that you have analytics data for all those combinations of keyword and landing page. Essentially, the tools that report on this make the very bold assumption that a given page converts identically for all keywords. Some are more transparent about this than others.

Again, this isn’t to say that those tools aren’t valuable – they just need to be understood carefully. The only way you could reliably fill in these blanks created by “not provided” would be to spend a ton on paid search to get decent volume, conversion rate, and bounce rate estimates for all your keywords, and even then, you’ve not fixed the inconsistent definitions issues.

Bonus peeve: Average rank

I still see this way too often. Three questions:

  1. Do you care more about losing rankings for ten very low volume queries (10 searches a month or less) than for one high volume query (millions plus)? If the answer isn’t “yes, I absolutely care more about the ten low-volume queries”, then this metric isn’t for you, and you should consider a visibility metric based on click through rate estimates.
  2. When you start ranking at 100 for a keyword you didn’t rank for before, does this make you unhappy? If the answer isn’t “yes, I hate ranking for new keywords,” then this metric isn’t for you – because that will lower your average rank. You could of course treat all non-ranking keywords as position 100, as some tools allow, but is a drop of 2 average rank positions really the best way to express that 1/50 of your landing pages have been de-indexed? Again, use a visibility metric, please.
  3. Do you like comparing your performance with your competitors? If the answer isn’t “no, of course not,” then this metric isn’t for you – your competitors may have more or fewer branded keywords or long-tail rankings, and these will skew the comparison. Again, use a visibility metric.

Conclusion

Hopefully, you’ve found this useful. To summarize the main takeaways:

  • Critically analyse correlations & case studies by seeing if you can explain them as coincidences, as reverse causation, as joint causation, through reference to a third mutually relevant factor, or through niche applicability.
  • Don’t look at changes in traffic without looking at the context – what would you have forecasted for this period, and with what margin of error?
  • Remember that the tools we use have limitations, and do your research on how that impacts the numbers they show. “How has this number been produced?” is an important component in “What does this number mean?”
  • If you end up combining data from multiple tools, remember to work out the relationship between them – treat this information as directional rather than precise.

Let me know what data analysis fallacies bug you, in the comments below.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Our Readership: Results of the 2017 Moz Blog Reader Survey

Posted by Trevor-Klein

This blog is for all of you. In a notoriously opaque and confusing industry that’s prone to frequent changes, we see immense benefit in helping all of you stay on top of the game. To that end, every couple of years we ask for a report card of sorts, hoping not only to get a sense for how your jobs have changed, but also to get a sense for how we can improve.

About a month ago, we asked you all to take a reader survey, and nearly 600 of you generously gave your time. The results, summarized in this post, were immensely helpful, and were a reminder of how lucky we are to have such a thoughtful community of readers.

I’ve offered as much data as I can, and when possible, I’ve also trended responses against the same questions from our 2015 and 2013 surveys, so you can get a sense for how things have changed. There’s a lot here, so buckle up. =)


Who our readers are

To put all of this great feedback into context, it helps to know a bit about who the people in our audience actually are. Sure, we can glean a bit of information from our site analytics, and can make some educated guesses, but neither of those can answer the questions we’re most curious about. What’s your day-to-day work like, and how much SEO does it really involve? Would you consider yourself more of an SEO beginner, or more of an SEO wizard? And, most importantly, what challenges are you facing in your work these days? The answers give us a fuller understanding of where the rest of your feedback comes from.

What is your job title?

Readers of the Moz Blog have a multitude of backgrounds, from CEOs of agencies to in-the-weeds SEOs of all skill levels. One of the most common themes we see, though, is a skew toward the more general marketing industry. I know that word clouds have their faults, but it’s still a relatively interesting way to gauge how often things appear in a list like this, so here’s what we’ve got this year:

Of note, similar to our results in 2015, the word “marketing” is the most common result, followed by the word “SEO” and the word “manager.”

Here’s a look at the top 20 terms used in this year’s results, along with the percentage of responses containing each term. You’ll also see those same percentages from the 2015 and 2013 surveys to give you an idea of what’s changed — the darker the bar, the more recent the survey:

The thing that surprises me the most about this list is how little it’s changed in the four-plus years since we first asked the question (a theme you’ll see recur in the rest of these results). In fact, the top 20 terms this year are nearly identical to the top 20 terms four years ago, with only a few things sliding up or down a few spots.

What percentage of your day-to-day work involves SEO?

We hear a lot about people wearing multiple hats for their companies. One person who took this survey noted that even at a 9,000-person company, they were the only one who worked on SEO, and it was only about 80% of their job. That idea is backed up by this data, which shows an incredibly broad range of responses. More than 10% of respondents barely touch SEO, and not even 14% say they’re full-time:

One interesting thing to note is the sharp decline in the number of people who say that SEO isn’t a part of their day-to-day at all. That shift is likely a result of our shift back toward SEO, away from related areas like social media and content marketing. I think we had attracted a significant number of community managers and content specialists who didn’t work in SEO, and we’re now seeing the pendulum swing the other direction.

On a scale of 1-5, how advanced would you say your SEO knowledge is?

The similarity between this year’s graph for this question and those from 2015 and 2013 is simply astonishing:

There’s been a slight drop in folks who say they’re at an expert level, and a slight increase in folks who have some background, but are relative beginners. But only slight. The interesting thing is, our blog traffic has increased significantly over these four years, so the newer members of our audience bear a striking resemblance to those of you who’ve been around for quite some time. In a sense, that’s reassuring — it paints a clear picture for us as we continue refining our content.

Do you work in-house, or at an agency/consultancy?

Here’s another window into just how little our audience has changed in the last couple of years:

A slight majority of our readers still work in-house for their own companies, and about a third still work on SEO for their company’s clients.

Interestingly, though, respondents who work for clients deal with many of the same issues as those who work in-house — especially in trying to convey the value of their work in SEO. They’re just trying to send that message to external clients instead of internal stakeholders. More details on that come from our next question:

What are some of the biggest challenges you face in your work today?

I’m consistently amazed by the time and thought that so many of you put into answering this question, and rest assured, your feedback will be presented to several teams around Moz, both on the marketing and the product sides. For this question, I organized each and every response into recurring themes, tallying each time those themes were mentioned. Here are all the themes that were mentioned 10 or more times:

Challenge

# of mentions

My clients / colleagues / bosses don’t understand the value of SEO

59

The industry and tactics are constantly changing; algo updates

45

Time constraints

44

Link building

35

My clients / colleagues / bosses don’t understand how SEO works

29

Content (strategy / creation / marketing)

25

Resource constraints

23

It’s difficult to prove ROI

18

Budget constraints

17

It’s a difficult industry in which to learn tools and techniques

16

I regularly need to educate my colleagues / employees

16

It’s difficult to prioritize my work

16

My clients either don’t have or won’t offer sufficient budget / effort

15

Effective reporting

15

Bureaucracy, red tape, other company problems

11

It’s difficult to compete with other companies

11

I’m required to wear multiple hats

11

More than anything else, it’s patently obvious that one of the greatest difficulties faced by any SEO is explaining it to other people in a way that demonstrates its value while setting appropriate expectations for results. Whether it’s your clients, your boss, or your peers that you’re trying to convince, it isn’t an easy case to make, especially when it’s so difficult to show what kind of return a company can see from an investment in SEO.

We also saw tons of frustrated responses about how the industry is constantly changing, and it takes too much of your already-constrained time just to stay on top of those changes.

In terms of tactics, link building easily tops the list of challenges. That makes sense, as it’s the piece of SEO that relies most heavily on the cooperation of other human beings (and humans are often tricky beings to figure out). =)

Content marketing — both the creation/copywriting side as well as the strategy side — is still a challenge for many folks in the industry, though fewer people mentioned it this year as mentioned it in 2015, so I think we’re all starting to get used to how those skills overlap with the more traditional aspects of SEO.


How our readers read

With all that context in mind, we started to dig into your preferences in terms of formats, frequency, and subject matter on the blog.

How often do you read posts on the Moz Blog?

This is the one set of responses that caused a bit of concern. We’ve seen a steady decrease in the number of people who say they read every day, a slight decrease in the number of people who say they read multiple times each week, and a dramatic increase in the number of people who say they read once a week.

The 2015 decrease came after an expansion in the scope of subjects we covered on the blog — as we branched away from just SEO, we published more posts about social media, email, and other aspects of digital marketing. We knew that not all of those subjects were relevant for everyone, so we expected a dip in frequency of readership.

This year, though, we’ve attempted to refocus on SEO, and might have expected a bit of a rebound. That didn’t happen:

There are two other factors at play, here. For one thing, we no longer publish a post every single weekday. After our publishing volume experiment in 2015, we realized it was safe (even beneficial) to emphasize quality over quantity, so if we don’t feel like a post turned out the way we hoped, we don’t publish it until we’ve had a chance to improve it. That means we’re down to about four posts per week. We’ve also made a concerted effort to publish more posts about local SEO, as that’s relevant to our software and an increasingly important part of the work of folks in our industry.

It could also be a question of time — we’ve already covered how little time everyone in our industry has, and with that problem continuing, there may just be less time to read blog posts.

If anyone has any additional insight into why they read less often than they once did, please let us know in the comments below!

On which types of devices do you prefer to read blog posts?

We were surprised by the responses to this answer in 2013, and they’ve only gotten more extreme:

Nearly everyone prefers to read blog posts on a full computer. Only about 15% of folks add their phones into the equation, and the number of people in all the other buckets is extremely small. In 2013, our blog didn’t have a responsive design, and was quite difficult to read on mobile devices. We thought that might have had something to do with people’s responses — maybe they were just used to reading our blog on larger screens. The trend in 2015 and this year, though, proves that’s not the case. People just prefer reading posts on their computers, plain and simple.

Which other site(s), if any, do you regularly visit for information or education on SEO?

This was a new question for this year. We have our own favorite sites, of course, but we had no idea how the majority of folks would respond to this question. As it turns out, there was quite a broad range of responses listing sites that take very different approaches:

Site

# responses

Search Engine Land

184

Search Engine Journal

89

Search Engine Roundtable

74

SEMrush

51

Ahrefs

50

Search Engine Watch

41

Quick Sprout / Neil Patel

35

HubSpot

33

Backlinko

31

Google Blogs

29

The SEM Post

21

Kissmetrics

17

Yoast

16

Distilled

13

SEO by the Sea

13

I suppose it’s no surprise that the most prolific sites sit at the top. They’ve always got something new, even if the stories don’t often go into much depth. We’ve tended to steer our own posts toward longer-form, in-depth pieces, and I think it’s safe to say (based on these responses and some to questions below) that it’d be beneficial for us to include some shorter stories, too. In other words, depth shouldn’t necessarily be a requisite for a post to be published on the Moz Blog. We may start experimenting with a more “short and sweet” approach to some posts.


What our readers think of the blog

Here’s where we get into more specific feedback about the Moz Blog, including whether it’s relevant, how easy it is for you to consume, and more.

What percentage of the posts on the Moz Blog would you say are relevant to you and your work?

Overall, I’m pretty happy with the results here, as SEO is a broad enough industry (and we’ve got a broad enough audience) that there’s simply no way we’re going to hit the sweet spot for everyone with every post. But those numbers toward the bottom of the chart are low enough that I feel confident we’re doing pretty well in terms of topic relevance.

Do you feel the Moz Blog posts are generally too basic, too advanced, or about right?

Responses to this question have made me smile every time I see them. This is clearly one thing we’re getting about as right as we could expect to. We’re even seeing a slight balancing of the “too basic” and “too advanced” columns over time, which is great:

We also asked the people who told us that posts were “too basic” or “too advanced” to what extent they felt that way, using a scale from 1-5 (1 being “just a little bit too basic/advanced” and 5 being “way too basic/advanced.” The responses tell us that the people who feel posts are too advanced feel more strongly about that opinion than the people who feel posts are too basic:

This makes some sense, I think. If you’re just starting out in SEO, which many of our readers are, some of the posts on this blog are likely to go straight over your head. That could be frustrating. If you’re an SEO expert, though, you probably aren’t frustrated by posts you see as too basic for you — you just skip past them and move on with your day.

This does make me think, though, that we might benefit from offering a dedicated section of the site for folks who are just starting out — more than just the Beginner’s Guide. That’s actually something that was specifically requested by one respondent this year.

In general, what do you think about the length of Moz Blog posts?

While it definitely seems like we’re doing pretty well in this regard, I’d also say we’ve got some room to tighten things up a bit, especially in light of the lack of time so many of you mentioned:

There were quite a few comments specifically asking for “short and sweet” posts from time to time — offering up useful tips or news in a format that didn’t expound on details because it didn’t have to. I think sprinkling some of those types of posts in with the longer-form posts we have so often would be beneficial.

Do you ever comment on Moz Blog posts?

This was another new question this year. Despite so many sites are removing comment sections from their blogs, we’ve always believed in their value. Sometimes the discussions we see in comments end up being the most helpful part of the posts, and we value our community too much to keep that from happening. So, we were happy to see a full quarter of respondents have participated in comments:

We also asked for a bit of info about why you either do or don’t comment on posts. The top reasons why you do were pretty predictable — to ask a clarifying question related to the post, or to offer up your own perspective on the topic at hand. The #3 reason was interesting — 18 people mentioned that they like to comment in order to thank the author for their hard work. This is a great sentiment, and as someone who’s published several posts on this blog, I can say for a fact that it does feel pretty great. At the same time, those comments are really only written for one person — the author — and are a bit problematic from our perspective, because they add noise around the more substantial conversations, which are what we like to see most.

I think the solution is going to lie in a new UI element that allows readers to note their appreciation to the authors without leaving one of the oft-maligned “Great post!” comments. There’s got to be a happy medium there, and I think it’s worth our finding it.

The reasons people gave for not commenting were even more interesting. A bunch of people mentioned the need to log in (sorry, folks — if we didn’t require that, we’d spend half our day removing spam!). The most common response, though, involved a lack of confidence. Whether it was worded along the lines of “I’m an introvert” or along the lines of “I just don’t have a lot of expertise,” there were quite a few people who worried about how their comments would be received.

I want to take this chance to encourage those of you who feel that way to take the step, and ask questions about points you find confusing. At the very least, I can guarantee you aren’t the only ones, and others like you will appreciate your initiative. One of the best ways to develop your expertise is to get comfortable asking questions. We all work in a really confusing industry, and the Moz Blog is all about providing a place to help each other out.

What, if anything, would you like to see different about the Moz Blog?

As usual, the responses to this question were chock full of great suggestions, and again, we so appreciate the amount of time you all spent providing really thoughtful feedback.

One pattern I saw was requests for more empirical data — hard evidence that things should be done a certain way, whether through case studies or other formats. Another pattern was requests for step-by-step walkthroughs. That makes a lot of sense for an industry of folks who are strapped for time: Make things as clear-cut as possible, and where we can, offer a linear path you can walk down instead of asking you to holistically understand the subject matter, then figure that out on your own. (That’s actually something we’re hoping to do with our entire Learning Center: Make it easier to figure out where to start, and where to continue after that, instead of putting everything into buckets and asking you all to figure it out.)

Whiteboard Friday remains a perennial favorite, and we were surprised to see more requests for more posts about our own tools than we had requests for fewer posts about our own tools. (We’ve been wary of that in the past, as we wanted to make sure we never crossed from “helpful” into “salesy,” something we’ll still focus on even if we do add another tool-based post here and there.)

We expected a bit of feedback about the format of the emails — we’re absolutely working on that! — but didn’t expect to see so many folks requesting that we bring back YouMoz. That’s something that’s been on the backs of our minds, and while it may not take the same form it did before, we do plan on finding new ways to encourage the community to contribute content, and hope to have something up and running early in 2018.

Request

#responses

More case studies

26

More Whiteboard Friday (or other videos)

25

More long-form step-by-step training/guides

18

Clearer steps to follow in posts; how-tos

11

Bring back UGC / YouMoz

9

More from Rand

9

Improve formatting of the emails

9

Higher-level, less-technical posts

8

More authors

7

More news (algorithm updates, e.g.)

7

Shorter posts, “quick wins”

7

Quizzes, polls, or other engagement opportunities

6

Broader range of topics (engagement, CRO, etc.)

6

More about Moz tools

5

More data-driven, less opinion-based

5


What our readers want to see

This section is a bit more future-facing, where some of what we asked before had to do with how things have been in the past.

Which of the following topics would you like to learn more about?

There were very, very few surprises in this list. Lots of interest in on-page SEO and link building, as well as other core tactical areas of SEO. Content, branding, and social media all took dips — that makes sense, given the fact that we don’t usually post about those things anymore, and we’ve no doubt lost some audience members who were more interested in them as a result. Interestingly, mobile took a sizable dip, too. I’d be really curious to know what people think about why that is. My best guess is that with the mobile-first indexing from Google and with responsive designs having become so commonplace, there isn’t as much of a need as there once was to think of mobile much differently than there was a couple of years ago. Also of note: When we did this survey in 2015, Google had recently rolled out its “Mobile-Friendly Update,” not-so-affectionately referred to by many in the industry as Mobilegeddon. So… it was on our minds. =)

Which of the following types of posts would you most like to see on the Moz Blog?

This is a great echo and validation of what we took away from the more general question about what you’d like to see different about the Blog: More tactical posts and step-by-step walkthroughs. Posts that cut to the chase and offer a clear direction forward, as opposed to some of the types at the bottom of this list, which offer more opinions and cerebral explorations:


What happens next?

Now we go to work. =)

We’ll spend some time fully digesting this info, and coming up with new goals for 2018 aimed at making improvements inspired by your feedback. We’ll keep you all apprised as we start moving forward.

If you have any additional insight that strikes you in taking a look at these results, please do share it in the comments below — we’d love to have those discussions.

For now, we’ve got some initial takeaways that we’re already planning to take action on.

Primary takeaways

There are some relatively obvious things we can take away from these results that we’re already working on:

  • People in all businesses are finding it quite difficult to communicate the value of SEO to their clients, bosses, and colleagues. That’s something we can help with, and we’ll be developing materials in the near future to try and alleviate some of that particular frustration.
  • There’s a real desire for more succinct, actionable, step-by-step walkthroughs on the Blog. We can pretty easily explore formats for posts that are off our “beaten path,” and will attempt to make things easier to consume through improvements to both the content itself and its delivery. I think there’s some room for more “short and sweet” mixed in with our longer norm.
  • The bulk of our audience does more than just SEO, despite a full 25% of them having it in their job titles, and the challenges you mentioned include a bunch of areas that are related to, but outside the traditional world of SEO. Since you all are clearly working on those sorts of things, we should work to highlight and facilitate the relationship between the SEO work and the non-SEO marketing work you do.
  • In looking through some of the other sites you all visit for information on SEO, and knowing the kinds of posts they typically publish, it’s clear we’ve got an opportunity to publish more news. We’ve always dreamed of being more of a one-stop shop for SEO content, and that’s good validation that we may want to head down that path.

Again, thank you all so much for the time and effort you spent filling out this survey. Hopefully you’ll notice some changes in the near (and not-so-near) future that make it clear we’re really listening.

If you’ve got anything to add to these results — insights, further explanations, questions for clarification, rebuttals of points, etc. — please leave them in the comments below. We’re looking forward to continuing the conversation. =)

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

Local Animation Studio Bonus- Local Animation Studio Bonus and Review

Get Local Animation Studio https://bonuscrate.com/g/3909/23284/

During my interview with Matt Bush I did a complete in-depth review of Local Animation Studio

Local Animation Studio is a cloud based software with 10 done for you videos in hot niches that you can edit with just a few clicks and create a hightly converting video for local clients

Local Animation Studio is a cloud based software, so you don’t need to install anything

Not just that Matt Bush is also giving a full training on how to find and close local clients so You get everything in this package when it comes to closing more local clients

Get Local Animation Studio along with my bonuses https://bonuscrate.com/g/3909/23284/

Subscribe to my YouTube Channel https://www.youtube.com/channel/UCbF13lMA9SD8-Av-ZrM_V5w?view_as=subscriber

Other Related Video

https://www.youtube.com/watch?v=yCOaDgwNg7E

Local Animation Studio Bonus- Local Animation Studio Bonus and Review

#localanimationstudio

#localanimationstudioreview

#localanimationstudiobonus

#mattbush

How Local SEO Fits In With What You’re Already Doing

Posted by MiriamEllis

islandfinal.jpg

You own, work for, or market a business, but you don’t think of yourself as a Local SEO.

That’s okay. The forces of history have, in fact, conspired in some weird ways to make local search seem like an island unto itself. Out there, beyond the horizon, there may be technicians puzzling out NAP, citations, owner responses, duplicate listings, store locator widgets and the like, but it doesn’t seem like they’re talking about your job at all.

And that’s the problem.

If I could offer you a seat in my kayak, I’d paddle us over to that misty isle, and we’d go ashore. After we’d walked around a bit, talking to the locals, it would hit you that the language barrier you’d once perceived is a mere illusion, as is the distance between you.

By sunset – whoa! Look around again. This is no island. You and the Local SEOs are all mainlanders, reaching towards identical goals of customer acquisition, service, and retention via an exceedingly enriched and enriching skill set. You can use it all.

Before I paddle off into the darkness, under the rising stars, I’d like to leave you a chart that plots out how Local SEO fits in with everything you’ve been doing all along.

The roots of the divide

Why is Local SEO often treated as separate from the rest of marketing? We can narrow this down to three contributing factors:

1) Early separation of the local and organic algos

Google’s early-days local product was governed by an algorithm that was much more distinct from their organic algorithm than it is today. It was once extremely common, for example, for businesses without websites to rank well locally. This didn’t do much to form clear bridges between the offline, organic, and local marketing worlds. But, then came Google’s Pigeon Update in 2013, which signaled Google’s stated intention of deeply tying the two algorithms together.

This should ultimately impact the way industry publications, SaaS companies, and agencies present local as an extension of organic SEO, but we’re not quite there yet. I continue to encounter examples of large companies which are doing an amazing job with their website strategies, their e-commerce solutions and their paid outreach, but which are only now taking their first steps into local listings management for their hundreds of physical locations. It’s not that they’re late to the party – it’s just that they’ve only recently begun to realize what a large party their customers are having with their brands’ location data layers on the web.

2) Inheriting the paid vs. organic dichotomy

Local SEO has experienced the same lack-of-adoption/awareness as organic SEO. Agencies have long fought the uphill battle against a lopsided dependence on paid advertising. This phenomenon is highlighted by historic stats like these showing brands investing some $10 million in PPC vs. $1 million in SEO, despite studies like this one which show PPC earning less than 10% of clicks in search.

My take on this is that the transition from traditional offline paid advertising to its online analog was initially easier for many brands to get their heads around. And there have been ongoing challenges in proving direct ROI from SEO in the simple terms a PPC campaign can provide. To this day, we’re still all seeing statistics like only 17% of small businesses investing in SEO. In many ways, the SEO conundrum has simply been inherited by every Local SEO.

3) A lot to take in and on

Look at the service menu of any full-service digital marketing agency and you’ll see just how far it’s had to stretch over the past couple of decades to encompass an ever-expanding range of publicity opportunities:

  • Technical website audits
  • On-site optimization
  • Linkbuilding
  • Keyword research
  • Content dev and promotion
  • Brand building
  • Social media marketing
  • PPC management
  • UX audits
  • Conversion optimization
  • Etc.

Is it any wonder that agencies feel spread a bit too thin when considering how to support yet further needs and disciplines? How do you find the bandwidth, and the experts, to be able to offer:

  • Ongoing citation management
  • Local on-site SEO
  • Local landing page dev
  • Store locator SEO
  • Review management
  • Local brand building
  • Local link building
  • And abstruse forms of local Schema implementation…

And while many agencies have met the challenge by forming smart, strategic partnerships with providers specializing in Local SEO solutions, the agency is still then tasked with understanding how Local fits in with everything else they’re doing, and then explaining this to clients. At the multi-location and enterprise level, even amongst the best-known brands, high-level staffers may have no idea what it is the folks in the in-house Local SEO department are actually doing, or why their work matters.

To tie it all together … that’s what we need to do here. With a shared vision of how all practitioners are working on consumer-centric outreach, we can really get somewhere. Let’s plot this out, together:

Sharing is caring

“We see our customers as invited guests to a party, and we are the hosts. It’s our job every day to make every important aspect of the customer experience a little bit better.”

– Jeff Bezos, Amazon

Let’s imagine a sporting goods brand, established in 1979, that’s grown to 400 locations across the US while also becoming well-known for its e-commerce presence. Whether aspects of marketing are being outsourced or it’s all in-house, here is how 3 shared consumer-centric goals unify all parties.

sharedgoalsfinal.jpg

As we can see from the above chart, there is definitely an overlap of techniques, particularly between SEOs and Local SEOs. Yet overall, it’s not the language or tactics, but the end game and end goals that unify all parties. Viewed properly, consumers are what make all marketing a true team effort.

Before I buy that kayak…

On my commute, I hear a radio ad promoting a holiday sale at some sporting goods store, but which brand was it?

Then I turn to the Internet to research kayak brands, and I find your website’s nicely researched, written, and optimized article comparing the best models in 2017. It’s ranking #2 organically. Those Sun Dolphins look pretty good, according to your massive comparison chart.

I think about it for a couple of days and go looking again, and I see your Adwords spot advertising your 30% off sale. This is the third time I’ve encountered your brand.

On my day off, I’m doing a local search for your brand, which has impressed me so far. I’m ready to look at these kayaks in person. Thanks to the fact that you properly managed your recent move across town by updating all of your major citations, I’m finding an accurate address on your Google My Business listing. Your reviews are mighty favorable, too. They keep mentioning how knowledgeable the staff is at your location nearest me.

And that turns out to be true. At first, I’m disappointed that I don’t see any Sun Dolphins on your shelves – your website comparison chart spoke well of them. As a sales associate approaches me, I notice in-store signage above his head, featuring a text/phone hotline for complaints. I don’t really have a complaint… not yet… but it’s good to know you care.

“I’m so sorry. We just sold out of Sun Dolphins this morning. But we can have one delivered to you within 3 days. We have in-store pickup, too,” the salesperson says. “Or, maybe you’d be interested in another model with comparable features. Let me show you.”

Turns out, your staffer isn’t just helpful – his training has made him so well-versed in your product line that he’s able to match my needs to a perfect kayak for me. I end up buying an Intex on the spot.

The cashier double-checks with me that I’ve found everything satisfactory and lets me know your brand takes feedback very seriously. She says my review would be valued, and my receipt invites me to read your reviews on Google, Yelp, and Facebook… and offers a special deal for signing up for your email newsletter.

My subsequent 5-star review signals to all departments of your company that a company-wide goal was met. Over the next year, my glowing review also influences 20 of my local neighbors to choose you over a competitor.

After my first wet, cold, and exciting kayaking trip, I realize I need to invest in a better waterproof jacket for next time. Your email newsletter hits my inbox at just the right time, announcing your Fourth of July sale. I’m about to become a repeat customer… worth up to 10x the value of my first purchase.

“No matter how brilliant your mind or strategy, if you’re playing a solo game, you’ll always lose out to a team.”

– Reid Hoffman, Co-Founder of LinkedIn

There’s a kind of magic in this adventurous mix of marketing wins. Subtract anything from the picture, and you may miss out on the customer. It’s been said that great teams beat with a single heart. The secret lies in seeing every marketing discipline and practitioner as part of your team, doing what your brand has been doing all along: working with dedication to acquire, serve and retain consumers. Whether achievement comes via citation management, conversion optimization, or a write-up in the New York Times, the end goal is identical.

It’s also long been said that the race is to the swift. Media mogul Rupert Murdoch appears to agree, stating that, in today’s world, it’s not big that beats small – it’s fast that beats slow. How quickly your brand is able to integrate all forms of on-and-offline marketing into its core strategy, leaving no team as an island, may well be what writes your future.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!