Insights

Google’s Mayday “Scare” – Stats Show it’s Time to Relax

In my last post, I wrote about how any good SEO has always needed to ADAPT to the constant changes to Google’s ranking algorithm.  Well, it’s been just about a month since Google’s so-called “Mayday” update that has SEOs and site owners alike in a tizzy, and my advice hasn’t changed!  To recap briefly, the Mayday update refers to an algorithmic change made by Google (and confirmed by Matt Cutts) between April 28th and May 3rd that has been primarily affecting long-tail traffic to sites across the web.  Industry experts such as Vanessa Fox and the folks at SEOmoz have shared tons of great insight, so rather than rehash a previous analysis I wanted to contribute some empirical data regarding the update, and what was discovered might surprise you.

I’ve read many accounts of the effects that this update has had on sites, many of which are negative in nature.  About 2 weeks ago, SEORoundtable conducted a poll which concluded that around 42% of respondents reported that search visits had declined since the update, 41% saw no change, and 13% saw their traffic increase.  This was a very revealing study, but it didn’t quench my thirst for down and dirty comparative traffic metrics.  I wanted an answer to the question that most SEOs were asking but no one was answering: how much long-tail traffic did we really lose??

I took a look at the search traffic for 25 of SEER’s clients (brands removed to protect anonymity) comparing the month of April (we’ll call this pre-Mayday) to the month of May (appropriately, post-Mayday).  It is also important to point out that the pre-Mayday data takes into account the last day in March (3.31) as well as the entire month of April in order to compare an equal number of days in each period.  Clients were chosen from a variety of niches, including (but not limited to) eCommerce, finance, education, tourism, and B2B.  This was done in an attempt to remove any bias according to industry.  Finally, actually traffic numbers were removed from the example data as the true value comes from comparative percentages.  Now, onward to the results!

How Much of an Effect did the Mayday Update Really Cause?

First, let’s take a look at non-branded search traffic, the reason being that this is the traffic that would be most likely associated with your “virgin searcher,” or person unfamiliar with your brand.  This is the purest target audience for SEO, as you’re trying to place a site before an audience that has no allegiances or loyalties to any particular brand.  Traffic numbers were compared, and the percentage differences calculated.  That data is available below:

As you can see, despite a small handful of clients seeing a significant decrease in non-branded search traffic, the split is a pretty even 50-50.  In fact, if we assume a deviation of +/- 5% to be “No Change,” only 16 of these companies saw any change at all, whether positive or negative.

Next, and more importantly, we will look at the number of referring keywords to each site pre- and post-Mayday.  Now, accordingly to the general perception around the web and Matt Cutts, the majority of this data should reflect a decrease in the overall number of referring keywords.  However, our data supports a different conclusion:

As you can see, only about half of our clients saw any decrease in the number of referring keywords.  More specifically, 20% of clients saw a decrease, 52% saw no change, and 28% actually saw an increase in the number of referring KWs.

So What?

Why is this relevant?  Why should you care?  Well, according to most qualitative sources the majority of sites saw a loss of traffic as a result of the Mayday Update.  However, quantitatively, this doesn’t actually appear to be the case.  In fact, looking at the total non-branded traffic driven to all 25 sites from search pre- and post-Mayday, there was a +0.68% difference in visits.  0.68%!! Furthermore, the percentage difference between the number of referring keywords pre- and post-Mayday? +0.44%.  Now, I know that some critics will argue that any one outlier in raw data could throw off the average.  Who’s to say that the decreases seen by Client13 or Client2 are not completely washed out by the positive month for Client16?  Well, when you take the average percentage change across all clients, there was only a -1.28% difference in visits and a -0.66% difference in referring keywords.  Pretty underwhelming if you ask me, especially considering the prevailing perception is that most sites were significantly damaged by the update.  This shows us that, although certain sites definitely saw a decrease in their traffic and referring keywords, this was not necessarily a trend that was reflected across the entire web, as most qualitative studies may have suggested.

Finally, I’m sure some people are also interested in how incorporating branded traffic into this analysis may have affected the overall results.  On the whole, although our site traffic actually fared worse when branded queries were included in the analysis, the number of referring keywords was consistent with our previous analysis.  Despite a similar breakdown of referring keywords, many of our sites received less traffic when branded queries were included in the analysis.

Total Organic Visits (Including Branded Queries)

Total Number of Organic Keywords (Including Branded Queries)

As for total organic traffic (including branded queries) across all 25 sites, we noticed a 0.31% decrease pre- versus post-Mayday, and for total organic referring keywords we saw a 0.28% increase after the update.  Taking the average of all percentage changes across 25 sites, we saw an average of -5.13% difference in our search traffic and -2.45% difference in the number of referring keywords.

Conclusions

Before we close, I’d like to address some concerns readers may have regarding this analysis.  Yes, these metrics are reflective of SEER clients and only SEER clients; it’s wholly possible that our data is not indicative of the Web as a whole.  However, this is exactly the reason why I tried to remove any industry-bias by analyzing sites from a wide variety of niches.  Next, the analysis did look at the raw number of referring keywords, rather than simply long-tail terms.  This was done so that holistic conclusions were drawn from the referring keyword report without having to arbitrarily determine what defines long-tail.  Matt Cutts and the good folks at Google just told us that the update affected long-tail traffic, not what is considered long-tail.  By viewing the total number of keywords across the board, I hoped to normalize the data and neutralize this variable as best we could.  Finally, our data also reflects ongoing SEO efforts by the SEER team.  However, I would argue that this is not a variable that should be excluded even if it were possible to do that.  All companies looking to turn a profit are constantly optimizing, whether actively or passively, for improved visibility on the web.  This could be through SEO, external marketing, site redesign, product updates, etc.  The fact that these numbers reflect ongoing marketing efforts is a trend that will always continue even if Google decides never to change the algorithm again, and therefore are an intricate part of any analysis.

Remember that Google will never change their algorithm to become less relevant, so it’s a constant battle of future-proofing your site for the inevitable.  Whether Google wants to become more relevant for the head terms or the long-tail, singular or plural, branded or non-branded, you will never go wrong with creating quality content, valuable links, and streamlined architecture, so keep that in mind next time (and there most certainly will be a next time) Google decides to roll out an update.

We love helping marketers like you.

Sign up for our newsletter to receive updates and more: