Google SEO Update

The Rise of ‘Not Provided’ and Hummingbird.


Google SEO Update


Over time, the SEO industry has evolved from a space obsessed with rankings over any other metric to one that views business metrics, such as visits, conversions and revenue from organic search (brand and generic) and market share, as the core KPIs.

However, almost two years ago, back in the fall of 2011, Google launched encrypted search and we all thought that would be the end of keyword level organic search data. It wasn’t, as only a small percentage of searches were encrypted. However, that time has now arrived as earlier this month Google moved everything to encrypted search.

Over the last month we have seen levels of ‘not provided’ keyword data rise to between 80% and 95% across our European client base. Markets where Google is less dominant such as the US have less of a problem as they still have keyword data from the likes of Bing and Yahoo. Whilst on the face of it this seems good, ‘not provided’ levels will still be in the 50% – 100% bracket.

What does this actually mean?

The key impacts of Google encrypting all searches will be that you can no longer:

• Split brand and generic organic keyword data by the following

     – Visits
     – Conversions
     – Revenue

• Monitor individual organic keyword data

     – The above metrics, plus:
     • Conversion rate
     • Bounce rate

• Use the multi-channel funnels in Google Analytics to view keyword paths

• Take a data driven approach to PPC vs. SEO testing (more on this later)

• Compare keyword usage variance across devices

In addition:


• Hitwise can’t access Google keyword data
     – This could render Hitwise largely irrelevant. However they have confirmed that they are using panel             data to replace ISP keyword data

• Tools like DFA that have never shown data for keyword ‘not provided’ (when Dart Natural Search ranking is in place to de-duplicate paid media data on the Ad Server) will suffer:
     – Client data will be inaccurate through lack of conversion de-duplication
     – DFA do have a solution in Alpha release in place to track the volume of ‘not provided’ referrals so that       they can offer deduplication. However, it is under very limited availability

Whilst keyword data has all but evaporated for organic search, there are still many useful metrics in site analytics and SEO management platforms such as Search-Metrics and BrightEdge:

• Overall organic search performance
     – This can however be largely influenced by brand and ATL advertising campaigns
     – Market share (overall and specific keyword groups)
     – Visibility (overall and specific keyword groups)

• Page and content level information
     – What is the most visited content via organic search
     – What content drives the most conversions
     – What content drives the most social shares
     – What content drives the most social links
     – What content drives the most valuable links
     – What is the increase or decrease in organic visits and conversions at the page level.

The end of the data driven SEM strategy?

For us, this is without a doubt the biggest impact. We have always tried to make 

the best use of our client’s budgets across Search. This has meant very detailed 

testing strategies across paid and organic search; the results of which have dictated 

where, how, and when paid search budget is deployed to drive the maximum 

overall performance for clients without paying for visits and sales that they don’t 

need to. This testing covered brand and generic search and in many cases resulted 

in a reduction in paid advertising spend, especially on brand and brand + product 

keywords or a redeployment from brand to generic.



However, an effective and accurate testing strategy requires:

• A single source of measurement (site analytics)
• Keyword level data across both channels for visits, conversions, and revenue
With Google’s switch to encrypt all searches, this data is lost. However, we can 

still get keyword data from Google AdWords if you’re bidding on your generic 


keywords. But what about the Google Webmaster Tools data that Google recently 


integrated into AdWords? We all know that data carries a high level of inaccuracy 


and is treated with caution across the industry. It is also just click based, so no 


revenue data to calculate an incremental CPA of paid search over your organic 


baseline. However, given Google’s latest move with encrypted search this data, 


accuracy issues and all, may become a lot more valuable. Additionally, you could 

look at your historical analytics data to get an idea of which keywords resulted in 

a majority of traffic and conversions to your site.


Rankings still don’t cut it


Whilst the natural assumption is that the industry will regress to an obsession with rankings data, we don’t see this happening; at least not for us. Why?
Quite simply there is now no such thing as a static ranking. Ultimately everyone sees something slightly different. There are two levels of personalisation for the logged in and logged out user, meaning that search history and visited sites impact every search result. Mobile search has been growing rapidly and continues to do so increasing the use of geolocation, tailoring the search results based on where a searcher is (this also applies to desktop searches but not to the same extent).
Rankings will absolutely remain an a leading indicator and they will be a key part of many other metrics, but given the vast variation in rankings across the visitor base as a metric it is far too inaccurate to hang your hat on as the core KPI.

Google webmaster tools and other data sources

Whilst vastly inaccurate, the data in Google Webmaster Tools will become more valuable than ever before. Publishers will still be able to access their top 2000 keywords driving site traffic for a 90-day window (and potentially a year, based on a recent Google announcement). When combined with share of voice or market
share analysis from the likes of BrightEdge, Hitwise, and Searchmetrics, this combined data set can be quite powerful.

Google wins, brands lose,and privacy doesn’t change


Let’s cut to the chase. If this was about privacy, Google would encrypt all search data, not just organic. Across paid and organic search, the users are the same, the keywords they search for are the same, and yet one set are encrypted and not the other. We can’t recall Google explaining their thinking behind how this helps privacy. Can you? We thought not. Privacy doesn’t change.
With no organic keyword data, it makes the overarching SEM strategy that requires paid search to prove incremental value almost impossible. Therefore marketers lose the data that may result in a reduction in paid search spend. Google wins.
For SEM strategies, you don’t have the data to make the call on whether it is better to transfer brand budget to generics to drive incremental sales rather than visitors which were always going to end up on your site. Brands lose.

The fact that you can no longer segment brand and generic organic data in site analytics is a big loss for brands. Brands lose.

With paid search continuing to have full keyword level data, it certainly looks like a more attractive option for marketers Google wins.

We’ll leave you to make your own mind up as to whether this is about privacy or about making paid search advertising more attractive.

What next?


SEO has always been a fast moving space that is constantly changing and in this too we will all need to adapt, evolve, and move on. With this recent update, brands need to spend time paying attention to page and content-centric SEO instead of looking at organic performance at the keyword level. There are a host of tools outside of site analytics that allow us to monitor various useful metrics but more importantly, also allow us to monitor performance vs. the market which is becoming ever more important for our clients. After all, it’s great to be 20% up year over year, but if you have lost market share, it means your competitors have grown more. Whilst performance measurement will be hampered in certain areas, we feel it could improve in others.
For more information on how recent changes to Google policy may affect your business, please reach out to our authors, who will be happy to engage in more specific dialogue with you.

Hummingbird.Google’s new algorithm


Hummingbird is a new algorithm rather than an update to the existing algorithm. It launched in September, but was live for around one month before the announcement.In term of initial impact we think this says a lot as the new algorithm rolled out without anyone really noticing – it is geared for the future.
Hummingbird has been designed to cope with changing search behaviour and the evolution of the devices that we use to search on. So rather than looking at words and phrases individually, Google is now looking at all the words and phrases in the search query and their relationship to each other in addition to the meaning of each word in context. Google specifically called out ‘conversational search’ i.e. the longer tail queries and voice search. This means we should see:
• Results for longer tail search queries returning more accurate results
• Increased integration of the Knowledge Graph as Google better understands the user search query
SearchEngineLand crafted a good example of this playing out for a particular search:
“What’s the closest place to buy the iPhone 5s to my home?” A traditional search engine might focus on finding matches for words — finding a page that says “buy” and “iPhone 5s,” for example.
Hummingbird should better focus on the meaning behind the words. It may better understand the actual location of your home, if you’ve shared that with Google. It might understand that “place” means you want a brick-and-mortar store. It might get that “iPhone 5s” is a particular type of electronic device carried by certain stores. Knowing all these meanings may help Google go beyond just finding pages with matching words.
Given Google recently switched to entirely to encrypted search (meaning we can’t see the referring keywords) it will be really tough to measure the impact of this going forward. However, looking at data trends for rich content pages will provide some great insight here.
As this update has been out for a month and we haven’t seen any big changes for now we think we can say there has been minimal impact, but this new algorithm is designed for future trends so the biggest impacts are likely be further down the road as search trends continue to evolve with new consumer technology. As such we need to think carefully about the content on sites and if your site is lacking on the content side you need to think about providing rich helpful content built around the needs of your target audience.