The one constant in search is that Google has always been looking for the best page to serve up to match the query of the searcher…their goal is user loyalty, so that Google can serve up ads to more people, more often. Search engine optimization has always had the goal of creating pages which are excellent answers for a given keyword phrase, and the shortcuts we used to take (maybe ten years ago) of simply repeating the search phrase several times were irritating and confusing to the searcher.
Why machine learning is changing the world of SEO is that, by using hundreds or thousands of processors, the One Algorithm To Rule Them All is now looking to understand user intent, and provide the page which is best for that query. This is a subtle but very powerful change, in that previously the index was grading pages for the best match to the questions people are asking, e.g., search phrases.
Past algorithm updates like Penguin and Panda are nothing compared to Google’s most recent switch, using a machine-learning artificial intelligence system to find better, more relevant matches to searchers’ queries. RankBrain, as Google has dubbed their machine learning tool, promises to provide more on-target results, but it also shifts the standards for well-optimized web pages. RankBrain was first launched in 2015, for testing and fine-tuning, but wasn’t fully implemented until this year.
In natural language, there may be hundreds of different ways of phrasing a question, which is very different from looking for a search phrase used as an important element on a given page. As a result, this new methodology is able to look for “things, not strings,” as Google puts it. Instead of a string of letters (a search phrase), machine learning can look at user actions following the 3 billion searches per day which Google processes…and serve up results which don’t even contain the precise search phrase used.
What RankBrain Means for SEO—and Why Your Content is Obsolete
SEO best practices have dictated including target keywords in a site’s top-level pages, then pulling visitors ever deeper into the site with repetitions and variations on those terms, often within long-tail phrases. The “long tail” of search queries might be searches 5-8 words long, so search volume would be much lower, even as the relevance might improve. Google would see this silo of related pages, and (poof!) the site was well-ranked on several of these related phrases…and maybe even on the high-volume core term on the top-level page.
The combination of AI plus the huge jump in voice searches has changed that. The new tools are looking for more relevant content to more complicated questions throughout a page or site, including phrases that flow naturally in spoken language. The AI is there to understand user intent, so in indexing pages, they’re looking for depth and complexity, rather than single pages with only one or two keywords and one very simple topic.
This new “semantic search” capability will see the pattern of simple-to-complex search phrases, repeated several times on a site, and actually downgrade a site’s rankings. In a recent Searchmetrics study on rankings in the financial industry, sites with fewer instances of keywords were now ranking higher than the sites which used traditional SEO techniques. Simple keyword repetition is raising RankBrain alerts as being an “unnatural” pattern…which means the pages you wrote last year which earned such great rankings may now show up 7-10 places lower in the SERPs!
How to Incorporate RankBrain into Your SEO Strategy: Have Original, Informative Content
Other factors beyond content are also likely to play a major role in successful website performance, including the now familiar “Mobile First” initiative, increasing emphasis on site speed as real-time machine reviews look for near-instant response, and the prioritizing of visual elements such as images and videos. Interestingly, because of the rising prominence of mobile search, which does not often result in links to the found site, another long-term standard SEO technique, backlinks from external sites, is diminishing in importance…while pages using Google’s AMP (Accelerated Mobile Pages Project) are showing up at the top of the mobile search results.
Historically, the more the SEO community has known about ranking factors, the more “black hat” techniques have been used to promote undeserving sites, which in turn frustrate users. Over the past year, Google has been “revealing” ranking factors which benefit the user: fast-loading pages, device-independent responsive design, and now RankBrain, a content evaluation system so smart and so unique it cannot be abused (Alex Barysevich, Search Engine Journal, 5/3/16)
While we have always stressed the need for quality content, which works for SEO as well as motivating prospects, RankBrain underscores the need for deep, rich and original content. Our research findings and site strategy documents identify the top 50-80 search terms where our B2B clients have the best shot at winning top rankings, based on the competitive environment. Previously, when we focused site optimization on search terms, we would concentrate on 30-40 terms with decent search volumes, and create a site optimization strategy which integrated these terms. Now, as we write content, it’s no longer with our eyes on the prize of high-volume search terms—it’s freeing us to write more complex and inclusive content, using any of the search terms in the master list, but always being sure to answer the myriad questions prospects have about a given topic. More semantic richness, more keywords the client will rank highly for, and improved metrics for user engagement (number of pages viewed, time on site)…resulting in more conversions, as prospects recognize the client is a subject matter expert and thought leader.