We don’t ordinarily think of Google when we think about competition in the digital marketing world, since it seems to reliably dominate most areas in which it does business. A recent segment discussing corporate monopolies on John Oliver’s “Last Week Tonight“ hilariously referenced Bing as the dominant search engine with a graphic that stated, “Bing. The best place to Google something.”
For the most part, however, the digital marketing sphere has been a fairly competitive landscape, though there were exceptions to this maxim. Established brands frequently dominated top SERP positions because of long-standing trust, fresh domains had to wait their turn in line, and black-hat SEO allowed webmasters to game the system and deliver high rankings for thin content. A decade ago, SEO agencies and webmasters could apply simple heuristics and buzzworthy keywords to rank content regardless of its utility to user intent or actual quality.
The Hummingbird update and subsequent rollout of RankBrain changed all of these notions entirely.
They should also be changing SEOs’ ideas of how to achieve success. Though many SEO experts understand the importance of RankBrain, or at least how important it will be, they still employ conventional strategies we made a living off of a decade ago.
In this column, I’ll explain why you should remodel the way you look at search engine optimization. And I’ll also offer some advice on machine learning applications and SEO strategies you can employ to compete in the cutthroat SEO landscape.
How machine learning revolutionized search
Machine learning is a subset of artificial intelligence that allows computers to learn independently of human intervention, learning in iterations by grouping similar properties and determining values based on their shared properties.
Google’s RankBrain, which the company says is its third most important ranking factor, is applied to determine the context of new search queries that it has not received before. RankBrain distinguishes the context of unlearned searches by pulling semantically similar keywords/phrases and comparing them with similar past searches to deliver the most relevant results.
Google employs machine learning technology to find patterns and make sense of relevant data when it analyzes user engagement with web pages in its SERP listings. With this data, Google’s algorithm evaluates user intent. From Google’s perspective, this helps filter results more effectively and rewards users with a better experience.
Currently, conventional signals are still applied to rank the best results. With each subsequent, relevant search, machine learning can analyze which web pages are receiving the best user signals and provide the best results to meet user intent. It’s important to note that machine learning isn’t instantaneous but would result in slow ranking changes based on growing data from its SERPs.
This has two broad implications for keyword research and ranking:
- Keyword rank is no longer affected by dramatic shifts.
- Google’s algorithm is more dynamic; different algorithms are employed for each unique search.
In more competitive niches, content quality and increased user engagement will slowly take precedence over conventional signals, leveling the SERP playing field. In low-volume searches, conventional ranking signals will still be applied as the de facto standard until enough data is available to determine user intent.
This has also brought semantic search to the fore for SEO experts. Semantic search allows content to rank for multiple keywords and get increased traffic by meeting the intent of various related search queries. The clearest example of semantic search’s impact is the related search field at the bottom of Google SERPs and what “People Also Ask” below the featured snippet field.
As Google becomes capable of understanding human intent and linguistic intelligence, technical SEO and keyword usage will take a back seat to user signals. Considering different algorithms are applied to unique searches, links will be reduced in their role as the arbiters of content quality, and smaller domains will have a better fighting chance to compete against industry titans organically.
If searcher intent determines which algorithm will be pulled for SERP listings, how do we optimize and even track this? The answer involves using both conventional strategies and our own machine learning technology.
Give the people what they want
Here are a few methods SEOs should be using to keep current with the evolving environment:
1. Improve user experience
Searchmetrics’ 2016 report on ranking factors illustrated just how important user signals were to organic ranking. The company found that user signals were second only to content relevance in terms of importance.
One of the best ways that a search engine can determine user intent is by analyzing user signals, which it gathers through its Chrome browser, direct URLs, SERPs and so on. But Google’s most valued user signal remains CTR.
To ensure your web pages deliver good user signals, you must create a solid UX foundation. This means providing thematic continuity across your web pages, creating high-quality and relevant landing pages, using engaging images, offering interactive content, delivering fast page speed and developing an organized internal linking structure.
Metatags and rich snippets can also influence your click-through rate, so optimize for both. Google will obviously lower your rank if your website suffers from a low CTR in a high-ranking result.
Other considerations to keep in mind include:
- employing 301 redirects for missing pages and rel=canonical tags for duplicate content.
- optimizing structured data and alternative tags to help search engines index content.
- resolving any broken links that could affect crawl structure.
Even though Google’s AI and RankBrain are incredibly advanced, Google still needs your help to crawl web pages and index them. It doesn’t hurt that these factors also improve your website’s navigation and user experience.
2. Embrace thematic continuity
Despite all of these advancements in search, I still commonly encounter clients who operate their websites with thin content and no keyword focus. My team begins client campaigns with research on keywords, competitors and some technical aspects.
Recently, though, we began focusing on creating more seamless hierarchical structures that leverage semantically linked keywords and topic clusters to promote an awesome UX. As opposed to simply creating content with a limited keyword focus, we focused on ranking our clients’ most important pages.
HubSpot refers to this exciting new practice as “topic clusters.” Topic clusters focus on pillar pages that represent your most important topics. These will be broad, overarching pages that rank high in your information hierarchy and attempt to discuss and answer the most important questions related to your main topic.
Subtopics are then discussed in greater detail on lower-hierarchy pages that contain internal links back to the pillar page. This strategy helps communicate your most important pages through a sophisticated interlinking structure, promotes seamless navigation and helps position your pillar page to rank for multiple keyword phrases.
These evergreen pieces are also supplemented by a consistent blogging strategy that discusses trending topics related to the website’s theme. Each piece of content produced is actionable and focuses on driving conversion or desired actions.
When modeling each piece of content, it’s important to ask yourself this question: What are the problems this piece of content is seeking to address, and how will it solve them? As more questions pop up, write content addressing these issues. Now you’ve created a website that satisfies user intent from almost every possible perspective. This helps you rank for a lot of keywords.
You can also employ machine learning technology to improve the workflow of your content marketing campaign. Applications, such as the Hemingway App and Grammarly, are excellent tools that can provide suggestions where improvements could be made in sentence structure, author voice and word usage.
3. Employ natural language
Perhaps the best way to optimize for an artificially intelligent search world is to optimize for voice search, as opposed to text search. This involves optimizing your website for mobile and your content to achieve featured snippets, given that answers to questions asked to a personal assistant device are pulled from the featured snippet field on a Google SERP.
In addition to following the strategies outlined thus far, this involves crafting cogent page copy that seeks to answer as many questions as possible and provide actionable solutions.
Research has also shown that people searching by voice, rather than text, are more likely to use search phrases from four to nine words in length. This means you need to optimize for long-tail keyword phrases — which are usually longer in length — and page copy that is more representative of natural language. For example, a text search for flights to Hawaii may be “cheap flights Hawaii,” while a voice search may say, “What are the cheapest flights to Hawaii?”
With the rise of machine learning, optimized content that appeals to natural language could satisfy user intent for both broad match searches over text and long-tail voice searches.
Consider how chatbot assistants incorporate Natural Language Understanding (NLU) to more readily understand linguistic syntax and meanings. With advancements in NLU applications, search engines will eventually be able to entirely assess the meaning and quality of content the same way a human does.
4. Personalize the buyer’s journey
With more big data being created this year than in the past 5,000 years, businesses will need to leverage machine learning technology to interpret vast amounts of user data at an unprecedented speed.
One way this is already being executed is by mining conversational text data from chatbots. As we move from a graphical interface world into a conversational interface, chatbots are being used to map inputs and data from customer journeys to help companies improve their user experience.
This technology is still in its infancy, but we can also apply machine learning technology and data mining to personalize touch points along the buyer’s journey. Customer journey mapping can be used to build out buyer personas and personalize marketing touch points to maximize conversions and sales.
Using customer journey mapping, businesses can personalize touch points to deliver content or advertisements when intent is highest. Real-time responses can be instituted to respond to customer service calls immediately, deliver calls to action to high-scoring leads and segment advertisement campaigns based on real-time data.
Predictive analytics can also be applied to deliver predictions of estimated campaign performances based on real-time data. This will greatly save time on A/B testing and improve campaign efficiency.
Fortunately, machine learning technology can be used by anyone. Given the sheer speed and scale of machine learning applications, relying on conventional SEO strategies to rank organically may eventually put you at an incredible competitive disadvantage.
The future is already passing
Don’t worry, automation won’t totally displace humans any time soon. Machine learning technologies can help augment marketing campaigns, but the creative and execution ultimately rely on the expertise of human intelligence. But we will probably reach a point soon enough that clients will actively seek out digital marketing firms that have expertise in customer journey mapping and AI-enabled applications.
In my opinion, these technologies have the potential to greatly improve the competition for SERPs and will also allow digital marketers to deliver a stronger product.
The post How machine learning levels the SERP playing field appeared first on Search Engine Land.
No comments:
Post a Comment