Monday, October 26, 2015

How To Leverage The Google Search Analytics API

google-data-tech-analytics1-ss-1920

In the past 18 months, Google has made many changes regarding what data it provides to you about your websites and your visitors.

With the change to SSL and (not provided) in analytics, many organic search analysts turned to Google Webmaster Tools (now Google Search Console), which still provides a sampled version of the more prominent keywords visitors use to connect to your website and what pages they land on.

However, unlike Analytics, which is stored as long as the website has a Google Analytics tag, Google Search Console data is only available for 90 days before it disappears. Additionally, there has been no official Google Webmaster Tools API for query and landing page data (although Google did offer an alternative method of circumventing that obstacle using Python).

Now, there’s good news: The Webmaster Tools Search Analytics API is available, and with the proper setup you can get a lot of great information about your website’s performance, Google’s algorithmic testing and consumer behavior.

Not everyone in search is a developer or has the technical skills to call APIs, but there is a growing need for anyone in digital to at least understand what they are and when it may be in your best interest to use an API vs the standard user interface. Let’s explore the APIs and the types of insights they deliver.

Basic Search Analytics API Exploration

You can access an API Explorer here.

You first need to authenticate through OAuth 2.0 via the icon shown below by clicking the On/Off slider and hitting “Authorize.” This allows the API Explorer to connect to your accounts.

oauth-analytics-explorer

Then, build a query on what information you would like to see from your accounts. (The information icons are very helpful if you get stuck.) The page should look something like this:

oauth-analytics-explorer-big

The completed fields build a POST API call, as displayed below:

oauth-analytics-request-type

If you haven’t made any syntactical errors, you will be served all of the results to specified:

oauth-analytics-POST-response-edit

Note: There seems to be a 4,000-row return limit per day depending on how you query, so for those looking to collect as much data as possible, you should query by day and by device, and loop them multiple times to get all of your data. Keep in mind that 90 days for a single website segmented by day and device will give you hundreds of thousands to millions of results, depending on the popularity of your website.

How Is This Different From The Webmaster Tools UI, And What Do I Do With It?

First and most important, the Webmaster Tools UI doesn’t allow you to get keyword information that is linked to a particular page on a single sheet, while the API does.

Second, depending on what you’re looking for, there is simply too much information to manually download if you want to review your site trends at a granular level.

To glean your insights, you will need three things:

  1. Storage/Database. You can create a free one using MSSQL or MySQL on your desktop computer, but I recommend that every company consider investing in a central DB to collect and store information.
  1. Collection. A method to schedule and call the API. I suggest employing or contracting a developer if you’re not a technical person. Creating a simple database and scheduling the calls for a single API should not take more than a day for a competent developer.
  1. Visualization. A tool to visualize and analyze the data. Excel is nice until you have collected 120 days of data, which will easily put you over 1.04 million rows (the maximum number Excel can handle). Programs like Tableau or Spotfire are ideal to analyze the trends within high volumes of information. (I ended up putting 1.2 billion rows into Spotfire!)

With these elements in place, you are ready to dive in and find some gems that would otherwise go unnoticed in an ever-growing sea of data.

Finding Insights From The Search Analytics API

In the example below, some basic scripting was done to bucket queries into four primary categories, while filtering to only show queries appearing in position one that received no clicks:

queries-impressions-noclicks

“Navigational” refers to those queries when the user was attempting to visit the site directly and either typed it into the search bar or slightly misspelled it (e.g., “ww.website.com” or “brand name”)

The chart above tells us one of two things is happening:

  1. We have paid search campaigns running which are absorbing those clicks.
  2. Competitors are advertising on our brand, and they are absorbing the clicks.

Ideally, Webmaster Tools should be linked to Google Analytics and Google AdWords, but when seeing something like this, it’s probably a good time to check out the paid search landscape.

The trend of a particular query’s position can also be very helpful to look at. Doing so allows you to catch a glimpse of how your site is performing and if Google is testing its organic listings:

Google-Test-Fail

The chart above shows the average position for a specific query over four months. As you can you see, for three months it maintains a constant position at six.

At the blue arrows, no changes were made to the site or page (including additional links), but it received eight days of top positions on all devices, returned to position six, then went through three more days of test positions. Our engagement did not see great improvement (except in tablet) in the first window:

Google-Test-Fail-CTR

With even worse results in the second window, our listing position was then tested in the opposite direction, indicated by the red line.

Another interesting level of depth to examine is all of the different pages that are being rotated in and out of Google’s listing for a specific query:

Google-LandingPage

Each color represents a different landing page for a single unbranded query. In general, the landing page associated with yellow stays in position 1; however, there are very specific time frames (July 9–July 19) where the olive green landing page holds position one.

The sheer volume of pages that is being served for the same query is a clear indication of two things that search marketers should consider:

  1. Although it’s best practice to focus on optimizing one page per keyword, Google has other thoughts about it. They will serve a variety of pages to see what provides the greatest experience to consumer relative to the entire page, not just your own listing.
  2. Because of data regulations and what is provided by Google and other search engines, we can’t be sure how much of what Google serves is based strictly on the user; however, the days of maintaining a universally consistent list position have long since passed.

In summary, as Google becomes more sophisticated, so does the skill set required to understand how a website is performing in organic search — and, more important, why it’s performing that way.

At this point in time, Google Search functions with an ensemble of algorithms and machine learning techniques that make it very challenging for even Google engineers to explain why the results appear for a given query.

Fortunately for us, Google enables us to explore that information from their Search Analytics API; it just requires more advanced tools and curious individuals to investigate it.

The post How To Leverage The Google Search Analytics API appeared first on Search Engine Land.

No comments:

Post a Comment