Wednesday, February 28, 2018

SearchCap: Google expands featured snippets, voice search ranking study & Rand Fishkin moves on

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

From Search Engine Land:

Recent Headlines From Marketing Land, Our Sister Site Dedicated To Internet Marketing:

Search News From Around The Web:

The post SearchCap: Google expands featured snippets, voice search ranking study & Rand Fishkin moves on appeared first on Search Engine Land.

Study: 11 voice search ranking factors analyzed

Backlinko has done an extensive analysis of “voice search ranking factors” and identified 11 variables tied to appearing in Google Home results. The company examined 10,000 results delivered over the smart speaker.

What Backlinko found was consistent with what many others have been saying but there were also a few surprises. For example, the study discounts the impact of Schema to some degree and page authority.

Here’s a partial, paraphrased list of the ranking factors:

  • PageSpeed is a significant factor; voice search results typically come from faster-loading pages.
  • Google relies heavily on very authoritative domains for results, but pages not as much.
  • Content that ranks well on the desktop tends to rank in voice search. This might be a correlation rather than causal however.
  • Shema may not be a factor: 36 percent of pages voice search results came from pages using Schema.
  • Roughly 41 percent of voice search results came from Featured Snippets.
  • Voice search results are generally 29 words; however Google sources voice results from long-form content.
  • HTTPS is critical.

Google has made page speed an explicit mobile ranking factor. Backlinko found that the page-load time for a voice result was almost 2X faster than traditional webpages. Not a surprise. What may be a surprise are the findings around Schema.

The company found that Schema was used on slightly more than a third of pages delivered over Google Home, somewhat more than in general results. Accordingly it discounted Schema as a voice search ranking factor:

Although voice search result pages tend to use Schema slightly more often than your average web page, the difference is not significant. Also, 63.6% of voice search results don’t use Schema at all.  Therefore, it’s unlikely that Schema has a direct impact on voice search rankings.

Below are Backlinko’s findings around Schema distribution in voice search results.

There are plenty of reasons to use Schema generally so this finding shouldn’t be seen as an argument against it. And many will question the validity of this finding. It may also be that Schema pages don’t appear more because they aren’t more prevalent and there are other important variables.

Among them, links matter for voice results as well. Domain authority was high but page authority was relatively low by comparison:

We discovered that the average Domain Rating of a voice search result was 76.8 . . . we found that the link authority of voice search result pages were significantly lower. In fact, the mean Page Rating of a voice search result was only 21.1.

Backlinko speculated that the voice algorithm was relying upon domain authority (over page authority) because that provided a higher level of confidence in the accuracy of results.

Long form content was also correlated with voice results. “Google voice search results predominately come from pages with a high word count,” the study asserts. In addition, “FAQ pages tend to perform particularly well in voice search.” Keywords were somewhat less important: “only 1.71% of voice search results use the exact keyword in their title tag.”

The company advises, “[D]on’t worry about creating individual pages that are each optimized around individual keywords. Instead, write in-depth content that can answer several different voice search queries on a single page.”

Finally, content that ranks well on the desktop appears to also rank well in voice results. This is logical. Nearly 75 percent of voice results on Google Home “came from a page ranking in the top 3 for that keyword.”

SEOs should review the post and do their own evaluations of the findings and recommendations.

While it’s not clear whether smart speakers will siphon off some query volume or merely be additive to the overall pie, at least two studies have shown that owners of Alexa and Google Home devices are spending somewhat less time with their smartphones.

Regardless, virtual assistants are emerging as an important consumer discovery tool and marketers need to take these platforms seriously and adapt accordingly. In Google’s case, the Assistant (which powers Google Home) is now available on 400 million devices.

The post Study: 11 voice search ranking factors analyzed appeared first on Search Engine Land.

Unit economics: The foundation of a good SEM campaign

A strong understanding of the economics of any business unit is absolutely critical to any digital marketing campaign managed against non-brand key performance indicators (KPIs) in a search engine marketing (SEM) campaign.

One would think any reasonably large and successful business would have a good handle on their unit economics, and that this knowledge will be shared down the chain of command to the mid and lower levels of the marketing team.

But time and time again, I have found this critical foundation is missing, miscommunicated, insufficient or is so outdated as to make it worse than worthless.  “Worthless” in this case means that bad data does no actual harm. “Worse than worthless” means the utilization of the wrong KPI goals resulting in media waste and — more importantly — missed revenue and profit opportunity.

In other words, the company’s health is at actual risk because the marketing team and the business team aren’t on the same page.  In some cases, members of the marketing team may be working at cross-purposes, using incompatible KPIs and metrics.

Silos standing in the way of marketing AI?

I had the privilege of attending a lunch recently with members of the Direct Marketing Club of New York, the Interactive Advertising Bureau (IAB), and other marketers, including MediaMath CEO Joe Zawadzki.

Although we attended the lunch to discuss the power of artificial intelligence (AI) and machine learning to solve marketing problems, the consensus was many brands aren’t ready to empower decision-making with AI.

Their organizations were often so siloed that the inertia of departments and organization charts were a big factor slowing down the adoption of AI and machine learning in terms of optimizing marketing spend. A common theme across the table was the winners and disrupters in many industries are the ones without any legacy departmental structure. These included Tesla, the Dollar Shave Club, Casper & Purple, and even Amazon.

How does this all relate to pay-per-click (PPC) search engine marketing campaigns? Well, the influence of legacy structures affects your ability to grow yourself as a marketer and business person and directly influences your ability to communicate — with rational questions — up the chain of command (while following protocol), even if those communications might decrease your own departmental budget.

Rational questions for marketers to ask

Let’s use some common KPIs as examples of how things are often done now in SEM campaigns and how one might apply smarter business unit economics by asking some rational questions in the following scenarios:

PPC account (typically retail) managed by return on advertising spend (ROAS) with last click attribution

Rational questions to ask include:

  • Do we have any data that would predict which customers order more frequently?
  • Do we have any data on which variables in the PPC campaign tend to attract “new to file” customers vs. returning customers?
  • Do some categories of products have a significantly higher margin where a different ROAS KPI should be applied (so that the same dollar of spending generates more profit)?
  • Do certain products and their associated keywords result in higher lifetime customer value (LTV)? (The same question applies to geography, time of day, mobile vs desktop, etc.)
  • Should we deploy simple tests to understand the impact and attribution of other forms of paid and earned media (display, social, video/audio, etc.)? Any paid media that can be geo-targeted lends itself to such a test of incrementality. These kinds of tests often work better than attribution models that lack data points and are particularly important for search because often another marketing touch-point stimulates search behavior by the consumer. Therefore one can look not only at sales data but also at changes on brand search volume.  For example, if you double display spends in Albany, Denver, and Orlando, and both brand searches and sales happen to rise in those cities, you’ve isolated your interaction effect.  Do the same with any paid media you want to test.

PPC account managed based on last click attribution around cost per action (CPA) or lead gen

Not all businesses make the sale online, so marketing teams running campaigns used to generate leads and are given a cost-per-lead or CPA target to hit. Rational questions to ask include:

  • Are all leads we see equal? This issue goes beyond the “Glengarry Glen Ross” level of good leads and bad leads, encompassing both lead conversion rate to a sale and also the value of the sale (immediate), along with lifetime customer value.  Part of that LTV discussion could revolve around churn rates (think cell phone plans and subscriptions of all types, including product and services subscriptions).
  • Rather than the average cost per lead or CPA, can we be more nuanced in our optimization?

Fixed budget PPC account

Rational questions to ask include:

  • What’s the marginal contribution to the business originating from extra search, social or display media? Despite nearly all online media being auctioned off in near real-time, many larger organizations like to fix their spend by channel or category. This isn’t often the right strategy. To have a conversation about where the media dollars should be allocated, estimate the marginal cost of the next KPI unit in search, social and display.  Yes, you’ll need to address all the intricacies of the interaction effects between other media and search, but PPC search is very inelastic (small changes in bidding often don’t result in position and volume change, whereas in display, depending on whether it’s retargeted or otherwise targeted, the ability to “buy your KPI” may be easier.
  • Should we just add budget or take from another channel when we find opportunities? If you could buy dollar bills for $.90 (including the effort to do the transaction), would you cap your budget?  Of course not.  Neither should any business. Sometimes, when the KPI includes a LTV factor, there may be cash flow constraints that also have a time-value of money (it may take you a year to get your one dollar), but otherwise the more the better for “ninety cent dollar bills!”

There’s probably another column that could be written on the nuances of applying unit economics to marketing, but this should give you a solid start toward a good SEM campaign.

The post Unit economics: The foundation of a good SEM campaign appeared first on Search Engine Land.

Multifaceted featured snippets begin rolling out in Google search results

Google has been rolling out many new search features over the past few months related to images, featured snippets, and the knowledge graph. Today the search giant released another feature called “multifaceted featured snippets.”

Multifaceted featured snippets will be surfaced for queries that are sufficiently broad enough to allow for more than one interpretation of what was submitted. In these instances, the SERP returned will include more than one featured snippet, with the original query rewritten as the questions the algorithm assumes the user may have intended, and the results displayed in the multifaceted snippet will reflect those new questions.

From the announcement:

There are several types of nuanced queries where showing more comprehensive results could be helpful. We’re starting first with “multi-intent” queries, which are queries that have several potential intentions or purposes associated. The query “tooth pain after a filling,” for example, could be interpreted as “why does my tooth still hurt after a filling?” or “how long should a tooth hurt after a filling?”

For example:

Google Multifaceted Featured Snippet

Multifaceted Featured Snippets vs. Multi-Perspective Answers

Back in December, Bing began rolling out AI-powered multi-perspective answers as part of its “Intelligent Search” set of new features, which includes Intelligent Answers, Intelligent Image Search and Conversational Search. Multi-perspective answers are just one of the “Intelligent Answers” features that has been live since the rollout. These results surface two (or more) authoritative sources on a topic, and will typically include differing perspectives/answers to the query.

Bing leverages its deep recurrent neural network models to determine similarity and sentiment among authoritative sources, and extracts the multiple viewpoints related to a topic — providing the most relevant set of multi-perspective answers (covered in more detail here).

Bing – Multi-perspective Intelligent Answers

Google’s multifaceted featured snippets may appear not too dissimilar from Bing’s multi-perspective answers, in that they also provide multiple rich results for a single query, but they are instead based on the presumed multiple intentions of a query (resulting in both multiple queries and results) vs. multiple viewpoints resulting from a single query. With these types of broad queries, many interpretations of what the user is actually asking can exist.

Multifaceted snippets aim to provide a more comprehensive and actionable set of results for these multi-intent query scenarios. They differ from multi-perspective intelligent answers in that they presume a different question might be being asked altogether, and surface responses for each of the queries the algorithm assumes the user may have actually intended, as the screenshot below demonstrates:

Multifaceted Featured Snippet

Google plans to expand multifaceted featured snippets throughout 2018 to include other nuanced query types — beyond those that could have multiple intentions — and lists guidance-seeking queries as one example.

From the post:

“For example, guidance-seeking queries like “is it worth fixing my foundation?” have several components that could be important, such as cost, duration, methods and financing. We’ll continue to experiment with multifaceted featured snippets over this year to expand coverage.”

With both Google and Bing having fully adopted deep learning methods and using artificial neural networks to drive search advancements, we can expect to see a steady stream of changes in search results enhancements and improved information discovery.

As always, Google encourages users to submit feedback on these new search features as you encounter them in the SERPs. Read Google’s full announcement here.

The post Multifaceted featured snippets begin rolling out in Google search results appeared first on Search Engine Land.

Rand Fishkin leaves Moz, announces a new start-up

Rand Fishkin, the co-founder of Moz, announced his new company after officially leaving his day-to-day operations at Moz yesterday. He is starting a company named SparkToro, a technology platform in the influencer and audience intelligence marketing space.

Fishkin started the well-known SEO platform over a decade ago with his mother, Gillian. In fact, he dropped out of college to make the company now known as Moz about 17 years ago. Last July we learned he was leaving Moz after stepping down as CEO back in 2014.

Fishkin described his departure from Moz as a four out of 10 on a scale of zero to 10 where zero is “fired and escorted out of the building by security” and where 10 is “left entirely of his own accord on wonderful terms.” He wrote:

That makes today a hard one, cognitively and emotionally. I have a lot of sadness, a heap of regrets, and a smattering of resentment too. But I am, deeply, deeply thankful to all the people who supported me and Moz over the last two decades. The experience of building a company like this, of helping to change and mature an industry, of learning so much about entrepreneurship, marketing, and myself has been an honor and a privilege.

Fishkin said he still owns about 24 percent of Moz in outstanding shares, is still on Moz’s board of directors as the chairperson and remains the single largest shareholder. Because of his stake and the people there, Fishkin says he wants to see Moz succeed and continue to do well.

You can read more about Fishkin’s thoughts on leaving and his future plans on his new blog.

The post Rand Fishkin leaves Moz, announces a new start-up appeared first on Search Engine Land.

Tuesday, February 27, 2018

SearchCap: Google right to be forgotten, Google Word Coach game & German ruling

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

From Search Engine Land:

Recent Headlines From Marketing Land, Our Sister Site Dedicated To Internet Marketing:

Search News From Around The Web:

The post SearchCap: Google right to be forgotten, Google Word Coach game & German ruling appeared first on Search Engine Land.

How Amazon dominates the competitive search landscape

This report from Adthena reveals the extent to which Amazon has been capturing text ad click share in major retail categories.

Analysis of consumer electronics, and department store retail categories in the US and UK, suggests that Amazon’s paid search ad spend is behind the e-commerce giant’s continued market growth, with the scale and impact of their paid search investments eclipsing their closest rivals.

Featured in this report:

  • What is Amazon’s share of ad spend in US/UK markets?
  • How much click share does this win them?
  • What are the three key factors which contribute to Amazon’s dominant search performance?
  • What are the tactics and strategies retailers can use to get more click share?

Visit Digital Marketing Depot to download “How Amazon Dominates the Competitive Search Landscape.”

The post How Amazon dominates the competitive search landscape appeared first on Search Engine Land.

Can you predict what the future holds for your inbound links?

Almost five years ago I wrote an article about predicting a site’s future and using your expectation to decide whether you should pursue links on that site today. Much has changed in the search engine optimization (SEO) landscape since then so I decided to expand and update my original article.

Sometimes, what’s old is old

It’s interesting to run into sites we’ve worked with in the past and compare their previous and current metrics. Lots of things pop up like:

  • Old links are still live but the host page is full of new links whereas it wasn’t before.
  • Pages that once ranked well no longer do so.
  • Articles with links that were not originally there have been added.
  • And sometimes everything is the same, though, if not better!

A look into the past

It’s easy to determine what a site looked like in the past and compare it to the current site by using Archive.org.

You may notice a lot of changes such as good and bad redesigns, deleted links and entire articles removed. Occasionally you may notice whole sites deindexed in Google:

Due diligence

When starting a link campaign, it is important to go through a number of steps or perform “due diligence” using checklists and guidelines you’ve established.

It may be impossible to check every page but try to do as much as possible so nothing is overlooked. Here are some issues to check for:

  • Is the site indexed in Google?
  • Are there any spammy hacks on the site that haven’t been fixed?
  • Is there contact info on the site?
  • Does the site rank for its brand and major keywords?
  • If you’re placing a link in an existing piece of content, does that page rank for its title?
  • Is the site free from links and ads for gambling, payday loans, drugs, and porn?
  • Have you checked to make sure the content is original and not scraped or duplicated?
  • And always, always…does it look like your link would be a natural fit and get clicked on here?

There’s more depending on the industry and individual website but notice it’s pretty uncomplicated common sense stuff.

So how in the world can you predict what’s going to happen after you finish working on the site?

How do you know the webmaster won’t fill the site up with spam, sell the domain, let it expire or sell the site to a private blog network?

There tend to be signs, both good and bad.  Let’s start with the bad signs.

Bad signs

Here are a few red flags to look for when negotiating for link placement:

  • The webmaster gives you a list of 50 other “great” sites he has.  While some people just own a lot of sites, it is doubtful the other 50 will be as good as the one you sought out.  Look carefully.
  • The webmaster asks if you mind if he gives your information to “friends” who own similar websites. Watch for heavy interlinking with the friend sites — they may possibly even be owned by the same person who’s just using aliases.
  • Traffic on a site has dipped dramatically in the past, even if it’s good now. If the dip was five years ago and everything has been good since then it should be OK but if you see lots of dips, especially in the past few years, that may be a sign a new drop will happen soon.
  • They openly advertise that they sell text links.  Big red flag here; you do not want to work with a site that is basically asking for a Google penalty.

Good signs

Now let’s look at a couple factors that distinguish sites where links live for years and everything is still looking great.

  • Traffic is fairly steady (or continues to increase) through the years with no major dips.
  • Articles are well-written, guest or sponsored posts are identified as such and don’t appear to be full of someone else’s links.

Notice the good list is shorter than the bad list. That’s because you never know what will happen. Is everyone going to eventually get hit in some way since the algorithm changes constantly? Maybe.

Disavow madness

Don’t forget some people disavow like crazy, and they don’t just disavow single webpages — they disavow entire domains, because it’s easier.

I know of sites who want to disavow upwards of 75% of their links when they don’t even have a penalty or they haven’t been negatively impacted by an algorithmic change!

Honestly, when it comes to links, anything can happen. You never know when a site will be penalized, and it’s possible for them to get caught in a wide net and not deserve it. I’ve seen unfair penalties many times and seen sites suddenly drop in rankings and never get back to where they once were, even if they did nothing wrong.

You can’t predict what will happen in link building or SEO. You can make some very educated guesses but change is the only thing you can really guarantee.

The post Can you predict what the future holds for your inbound links? appeared first on Search Engine Land.

An easy way to see if Google thinks your webpages are keyword relevant

We all want to rank well, but there are times when it seems nearly impossible to do so.

There can be many causes for rankings shortfalls, and as I pointed out recently, sometimes it seems Google is just not interested in ranking businesses like yours for a target query.

That can be frustrating for anyone which is why my previous article suggested a way to determine if your target keyword phrase was a good fit for the terms you want to rank for.

Sometimes it’s better to pursue keyword phrases for which you know have a better chance of ranking than those you want to rank for.

In today’s post, I’m going to talk about factors Google may use to determine if a site is reasonably relevant for the keywords it targets.

Creating a webpage is not enough

Just because you create webpages targeting a certain keyword phrase or in a specific topic area does not mean you will rank for those terms.  In short, we don’t know if Google is “buying” it.

Let’s set the context here:

  1. You want to rank for a specific search phrase but currently, do not.
  2. Google is ranking your competitors’ content in the top ten search engine result pages (SERPs) but not yours.
  3. You want to know if Google thinks your webpage is a potential (relevant) fit.

Now that we have an outline, let’s dig in.

Ranking analysis

One of the best ways to see the ranking potential for a webpage is to see what it already ranks for.

This sounds simple and obvious, but I’m going to take it a bit deeper than simply looking at the top keywords to your site.

It’s worth digging a bit deeper to see what insights we can get, not only by looking at what we rank for, but what the competition ranks for, and the makeup of the words in those phrases.

The first step is to pull the phrases you currently rank for.

Getting Search Term Data from SEMRush

Once you have the basic ranking data, the next step is to manipulate the data to find additional keyword sets.

To do this, we’re going to focus on the phrases our site ranks for (in the top ten) and then count of all the individual words included with those keyword phrases.

Let’s say you want to rank for “manufacturer’s blue widgets”. The end result should look something like this:

Example Search Query Data for a Site

Now, repeat the process for your competitors ranking for the target phrase “manufacturer’s blue widgets”. When you’re done, it should look something like this:

Search Query Data That Shows That You Have a Long Way to Go

At this point, we have a lot of information which tells us what Google thinks about your site, or not.

Do you show up for a number of related phrases?  If yes, then your target keyword is probably fairly possible for you to rank for, even if you don’t currently show up in the top 50 for that target. If you’re ranking for related words, Google at least sees you as being relevant.

Scenario #1, we need help

But there’s a problem with the data showing above. Notice how the site on the left is ranking only for keywords that include the company’s brand name.

This is not good since we are trying to rank for the phrase “manufacturer’s blue widgets” and that term is nowhere to be found.  This tells us Google doesn’t equate this website with our target phrase.  Ouch!

If you don’t see your target keywords showing in the search results that may be a sign ranking is going to be tough. You either have major content problems related to the target keyword or a general authority problem related to the topic area.

Scenario #2 much better

Let’s take a look at a scenario where our prospects are better:

Search Query Data That Shows That Your Prospects are Good

The data shows we’re in much better shape, and our prospects are reasonably good.

Tuning and tweaking

If you find you rank for a number of related terms, your next steps are pretty straightforward. You should look to improve the content related to the target page on your site by improving the depth of supporting content on other pages of your site.

You may also need to do some public relations (PR) or related link building that supports your stance as a relevant resource.  The scope of this effort is along the lines of tuning and tweaking, not a complete overhaul.

When to do an overhaul

If you didn’t rank for much in the way of related terms, the steps you need to take are basically the same as tuning and tweaking but with more intensity.  You need an overhaul because Google just doesn’t look at your pages as being relevant to the topic/phrase.

This means the program is going to cost significantly more to execute, and it will take longer to show results. If you don’t have the budget or patience for that, you may need to consider a different keyword target.

Some cost perspective

Let’s put this in perspective and see what each scenario may cost you.

In the first scenario where you didn’t rank for any relevant terms, the work involved to rank may cost $250K, and you might have to campaign for ten to twelve months to see the results you’re looking for.

In the second scenario, you rank for relevant terms so your costs to continue ranking or rank ahead might be $50K and might take three to four months to achieve results.

The analysis is actually fairly simple, but the benefits are high. When Google relates keywords to your webpages, your search engine optimization (SEO) costs should be lower.

Focusing on what you already rank for allows you to budget more accurately for your campaigns, and to focus on tactics to help you achieve your goals.

The post An easy way to see if Google thinks your webpages are keyword relevant appeared first on Search Engine Land.

German court: Google has no ‘duty to inspect’ websites for illegal content before displaying

A German court has ruled that Google is not required to pre-screen websites for defamation before displaying them in search results. This ruling comes from the German Federal Court of Justice, the country’s highest court.

The plaintiffs in the case had sought to force Google to filter out websites that displayed allegedly defamatory content about them in an IT-related online discussion forum. They also sought to collect damages from Google for the presentation of those sites in search results, arguing that Google had a duty to screen and not to display the defamatory material to others online.

A ruling in favor of the litigants would have put a huge burden on Google to essentially review all website content in Germany for any potential violations before displaying it in search results. The German court, however, recognized the practical impossibility of this and held that a duty to take action is triggered only if Google is notified “of a clearly recognizable violation of individuals’ rights.”

The court said, “Instituting a general duty to inspect the content would seriously call into question the business model of search engines, which is approved by lawmakers and wanted by society, ” according to a Reuters translation. “Without the help of such search engines it would be impossible for individuals to get meaningful use out of the internet due to the unmanageable flood of data it contains.”

The case apparently arose under the Right to Be Forgotten (RTBF). I haven’t seen the underlying factual and legal discussion so can’t comment on the full implications of the decision. However, it would appear to clearly affirm the broader proposition that Google can’t be held liable for illegal content in search results without first being notified of its disputed or potentially illegal nature.

Yesterday Google published a report on three years’ worth of RTBF requests and who’s making them.

The post German court: Google has no ‘duty to inspect’ websites for illegal content before displaying appeared first on Search Engine Land.

Law and reputation firms generate 21% of Right to Be Forgotten delistings, says Google

Google says that there are “tens of thousands” of Right to Be Forgotten (RTBF) requests filed each month in Europe. In a new blog post, the company explains that it’s updating its “Transparency Report,” which details RTBF requests, to include new categories of information.

In addition to reporting aggregate data on requests, their countries of origin and percentages granted, Google says it will now reveal:

  • The type of individual/entity making the request: private vs. non-private (government entity, corporations, NGOs)
  • What sort of content is associated with the request: personal information, professional information, criminal activity
  • Whether the site on which the link appears is a directory site, news site, social media or other.
  • Delisting rate by content category

Google is simultaneously releasing a report that provides more depth and detail on the nature of delisting requests, summarizing three years of data since RTBF first came into being in May 2014. The high-level findings are provided in an infographic in the blog post.

In the report, Google says there are “two dominant intents for RTBF delisting requests.” Roughly a third (33 percent) of requests are related to personal information on social media and directory sites. Another 20 percent relate to news and government websites that contain “a requester’s legal history.” The rest are diverse and span a range of content types and objectives.

Top hostnames for requested delistings

Source: Google “Three years of the Right to be Forgotten” (2018)

Google said that more than half (51 percent) of requests come from three countries: France, Germany and the UK. Overall, only 43 percent of requests are granted.

Private individuals make up most of those requesting delisting — Google says 85 percent in the report and 89 percent on the infographic. Non-private persons (entities, government officials) make up the rest. Minors make up 5 percent the private individual requester group.

The report is the first time Google provided significant detail on its RTBF evaluation process. Interestingly, the company said there’s no automation involved. Below are the major criteria and considerations, which I’ve paraphrased for length:

  1. The validity of the request . . . and the requester’s connection to an EU/EEA country.
  2. The identity of the requester . . . to assess whether the requester is a minor, politician, professional, or public figure.
  3. The content referenced by the URL: is it of interest to the public; how “sensitive” is the content and did the person requesting delisting consent to it being made public?
  4. The source: Google strongly implies that government and news sites are weighted more heavily (against delisting) compared with “a blog or forum.”

Breakdown of those requesting delisting

Source: Google “Three years of the Right to be Forgotten” (2018)

Google also reiterated and clarified the scope of delisting, if granted:

Delistings occur on result pages for queries containing a requester’s name on (1) Google’s European country search services; and (2) on all country search services, including google.com, for queries performed from geolocations that match the requestor’s country. However, in 2015 the French data privacy regulator CNIL notified Google to extend the scope of delisted URLs globally, not just within Europe. Google appealed this decision, and the matter is now under consideration by the Court of Justice of the European Union.

One of the more interesting disclosures in the report is that there is a category of high-volume RTBF requesters. Google reports that the top 1000 requesters “generated 14.6 percent of requests and 20.8 percent of delistings. These mostly included law firms and reputation management agencies, as well as some requesters with a sizable online presence.”

Categories of information associated with delisting requests

Source: Google “Three years of the Right to be Forgotten” (2018)

The full report is available for download here.

The post Law and reputation firms generate 21% of Right to Be Forgotten delistings, says Google appeared first on Search Engine Land.

Monday, February 26, 2018

Google Word Coach, a fun word game in the search results

Google has added a new feature named Google Word Coach to the Google dictionary and translate boxes within web search for non-English searchers.

When you do a search that triggers a dictionary or translate box, Google may show you this Word Coach that helps you “expand English-language vocabulary in a fun and engaging way,” a Google spokesperson told us.

Susanta Sahoo shared a picture with us of this feature on Twitter:

This launched in the Google search results a couple weeks ago.

Here is a statement from a Google spokesperson on this feature:

Google Word Coach is a game designed to help expand English-language vocabulary in a fun and engaging way. It appears under our dictionary and translate boxes or when someone searches for “Google Word Coach.” It launched this month in non-English speaking countries and also in India. It may come to other countries and languages in the future.

The post Google Word Coach, a fun word game in the search results appeared first on Search Engine Land.

SearchCap: Google Shopping EU changes, page speed scorecard & Search Console bug

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

From Search Engine Land:

Recent Headlines From Marketing Land, Our Sister Site Dedicated To Internet Marketing:

Search News From Around The Web:

The post SearchCap: Google Shopping EU changes, page speed scorecard & Search Console bug appeared first on Search Engine Land.

Keeping your edge sharp with AdWords


These are days of plenty for advertisers, with Google rolling out a new AdWords interface, AMP for ads and landing pages, Purchases on Google, updating the AdWords Keyword Planner and more.

But factor in the increasing prevalence of machine learning and other forms of artificial intelligence that are now fundamental components of AdWords and keeping up with the abundance of goodies can be both time-consuming and occasionally overwhelming.

Never fear – if you’re looking keep sharp with cutting edge strategies and tactics for getting the most out of your AdWords campaigns, check out Brad Geddes Advanced AdWords workshop, running the day before SMX West on March 12 in San Jose. This one-day, hands-on workshop provides a deep immersion into everything you need to know to take advantage of everything AdWords has to offer today.

Want to know more about what to expect from the workshop? Click over to Five questions for AdWords expert Brad Geddes where he explains why this full-day master class is a must-do experience for all serious paid search practitioners.

The post Keeping your edge sharp with AdWords appeared first on Search Engine Land.

Just 2 weeks until SMX West! Register now & save $100

We’re just two weeks away from Search Engine Land’s SMX® West, the premier educational experience for search marketers obsessed with SEO, SEM, and online retail. Now is the time. Secure your pass and join your search marketing community in sunny San Jose!

What’s in it for you?

We’ll equip you with actionable tactics you can implement immediately to drive your search marketing results. Three days of dynamic and authoritative sessions, inspirational keynotes, hands-on clinics, vendor demos, and exceptional networking are sure to satisfy your SEO and SEM cravings.

Choose your ideal pass

Our goal is to provide you the best conference experience imaginable – but we know that “best” can mean different things to different people. Choose from a variety of pass options to suit your specific goals and budget:

  • Option 1: All Access Pass – The Ultimate SMX Experience
    Enjoy full access to all that SMX offers: 50+ sessions, delicious breakfasts, lunches, and snacks, free WiFi, the works. See the full agenda here. You’ll also connect with the search marketing community through our exclusive networking events, including the legendary SMX After Dark bash at the Tech Museum of Innovation. Your All Access pass also includes the entire SMX Expo Hall, Solution, Classroom, and Theater presentations featuring Bruce Clay, Inc. and Learn with Google. Can’t get away from the office for the full show? One Day All Access Passes are also available!
  • Option 2: All Access + Workshop Pass – Expand Your SMX Experience
    Amplify your time by adding a full-day deep dive workshop to your agenda! Unlock all of the core conference benefits, plus your choice of an outstanding pre-conference workshop, hosted Monday, March 12: Advanced SEO Training, Advanced AdWords Training, Master Social Media Advertising, Maximizing Mobile Potential, Hardcore Technical SEO Tactics & Techniques, or Advanced Conversion Rate Optimization & A/B Testing. Interested in only attending a workshop? Register for the Workshop Pass!
  • Option 3: Boot Camp Pass – Perfect For Beginners
    Join us for the SMX Boot Camp, a one day tour through the fundamentals of keyword research, copywriting, SEO-friendly web design, paid search, and link building. This track is perfect for those new to the industry, or anyone looking to brush up on their basics. Register for a one-day Boot Camp Pass or attend as part of your All Access pass!
  • Option 4: FREE Expo+ Pass – Streamlined & Affordable
    Yes, seriously, it’s free as long as you register before March 13! On-site prices are $49. What are you waiting for? Your Expo+ pass lets you access the Learn With Google classroom sessions as well as the Solutions track and SMX Theater presentations. Visit the Expo Hall for live demos of services and solutions, attend the keynote with Microsoft and Bing, and enjoy cocktails and snacks at the Expo Hall Reception. See the Expo+ Agenda here.

The choice to attend SMX West is easy. It’s selecting the perfect pass option that’s the trick! However you choose to join us, register now and prepare to be blown away by quality content, career-defining networking opportunities, and thoughtful conveniences that make SMX a rewarding experience. We guarantee it.

T-minus two weeks! Hope to see you in San Jose!

The post Just 2 weeks until SMX West! Register now & save $100 appeared first on Search Engine Land.

In response to EU antitrust ruling, Google Shopping now shows ads from competing shopping engines

A couple of weeks ago, we reported that links to third-party comparison shopping engines (CSEs) had been spotted in Google Shopping results in the UK. Now ads for products promoted by competing CSEs are showing up in the Google Shopping carousel in the UK (and possibly other EU countries).

The inclusion of ads from competing CSEs is part of Google’s response to an antitrust ruling and massive fine issued by the European Commission last year. The example below, shared by a webmaster who goes by Gabs, shows an ad by ShoppingFM in the second ad slot in the Shopping carousel. The other product listing ads are by Google’s own shopping engine.

Google is appealing the European Commission’s antitrust ruling, but in the meantime, the company has established Google Shopping as a distinct business in the EU. In a new structure, that business unit then bids against other CSEs in the ad auction to give the competing engines “equal treatment” as mandated in the ruling. In the US and other non-EU markets, individual advertisers compete within the Google Shopping auction.

A merchant’s visibility on Google Shopping in EU markets will now depend on how well it performs for Google Shopping and any other CSEs on which it purchases ads. This is in addition to its bid and the CSE’s evaluation of its expected performance for a given query. Because Google is bidding against other players, it will likely bid only as high as is profitable for it as a marketer. The ads were supposed to start showing last fall, but have been slow to roll out.

Unlike the previous sighting of a link shown at the end of the ad carousel, the screenshot above reflects the format Google had put forward last September (shown below) and has apparently now settled upon.

The post In response to EU antitrust ruling, Google Shopping now shows ads from competing shopping engines appeared first on Search Engine Land.

6 smart e-commerce lessons to boost local business

E-commerce and local search might seem to be exclusive functions. Local search is typically associated with store locations and driving offline purchases, while e-commerce usually involves online transactions.

But recently the crowded e-commerce space has led online-only stores to encroach on the turf of local stores and services.

Amazon opened a bookstore in Seattle and acquired Whole Foods. Warby Parker, the eyewear company, opened its first store in 2013 and now has 61 nationwide. Bonobos, Blue Nile and others are likewise opening retail stores.

The reason for the bleed over is that online brands have realized the cumulative benefit storefronts are having to both marketing and sales, boosting both online sales as well as adding sales from stores.

That’s a lesson that also applies in reverse to those whose primary business is a brick-and-mortar storefront. Many of the components of e-commerce are increasingly relevant as consumers use more devices and more media to research purchases that they plan to buy offline. Omni-channel customers now expect to start their search for goods and services in one place and continue it seamlessly in another whether on a different device, media outlet or store location.

Here are six lessons from e-commerce you should use with your online content to provide a lift to your offline business.

1. Your website is your storefront, too

Sometimes it is hard for brick-and-mortar stores to dedicate the same investment to their website as they do their store.

E-commerce sites don’t have this conflict. Yet based on consumer behavior statistics, local businesses must realize that they likely have more visits to their website than their store and most in-store customers will visit their website first.

A 2016 Google and Purchased Digital Diary survey found that 58% of consumers visited a retailer’s website or app before making an in-store purchase.

Local Search Association’s (LSA, my employer) own survey of 8,000 consumers likewise found that a company website is the most used channel by consumers who are ready to purchase. Twenty-seven percent turn to a company website, even more than search engines at 24%.

On the flip side, consumers are turned off by bad online experiences.

A survey conducted by Vistaprint in 2016 shows:

45% of consumers are unlikely to buy from a business with a poorly designed website. This could be anything from a poor mobile experience to broken links, a confusing user experience or something similar. More compelling still, 34% were unlikely to buy from a business if they didn’t have a website.

2016 Vistaprint survey of 2000 adults

Realize that your website is an extension of your store and invest in a good online experience for your customers.

2. Make sure online information is accurate and up to date

You wouldn’t keep a Christmas sale sign up after the New Year in your store. Don’t do that online either.

While most customers are not likely to react as strongly as a Harvard professor who was charged $4 more than the outdated online menu prices he saw online, it can still impact business. That Harvard professor threatened legal action and notified authorities.

There are plenty of reactions short of his that are still not desirable.

My wife recently purchased a vacuum accessory just to be notified later that it was back-ordered, and then, a second time, that it was indefinitely out of stock. I can personally attest to the veracity of the phrase “unhappy wife, unhappy life” and do not recommend you tick her off with inaccurate online info.

On the flip side, there are a lot of good reasons to keep online information just as timely and accurate as the information you have in the store.

I was shopping for necklace charms for my daughters and looked at James Avery, an artisan jeweler based in Texas. Their online inventory by store was very helpful as I looked at some vintage letter charms. I was able to locate a store that didn’t just carry the charms, but the exact letters I needed.

They were also careful to let me know when they didn’t have information I needed. James Avery partners with and sells their jewelry through Dillard’s department stores, however, it appears they cannot track inventory at those locations like they can in their own retail stores. Online, they make it clear that inventory searches are not available for the Dillard’s locations.

Informing the customer when information is not available is much better than frustrating them with vague answers and wasting their time.

3. Don’t hide the ball from online shoppers

The shell game played on the streets of New York City (NYC) might be entertaining to watch, but it is not effective marketing.

A study by Market Track found that 80% of those surveyed compared prices online before buying offline. Business owners might use that as justification for not displaying prices online out of fear that they will be undercut by a competitor.

The problem with that mindset is that price is a core decision-making factor but not the only one. If a shopper is looking for the price and you fail to provide it, you are likely taken out of consideration. That was my personal experience.

I’m in the market for a new mattress. As I’m comparing products, I’m looking at comfort, features and quality. Even though I’ve visited several stores, I can’t recall all the details of each mattress I’m considering. One store doesn’t list their prices online and, even though I’m not trying to figure out which mattress is the cheapest, I do need price to assess overall value.  So that store is out.

4. Provide a full shopping experience online

It’s not just pricing information that isn’t always disclosed.

Some feel that providing only basic information online will drive store traffic where customers can get the “full experience”. I’m certainly not underestimating the value of in-person customer visits, but today’s consumer wants more, not less, information online.

LSA research found that 63% of consumers research a product or service online 50% of the time before making a purchase at a store. But many consumers research a product or service online even more frequently. Forty-six percent said they do online research 75% of the time before in-store purchases.

As an example, online mattress companies might appear to be at a huge disadvantage to local stores, but they are competing effectively by providing rich amounts of information to compensate for the inability to test the mattresses in person.

Effective use of video individualized for each mattress highlighting descriptions, features and comparison data comes close to mimicking a conversation with a sales rep. Charts, comfort ratings, and reviews also help decision making.

Brick-and-mortar stores must match these online experiences just to stay even before they can realize their in-store product advantage.

5. In store product interaction still maintains an advantage

One thing that online stores cannot duplicate is the ability to touch, feel, smell and get a truly interactive experience with a product. They may compensate by offering extended return policies and lenient trial programs, but each of these requires a greater commitment than a store visit.

LSA recently reported on a study that found the ability to interact with products was the top reason for most consumers to shop in-store. The only group for which this wasn’t true was Gen Z, 72% of whom said their top reason was to avoid shipping costs.

Keep in mind this group has the lowest disposable income and likely spends mostly on small item purchases.

With more product research being done online, in-store visits to interact with products is likely done at the tail-end of the purchase funnel. So these store visitors should be considered high-conversion leads.

There are ways to combat the concern about showrooming. That consumer is still ready to buy -– give them reasons to buy in-store by highlighting price-match policies, easier returns without shipping costs, and the immediate gratification of buying it now.

6. Don’t forget to pay attention to third-party listings

Brick and mortar stores extend their storefronts using third-party sites such as Google Shopping and Amazon. Because of their interactive sales nature and API programming, it’s easy to understand the importance of keeping the information such as price and inventory up to date.

But listings such as Google knowledge graphs and Yelp profiles are often relegated to the “set it and forget it” mode. After all, isn’t all of that information static?

That’s a mistake that many make. Google and Purchased Digital Diary’s 2016 survey found that “37% of consumers visited a non-retailer website or app before an in-store purchase and 31% used an online map” where consumers commonly view the store profile.

While not a substitute for a local business website, a third party does operate as a proxy in many cases. So while the information on those listings can be relatively static, it would be more effective if there were some updates.

For example, add holiday hours or special events such as President’s Day sales.

Update listings with new lines of products or highlight specific menu items. Upload new photos showing store renovations or redesigns. And conspicuously post any variation from your store’s status quo such as being closed for a private party. This is all information that customers searching for your business want and need. And failing to check these third-party listings can be costly.

My colleague Greg Sterling recently wrote about his experience with furniture shopping. It seems a Google map listing indicated that a particular Macy’s store had an in-store furniture gallery.

Armed with the sales catalog and intentions on buying $1,000+ of furniture, Greg headed to the store. Turns out they did not sell furniture at that location. Greg didn’t have time to visit another store before the sale expired, and for Macy’s, it was an opportunity lost.

Closing thoughts

Local business owners spend much of their time on site so it is natural to focus on a customer’s experience as primarily in-store. But today’s consumer often arrives at the store more informed and closer to making a purchase decision than ever before. Much window shopping is now done on a screen.

In today’s cross-device, multi-media, omnichannel market, it pays to think like an online store and use some e-commerce strategies to provide a lift to your local storefront’s business.

The post 6 smart e-commerce lessons to boost local business appeared first on Search Engine Land.