Monday, October 31, 2016

SearchCap: SEMPO survey, HTTPS & Halloween

searchcap-header-v2-scap

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

From Search Engine Land:

Recent Headlines From Marketing Land, Our Sister Site Dedicated To Internet Marketing:

Search News From Around The Web:

Industry

Local & Maps

Link Building

SEO

SEM / Paid Search

Search Marketing

The post SearchCap: SEMPO survey, HTTPS & Halloween appeared first on Search Engine Land.

Choosing the right content marketing software for your business

content_304576991-ss-1920Managing the volume of marketing content that needs to be created, distributed, analyzed, and managed has become complicated, time consuming, and costly for many organizations. A crowded field of content marketing tools has emerged to help brand marketers automate their content marketing strategies and tactics.

Marketing Land’s all new “Content Marketing Tools: A Marketer’s Guide” examines the market for content marketing tools and the considerations involved in implementing this software into your business.

If you are thinking about implementing a content marketing tool, searching for the best content marketing resources, or simply want to learn more content marketing, you need to read this report.

Visit Digital Marketing Depot to download this Marketing Land guide.

The post Choosing the right content marketing software for your business appeared first on Search Engine Land.

Take the State of Search Survey from SEMPO & Search Engine Land

sempo-logo-1920

SEMPO, in partnership with us at Search Engine Land, is asking the search industry to participate by completing the annual State of Search Survey. The survey is available by clicking here.

Those who take the survey will be given access to a free of the in-depth report that is written up based on the survey results, a value of $400. Also, you will also be eligible to win a free pass to the 2017 SEMPO Member Forum, courtesy of SEMPO or a FREE all-access pass to SMX West on March 21st-23rd, 2017 courtesy of Third Door Media.

You do not need to be a SEMPO member to complete the survey and/or be eligible for the above.

The survey asks questions about:

  • Search engine optimization (SEO / organic search)
  • Paid search (pay-per-click advertising or paid search)
  • Social media
  • Email
  • Mobile
  • Display
  • Integration, Emerging Trends and more​​​​​​​

Again, please complete the survey over here.

The post Take the State of Search Survey from SEMPO & Search Engine Land appeared first on Search Engine Land.

Meet a Landy Award winner: Quick on its feet, Point It wins Best B2C Enterprise SEM Initiative

landys-254-pointit-1920x1080-2

Katy Tonkin (left) and Maddie Cary of Point It accept the Landy for Best B2C Enterprise SEM Initiative.

The mission they chose to accept: To build and activate paid search campaigns to promote the surprise product launch of the Surface Pro 4 and Surface Book on the US Microsoft Store web site — in less than 24 hours.

Mission accomplished. For its quick execution of tailored, target campaigns that exceeded expectations, Seattle-based Point It Digital Marketing took home the Landy award for Best B2C Enterprise SEM Initiative. This was Point It’s second consecutive Landy win.

Not only was time not on their side, once the campaigns were launched, Point It faced stiff competition from other authorized sellers and retailers carrying the new Surface products.

Point It focused their campaign structure and keyword strategy on reaching lower-funnel prospects who knew about the new products and were searching on brand keywords that signaled purchase intent. Negative keywords were added to funnel target prospects to the right products.

Ad copy was crafted with product specific description copy tailored to keywords in tightly themed ad groups. Callout extensions highlighted specific product details and sitelinks provided easy navigation to product selection in the respective campaigns.

The team also initiated RLSA campaigns using BlueKai to pass audience segments into Google for targeting that proved to be highly successful. Overall, the campaigns exceeded their targets.

“This award win is a reflection of the way our paid search team executes every day,” said Maddie Cary, Director of Paid Search at Point It. “We try to think ahead and set up account management processes that allow us to scale or move with agility and efficiency. So when our client had a surprise product launch that needed to get turned around ASAP, we didn’t panic. We brainstormed, formalized, and delivered within 24 hours on a paid search plan we were proud of that capitalized on critical window of revenue opportunity for the client.”

“This is one of the most heroic SEM stories ever told,” said Landy judge Matt Van Wagner, president and founder of Find Me Faster, “Point It was uniquely qualified to take on a new product launch, but to build, test and deploy within 24 hours is the equivalent of NASA’s first landing on the moon!”

The post Meet a Landy Award winner: Quick on its feet, Point It wins Best B2C Enterprise SEM Initiative appeared first on Search Engine Land.

Why all search ads seem the same (and what you can do about it)

Men in Uniform

Let’s face it: anyone with an AdWords login, bank account and keyboard can create ads for search. It can be a Wild West out there, which means that many ads ultimately fail. They fail because they don’t capture the attention of searchers, because they don’t include the best information, and frankly, because they look like every other ad out there.

Your paid search ad strategy goes way beyond the 140 text characters allotted to you. It starts with that, sure, but the entire architecture of your ad from the text to the extensions should all support a strategic message about your brand, its products or services.

So in this post, we’ll look at some of the steps you can take before you type that first word of text, so you can construct informative, eye-catching ads that truly support a company’s goals and stand out from the crowd.

Get into the mind of the business and consumer

You can’t very well create impactful ads without first understanding the business and consumer needs inside and out. And there are several ways you can facilitate research to get a 360-degree view of the company. Let’s look at those now.

Interviews and questionnaires

Create a questionnaire you can send to employees from various departments — like customer service representatives, sales teams or product teams — or talk to them directly. These folks are on the front lines every day and should have some interesting insight.

Sample prompts and questions include things like:

  • Describe your target audience.
  • Do you have a secondary market you’re looking to tap into as well?
  • What’s most important to your target audience when they purchase Product or Service X?
  • What are your customer pain points, and how do you solve them?
  • How often does your target audience need or buy your product?
  • What are three to five key selling points for your company and product or service?
  • Do you experience seasonal slow or peak times?
  • What does the company promotional or event calendar look like currently?

Customer reviews and testimonials

What a company’s customers have to say (the good and the bad) can do a lot for the ad strategy. Read as much of these as you can to see if you can spot any trends that you can work into the ads.

You may also want to talk to key folks in the organization about any negative trends in reviews. Oftentimes, internal teams are not aware of what the customers are saying, and a conversation like that can be helpful so they can tweak their strategy.

And remember that when it comes time to create the ad, you also have things available to you in AdWords like review extensions for third-party reviews and seller ratings that can help highlight those praises.

Study the competition

Understand how the company is the same and different from its competitors. And watch out for the we-don’t-have-any-competitors response. If you run into that, simply search in Google using the top keywords you plan to target to get a better picture of who you’re up against.

But be aware: Sometimes the ads that show up for the keywords aren’t really your competitors. For example, if Target shows up for a specialty dance shoe, use your discernment in assessing if Target really is a competitor to a specialty dance shoe company.

In this sense, an exercise like searching for keywords can really get you up to speed on the competitive landscape.

Reviewing competitor ads can also be a good thing if you don’t let what they are saying influence too much the ads you want to create (Remember, you’re trying to get away from what every other ad is doing).

However, it can help you spot missed opportunities for your own ads — places where you can one-up the competition. And sometimes, you can learn from them, too — so go in with an open mind.

Then, having candid conversations with the company about the competition’s advertising (what they like or don’t like) is also important in the strategy phase.

Understand your other marketing efforts

It’s good to understand the full scope of the company’s marketing efforts in other channels because they often inform and influence one another. So get plugged into the strategy by talking to other teams and vendors and looking at product guides, subscribing to the company’s mailing list and so on.

You can learn a lot of about the tone and the messaging of the brand by how it communicates, and you can then incorporate that into the advertising.

Plus, when you know what the other marketing teams are doing, you’re more likely to be able to work with them on the things that impact both your channels (for example, website speed) and react quicker in any given situation (for example a PR crisis).

Like any other marketing or sales effort, you have to put in the research to understand both the business needs and the audience desires. With those two areas researched well, you can begin to create killer ads that stand out from the crowd.

The post Why all search ads seem the same (and what you can do about it) appeared first on Search Engine Land.

When going HTTPS, don’t forget about local citations!

https-browser2-ss-1920

Migrating your site to HTTPS is all the rage these days. Google is on the record as saying that using “https” in your URLs can give a site a ranking boost.

That said, going HTTPS has its share of SEO challenges. Here are but a few of the HTTPS horror stories we have witnessed over the past year:

  • Sites go HTTPS and don’t redirect or canonicalize the HTTP URLs to their HTTPS versions.
  • Sites go HTTPS without telling the SEO team, who freak out when they check into Google Search Console and see branded traffic has started to tank (Hint: check the HTTPS profile in Google Search Console that no one set up because you forgot to tell the SEO team).
  • Sites go HTTPS without making the site truly secure. For example, if you are serving your CSS file from an HTTP URL, you will need to update the CSS URLs to HTTPS. If you don’t do this, your browser may start to show an insecure warning like this:
  • Even worse, Google may start showing insecure site warnings next to your URLs in search results — a nice way to depress CTR, if that’s what you’re into…
  • Sites go HTTPS, get some links to HTTPS URLs, and then revert back to HTTP for whatever reason. Now, whenever someone clicks on one of those HTTPS links, they are going to get an “insecure!” warning like this:
not-private

Things can get complicated when you’re trying to keep track of all of the technical best practices, particularly if you’re working on migrating a huge, complicated site with multiple teams and vendors, which is often the case with multi-location brands.

One of the bigger complications we often come across is how to handle your local citations — the listings for your locations on various local search services such as Google My Business, Yelp, YP.com, and the main business listing data aggregators such as Acxiom, Factual, InfoGroup and Neustar Localeze (or whichever services provide listings in your country).

Now I see you scratching your head, thinking, “I thought this HTTPS stuff was just about my website. What does it have to do with a business listing on another site?” In short: plenty.

Over the past couple of years, we have conducted several studies on the impact of cleaning up your local citations, and in our experience, one of the best things you can do is remove redirects from your citation links, particularly your Google My Business listings.

Often, we see brands go HTTPS and forget that their citations still all link to HTTP URLs. This may seem fine, as the HTTP links redirect to HTTPS — but in one fell swoop, you have redirected all of your local citations, which now may be negatively impacting your Local Pack rankings.

Let’s say you have a business with 1,000 locations. Each location likely has 150 to 300 citations. So on the low end, this is 150,000 links for this site all going through a 301 redirect (at best). According to this Moz post about an accidental redirect test Wayfair.com conducted, they saw a 15 percent reduction in traffic, on average, after doing 301 redirects. In our thousand-location situation, that means we could be losing 15 percent of the traffic to each of these location pages. That’s a lot of traffic to lose.

And if you have decided not to migrate your image URLs to HTTPS (For some reason, image URLs are often the neglected stepchildren of redesigns), now any image URL that you have added to your GMB profile is likely broken.

We just worked on a case where the brand had created a new HTTPS logo URL, so every other site that had been serving the logo from the HTTP URL was now serving a broken image, including every Google My Business page. #OOPS

So maybe when you put your “we’re going HTTPS!” plan together, make sure you have someone on hand to deal with your local citations. It might make you feel a bit more secure…

For further reading on going HTTPS, I strongly recommend Fili Wiese‘s “All You Need To Know For Moving To HTTPS.” It’s the best thing I have read on the subject anywhere.

make-citations-great-again

PS: Don’t get too freaked out about going HTTPS. Over the past six months or so, we have seen some sites make some truly epic HTTPS migration screw-ups with little Google downside. It may be the case that since Google has promoted HTTPS so much, they have made the algorithm a bit more forgiving to avoid too many #HTTPSUCKS tweets. Your mileage may vary.

PPS: You’re the SEO person. You have made a career out of studying how to take advantage of Google’s algorithms while asking for resources from people who often don’t understand what it is that you do all day. So don’t blow it by being the one who champions migrating to HTTPS. Let the CIO do it.

The post When going HTTPS, don’t forget about local citations! appeared first on Search Engine Land.

Sunday, October 30, 2016

Halloween Google Doodle treats searchers to Magic Cat Academy game a day early

halloween-doodle

Google’s Halloween doodle arrived a day early this morning – giving trick-or-treaters extra time to beat Google’s Magic Cat Academy game that now resides on the homepage.

Players are instructed to draw simple shapes on the screen to scare away ghosts creeping toward Momo, the cat-magician casting spells at the center of the game.

According to the Google Doodle Blog, the development of the game involved four different teams – art, engineering, production, and an “extra help” group that produced the music. Players navigate through five different levels of Magic Cat Academy – all set in a school environment – racing against time to swipe away ghosts headed toward Momo.

Google says it started with numerous ideas of elaborate symbols to draw, but in the end, decided a “short game against the clock” was a better option.

“Plans like the ‘Eiffel Tower spell’ were abandoned, and similarly, gag spells didn’t make the cut,” says Google, “Regardless, we loved the process of dreaming up the possibilities.” The Doodle team shared the following early mock-up of the game:
halloween-doodle-2016

So far, I’ve only made it to level three – but, I’m not much of gamer, so my results are not the best measure of the game’s difficulty. At the end of the game, players are given the option to share it via social channels or email, and the search icon leads to results for a “Halloween” query.

Google says the game was inspired by an actual cat named Momo who belongs to Doodler Juliana Chen. You can read more about early versions of the game and see a picture of the actual cat behind Magic Cat Academy at: Google’s Halloween 2016 Doodle.

The post Halloween Google Doodle treats searchers to Magic Cat Academy game a day early appeared first on Search Engine Land.

Friday, October 28, 2016

SearchCap: AdWords Partners bug, AMP upsets publishers & SEO power

searchcap-header-v2-scap

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

From Search Engine Land:

Recent Headlines From Marketing Land, Our Sister Site Dedicated To Internet Marketing:

Search News From Around The Web:

Industry

Local & Maps

Searching

SEO

SEM / Paid Search

Search Marketing

The post SearchCap: AdWords Partners bug, AMP upsets publishers & SEO power appeared first on Search Engine Land.

Report: AMP causing monetization frustration among some news publishers

google-amp-fast-speed-travel-ss-1920

According to a report in the Wall Street Journal, Google’s Accelerated Mobile Pages (AMP) are meeting with mixed reviews from publishers. The core issue for publishers is that AMP pages don’t generate the same amount of revenue and don’t give publishers as much control over ads.

The article asserts:

For some publishers [preference for AMP in search results] is a problem, since their AMP pages do not currently generate advertising revenue at the same rate as their full mobile sites. Multiple publishers said an AMP pageview currently generates around half as much revenue as a pageview on their full mobile websites.

That’s largely because of limitations related to the types of ad units AMP pages will allow and the ad technology providers that are currently integrated with the platform, those publishers say.

AMP ads are standardized and don’t allow certain kinds of interstitials or takeovers, which enable publishers to charge more or offer more customization. However many of those higher-profile or customized units are objectionable to users.

Google disputes the notion that AMP won’t monetize for publishers. Properly implemented, Google says, AMP pages can generate revenue comparable to publishers’ existing mobile sites. CNN and the Washington Post are cited in the article as example publishers whose AMP pages generate revenue at roughly the same levels as their mobile sites.

The article also states that some publishers don’t want to go on the record discussing AMP challenges because they fear retaliation from Google. That conspiracy mindset isn’t helpful however. Google needs to hear directly from publishers about their concerns and issues. Many of these publishers also believe, apparently, that AMP adoption will be forced on them as a ranking factor.

While Google has said that AMP usage isn’t a ranking factor, page speed will become one soon. Google has said AMP pages are 4x faster and use 10x less data compared to non-AMP pages and that, on average, AMP pages load in less than one second. As a practical matter, Google probably can never turn AMP directly into a ranking factor because it would get spanked by antitrust authorities (at least in Europe) were it to do so.

As mentioned, a few publishers told the WSJ that they were generally happy with AMP performance and monetization. They added that an increasing percentage of their mobile page views are coming from AMP pages.

AMP is Google’s attempt to make the mobile web a more user-friendly place and increase user willingness to visit mobile websites (vs. apps). That in turn will benefit mobile search usage. So while there’s a larger “altruistic” goal of speeding and cleaning up the mobile web, there’s also a very self-interested aspect to AMP that ties directly to mobile search revenue.

The post Report: AMP causing monetization frustration among some news publishers appeared first on Search Engine Land.

The SEO power of portfolio entries, case studies & testimonials

The SEO Power of Portfolio Entries for small businesses

SEO and content marketing can be tough for small businesses. Creating content that answers the frequently asked questions in your industry may not be too difficult, but getting it found in search engines is not so easy if you are a small local player. Even if you could rank a piece of content nationally, would it turn into business? Could you handle the influx of leads if it did?

The digital marketing channels and tactics you use are a strategic decision — and in many cases, traditional content marketing is not the best choice for small local businesses. This is a different story for SaaS (software as a service) companies and the like, which can easily scale users and deliver their product on a national or international basis. But for the small local guys, traditional content marketing can lead to a lot of head-scratching and wasted effort.

The SEO power of portfolios

This is not to say that content marketing is completely useless for small and local businesses — rather, that there is a strategic decision to make regarding the kind of content you create and how you promote it. And often, the key to smart local content marketing efforts is simply in the work that you do for your customers.

This is the content that really demonstrates what you do, where you do it and who you do it for, which is the information that really matters. Of course, “portfolio” is kind of a catch-all term — we are just as interested in case studies, reviews and testimonials as fair game for small business content marketing efforts, and often a single piece of content may contain one or more of these elements.

Consider:

  • portfolio entries;
  • case studies;
  • reviews; and
  • testimonials.

This kind of content has two main benefits:

1. Topical scope

Creating portfolio content provides very specific examples of your work. In the case of a painting and decorating company, it could be a certain kind of property in a very specific location: painting and decorating a Victorian house in Boldmere, Sutton Coldfield.

This can zoom into a hyper-specific activity or location or zoom out to be more general. This broad or specific approach can apply to the job and the location in which you operate, creating the opportunity for smart local content, which so, so important for local businesses.

Take the following examples of portfolio pieces (Note: I am in Birmingham, UK, so examples reflect my own location and areas): 

  1. Renovation of skirting boards in a Victorian house in Boldmere, Sutton Coldfield, Birmingham
  2. Repair of ceilings in 1970s semi-detached house in Walmley, Sutton Coldfield, Birmingham
  3. Complete rewiring and electrical refit of a five-bedroom, three-story Victorian house in Boldmere, Sutton Coldfield, Birmingham
  4. New heating system, radiators and pipework in five-bedroom, three-story Victorian house in Boldmere, Sutton Coldfield, Birmingham
  5. Rebuilding chimney on 1970s semi-detached house in Mere Green, Four Oaks, Birmingham
  6. Repointing of chimney on Edwardian property in Four Oaks, Sutton Coldfield, Birmingham
  7. Roof repairs to eliminate damp issues on terraced house in Wylde Green, Sutton Coldfield, Birmingham

These content pieces improve the scope of search terms you can rank for by detailing very specific jobs within your overall business category and focusing on other key details.

In the examples above, we have looked at the specific jobs in various trades, micro and metro areas, and specific types of property. All of these details would likely be missed in your traditional service pages — or poorly implemented in an effort to create catch-all service and location pages that are, more often than not, not really up to scratch.

There is a good chance your pages would now rank for search terms like:

  • builder repointing chimney mere green;
  • roof repair terraced house wylde green;
  • rebuild chimney semi detached house four oaks;
  • plumber new heating system boldmere; and
  • … many more just like this.

Sure, these are going to be low-volume terms, but they are highly specific. And with localized search removing the need to enter your location, and an ever-smarter and mobile- (and voice-) driven search landscape, consumers are searching in more detail than before.

2. Credibility

Getting folks through the digital front door is great, but you must then convince them to take action — and portfolio content again comes up trumps here.

Too much SEO thinking is done in a silo without enough consideration of the real users who will land on your pages. All too often, we see small businesses creating overstretched and over-optimized location pages that have keywords crammed in to help them rank but provide a poor landing page experience.

Creating portfolio entries, case studies, testimonials, and even reviews (which hopefully you are not creating as such and are requesting) opens you up to increased search engine traffic with real local users and provides the information these customers need to make an informed decision to do business with you.

Most local businesses are offering the same exact service as their competition, and this undifferentiated marketplace creates a difficult environment for prospects to choose Company A over Company B. Smart marketers and small businesses out there will see this obstacle as an opportunity to stand out amongst their peers with carefully crafted portfolio entries that illustrate a strong reputation — thus making it clear that they are the best choice for these weary internet browsers.

Powering up your portfolios

As ever, the best way to illustrate what I am getting at here is with examples, and the following are in part drawn from my own recent struggles to identify various contractors to help with the renovation of our new (very old) house.

I am pretty handy on a PC and the internet but utterly hopeless when it comes to the practical skills required to renovate a house. As such, I have spent a considerable amount of time on the internet trying to locate a range of local tradesmen, including electricians, plumbers, central heating specialists, plasterers, painters and decorators.

All in all, it was a nightmare to manage from behind the keyboard. Indeed, the process was so difficult that in the end, three of the four contractors I ended up working with came by referral; only one was someone I found via the internet.

This tells me that there is a huge opportunity for traditional contractors to optimize their digital presence and win more work. After all, I am about as search engine and internet savvy a user as you are going to get — so if I failed at this task, what must your average consumer make of the wasteland of small business websites?

What we commonly see, particularly around the traditional trades, are business directory sites and portals aimed squarely at users trying to find a local tradesman. These all tend to provide a raft of reviews and are highly visible, yet I found it very hard to distinguish one business from another.

In fact, a search for “plumber in birmingham,” which is a trade and the metro location where I live, returns 10 results, and eight are a directory or portal of some sort. This makes some of Google’s recent comments regarding directories somewhat curious — both Gary Illyes and John Mueller of Google seemed to imply that securing directory placements was an outdated practice or “very often not the right way to build links.”

But certainly in the UK, directories — and in particular, vertical directories — are still hugely visible in many local business categories. The following image shows that five of top six listings for “plasterer in birmingham” are directory listings, but I really want to see an actual website for these companies to aid in decision-making.

many results are still directories

When it comes down to it, what I wanted to see was that the various contractors had tackled similar jobs (experience) and had done a good job (credibility). What was out there did not fill me with confidence and enable me to do that.

Structuring your portfolio entries

The specifics here will vary for each business, but I would be looking at the following kind of loose structure as a starting point.

  1. The problem. What was the issue? Where was the pain?
  2. The solution. How did you help? What measures were taken?
  3. Testimonials and reviews. Can you get the actual client to add some feedback to this page?

This does not have to be a huge piece of content — you can simply outline the problem (damp in first floor bedroom) and detail the solution (repoint and cap chimney) along with all the other important details (type of property, location and so on). In most cases, you will want to add images, so it always makes sense to take photos of the job as you progress. You can then largely tell the story via the photos you take and keep the actual text concise and to the point.

Of course, you will need to get permission from the customer, but if you do a great job and gently stroke their ego and tell them just how happy you are with the project, then this will generally help you secure permission — and of course, this can lead to asking for that client testimonial.

TL;DR

SEO does not exist in a silo, and the lines between smart SEO, content marketing and demonstrating your credibility are forever blurred. Often, the same content can tackle these three important goals for small and local businesses.

More often than not, the best content marketing options for small and local businesses is the creation of portfolio content that widens the search terms you will be found for and demonstrates your credibility in completing jobs for your local customers.

When executed well, your portfolio content will widen the search terms you can be found for and attract more local search engine users while simultaneously demonstrating your credibility for an SEO and marketing win-win.

The post The SEO power of portfolio entries, case studies & testimonials appeared first on Search Engine Land.

Micro-moments and beyond: Understanding and optimizing for consumer intent

consumer-customer-mobile-shopping-ss-1920

Google introduced the concept of micro-moments over a year ago, and since then, the company has consistently published supporting information as it relates to specific industries and user behavior across content platforms.

If you’re unfamiliar with micro-moments, they’re essentially a way of framing a user’s path to purchase or to conversion, with specific focus on mobile and the needs or questions users search on Google along with way. The concept of micro-moments is easily digestible and provides a great way of conducting and organizing keyword research, something search marketing practitioners and decision-makers alike can certainly appreciate.

At our agency, ZOG Digital, we’ve been developing ways to comprehensively identify micro-moment opportunities for clients while mapping and optimizing to the consumer’s conversion path. The following is a high-level look at our approach and a few of the resources we use.

1. Identifying micro-moments: The consumer journey

Before you can identify micro-moment opportunities, you must understand the structure or user path and adapt it to your particular business or vertical. For instance, we categorize micro-moments for hospitality clients into Dreaming, Exploring, Planning and Booking; these buckets support each step in the consumer journey to bookings, and keyword opportunities can logically be categorized within them.

identifying

Google uses a fairly ubiquitous micro-moment structure of “I want to know,” “I want to go,” “I want to do” and “I want to buy.” Unlike the categorization structure I noted above, Google’s classification maps micro-moments to different types of consumer journeys with additional research to support best practices for search content.

Either of these examples can work, as long as consumer intent can be appropriately segmented. Keywords are the backbone of this phase and enable future content to be planned, developed and published by each opportunity category.

2. Organizing micro-moments: Defining parameters and collecting data

With keyword categorization structure understood, the next step is to map out the keyword modifiers that users will use in their path to conversion. Our philosophy is to use all available modifiers, with an understanding that not all will apply to each client. This approach allows us to cast the widest net and effectively understand the micro-moment opportunity size.

Here are some example modifiers grouped under questions and prepositions:

Questions: (Keyword) + Where, Which, Who, Why, What, How and Are

Example hospitality-related searches using questions could be “Things to Do in San Francisco” or “Where to Stay in Miami.”

sfo

Prepositions: (Keyword) + With, Without, Versus, Near, Like and For.

Example retail-based searches using prepositions could be “Tablet vs. Laptop” or “Ceiling Fan with Lights.”

tablet_laptop

At ZOG Digital, we predefine all keyword modifiers so we can map across keyword lists at scale. However, if you’re looking to define micro-moments across a small set of keywords, we recommend Answer the Public and Keyword.io as great starting points. Answer the Public predefines questions and prepositions automatically, while Keyword.io allows you to segment keyword results by questions once they’ve been retrieved.

It’s important to note that collecting micro-moment data doesn’t stop at the keyword level. To effectively understand opportunity size and prioritize tactics, consumer intent and demand needs to be identified and grouped within the aforementioned consumer journey stages. This research process provides a segue into our next step, which is building a plan for ROI.

3. Forecasting and prioritizing for ROI

The next step to moving forward with micro-moment opportunity analysis and planning is to forecast potential and prioritize for ROI. My agency developed our own tool, the Keyword Revenue Forecasting Tool, to automate this process with historical client performance data, but a basic one can be created through Excel and a few simple formulas.

First, you’ll need to determine a click-through rate by keyword position. There are numerous data sources for this — we like Advanced Web Ranking, as they regularly update their CTR data. The best option, if you have enough data, is to use Search Console and filter out branded keywords. This will then most closely resemble the CTR you can expect from each keyword position.

keyword-revenue-forecast-1

Second, you need to forecast how your rankings can improve over time. This is a bit tricky without substantial historical data, so the next best option is to look at where similar websites rank for the keywords you’re targeting. Check the domain and PageRank of the websites that rank in the top positions for each keyword. If you are within range comparatively, chances are that you have a likelihood of competing, assuming you’re conducting comprehensive on- and off-page optimization.

The improvement over time is tricky here — if you have performed SEO in the past for the site, you should be conservative and make assumptions based on performance you have observed historically.

keyword-revenue-forecast-2

Finally, you can now calculate potential return based on the metrics you have available:

(keyword position CTR) x (keyword search volume) x (organic conversion rate) x (organic average order value)

When possible, we like to make these calculations at a categorical level, applying unique conversion rate and average order volume (AOV) data to get the most accurate results.

4. Content analysis and selection

After assessing the value of keywords and micro-moments, one final step needs to occur before defining content topics and types. It’s important to examine and dissect the search results and content that currently exists for each keyword. Because Google takes into account the context of each search term and displays the most relevant results, the types of results revealed will give you an idea of the intent behind the query.

For example, a search term with modifiers like “best” or “top” may imply the user is seeking an article, blog post or list, while a search term that includes modifiers like “discount” or “buy” may suggest the user is looking for a product page.

Inspecting content types indexed in search results can inform future content that will succeed at each stage of the consumer journey. Particularly, deciphering content trends for each phase will inform the long-term content strategy for brands and agencies to begin building together.

­With micro-moments inspired by Google, savvy marketers can see the consumer journey through a new lens and gain further insights from keyword categorization. Google has recently published an article, “Micro-Moments: 5 Questions to Ask Your Agency,” that concisely summarizes many of the aforementioned steps and recommendations. We highly encourage reviewing for assessing agency partners and internal teams alike.

The post Micro-moments and beyond: Understanding and optimizing for consumer intent appeared first on Search Engine Land.

Due to a bug, AdWords Partners are told they’re Search Ad specializations have expired

google-penalty-jail-ss-1920

Many agencies with Google Partner status received an email from Google late yesterday telling them, “You’ve lost your Search Ads specialization” and will need to retake their exams.

A bug has apparently triggered the emails after a change was made yesterday and affected Partners in many markets. Steve Seeley tweeted a screenshot of the email yesterday and cautioned others to “not flip out if you got emails like this last night or this morning. It is a bug. Our entire status is blank.”

When Partners login to their Partner account to see their performance, the Company Specialization scores and spend fields are empty.

Italy-based digital marketer, Gianpaolo Lorusso tells us it’s likely a global bug, and is definitely affecting  US, Italy, Germany, Austria, Switzerland, and Spain.

We’ve reached out to Google for comment and will update when we hear back.

The post Due to a bug, AdWords Partners are told they’re Search Ad specializations have expired appeared first on Search Engine Land.

Search in Pics: Google bumper car, pop up stores & Halloween decorations

In this week’s Search In Pictures, here are the latest images culled from the web, showing what people eat at the search engine companies, how they play, who they meet, where they speak, what toys they have and more.

Google Partners pillow:

google-partners-pillow
Source: Instagram

Google Halloween:

google-halloween
Source: Instagram

google-halloween2
Source: Instagram

Really big Android:

big-android
Source: Google+

Google pop up store:

google-pop-up-store
Source: Instagram

Google bumbper car?

google-bumper-car
Source: Instagram

The post Search in Pics: Google bumper car, pop up stores & Halloween decorations appeared first on Search Engine Land.

Thursday, October 27, 2016

It’s not too late to maximize your Google Shopping in time for the holidays

webcast_500035351-ss-1920Whether you’re a beginner or a paid search pro, it’s not too late to improve your Google Shopping campaigns in time for the holidays. Join us for this November 10 webcast and learn how to optimize for search success with Google Shopping ads. Our expert panel will discuss:

  • how to create “quick wins” in your Shopping account;
  • how to combine paid search with Shopping campaigns for better results;
  • more effective use of product pricing in your Shopping strategy; and more.

Register today for “It’s Not Too Late! Maximize Your Google Shopping in Time for the Holidays,” produced by Digital Marketing Depot and sponsored by Crealytics.

The post It’s not too late to maximize your Google Shopping in time for the holidays appeared first on Search Engine Land.

SearchCap: Voice search, Bing Ads & scary SEO

searchcap-header-v2-scap

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

From Search Engine Land:

  • It’s scary how many ways SEO can go wrong
    Oct 27, 2016 by Patrick Stox

    In search engine optimization, sometimes even small errors can have a large and costly impact. Columnist Patrick Stox shares his SEO horror stories so that you can be spared this fate.

  • A Penguin’s Tale: Responding to the latest update
    Oct 27, 2016 by Dave Davies

    What should SEOs do to make the best of the new Penguin update? Perhaps not much. Columnist Dave Davies notes that while Penguin 4.0 was indeed significant, things ultimately haven’t changed that much.

  • Now you can share budgets in Bing Ads, too
    Oct 27, 2016 by Ginny Marvin

    Assign one budget across a set of campaigns to save time monitoring and adjusting allocations.

  • Comparing Google Assistant on Pixel to Apple Siri on iPhone 7
    Oct 27, 2016 by Barry Schwartz

    Watch YouTube creator Marques Brownlee have Google and Apple battle it out over their smartphone voice assistants.

  • 5 marketing predictions for the next 5 years
    Oct 26, 2016 by Digital Marketing Depot

    The marketing world continues to evolve rapidly, often without warning. So it’s vital to constantly evaluate the industry and marketplace in order to stay current and competitive when changes happen, or risk being left behind. This guide from Emarsys highlights five of the biggest trends marketers and retailers should expect to see, and how they […]

Recent Headlines From Marketing Land, Our Sister Site Dedicated To Internet Marketing:

Search News From Around The Web:

Industry

Local & Maps

Link Building

Searching

SEO

SEM / Paid Search

Search Marketing

The post SearchCap: Voice search, Bing Ads & scary SEO appeared first on Search Engine Land.

It’s scary how many ways SEO can go wrong

halloween1-pumpkins-ss-1920

We’ve all had those moments of absolute terror where we just want to crawl into the fetal position, cry and pretend the problem doesn’t exist. Unfortunately, as SEOs, we can’t stay this way for long. Instead, we have to suck it up and quickly resolve whatever went terribly wrong.

There are moments you know you messed up, and there are times a problem can linger for far too long without your knowledge. Either way, the situation is scary — and you have to work hard and fast to fix whatever happened.

Things Google tells you not to do

There are many things Google warns about in their Webmaster Guidelines:

  • Automatically generated content
  • Participating in link schemes
  • Creating pages with little or no original content
  • Cloaking
  • Sneaky redirects
  • Hidden text or links
  • Doorway pages
  • Scraped content
  • Participating in affiliate programs without adding sufficient value
  • Loading pages with irrelevant keywords
  • Creating pages with malicious behavior, such as phishing or installing viruses, trojans or other badware
  • Abusing rich snippets markup
  • Sending automated queries to Google

Unfortunately, people can convince themselves that many of these things are okay. They think spinning text to avoid a duplicate content penalty that doesn’t exist is the best option. They hear that “links are good,” and suddenly they’re trying to trade links with others. They see review stars and will fake them with markup so that they have them and stand out in the SERPs.

None of the above are good ideas, but that won’t stop people from trying to get away with something or simply misunderstanding what others have said.

Crawl and indexation issues

User-agent: *
Disallow: /

That’s all it takes — two simple lines in the robots.txt file to completely block crawlers from your website. Usually, it’s a mistake from a dev environment, but when you see it, you’ll feel the horror in the pit of your stomach. Along with this, if your website was already indexed, you’ll typically see in the SERPs:

A description for this result is not available because of this site's robots.txt

Then there’s the noindex meta tag, which can prevent a page you specify from being indexed. Unfortunately, many times this can be enabled for your entire website with a simple tick of a button. It’s an easy enough mistake to make and painful to overlook.

Even more fun is a UTF-8 BOM. Glenn Gabe had a great article on this where he explained it as such:

BOM stands for byte order mark and it’s used to indicate the byte order for a text stream. It’s an invisible character that’s located at the start of a file (and it’s essentially meaningless from an SEO perspective). Some programs will add the BOM to a text file, which … can remain invisible to the person creating the text file. And the BOM can cause serious problems when Google tries to read the file. …

[W]hen your robots.txt file contains the UTF-8 BOM, Google can choke on the file. And that means the first line (often user-agent), will be ignored. And when there’s no user-agent, all the other lines will return as errors (all of your directives). And when they are seen as errors, Google will ignore them. And if you’re trying to disallow key areas of your site, then that could end up as a huge SEO problem.

Also of note: Just because a large portion of your traffic comes from the same IP addresses doesn’t mean it’s a bad thing. A friend of mine found this out the hard way after he ended up blocking some of the IP addresses Googlebot uses while being convinced those IPs were up to no good.

Another horrific situation I’ve run into was when someone had the bright idea to block crawlers to get pages out of the index after a subdomain migration. This is never a good idea, as crawlers need to be able to access the old versions and follow the redirects to the new versions. It was made worse by the fact that the robots.txt file was actually the shared for both subdomains, and crawlers couldn’t see either the old or the new pages because of this block.

Manual penalties

google-penalty-blue-ss-1920

Just hearing the word “penalty” is scary. It means you or someone associated with the website did something wrong — very wrong! Google maintains a list of common manual actions:

  • Hacked site
  • User-generated spam
  • Spammy freehosts
  • Spammy structured markup
  • Unnatural links to your site
  • Thin content with little or no added value
  • Cloaking and/or sneaky redirects
  • Cloaking: First Click Free violation
  • Unnatural links from your site
  • Pure spam
  • Cloaked images
  • Hidden text and/or keyword stuffing

Many of these penalties are well-deserved, where someone tried to take a shortcut to benefit themselves. With Penguin now operating in real time, I expect a wave of manual penalties very soon.

A recent scary situation was a new one to me. A company had decided to rebrand and migrate to a new website, but it turned out the new website had a pure spam penalty.

Unfortunately, because Google Search Console wasn’t set up in advance of the move, the penalty was only discovered after the migration had happened.

Oops, I broke the website!

One character is all it takes to break a website. One bad piece of code, one bad setting in the configuration, one bad redirect or plugin.

I know I’ve broken many websites over the years, which is why it’s important to have a backup before you make any changes. Or better yet, set up a staging environment for testing and deployment.

Rebuilding a website

With any new website, there are many ways for things to go horribly wrong. I’m always scared when someone tells me they just got a new website, especially when they tell me after it’s already launched. I get this feeling in the pit of my stomach that something terrible just happened, and usually I’m right.

The most common issue is redirects not being done at all, or developers arguing that redirects aren’t necessary or too many redirects will slow down the website. Another common mistake I see is killing off good content; sometimes these are city pages or pages about their services, or sometimes an entire domain and all the information will be redirected to a single page.

Issues can range from very old issues that still exist — like putting all text in images — to more modern problems like “We just rebuilt our website in Angular” when there was no reason for them to ever use Angular.

Overwrote the file

This scares me the most with overwritten disavow files, especially when a copy is not made and the default action happens to overwrite, or with an .htaccess file where redirects can easily be lost. I’ve even had shared hosts overwrite .htaccess files, and of course, no email is ever sent of the changes.

I don’t even know

In my years, I’ve seen some really random and terrible things happen.

I’ve seen people lose their domain because it expired or because they unknowingly signed a contract that said they didn’t own the domain. I’ve seen second and even third websites created by other marketing companies.

There are times when canonical tags are used incorrectly or just changed randomly. I’ve seen all pages canonicalized to the home page or pages with a canonical set to a different website.

I’ve seen simple instructions that sounded like a good idea, like “make all links relative path,” end up in disaster when they made canonical URLs relative along with alternate versions of the website, such as with m. and hreflang alternate tags.

SEO is scary

It’s amazing how one little thing or one bad decision can be so costly and scary. Remember to follow the rules, plan, execute and QA your work to prevent nightmares. Share your own tales of horror with me on Twitter @patrickstox.

The post It’s scary how many ways SEO can go wrong appeared first on Search Engine Land.

A Penguin’s Tale: Responding to the latest update

google-penguin-2016d-ss-1920

For the last four-plus years now, we’ve heard a lot about Penguin. Initially announced in April 2012, we were told that this algorithm update, designed to combat web spam, would impact three percent of queries.

More recently, we’ve witnessed frustration on the part of penalized website owners at having to wait over a year for an update, after Google specifically noted one was coming “soon” in October of 2015.

In all the years of discussion around Penguin, however, I don’t believe any update has been more fraught with confusing statements and misinformation than Penguin 4.0, the most recent update. The biggest culprit here is Google itself, which has not been consistent in its messaging.

And this is the subject of this article: the peeling away of some of the recent misstated or just misunderstood aspects of this update, and more importantly, what it means for website owners and their SEOs.

So, let’s begin.

What is Penguin?

Note: We’re going to keep this section short and sweet — if you want something more in-depth, you should begin by reading Danny Sullivan’s article on the initial release of Penguin, “Google Launches ‘Penguin Update’ Targeting Webspam In Search Results.” You can also browse Search Engine Land’s Penguin Update section for all the articles written here on the topic.

The Penguin algorithm update was first announced on April 24, 2012, and the official explanation was that the algorithm targeted web spam in general. However, since the biggest losses were incurred by those engaged in manipulative link schemes, the algorithm itself was viewed as being designed to punish sites with bad link profiles.

I’ll leave it at that, with the assumption that I shouldn’t bore you with additional details on what the algorithm was designed to do. Let’s move now to the confusion.

Where’s the confusion?

Until Penguin 4.0 rolled out on September 23, 2016, there really wasn’t a lot of confusion around the algorithm. The entire SEO community — and even many outside it — knew that the Penguin update demoted sites with bad links, and it wasn’t until it was next updated that an affected site could expect some semblance of recovery.

The path was clear: a site would get hit with a penalty, the website owner would send out requests to have offending links removed, those that couldn’t be removed would be added to a disavow list and submitted, and then one would simply wait.

However, things got more complicated with this most recent update — not because the algorithm itself got any more difficult to understand, but rather because the folks at Google did.

In essence, there were only a couple of major changes with this update:

  1. Penguin now runs in real time. Webmasters impacted by Penguin will no longer have to wait for the next update to see the results of their improvement efforts — now, changes will be evident much more quickly, generally not long after a page is recrawled and reindexed.
  2. Penguin 4.0 is “more granular,” meaning that it can now impact individual pages or sections of a site in addition to entire domains; previously, it would act as a site-wide penalty, impacting rankings for an entire site.

It would seem that there isn’t a lot of room for confusion here on first glance. However, when the folks at Google started adding details and giving advice, that ended up causing a bit of confusion. So let’s look at those to get a better understanding of what we’re expected to do.

Disavow files

Rumor had it, based on statements by Google’s Gary Illyes, that a disavow file is no longer necessary to deal with Penguin-related ranking issues.

This is due to a change in how Penguin 4.0 deals with bad links: they now devalue the links themselves rather than demoting the site they’re linking to.

Now, that seems pretty clear. If you read Illyes’ statements in the article linked above, there are a few takeaways:

  1. Spam is devalued, rather than sites being demoted.
  2. There’s less need to use a disavow file for Penguin-related ranking penalties.
  3. Using the disavow file for Penguin-related issues can help Google help you, but it is more specifically useful for sites under manual review.

Here’s the problem, though — just a day earlier, the following tweets had been exchanged:

So now we have a “yes, you should use it for Penguin” and a “no, you don’t need it for Penguin.” But wait, it gets more fun. On October 4, 2016, Google Webmaster Trends Analyst John Mueller stated the following in an Office Hours Hangout:

[I]f these are problematic links that are affected [by Penguin], and you use a disavow file, then that’s a good way for us to pick that up and recognize, “Okay, this link is something you don’t want to have associated with this site.” So when we recrawl that page that is linking to you, we can drop that link from our link graph.

With regards to devaluing these low quality links instead of punishing you, in general we try to figure out what kind of spammy tactics are happening here and we say, “Well, we’ll just try to ignore this part with regards to your website.”

So… clear as mud?

The disavow takeaway

The takeaway here is that the more things change, the more they stay the same. There is no change. If you’ve used unethical link-building strategies in the past and are considering submitting a disavow file — good, you should do that. If you haven’t used such strategies, then you shouldn’t need to; if Google finds bad links to your site, they’ll simply devalue them.

Of course, it was once also claimed that negative SEO doesn’t work, meaning a disavow wasn’t necessary for bad links you didn’t build. This was obviously not the case, and negative SEO did work (and may well still), so you should be continuing to monitor your links for bad ones and adding them to your disavow file periodically. After all, if bad links couldn’t negatively impact your site, there would be no need for a disavow at all.

And so, the more things change, the more they stay the same. Keep doing what you’ve been doing.

The source site?

In a recent podcast over on Marketing Land, Gary Illyes explains that under Penguin, it’s not the target site of the link that matters, it’s the source. This doesn’t just include links themselves, but other signals a page sends to indicate that it’s likely spam.

So, what we just were informed is that the value of a link comes from the site/page it’s on and not where it’s pointing. In other words, when you’re judging your inbound links, be sure to look at the source page and domain of those links.

The more things change, the more they stay the same.

Your links are labeled

In the same podcast on Penguin, it came to light that Google places links on a page into categories, including things like:

  • footer;
  • Penguin-impacted; and
  • disavowed.

It was suggested that there are other categories, but they weren’t named. So, what really does this mean?

It means what we all pretty well knew was going on for about a decade. We now have a term to use to describe it (“labels”) rather than simply understanding that a page is divided into sections, and the sections that are the most visible and more likely to be engaged with hold the highest value (with regard to both content and links).

Additionally, we already knew that links that were disavowed were flagged as such.

There is one new side

The only really new piece of information here is that either Google has replaced a previous link weighting system (which was based on something like visibility) with a labeling system, or they have added to it. Essentially, it appears that where previously, content as a whole may have been categorized and links included in that categorization, now a link is given one or possibly multiple labels.

Link labels

So, this is a new system and a new piece of information, which brings us to…

The link labeling takeway

Knowing whether the link is being labeled or simply judged by its position on the page — and whether it’s been disavowed or not — isn’t particularly actionable. It’s academically interesting, to be sure, and I’m certain it took Google engineers many days or months to get it figured out (maybe that’s what they’ve been working on since last October). But from an SEO’s perspective, we have to ask ourselves, ”What really changed?”

Nothing. You will still be working to develop highly visible links, placed contextually where possible and on related sites. If this strays far from what you were doing, you likely weren’t doing your link building correctly to begin with. I repeat: the more things change, the more they stay the same.

But not Penguin penalties, right? Or… ?

It turns out that Penguin penalties are treated very differently in 4.0 from the way they were previously. In a discussion with Google’s Gary Illyes, he revealed that there is no sandbox for a site penalized by Penguin.

To put that in context, here’s a fuller glimpse of the conversation:

So essentially, if you get hit with a Penguin penalty, there is no trust delay in recovery — once you fix the problem and your site is recrawled, you’d bounce back.

That said, there’s something ominous about Illyes’ final tweet above. So Penguin does not require or impose a sandbox or trust-based delay… but that’s not to say there aren’t other functions in Google’s algorithm that do.

So, what are we to conclude? Avoid penalties — and while not Penguin-related, there may or may not be delays in recovering from one. Sound familiar? That’s because (surely you can say it with me by now)…

The more things change, the more they stay the same

While this was a major update with a couple of significant changes, what it ultimately means is that our SEO process hasn’t really changed at all. Our links will get picked up faster (both the good and the bad), and penalties will likely be doled out and rolled back much more reliably; however, the links we need to build and how they’re being weighted remain pretty much the same (if not identical). The use of the disavow file is unchanged, and you should still (in my opinion) watch for negative SEO.

The biggest variable here comes in their statement that Penguin is not impacted by machine learning:

I have no doubt that this is currently true. However, now that Penguin is part of the core algorithm — and as machine learning takes on a greater role in how search engines rank pages — it’s likely that it will eventually begin to control some aspects of what are traditionally Penguin algorithms.

But when that day comes, the machines will be looking for relevancy and maximized user experience and link quality signals. So the more you continue to stay focused on what you should be doing… the more it’ll stay the same.

The post A Penguin’s Tale: Responding to the latest update appeared first on Search Engine Land.

Now you can share budgets in Bing Ads, too

money-funding-invest-earnings-ss-1920

Bing Ads has been busy. In the past week, it has rolled out Expanded Text Ads and a more comprehensive campaign set up workflow that provides performance estimates along the way. On Thursday, Bing Ads announced the global release of Shared Budgets.

As in AdWords, Shared Budgets let you assign one budget to multiple campaigns. Why would you want to do this? Well, say you have a fixed budget of $1,000 a week to spend on Bing Ads campaigns. After allocating budgets to individual campaigns, you may find at the end of the week that some campaigns tapped out their budgets and could have spent more while other campaigns had budget to spare.

With Shared Budgets all campaigns, or a subset of campaigns, can have a single budget that will get doled out automatically to the campaigns that need it.

Beta tester David McIntyre from iProspect told Bing Ads, “Shared budgets definitely lightened my workload and saved time related to budget pacing. We used it to essentially combine 400+ regional targeted campaigns into one.”

Shared Budgets is available in the UI. It’s read-only in Editor and the Apps at this point. In the UI, you’ll find the option under the Shared Library.

bing-ads-shared-budgets

Do note that Shared Budgets is not supported by Import or Google Sync at this time. That means if you import updates from AdWords for an existing campaign in Bing Ads that uses a Shared Budget, the campaign budget won’t get updated. Bing Ads says it’s working on that integration for early 2017.

The post Now you can share budgets in Bing Ads, too appeared first on Search Engine Land.