Tuesday, October 31, 2017

‘Ask Me Anything’ with Google’s Gary Illyes at SMX East

At last week’s SMX East conference, Google’s webmaster trends analyst Gary Illyes took questions from the dual moderators — Barry Schwartz and Michelle Robbins — as well as from the audience in a session called “Ask Me Anything.”

In this post, I will cover that question-and-answer dialogue, though what you’ll see below are paraphrases rather than exact quotes. I have grouped the questions and used section headers to help improve the flow and readability.

Off-site signals

Barry: You’ve been saying recently that Google looks at other offsite signals, in addition to links, and some of this sounded like Google is doing some form of sentiment analysis.

Gary: I did not say that Google did sentiment analysis, but others assumed that was what I meant. What I was attempting to explain is that how people perceive your site will affect your business, but will not necessarily affect how Google ranks your site. Mentions on third-party sites, however, might help you, because Google looks to them to get a better idea what your site is about and get keyword context. And that, in turn, might help you rank for more keywords.

Imagine the Google ranking algo is more like a human. If a human sees a lot of brand mentions, they will remember that, and the context in which they saw them. As a result, they may associate that brand with something that they didn’t before. That can happen with the Google algorithm as well.

Mobile First, AMP, PWAs and such

Michelle: Where should SEOs focus their efforts in 2018?

Gary: If you are not mobile-friendly, then address that. That said, I believe the fear of the mobile-first index will be much greater than the actual impact in the end.

Michelle: When will mobile-first roll out?

Gary: Google doesn’t have a fixed timeline, but I can say that we have moved some sites over to it already. We are still monitoring those sites to make sure that we are not harming them inadvertently. Our team is working really hard to move over sites that are ready to the mobile-first index, but I don’t want to give a timeline because I’m not good at it. It will probably take years, and even then, will probably not be 100 percent converted.

The mobile-first index as a phrase is a new thing, but we have been telling developers to go mobile for seven years. If you have a responsive site, you are pretty much set. But if you have a mobile site, you need to check for content parity and structured data parity between your desktop and mobile pages. You should also check for hreflang tags, and that you’ve also moved all media and images over.

Michelle: Where does AMP fit? Is AMP separate from mobile-first? Is the only AMP benefit the increased site speed?

Gary: Yes, this is correct. AMP is an alternate version of the site. If you have a desktop site, and no mobile site, but do have an AMP site, we will still index the desktop site.

Michelle: If half a site is a progressive web app (PWA), and half is responsive, how does that impact search performance?

Gary: PWAs are JavaScript apps. If they can render, they will do pretty much the same as the responsive site. However, we are currently using Chrome Version 41 for rendering, and that’s not the latest, so there are newer APIs not supported by V41. If you’re are using those APIs, you may have a problem. Google is working to get to the latest version of Chrome for rendering, which will solve that issue.

Barry: I’ve seen desktop search showing one result and a mobile device showing a different page as an AMP result.

Gary: This happens because of our emphasis on indexing mobile-friendly sites. AMP is an alternate version of the regular mobile page. First, the mobile page gets selected to be ranked. Then the AMP page gets swapped in.

Michelle: So that means AMP is inconsequential in ranking?

Gary: Yes.

Michelle: Will there be a penalty for spamming news carousels?

Gary: We get that question a lot. I do not support most penalties. I (and many others at Google) would like to have algorithms that ignore those things [like spam] and eliminate the benefit. I’ve spoken with the Top Stories team about this, and they are looking into a solution.

Michelle: What about progressive web apps (PWAs)? Do they get the same treatment as AMP, i.e., no ranking boost?

Gary: If you have a standalone app, it will show up in the mobile-first index. But if you have both a PWA and an AMP page, the AMP page will be shown.

Michelle: What if the elements removed from your mobile-first site are ads? [Would that make the AMP version rank higher?]

Gary: Your site will become faster [by adopting AMP and eliminating these ads]. The “above the fold” algorithm looks at how many ads there are, and if it sees too many, it may not let your site rank as highly as it otherwise might. But when we’re looking at whether sites are ready for the mobile-first index, we’re more concerned about parity regarding content, annotations and structured data than ads.

Michelle: What about author markup?

Gary: Because AMP pages on a media site can show up in the news carousel, the AMP team said that you shouldn’t remove the author info when you’re creating AMP pages.

Search Console

Barry: When will SEOs be able to see voice search query information in Search Console?

Gary: I have no update on that. I’m waiting for the search team leads to take action on it.

Barry: How is the Search Console beta going?

Gary: It’s going well. There are a significant number of sites in the beta. We’re getting good feedback and making changes. We want to launch something that works really well. I’m not going to predict when it will come out of beta.

Barry: When will they get a year’s worth of data?

Gary: They have started collecting the data. Not sure if it will launch. The original plan was to launch with the new UI. [Gary doesn’t know if plans have changed, or when the new UI will launch.]

Barry: Why is there no Featured Snippet data in Search Console? You built it, tested it, and then didn’t launch it.

Gary: There is internal resistance at Google. The internal team leads want to know how it would be useful to publishers. How would publishers use it?

Barry: It would give us info on voice search.

Gary: I need something to work with to argue for it (to persuade the team leads internally at Google that it would be a good thing to release).

This question about how the featured snippet data would be used was then sent to the audience.

Eric Enge (your author) spoke from the audience: I’d like to use the data to show clients just how real the move to voice search is. There are things they need to do to get ready, such as understand how interactions with their customers will change.

Michelle: So, that data could be used to drive adoption. For now, that sounds like more of a strategic insight than immediately actionable information.

Gary: The problem is that voice search has been here for a couple of years. Voice search is currently optimized for what we have, and people shouldn’t need to change anything about their sites. Maybe there will be new technologies in the future that will help users.

Michelle: I think that it’s more complicated than that. There are things that you can do with your content that will help it surface better in search, and brands can invest resources in structuring content that can handle conversations better.

Ads on Google and the user experience

Michelle: As you (Google) push organic results below the fold [to give more prominence to ads and carousels] … is that a good user experience?

Gary: I click on a lot of search ads. (Note that Googler clicks that occur on our internal network don’t count as clicks for advertisers, so this costs you nothing.)

I believe that ads in search are more relevant than the 10 blue links. On every search page, there’s pretty aggressive bidding going on for every single position. Since bids correlate to relevance and the quality of the site, this does tend to result in relevant results

Barry: Sometimes the ads are more relevant than the organic results …?

Gary: Especially on international searches.

Michelle: How is that determined?

Gary: This is done algorithmically.

Michelle: How can you compare ads to organic if the two aren’t working together?

Gary: The concept of a bidding process and the evaluation of quality are used by both sides. The separation between the groups is more about keeping the ads people who talk to clients away from the organic people, so they don’t try to influence them. The ads engineering people, they can talk to the organic side; that’s not forbidden.

Ranking factors and featured snippets

Michelle: Does Google factor non-search traffic into rankings?

Gary: First of all, search traffic is not something we use in rankings. As for other kinds of traffic, Google might see that through Analytics, but I swear we do not use Analytics data for search rankings. We also have data from Chrome, but Chrome is insanely noisy.

I actually evaluated the potential for using that data but couldn’t determine how it could be effectively used in ranking.

Barry: What about indirect signals from search traffic, such as pogosticking? Previously, Google has said that they do not use that directly for ranking.

Gary: Yes, we use it only for QA of our ranking algorithms.

Barry: At one point, Jeff Dean said that Google does use them.

Gary: I do not know what he was talking about. The RankBrain team is using a lot of different data sources. There was a long internal email thread on this topic, but I was never able to get the bottom of it.

Michelle: Is RankBrain used to validate featured snippets?

Gary: RankBrain is a generic ranking algorithm which focuses on the 10 blue links. It tries to predict what results will work better based on historical query results. The featured snippets team uses their own result algorithm to generate a good result. I have not looked into what that means on their side. RankBrain is not involved, except that it will evaluate the related blue link.

Barry: Featured snippets themselves are fascinating. You said that they are changing constantly. Please explain.

Gary: The context for that discussion was about future developments for featured snippets. The team is working around the clock to improve their relevancy. The codebase underlying it is constantly changing.

Michelle: Does the device being used by the searcher factor in?

Gary: I don’t think so.

Schema and markup

Gary: I want to live in a world where schema is not that important, but currently, we need it. If a team at Google recommends it, you probably should make use of it, as schema helps us understand the content on the page, and it is used in certain search features (but not in rankings algorithms).

Michelle: Why do you want to be less reliant on it?

Gary: I’m with Sergey and Larry on this. Google should have algorithms that can figure out things without needing schema, and there really should not be a need for penalties.

Michelle: Schema is being used as training data?

Gary: No, it’s being used for rich snippets.

Michelle: Eventually the algo will not need the schema?

Gary: I hope so. The algorithms should not need the extra data.

Barry: Is there a team actively working on that?

Gary: Indirectly, absolutely. It probably involves some sort of machine learning, and if so, it’s the Brain team that works on it. I do not know if they have an active project for that.

Barry: How did you get entity data in the past?

Gary: From Freebase and the Knowledge Graph.

Panda and thin content

Barry: You said that pruning content was a bad idea. If you’re hit by Panda, how do people proceed?

Gary: Panda is part of our core ranking algorithm. I don’t think that anyone in a responsible position at Google thinks of Panda as a penalty. It’s very similar to other parts of the algorithm. It’s a ranking algorithm. If you do something to attempt to rank higher than you should, it basically tries to remove the advantage you got, but not punish you.

Ultimately, you want to have a great site that people love. That is what Google is looking for, and our users look for that, as well. If users leave comments or mention your site on their site and things like that, that will help your ranking.

Pruning does not help with Panda. It’s very likely that you did not get Pandalyzed because of your low-quality content. It’s more about ensuring the content that is actually ranking doesn’t rank higher than it should.

Barry: Pruning bad content is advice that SEOs have been giving for a long time to try and help people deal with Panda.

Gary: I do not think that would ever have worked. It definitely does not work with the current version of the core algorithm, and it may just bring your traffic farther down. Panda basically disregards things you do to rank artificially. You should spend resources on improving content instead, but if you don’t have the means to do that, maybe remove it instead.

Using disavow

Michelle: Should you use disavow on the bad links to your site?

Gary: I have a site that gets 100,000 visits every two weeks. I haven’t looked at the links to it for two years, even though I’ve been told that it has some porn site links. I’m fine with that. I don’t use the disavow file. Don’t overuse it. It is a big gun.

Overusing it can destroy your rankings in a matter of hours. Don’t be afraid of sites that you don’t know. There’s no way you can know them all. If they have content, and they are not spammy, why would you disavow them?

Sites like this are very unlikely to hurt you, and they may help you. I personally trust the Google filters.

Barry: Penguin just ignores the links.

Gary: Penguin does that, too (Gary’s phrase implies that there other algorithms that might filter bad links out, as well).

The post ‘Ask Me Anything’ with Google’s Gary Illyes at SMX East appeared first on Search Engine Land.

SearchCap: Google service ads, local search & Halloween

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

From Search Engine Land:

Recent Headlines From Marketing Land, Our Sister Site Dedicated To Internet Marketing:

Search News From Around The Web:

The post SearchCap: Google service ads, local search & Halloween appeared first on Search Engine Land.

Customer Experience in the Age of Social Media

Join our social media and CX experts as they explain how social customer service tools can help brands provide winning digital customer experiences. They’ll discuss how to manage that experience across multiple social touch points, leverage evolving social customer service tools and platforms to deliver long-term value and act on real-time customer insights to drive social ROI.

Attend this webinar and learn:

  • social strategies that drive loyalty and advocacy throughout the customer journey.
  • social customer service response techniques that meet — and exceed — customer expectations.
  • how global brands use social networks and communities to grow their customer bases.

Register today for “CX in the Age of Social Media,” produced by Digital Marketing Depot and sponsored by Lithium.

The post Customer Experience in the Age of Social Media appeared first on Search Engine Land.

Oh, no! AdWords can now spend double your budget. Or not…

In case you hadn’t already heard, AdWords can now spend up to double your campaign’s daily budget… which is pretty darned irritating!

Fortunately, your favorite PPC superhero is here to save the day.

Yep, here I am! So let’s see if we can’t script our way out of this mess.

For 99 percent of campaigns, I’d normally recommend not using budget caps at all — I like to “tap it not cap it,” which basically means it’s better to control spend by bids (/ROI) rather than closing up shop with budgets.

However, there are certain instances where budgets are not just useful, but essential — for example, if a client has a specific budget attached to a particular campaign. Yes, Google, some people actually have limited marketing budgets!

At the very least, you should know when the overspend is happening, so you can judge for yourself whether said overspend should continue.

If you’d really like to keep a close eye on costs, have a look at our script to track your account’s spend every hour. For those who only want to be alerted when campaigns are over their budgets, this is where the new script comes in!

This latest script from Brainlabs (my employer) checks each campaign’s spend and budget. All you need to do is set a multiplier threshold — if the spend is larger than the budget multiplied by the threshold, then the campaign is labeled. You’ll get an email listing the newly labeled campaigns, along with their spend and budgets. And if you want, you can set another threshold so that if the spend gets too far over your budget, the campaign will be paused.


The script also checks if the campaign’s spend is under your labeling and pausing thresholds, so it can unlabel and unpause them. That means that when it’s a new day and nothing has been spent yet, the labels will be removed, and anything the script has paused will be reactivated. It also means that if a campaign is over budget, but you increase its budget, the labeling and status will reflect the new, increased budget.

To use the script, copy the code below into a new AdWords Script and change the settings at the top:

  • campaignNameContains and campaignNameDoesNotContain filter which campaigns the script will look at. For example, if campaignNameContains is  [“Generic”, “Competitor”] then only campaigns with names containing “generic” or “competitor” will be labeled or paused. If campaignNameDoesNotContain is [“Brand”, “Key Terms”] then any campaigns containing “brand” or “key terms” in the name will be ignored (and can overspend as they like).
    • This is not case-sensitive.
    • Leave blank, [], to include all campaigns.
    • If you need to put a double quote into campaignNameContains or campaignNameDoesNotContain, put a backslash before it.
  • email is a list of addresses that will be emailed when campaigns are labelled or paused.
    • Note that emails will be sent even when the script is being previewed and not making changes.
  • currencySymbol, thousandsSeparator and decimalMark are used to format the budget and spend numbers in the email.
  • labelThreshold determines how much the campaign must spend, compared to its budget, for the script to label it as overspending.
    • For example, if you set labelThreshold to 1, then campaigns will be labeled and you will be emailed if the spend is equal to the budget. If you set it to 1.2, then the campaign is labeled and email sent if spend is 120 percent of the budget.
  • labelName is the name of the label that will be applied to overspending campaigns.
  • Set campaignPauser to true if you want campaigns too far over their budgets to be paused. Set it to false if you do not want the script to pause campaigns, no matter how much they spend (the script will still label and email you according to the labelThreshold).
  • pauseThreshold determines how much the campaign must spend, compared to its budget, for the script to pause it (if campaignPauser is true).
    • This works the same as labelThreshold: If it is 1.2, then campaigns will be paused if their spend is 120 percent of the budget.
    • This must be greater than or equal to the labelThreshold. The script needs the paused campaigns to be labeled so it knows which to reactivate when their spend becomes lower.

Preview the script to make sure it’s working as expected (and check the logs in case there are any warnings). Then set up a schedule so the script runs hourly.

A few things to note:

  • The script only works with search and display campaigns! It can’t help with video, shopping or universal app campaigns.
  • This script can’t completely prevent your going over budget. The script only runs hourly, so the campaign can go over the spend threshold between runs. And there’s a 15- to 20-minute lag in the spend data.
  • Scheduled scripts don’t run on the hour, so campaigns will not be reactivated as soon as a new day begins. Instead, they’ll be reactivated when the script first runs on the new day, sometime between midnight and 1:00 a.m. Most campaigns receive little traffic at this time anyway — but if that’s not the case for you, you might want to set up automated rules to unpause things exactly as midnight strikes.
  • You can set labelThreshold to be less than 1. For example, if you set it as 0.9, you’ll get an email when the campaign reaches 90 percent of its budget.

The post Oh, no! AdWords can now spend double your budget. Or not… appeared first on Search Engine Land.

The ever-growing local search universe

For those who missed it, Whitespark’s overhaul of the US Local Search Ecosystem interactive tool was recently released, and it does a fantastic job of showing how vast and complex the search industry has become. The ecosystem visualizes the web of search engines, data providers, publishers, directories and other businesses that use local data about businesses to power one simple action that people do every day: search online.

For example, the infographic identifies Infogroup, Acxiom, Neustar/Localeze and Factual as the primary data aggregators, which collect and validate location data from businesses and share that data with publishers such as Apple, Bing, Foursquare and Google. (I refer to data aggregators and large publishers collectively as data amplifiers because they share a business’s location data not just directly with searchers, but also with other apps, tools, websites and businesses that, in turn, reshare that data to people across the digital world.)

In Whitespark’s words, the ecosystem “shows how business information is distributed online, who the primary data providers are, how search engines use the data, and how it flows.” The interactive tool helps you understand the importance of sharing accurate location data and the consequences of maintaining inaccurate data.

For example, because data aggregators influence a web of businesses across the ecosystem, it’s imperative that businesses meet the data formatting requirements of each aggregator. And as you can see, the ecosystem is complex:

The Local Search Ecosystem [click to enlarge]

Local search expert David Mihm originally developed this infographic in 2009, and over the years, the ecosystem has changed dramatically to reflect the rich palette of destinations that people weave together throughout the process of discovery, as well as the number of companies that influence whether a business’s location data appears as it should when, say, a searcher finds them on Facebook, Yelp or Uber.

A post on the Whitespark blog by Nyagoslav Zhekov dramatizes this evolution, tracing some of the businesses that have joined and departed. For instance, back in 2009, Apple did not even appear on the ecosystem, and Myspace did. In 2017, Apple is one of the principal data amplifiers, and Myspace is not a factor. You can tell by a quick glance of the 2009 version of the infographic how far the industry as grown:

2009 Local Search Ecosystem

Now, here’s the interesting part: As far-reaching as the new infographic is, it’s just the tip of the iceberg. The infographic does not come close to identifying all the companies that license business information from data amplifiers or use it as a starting point to build out their own curated business directory. For instance, a quick glance at the following three lists of local citation sources shows dozens of additional places where business information exists:

Many of the businesses that appear on these lists overlap with those on Whitespark’s local search ecosystem, and they have the same role: receiving and sharing location data that influences which locations appear in search results. But many names on the top citations lists didn’t make the cut and are not part of the infographic. Why? Because of two factors that influence each other:

  • Consumers are using search in more far-reaching and sophisticated ways. They’re using apps, social media sites, websites, search engines and a host of other touch points to do increasingly refined searches for things to do, places to go, services to use and things to buy. They expect the digital world to provide instant access to restaurants, plumbers, museums, tattoo parlors, places for Magic the Gathering meet-ups, places to find spoken poetry and so on. Because of this behavior, the thousands of mobile app platforms, ad networks, navigation systems, data services, social media companies, search engines, directories and so on currently using business information provided by the data amplifiers would make the infographic difficult to comprehend — similar to the Marketing Technology Landscape.
  • At the same time, the ephemeral nature of many of these tools means that the infographic would rapidly be out of date as the various startups or branches of larger organizations either sunset or consolidate into a larger entity. I find it interesting that the fundamental reason the infographic can never be a truly representative look at the scale of the local search ecosystem is the exact reason that focusing your location data management on the data amplifiers is so critical today — something the infographic illustrates well.

The 2017 local search ecosystem is a brilliant foundation to get businesses grounded in the most influential sources of location data. But as the above examples demonstrate, the scope of location data companies far exceeds the Whitespark infographic. Put another way: Consider each wedge on the infographic to be a gateway to even more specialty sites by category.

The scope of location data directories, publishers and aggregators can seem overwhelming. But if you manage multiple brick-and-mortar storefronts, don’t despair. You need not have a presence on every directory on the lists I’ve cited. It’s far more important to focus your efforts on building relationships with data amplifiers. When you share your data with the core aggregators and publishers, you create two advantages for yourself:

  • Amplifiers do the heavy lifting for you by disseminating your data among all the places that require it, however obscure, where your data appears.
  • You stay up-to-date on the emerging technologies and products that the data amplifiers create. Google alone constantly updates its algorithms and products to improve search. By having a relationship with Google — such as publishing your data on Google My Business — you are on the ground floor when product updates happen and when Google launches new products.

Understand the scope and richness of the location data ecosystem. Make sure you are constantly optimizing your data and content to be found everywhere. And let the data amplifiers help you succeed across the ecosystem.

The post The ever-growing local search universe appeared first on Search Engine Land.

Halloween Google doodle tells the story of Jinx, the lonely ghost

For this year’s Halloween holiday, Google pulled together its full team of doodlers to develop and produce a ghost story.

The story — told in a YouTube video called “Jinx’s Night Out” — is about Jinx, the lonely ghost, who wants to be part of the trick-or-treating activities, but thinks he must first find a costume to hide his true identity.

“The Doodle team took their time crafting a bewitching storyline, adding a little hocus pocus to make the designs dreadfully engaging,” writes Google doodlers on their blog. “Each sequence has its own color scheme, bringing the characters to (after)life with an entirely new animation process.”

The YouTube video doubles as today’s doodle, and includes a sharing icon, along with a link that leads to “Halloween” search results.

Four doodle team members — Melissa Crowton, Cynthia Chen, Sophie Diao and Helene Lerous — worked on backgrounds and design for the “Jinx’s Night Out” mini-movie. Doodler My-Linh Le was the producer; D.E. Levison did the video’s music, and Paulette Penzvalto was the “Scribbler.”

The Doodle team shared everything from initial sketches for Jinx to the following story board on the Google Doodle Blog:

Here’s the final video that’s being shared on Google’s US home page today, in addition to a number of its international pages:

“No bones about it, this was one of the most enjoyable doodles we have worked on,” writes the Doodle team, offering a few words of advice, “Don’t be afraid to show who you really are or let superstition get in the way of a new friendship and you’ll be a graveyard smash.”

The post Halloween Google doodle tells the story of Jinx, the lonely ghost appeared first on Search Engine Land.

Google home services ads program rebrands, expanding to 30 cities by end of 2017

Google is rebranding and rolling out its advertising and verification for local service providers that launched in beta in 2015 as Google Home Services. Now known as Google Local Services, Google announced Tuesday that the program has expanded to 17 US cities with plans to be in 30 cities by year-end.

Service providers can manage their campaigns and appointments through a new Local Services app, available on iOS and Android, rather than via AdWords Express. Businesses can control the number of leads they receive through the program by pausing and enabling their ads in the app. A personalized profile page shows reviews, contact info and unique aspects about the business. Ratings can come via Google My Business or from leads received through the program. Those reviews can then be verified by Google.

Providers can manage leads and requests throughout the day in the Local Services app.

Instead of the the typical bidding auction, leads are priced by Google for each job type in each area. Businesses can see the price of a lead when they sign up in the app. Product director of Local Services Kim Spalding said in a phone interview Monday that the pricing is based on “balancing what we know about cost of jobs and overall demand.”

Advertisers set a weekly budget determined by the number of leads they want to receive. Google won’t say specifically what factors go into the rankings in the ad unit, but Spalding said there’s a focus on quality (ratings and reviews), the ability to connect right away, location and a number of other factors.

The results appear on desktop and mobile for services categories locksmiths, plumbers, electricians, HVAC and garage door services are covered in all of the current cities, which include Atlanta, Boston, Chicago, Dallas, Detroit, Miami, New York, Philadelphia, Phoenix, Seattle, Washington DC, and the California cities of Los Angeles, Riverside, Sacramento, San Diego, San Francisco and San Jose. Some of those cities also have additional categories such as handyman and house cleaning services. The program was in just five metro areas as of July.

Three Google Local Services ads appear at the top of desktop search results for “plumber Philadelphia.”

The ad units launched on mobile in 2016.

 

The current iteration on mobile shows two results at the top. Clicking the “More” button leads to a page to enter more details about the job.

Businesses that want to participate need to go through a verification process. Each employee goes through a background check. Spalding says it takes about two weeks to sign up and get certification. Each verified business gets the Google guarantee badge that ensures Google will cover claims up to the job invoice amount if a customer is unsatisfied with the work.

Spalding says they’ve found users prefer the speed of calling to messaging for more urgent types of jobs. In these cases, the ads will often show with just the phone option. Others include both phone and messaging options. Users can also submit lead forms through the service.

Google is competing with several other players in the local services space, including Amazon and Angie’s List.

The post Google home services ads program rebrands, expanding to 30 cities by end of 2017 appeared first on Search Engine Land.

Monday, October 30, 2017

9 reasons why search marketers have been at the cutting edge of marketing technology

As marketing functions increasingly rely on technology, Scott Brinker, aka “Chief MarTech,” laid out nine reason he believes search marketers are poised for leadership as marketing becomes increasingly technology-dependent in a keynote presentation at SMX East in New York City last week.

Search marketers, of course, employ any number of tools and technologies in their work, and the industry has spawned hundreds of products and solutions. Brinker outlined how the work of search marketers touches 22 of the 49 categories he has identified in the Marketing Technology Landscape infographic he has been compiling to track the growth in marketing technology companies.

Search marketers engage with the marketing technology categories circled in blue.

Brinker, program chair for the MarTech Conference series and editor VP platform ecosystem at HubSpot, highlighted the core functions of search marketing — testing, analysis, conversion optimization and so on — that encompass the overlap of marketing, technology and management.

With more and more companies creating the role of chief marketing technologist, Brinker says search marketers have long been on the cutting edge of this growing trend.

The post 9 reasons why search marketers have been at the cutting edge of marketing technology appeared first on Search Engine Land.

SearchCap: Bing Ads grows, Amazon ad revenues & PPC audits

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

From Search Engine Land:

Recent Headlines From Marketing Land, Our Sister Site Dedicated To Internet Marketing:

Search News From Around The Web:

The post SearchCap: Bing Ads grows, Amazon ad revenues & PPC audits appeared first on Search Engine Land.

Microsoft search revenue grew 15% last quarter, after hovering around 10% previous 4 quarters

In its first quarter results of fiscal year 2018, ending September 30, 2017, Microsoft reported search advertising revenue grew by 15 percent year over year, excluding traffic acquisition costs (TAC). Gross revenue from search advertising rose by $210 million, compared to $124 million the previous quarter. The majority of Microsoft’s search advertising revenue comes from its Bing search engine.

The first quarter of the company’s fiscal year 2018 marks the first meaningful increase in revenue growth for Bing Ads since Windows 10 first came on the scene.

The bump is surprising after four stagnant quarters. Last quarter, Microsoft CFO Amy Hood said total search revenue growth would slow with the renegotiated Yahoo deal and associated change in revenue recognition having passed the one-year mark.

As it has for the previous five quarters, Microsoft cited “increased revenue per search and search volume” for the revenue growth.

LinkedIn generated $1.1 billion in revenue for the quarter, with sessions up more than 20 percent year over year. Both metrics are on par with the previous quarter. Microsoft acquired LinkedIn in December 2016 for $27 billion, nearly all of that in cash. In September, Corporate VP of Microsoft Search Advertising Rik van der Kooi said the company has begun work integrating the LinkedIn Graph with Microsoft’s Audience Intelligence Graph, which includes search history. Advertisers are keen to synthesize LinkedIn’s professional user data with Microsoft’s audience and intent data. In addition to building on LinkedIn’s business, Microsoft is focusing on using the acquisition to grow Office 365 and Dynamics 365 products. Office 365 revenue rose 42 percent over last year.

Microsoft CEO Satya Nadella told analysts on the earnings call, “You’ll see more product integration in fiscal 2018 as we continue to accelerate our innovation to connect the world’s leading professional cloud with the world’s leading professional network.”

Overall, Microsoft reported $6.58 billion in net income, or 84 cents per share, beating analysts’ expectations. That compares to $5.67 billion in profit, or $0.72 per share, the previous year.

The post Microsoft search revenue grew 15% last quarter, after hovering around 10% previous 4 quarters appeared first on Search Engine Land.

How to ensure your external PPC account audit isn’t a waste of time

PPC account audit

If you run a PPC agency, you’ll know it’s not that unusual for clients to occasionally bring in an outside auditor to review their PPC accounts.

Sometimes, your client will let you know in advance; sometimes, you’ll find out when you see a request to access the account.

And sometimes, you won’t find out until after the fact, when the final report is forwarded to you for discussion!

I completely understand why some clients like to have an outside audit of their PPC accounts. For some companies, it’s simply part of their due diligence. For others, an executive will come up with the idea and push it through. And for some, it’s impossible to resist the allure of a “free” audit.

I can also understand why clients might hesitate to inform their PPC agency of their decision. They might feel embarrassed or uncomfortable about the situation. Or they may feel ambivalent about the audit itself.

In some cases, it may be that the client doesn’t trust the agency not to do some quick “fixes” in anticipation of the audit. (Although I have to say, if you don’t trust your agency enough to let them know of the audit in advance, you definitely shouldn’t trust them to run your campaigns!)

But whatever the situation, external audits are something that PPC agencies have to expect. But what’s it like to go through one? And how could the process be improved?

Today, I’m going to tell you about a recent external audit one of our clients initiated and some of the issues the process raised.

When your client brings in an external auditor

In this case, our client let me know up front that they were bringing in an external auditor, which I appreciated. But at the same time, I was rather surprised, too. This was an account we’d held for about five years, and we had good communication with them. Moreover, we’d gotten them some excellent results, and everyone seemed very happy all round.

As we learned later, the audit came about because a different executive in the company had been approached with the offer of a free PPC audit, and he felt the company had nothing to lose. So they agreed to it.

Meanwhile, my contact at the company reassured me that they were happy with our work. She said they had worked with “good” and “bad” agencies before and knew the difference. She also recognized that the outside auditor wasn’t entirely neutral in this process. (Was this free audit a marketing strategy by the auditor? We weren’t sure. But assuredly, any “free” audit has strings attached.)

At the same time, I reminded myself that my agency had never lost a client due to an audit (knock wood!). More importantly, we had nothing to hide, and I had total confidence in my team and our work.

And who knows? Maybe the report would have some helpful recommendations. Having a fresh set of eyes on an account is never a bad idea.

Besides, how detailed would a “free” audit be?

A few days later, my client presented me with the report. And it was huge! It ran about 35 pages and was very detailed and thorough. At first, I was excited. Surely this would yield all kinds of valuable information! But once I started to dig into it, my enthusiasm started to flag.

Because as it turned out, the report suffers from two major problems:

  1. It mostly regurgitates what’s currently happening with the account.
  2. It contains a lot of incorrect assumptions.

Problem #1: A regurgitation of existing data

Unfortunately, the report didn’t contain anything surprising or new. It was mostly a detailed recounting of what was currently happening with the account. And of course, we already knew what was happening with the account.

If my client had asked, I could have easily filled her in on account details without going to an outside auditor. And my team and I do make a concerted effort to communicate with our clients. We usually have weekly or bimonthly standing calls with them, and we also provide them with relevant reports.

Is it possible that the client was looking for information we weren’t providing? Possibly. But again, if we had been alerted to this need, we would have been more than happy to provide it. (If nothing else, the lesson here is to occasionally check in with the client to see if they want more detailed, or different, reporting.)

Much more problematic than the redundancy in the report was its lack of recommendations. The vast bulk of the report was focused on current account status, not suggestions for changes or improvements — which seemed like a lost opportunity.

Problem #2: Incorrect assumptions

Another major issue with the report was that many of its conclusions were based on incorrect assumptions.

The auditor lacked the context to clearly understand what was going on with the account. Repeatedly, the auditor found “errors” that weren’t errors at all — which he would have known if he’d had more background information.

Without this context, the value of the whole audit exercise comes into question.

What kind of information was the auditor lacking? I can think of four specific areas the auditor should have inquired about before even logging into AdWords:

1. What is the company’s business? What are its goals?

Whenever we land a new client, we ask the owner or marketing team to complete an onboarding questionnaire. The questionnaire allows us to better understand their business and its goals. It only seems logical that an auditor would go through a similar process.

After all, how can you audit a PPC account when you know little about the company?

We can also extend this “context for understanding” to PPC tools. Not everything happens in AdWords. In this case, my team and I were using Google Analytics for some of our tracking, and the auditor missed this point completely.

2. What tests are the agency currently running?

As an agency, we use labels religiously to clarify what we’re doing in client accounts — especially in terms of testing. But not all agencies do. And even so, it can be impossible to capture the complexity of these tests in one little label. Auditors would need to get more detailed information outside of the account to fully understand what’s being tested and why.

For example, we were in the process of testing the “optimize for clicks” setting on some of our client’s campaigns. Of course, the auditor saw this setting selected and immediately marked it with a big red “X” in the report.

We knew (and the client knew) why were testing this setting. But the auditor didn’t — and therefore he filled several paragraphs explaining why this isn’t an optimal setting in most cases.

3. What strategies and tactics have been tried in the past and haven’t worked?

Similarly, it would be helpful for the auditor to know what things we’ve tested in the past — and the results.

For this particular client, the auditor noted that we didn’t have any non-branded keywords live. Why? Because the nature of this client’s business is seasonal. And in the past, we had heavily tested non-branded keywords in peak season, with disappointing results each time.

This year, we decided (in consultation with the client) to ditch non-branded keywords during peak season and expand our Google Display Network efforts instead. The result: a major success!

But of the course, the auditor didn’t know any of this. So he marked another big X and wrote a few more paragraphs explaining why non-branded keywords are important.

4. What projects are slated for testing in the next quarter or two?

As with all our clients, we had plans in place for testing over the next few months, including device adjustments and audience tests.

But again, the auditor wasn’t aware of these plans. When he noted their absence, he assigned more red Xs and gave more lengthy explanations for why they should be done. But we knew that already.

Make your audit worth your time

Based on this experience, I can only conclude that audits can eat up a lot of hours. The client had to spend time arranging for the audit and reviewing the report. I had to spend time reviewing the report and responding to the findings. And I can only imagine how many hours the auditor spent auditing the accounts and writing his report.

Therefore, we can conclude that even a free audit comes at a cost. So if you decide to move forward with one, whether free or not, make it worth your time by ensuring that the auditor has answers to the questions outlined above. And suggest that they put more emphasis on making recommendations than recapping current status.

Hopefully, by putting these pieces in place, you’ll end up with an accurate and valuable final report — that doesn’t immediately get filed in the circular folder.

The post How to ensure your external PPC account audit isn’t a waste of time appeared first on Search Engine Land.

13 outdated SEO tactics that should terrify you

outdated seo tactics

As we approach Halloween and our Netflix queues again fill up with all manner of spooky, startling and downright horrifying monsters, I’m reminded of another kind of monster we should all be afraid of: outdated SEO tactics.

These tactics range from harmless but ineffective (like Casper the Friendly Ghost) all the way to completely devastating (like Freddy Krueger). And much like the bad guy in so many of the horror movies we all grew up watching, these tactics never seem to die, despite common sense, SEO professionals, and even Google warning people away from them.

So today, we’re going to delve into 13 outdated SEO tactics that you should be terrified of and avoid at all costs.

1. Link and article directories

Link directories are generally useless today, with the exception of high-quality, niche-specific directories that follow strict editorial guidelines.

Long before search engines were as powerful and effective as they are today, link directories served as a way to categorize websites so that people could find what they were looking for. Thanks to the simplicity of installing and using the software that powers them, marketers’ insatiable appetite for fast and easy links, and website owners’ hunt for additional revenue streams, link directories exploded in popularity.

But, since they didn’t provide any real value to visitors, search engines began to ignore many of these link directories — and they quickly lost their effectiveness as a link-building tactic. Eventually, link directories became a toxic wasteland of low-quality links that could actually get your website penalized.

Article directories are even worse. What started off as a way to share your brilliant insight with a larger audience while earning links, this tactic was quickly abused. Marketers began using software to “rewrite” their articles and submit them to thousands of article directories at a time.

As with link directories, article directories — now bloated with low-quality content — simply hit a point at which they provided no value to visitors. Marketers just used them for fast and easy links. Indeed, the glut of low-quality content flooding the web through these article directories appeared to be the proverbial straw that broke the camel’s back right before the release of Google’s Panda update, which decimated countless websites.

With the exception of high-quality, niche-specific link directories — and you may only find one or two in any given industry — you should avoid link and article directories entirely.

2. Exact-match domains

For a while, exact-match domains (EMDs) were a hot topic because they became a silver bullet for search engine optimization. It was easy to throw up a microsite on an exact-match domain and rank far more quickly than a traditional, branded domain — often in weeks, sometimes in days.

With an EMD, your domain matches the exact keyword phrase you’re targeting. For example:

  • residentialarchitectmiami.com
  • tampacontractor.com
  • airconditioningrepairstpete.com

But much like a werewolf when the full moon wanes, EMDs quickly lost their power as Google adjusted their algorithm.

Exact-match domains have the potential to rank as well as any other domain, but they also seem to have a higher potential to be flagged for spam, either algorithmically or manually. They become an even riskier proposition when you consider that they generally aren’t as “brandable,” and as a result, the domain will generally be viewed as less trustworthy, which can reduce conversions and make link building more difficult.

3. Reciprocal linking

Search engines view a link to another site as a “vote” for that site — so reciprocal linking is essentially saying, “If you vote for me, I’ll vote for you.” This is the very definition of manipulative linking practices, yet that didn’t stop millions of marketers from blindly trading links, even with websites that had zero relevance to theirs.

Worse yet, rather than links embedded within valuable content, these links were often simply dumped on a “links” or “resources” page, sometimes broken into categorical pages, along with hundreds of other links, offering no value to visitors.

This tactic, though ineffective today, still stumbles slowly along like a putrid and rotting zombie, more than a decade after its death.

4. Flat URL architecture

This isn’t really a “tactic” as much as it is just the default way WordPress is set up, and most people don’t know that they need to change it.

Ex. 1: http://ift.tt/1f1fzRH

vs.

Ex. 2: http://ift.tt/2gOSNBS

A flat URL structure (Ex. 1) makes it more difficult for search engines to understand the hierarchy of your website because all of your pages are treated with the same level of importance, while a faceted or nested URL structure (Ex. 2) clearly communicates the importance of each page within your website in relation to every other page within your website.

The first step is to change your default permalink settings. Then, if you haven’t already, publish your second-level pages, and create corresponding blog categories; or, if they already exist, move them and set up any applicable redirects.

The slugs for your categories must exactly match the slugs for your second-level pages. This little detail is critical because it determines how search engines will value each page within your website relative to other pages within your website.

Once properly configured, each third-level page and blog post will appear as a sub-page of the applicable second-level page based on the blog category it is assigned to. In other words, each third-level page/post adds more authority to the page it appears nested under.

It’s important to think this through thoroughly because changing it later means having to redirect all of the pages in your website and potentially losing ranking.

5. Indiscriminate guest blogging

Contrary to what some people claim, guest blogging is far from dead. However, it has changed dramatically. To fully understand the context, it’s important to understand the evolution of guest blogging over the years.

Guest blogging has roots in traditional public relations. The basic premise is that you’re trying to leverage a larger, existing audience by publishing your article on an established publication. This helps you to:

  • create more exposure.
  • build authority, credibility, and trust.
  • demonstrate your expertise.
  • build a personal brand.

In the early days, you would seek out publications for guest posting opportunities based on the size and, more importantly, the relevance of their audience. The intent was to get in front of more of the right people, and this involved writing killer content that their audience would find valuable, which would usually include a short bio, and maybe even one or more links back to your own website.

Website owners attempting to keep Google happy by constantly adding fresh content were all too eager to publish a steady stream of posts from guest authors, and because links are the lifeblood of SEO, people quickly latched onto this tactic to build links and sucked the life out of it like a ravenous vampire.

Marketers soon began submitting guest posts to any website that would accept them in an attempt to acquire a link.

Your website is about construction? Great! Let me submit an article on construction trends, along with a bio that includes a link back to my crochet website — relevance be damned! The next predictable step was that many marketers began submitting completely off-topic articles, and website owners eagerly published them.

This is why we can’t have nice things.

Google understandably showed up like a mob of angry villagers with pitchforks and torches to put an end to this nonsense and, as they often do, created a lot of collateral damage in the process. Websites were penalized, and while some took years to recover, a many never did, so their owners had to start over on a new domain. A lot even went out of business.

For a while, people shied away from guest blogging, but today, it’s returned to its traditional roots.

6. Keyword stuffing

Back when search engines were only capable of interpreting simple signals, like keyword density, stuffing keywords by the truckload into a web page to make it seem more relevant was all the rage. What should have been just a few instances of a particular phrase sprinkled throughout a web page grew faster than a zombie outbreak.

This doesn’t work — and more importantly, it makes it look like you employ drunk toddlers to write your copy, which doesn’t do much to inspire trust in your company.

7. Exact-match anchor text

At one point, anchor text — the clickable text of a link — was a huge ranking factor. For example, if you wanted to rank for “Tampa contractor,” you would have tried to acquire as many links using Tampa contractor as the anchor text as you could.

Marketers predictably abused this tactic (seeing a trend yet?), and Google clamped down on it and dropped the ranking for websites with what they deemed to be unnatural amounts of keyword-rich anchor text backlinks.

The anchor text distribution for a natural link profile will generally have a lot of variety. That’s because if 100 different people linked to the same page on your website, each link would likely be used in a different context within their content. One person might link to your web page using anchor text that describes the product (“blue widgets,” for example), while another may link using anchor text that describes the price, and yet another might even link using nondescript anchor text like “click here” or something similar.

Below is an example of the anchor text distribution for Search Engine Land.

SEL anchor text

The majority of your anchor text will not be an exact match to the keyword topics you’re targeting unless they are part of your brand or domain name. And this is OK because today, rather than anchor text, Google places more emphasis on:

  • the relevance of the linking website to your website.
  • the authority of the linking website.
  • the number of relevant links from authoritative websites pointing to your website.

I wouldn’t put too much effort into controlling the specific anchor text that others use to link to your website — it’s a waste of time, and it can potentially harm your ranking if you go overboard and create an unnatural pattern. The majority of anchor text for most websites with a natural link profile will generally be for branded terms anyway.

8. Pages for every keyword variation

Keyword phrases, in the traditional thinking, are dead. The old approach involved creating a separate page for every keyword variation, but fortunately, search engines are a lot smarter today, so this isn’t necessary.

Google’s Knowledge Graph, based on latent semantic indexing, started to kill off traditional thinking, but RankBrain drove a stake into its heart. Today, websites that still follow this antiquated tactic perform a lot like the zombie hordes you see mindlessly wandering around in a George Romero movie in search of fresh brains to devour.

RankBrain is just a catchy name for Google’s machine-learning artificial intelligence system (Skynet was already taken, apparently) that helps it to better understand the user intent behind a query. It can even help Google to (appropriately) rank a web page for keyword phrases that aren’t in the content!

This means that if you write content for a page about HVAC services, RankBrain understands that it would also very likely be a good match for a user entering any of the following queries:

  • HVAC repair.
  • HVAC maintenance.
  • HVAC tune-up.
  • HVAC cleaning.

If you’ve created individual pages for each keyword variation in the past, you may be tempted to leave them and just stop doing it in the future, but that’s not enough. You need to prune the unnecessary pages, merge content that can be merged, and create any applicable 301 redirects, because these unnecessary pages will have a negative impact on how Google views your website, and how often and how thoroughly it is crawled.

So, instead of creating an individual page for every keyword phrase you want to rank for, create a more comprehensive page for a keyword topic. Using the HVAC example we mentioned earlier, this would involve creating a page about HVAC services, along with a subheading and content for each of the additional highly-related phrases.

9. Paid links

Paying for PageRank-passing links has been a clear violation of Google’s webmaster guidelines for a long time, but like the machete-wielding protagonist at Camp Crystal Lake, this one simply refuses to die.

I take a pragmatic view to buying links: They can work to improve your ranking in the short term, but you may eventually get caught and penalized, so is it really even worth it?

You might think you can be really careful — buy just a few links to get some traction and stay under Google’s radar — but that’s not going to happen. They are always hunting for both link buyers and link sellers, and it’s shockingly easy because all they have to do is follow the links.

You might get be thinking, “Pffft… I know what I’m doing, Jeremy! I’m careful when I buy links!” Sure you are. But can you say the same thing about the site owners you buy the links from? Or everyone else who buys links from them?

Let’s say Google catches one link buyer by identifying an unnatural pattern of inbound links — all they need to do next is evaluate the outbound links of anyone linking to that buyer to identify more link sellers. In turn, that will uncover more link buyers, which again uncovers more link sellers.

See how fast it all goes downhill? So just don’t buy links.

10. Low-quality content

I recently gave a presentation on digital marketing to a group of franchisees of a large national brand. While discussing the type of content they should be producing for their websites, one of the franchisees frustratedly said, “I can’t write articles for my website — it takes too much time and effort just to do what I’m doing now!”

Effective SEO requires you to regularly produce amazing content — which is, understandably, difficult for time-strapped marketers. A lack of time and resources can often lead to rushing content creation, or worse yet, outsourcing it to non-English speakers or budget services like Fiverr or Upwork. The resulting content is often the text equivalent of the unintelligible grunts from Frankenstein’s monster.

The days of simply producing content just for the sake of publishing something, are, fortunately, far behind us thanks to Google’s Panda update in 2011. Since then, the algorithm has been further refined and worked into the core algorithm.

Your content should be robust, well-written, accurate and engaging. There is no minimum or maximum ideal length; it just needs to be long enough to serve its purpose. Sometimes that may mean just a few hundred words, and other times, that may mean several thousand words.

While we’re on the subject of writing content…

11. Writing for bots rather than people

If you’ve ever seen a web page or an article that repeats a particular keyword over and over, awkwardly forces a keyword phrase into a sentence in a way that doesn’t make sense or incorporates unnecessary heading tags, then you’ve probably seen an example of someone writing for bots rather than people.

SEO has come a long way since the early days, when we had to really spell everything out in order for the search engines to understand and rank a page. You don’t need to do that anymore. Write for people, because they will be the ones buying your products or services.

12. Creating multiple interlinked websites

There are two approaches to creating multiple interlinked websites — and neither one is an effective SEO tactic today.

The first approach is interlinking several legitimate websites that you own. This is the lesser of two evils because if done properly, it won’t result in a penalty. However, it also won’t have much impact, if any, on your SEO efforts, since search engines place a high value on the number of linking root domains, not just the total number of links. Another black mark against this approach is that it reduces the resources you can direct to marketing your primary website.

An example of this being done properly would be when a residential home builder links to a mortgage company that they also own, because there is a high relevance between both websites.

The second approach, which is unquestionably black hat, is to create a series of websites just for the purpose of linking to other websites you own. Since this tactic requires you to create an ever-growing network of websites on such a scale that the only way to describe it would be a gremlin pool party, it’s an absolute certainty that you will also create a pattern that Google can identify, which will result in a penalty.

Instead of trying to build, manage and market multiple websites just to acquire a few measly links, focus your efforts on earning lots of high-quality links from other legitimate websites. An added benefit is that as those websites become more authoritative, their links to your website will become more powerful.

13. Automated link building

When links became an essential part of SEO, marketers predictably sought ways to maximize their link-building efforts using a variety of automated software programs. They blasted their links into guestbooks, blog comments and forums, submitted their websites to bookmarking services and link directories and spun poorly written articles by the thousands, for submission to every article directory they could find.

I’m all for automating certain tasks to improve efficiency within your business, but link building is not one of them because the only kind of links that can be built this way violate Google’s webmaster guidelines.

You can call me a purist, but there is simply no way to automate high-quality link building. That requires creating amazing content and developing relationships to earn links to it. There are no shortcuts.

The post 13 outdated SEO tactics that should terrify you appeared first on Search Engine Land.

Rothy’s uses Twitter Ads to drive brand awareness

Creating an impactful social strategy can be a difficult task. How do you ensure your content is on brand? How can you leverage social to drive brand awareness?

One brand that stands out with their engaging Tweets and impactful Twitter Ads is sustainable fashion start-up @rothys. We chatted with their marketing team to learn about their Twitter strategy.

Tell us a bit about yourself.

Lacey Young: I’m the brand manager here at Rothy’s. I’m passionate about content creation, social strategy, and leveraging digital media to build brands.

Jenny Robinson: And I’m Rothy’s interim VP of e-commerce and Digital Marketing. Previously I led e-commerce and digital marketing for luxury beauty brands.

What does a typical work day look like for you?

We are a lean team at a fast-growing brand, so we wear many hats. Our team focuses on brand strategy, acquisition, and honing our messaging. We tackle everything from marketing strategy, site optimization, and performance marketing to the planning and execution of photo shoots and content creation. Part of our time is spent interacting with our community on social media and figuring out how best to delight our customers.

All while wearing our Rothy’s, of course.

You all have done a great job at getting your brand out there. What’s your marketing strategy?

Word-of-mouth marketing from our passionate customers plays a critical role in our marketing strategy. Another key piece is focusing on reaching potential customers online where they are — like on Twitter. By leaning on the visual appeal and stylishness of the shoe, we let the product do most of the talking in our marketing.

How does Twitter fit into your marketing mix?

We use a data-driven approach with our marketing initiatives to ensure we have the ideal media mix across channels. Twitter acts as a catalyst for reaching people where other platforms can’t — when people are on the go, looking to consume content. This is in contrast to the way people behave on other platforms, where they may not be looking to engage with brands.

Describe the Rothy’s brand.

Rothy’s are a chic, front-of-the-closet shoe, designed for life on the go. They are made from recycled materials and are out the box comfortable. We make them with little waste, caring about the planet as much as good design.

What are a few Tweet examples that you think really capture the Rothy’s brand?

Part of telling our brand story is educating our customer on Rothy’s product benefits. The shoes are machine washable, for example, which someone might not realize right away.

Our team members and loyal customers are important to our brand. We highlight them by sharing their beautiful images of them wearing their Rothy’s.

We stand out from other fashion brands by using strong imagery that features our sustainable materials.

How do you use Twitter Ads to amplify your marketing efforts?

Twitter is instrumental in building and maintaining awareness for Rothy’s. Part of our marketing strategy is to maintain brand awareness wherever our customers’ eyeballs are. Since both prospective and current customers are consuming content and engaging on Twitter, serving ads on the platform lets us be in the right place at the right time — which is perfect for our growing brand.

How do you plan and create Twitter Ads that will resonate with your target audience?

When people scroll through their Twitter feed, they’re looking for concise, unique content, more than other platforms. People want to see information that is relevant and interesting to them — at lightning speed.

In order to catch someone’s attention in such a fast-paced environment, it’s important to have original creative that’s eye-catching. For us, this means bright images and short copy that pops. The fact that people scroll through their feed quickly scanning for interesting content drives all of our decisions when crafting creative and messaging.

Here are a few standout Twitter Ads which feature some of our secret sauce:

Any final tips for brands trying to increase brand awareness with Twitter?

Here are our top three tips to keep in mind when creating Twitter Ads:

  • Pair eye-catching creatives with concise copy. This is paramount to grabbing people’s attention.
  • Make your bids in line with performance goals, optimizing for each audience and ads creative combination.
  • Ensure that your audience targeting is relevant to your brand and target demo. Don’t forget retargeting and email match!

Know a brand doing interesting things on Twitter? Tweet us @TwitterBusiness.

The post Rothy’s uses Twitter Ads to drive brand awareness appeared first on Search Engine Land.

Friday, October 27, 2017

Amazon Q3 ad revenues surpass $1 billion, up roughly 2X from early 2016

Yesterday Amazon announced third-quarter earnings. The company reported sales growth of 34 percent to $43.7 billion. A year ago Amazon reported $32.7 billion in sales.

For purposes of this post, the notworthy part is Amazon’s “other” revenue, which is basically advertising. Buried at the bottom of the Net Sales chart in the press release was this line item:

Other is defined by Amazon to include “sales not otherwise included above, such as certain advertising services and our co-branded credit card agreements.” It’s a safe bet then that ad sales for the quarter were $1+ billion, which represented 58 percent year over year growth. Since Q2 of 2016 ad sales have basically doubled.

On the earnings call, Amazon CFO Brian Olsavsky said “Advertising revenue continues to grow very quickly and its year-over-year growth rate is actually faster than the other revenue line item that you see there [in the ‘other’ category].”

The fact that Amazon is now on par with or surpasses Google in product search is not lost on retailers and brand advertisers. Reflective of the company’s intensifying effort to attract ad revenues from search marketers and agencies, Amazon made its first appearance at SMX East in New York City this week to promote Amazon Marketing Services advertising offerings.

The post Amazon Q3 ad revenues surpass $1 billion, up roughly 2X from early 2016 appeared first on Search Engine Land.

SearchCap: Google search location changes, GOOG earnings & Facebook marketplace

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

From Search Engine Land:

Recent Headlines From Marketing Land, Our Sister Site Dedicated To Internet Marketing:

Search News From Around The Web:

The post SearchCap: Google search location changes, GOOG earnings & Facebook marketplace appeared first on Search Engine Land.