SearchCap: ‘Yext for Food’, Google quality score & Local Ads

See details of post SearchCap: ‘Yext for Food’, Google quality score & Local Ads below

Below is what happened in search today, as reported on Search Engine Land and from other places across the web.

From Search Engine Land:

Recent Headlines From Marketing Land, Our Sister Site Dedicated To Internet Marketing:

Search News From Around The Web:

view the original article here

68% Of SEOs Do Their Work Without Log File Analysis

i hope you like this post on 68% Of SEOs Do Their Work Without Log File Analysis

A few weeks ago, I conducted a poll asking SEOs how they do their job with or without log files. I discussed how log files can be an awesome way to uncover SEO issues including crawling, indexing, and even some ranking issues.

The poll showed that most SEOs, 68%, are doing their SEO services without looking at log files. We had over 350 responses and even with my poll disclaimer, the results are pretty revealing.

Almost 30% of SEOs actually said they do not need log files to do their job and an additional 38% said they would use them if they got them, but they “rarely” are able to get the log files. Only 15% said they always get the log files and 18% said they often get them.

Here are the results of the poll:

link to main source

Yext begins to verticalize local business listings syndication with ‘Yext for Food’

See details of post Yext begins to verticalize local business listings syndication with ‘Yext for Food’ below

Business listings with more content see more engagement, tend to rank higher and perform better overall. And as more searches take place on mobile devices (and eventually smart speakers and virtual assistants), marketers will need to expose more local business attributes and enhanced data for discovery and competitive advantage.

According to previous Google research, 50 percent of smartphone users conducting local-intent searches visit business locations within 24 hours. These numbers are even higher and more immediate for restaurants, which often see searches translate into visits within a few hours or less.

TripAdvisor found that “Restaurants with hours of operation on their TripAdvisor listing see 36 percent more engagement than those without them.” Yelp reports, “Businesses who complete their profiles see, on average, 5x the customer leads each month.”

Both sites also point out the importance of images on profiles. TripAdvisor said restaurants with between 11 and 20 photos see “double the amount of diner interaction over others with no photos at all,” and Yelp reports that “a business with 1-5 reviews and at least 10 photos sees 200 percent more user views than a business with the same number of reviews and no photos.”

view the original article here

Markets with home service ads: Service-area businesses are coming back to the local results

See details of post Markets with home service ads: Service-area businesses are coming back to the local results below

After my column about Home Service Ads came out last week, I got a message from Google with some great news. They told me two things:

  1. Google plans to add pure service-area businesses (SABs) back into the local results — this includes home-based businesses.
  2. The disappearance of results for home-based businesses in markets without Home Service Ads was due to a bug (not intentional), which Google says should be resolved soon.

So, almost a year after deciding to remove service-area businesses from the local results, I’m starting to see that Google is adding them back.

Here is an example of a search result I spotted this morning.

A few days ago, it looked like this (Notice how every listing has a directions icon — meaning the address is showing on the listing):

Although owners of service-area businesses will be extremely excited about this change, service-area businesses aren’t the only listings returning to the local results.

The return of spam

One of the good things about Google’s decision to take SABs out of the results was that it eliminated the majority of spammy listings (but definitely not all of them). Looking at this one example, one of the listings that just returned to the local results is a keyword-stuffed duplicate for a business that already has a listing in a neighboring city — they are not allowed two. Their listing in the neighboring city is also using an address that doesn’t exist.

I recently shared at the State of Search event how I got 17 of the 28 home security business listings removed from the local results in one market — as they weren’t eligible for listings on Google My Business — after I combed through the competitors of a client of mine.

Spam is, unfortunately, alive and well.

The return of other junk

Not all the results that don’t qualify for a listing are necessarily “spam.” The term “spam” connotes that there is malicious intent. (“I know about the guidelines, and I don’t care that I’m breaking them because I want more business.”)

view the original article here

Google Look Out

i wish you like this post on Google Look Out

Google Look Out

link to main source

Quality score in 2017: Should you care?

See details of post Quality score in 2017: Should you care? below

You’ve got to hand it to the folks at Google — the idea of quality score is pretty brilliant. Unlike most search engines born in the ’90s, Google realized that the success of paid search advertising was directly tied to the quality and relevance of their paid search ads.

After all, if someone searches for “best dog food for rottweilers,” and the first result they see on the SERP is a handful of text ads selling Toyota hatchbacks, they aren’t likely to be wowed by your search engine. If people think your search engine is lousy, they won’t use it… which means no one will pay to advertise on your search engine, either.

But, if you incentivize advertisers to create ads that are relevant to a user’s search, you can maintain the quality of your SERP and still make money from paid search advertising.

The solution? Quality score.

Now, if you’ve been doing paid search advertising for a while, quality score probably isn’t a new concept. Paid search platforms like Google look at your click-through rate, ad relevance and landing page experience and assign your ads a quality score. As your quality score goes up, your average position tends to go up and/or your average cost per click tends to go down.

Seems simple enough, right? The better your quality score, the better your advertising results will be.

But is it really that simple? Sure, quality score is great for Google, but should optimizing quality score be a key part of your paid search advertising strategy? To answer that question, let’s take a look at the data.

Quality score and cost per conversion

When it comes to quality score and cost per click, the evidence is pretty clear: improving your quality score decreases your cost per click. Since your cost per conversion is essentially your cost per click divided by your conversion rate, you’d expect that improving your quality score would also improve your cost per conversion.

Sadly, that’s not how it actually works out.

Now, you might be thinking, But Jake, I know I’ve seen research somewhere showing how a higher quality score is associated with a lower cost per conversion. And it’s true. Odds are, you’ve probably run into an article discussing the results of this study by Wordstream or this study by Portent.

In both of these studies, cost per conversion typically dropped by around 13 to 16 percent for every point of increase in quality score.

the relationship between cost per conversion (CPA) vs. quality score in adwords

At Disruptive Advertising (my employer), we’ve audited thousands of AdWords accounts, so we decided to use our database to replicate Wordstream’s study. Not surprisingly, we got about the same results: Every point of increase in quality score resulted in a 13 percent decrease in cost per conversion.

A graph with thousands of data points (like the one above) is a bit hard to interpret, so I’ve used a small representative subset of our data to make things easier below:

Given the consistency of this data, you’re probably wondering how I can say that improving quality score does not reliably decrease cost per conversion. I mean, look at the graphs! There’s clearly a connection between quality score and cost per conversion!

Or is there?

Unfortunately, while these graphs look compelling, it turns out that the trendline has an R2 of 0.012. In non-statistical speak, that means a one-point increase in quality score only actually produces a 13 to 16 percent decrease in cost per conversion about 1 percent of the time.

Would you put a lot of time and effort into a marketing tactic that only behaves predictably 1 percent of the time? Neither would I.

Why quality score is a poor predictor

There are a lot of reasons quality score is an unreliable predictor of cost per conversion. However, I believe that the biggest reason is also the simplest reason: Quality score is Google’s metric, not yours.

Quality score matters to Google because it helps Google make money, not because it helps you make money. No one sees your ad on the SERP and thinks, “My, what a fine quality score they must have! Anyone with a quality score like that deserves my business.”

While Google cares about providing a relevant experience to their users, they don’t really care about whether or not you’re sending potential customers to your page or getting conversions at an affordable price. You got your click and they got their cash, so Google’s happy.

You, however, still need to drive conversions at an affordable price.

To do that, though, you can’t rely on the metrics Google cares about. Sure, your ad might make Google happy, but if that ad isn’t driving the right people to the right page, you could be wasting a ton of money — even on a keyword with a quality score of 10!

Case in point, over the course of our AdWords audits, we’ve discovered that the average AdWords account wastes 76 percent of its budget on keywords and search terms that never convert.

Here’s how that wasted ad spend affects your cost per conversion (using the same data subset as before):

As it turns out, this data is even scarier than the quality score data. Each 10 percent increase in wasted ad spend increases your cost per conversion by 44 to 72 percent. And, while this correlation isn’t 100 percent accurate, it has an R2 of 0.597, which means that it explains about 60 percent of your cost per conversion.

That’s a lot more compelling than 1 percent.

In fact, we’ve frequently helped clients significantly reduce their cost per conversion by reducing their wasted ad spend. For example, here’s what happened to one client as we reduced their wasted ad spend from 91 percent to 68 percent:

If you think about it, it makes sense that core account factors like wasted ad spend would have a much bigger impact on your cost per conversion than an external metric like quality score. After all, as we pointed out earlier, you can have a great quality score and still be driving people who will never buy to your site.

How to use quality score

All that being said, I still believe that quality score is a valuable metric to track and optimize. Quality score affects your cost per click and average position, which can do wonders for your account — provided that you aren’t hemorrhaging money in other areas.

If, however, you’re not wasting a ton of money on irrelevant clicks, and you feel confident in the quality of your traffic and landing page, quality score can be a great way to improve your paid search account.

First, open your AdWords account, go to the Keywords tab, and ensure that you’ve added Quality score as a column:

Next, pick a meaningful date range (I’m always partial to the last 6 to 12 weeks), and export your results as a spreadsheet. Open your spreadsheet in Excel, and create a pivot table:

The following settings will allow you to see how much you are spending on each level of quality score:

Looking at the data above, it looks like 12 percent of this client’s budget is being spent on keywords with a quality score of 1. If we assume that those ads are driving relevant traffic (maybe they’re bidding on the competition’s branded terms?), bumping the quality score of those ads up from 1 to 2 could save them thousands!

Alternatively, if you want to see exactly how much you’re spending on specific keywords with a given quality score, you can set your pivot table up like this:

In this case, I’ve included a filter for cost that allows me to see keywords with a quality score of 1 that the client has spent more than $500 on. This gives me nine high-priority keywords (representing the majority of ad spend on keywords with this quality score) to focus on, which should be a fairly workable number.

check out here

Daily Search Forum Recap: October 11, 2017

i wish you like this post on Daily Search Forum Recap: October 11, 2017

link to main source

AdWords rolls out new interface to all advertisers

testtttt

 

 

See details of post AdWords rolls out new interface to all advertisers below

Google has announced that the new AdWords interface is now available to all advertisers. The new “experience” was unveiled last year, followed by a rollout over several months, from August of last year until the present.

view the original article here

AMP up your call conversions: 5 things you need to know

See details of post AMP up your call conversions: 5 things you need to know below

see more here

Echo and Home will probably have to tell you they’re always listening — in Europe

See details of post Echo and Home will probably have to tell you they’re always listening — in Europe below

A number of Google Home Mini devices that were distributed to members of the press had a defect that caused them to record everything being said around them. This discovery renewed privacy concerns surrounding smart speakers as surreptitious listening devices in our homes.

The problem was first discovered by Android Police. Once being notified, Google investigated and fixed the issue:

The Google Home team is aware of an issue impacting a small number of Google Home Mini devices that could cause the touch control mechanism to behave incorrectly. We immediately rolled out a software update on October 7 to mitigate the issue.

Who is affected: People who received an early release Google Home Mini device at recent Made by Google events. Pre-ordered Google Home Mini purchases aren’t affected.

As a general matter, Google Home and Amazon Alexa devices must “listen” to surrounding conversations to capture “wake words” (e.g.,”Alexa,” “OK Google”) that activate them. Some privacy advocates have sounded alarms about this and expressed concern that these devices could be abused by unscrupulous law enforcement or other malevolent state actors (see Orwell’s Telescreen).

In a well-publicized criminal case in Arkansas, local prosecutors sought recordings on an Amazon Echo in a murder investigation. Amazon fought to prevent authorities from getting access to these recordings without a warrant. The defendant in the case ultimately consented to the release of any stored data, so the warrant issue was never formally ruled on by a court.

As Internet of Things devices proliferate, privacy warnings about personal data collection will intensify. It’s very likely that there will be more than 30 million smart speakers in US homes by year-end. Google and Amazon are competitively discounting and aggressively marketing them. Google’s $49 Home Mini was introduced as a low-cost answer to the Amazon Echo Dot, which Amazon just discounted to be $5 cheaper than the Mini.

These devices are also widely available in Europe, which raises the question of how they will be addressed under the forthcoming General Data Protection Regulation (GDPR) taking effect in May 2018. Millions of smart speakers will be installed in European homes by then.

In order to process “personal data,” companies must obtain opt-in consent from users:

Consent must be clear and distinguishable from other matters and provided in an intelligible and easily accessible form, using clear and plain language. It must be as easy to withdraw consent as it is to give it.​ Explicit consent is required only for processing sensitive personal data — in this context, nothing short of “opt in” will suffice. However, for non-sensitive data, “unambiguous” consent will suffice.

It’s safe to say that these devices will be “processing sensitive personal data” and that explicit consent will be required in every case.

see more here