Google Look Out

i wish you like this post on Google Look Out

Google Look Out

link to main source

Quality score in 2017: Should you care?

See details of post Quality score in 2017: Should you care? below

You’ve got to hand it to the folks at Google — the idea of quality score is pretty brilliant. Unlike most search engines born in the ’90s, Google realized that the success of paid search advertising was directly tied to the quality and relevance of their paid search ads.

After all, if someone searches for “best dog food for rottweilers,” and the first result they see on the SERP is a handful of text ads selling Toyota hatchbacks, they aren’t likely to be wowed by your search engine. If people think your search engine is lousy, they won’t use it… which means no one will pay to advertise on your search engine, either.

But, if you incentivize advertisers to create ads that are relevant to a user’s search, you can maintain the quality of your SERP and still make money from paid search advertising.

The solution? Quality score.

Now, if you’ve been doing paid search advertising for a while, quality score probably isn’t a new concept. Paid search platforms like Google look at your click-through rate, ad relevance and landing page experience and assign your ads a quality score. As your quality score goes up, your average position tends to go up and/or your average cost per click tends to go down.

Seems simple enough, right? The better your quality score, the better your advertising results will be.

But is it really that simple? Sure, quality score is great for Google, but should optimizing quality score be a key part of your paid search advertising strategy? To answer that question, let’s take a look at the data.

Quality score and cost per conversion

When it comes to quality score and cost per click, the evidence is pretty clear: improving your quality score decreases your cost per click. Since your cost per conversion is essentially your cost per click divided by your conversion rate, you’d expect that improving your quality score would also improve your cost per conversion.

Sadly, that’s not how it actually works out.

Now, you might be thinking, But Jake, I know I’ve seen research somewhere showing how a higher quality score is associated with a lower cost per conversion. And it’s true. Odds are, you’ve probably run into an article discussing the results of this study by Wordstream or this study by Portent.

In both of these studies, cost per conversion typically dropped by around 13 to 16 percent for every point of increase in quality score.

the relationship between cost per conversion (CPA) vs. quality score in adwords

At Disruptive Advertising (my employer), we’ve audited thousands of AdWords accounts, so we decided to use our database to replicate Wordstream’s study. Not surprisingly, we got about the same results: Every point of increase in quality score resulted in a 13 percent decrease in cost per conversion.

A graph with thousands of data points (like the one above) is a bit hard to interpret, so I’ve used a small representative subset of our data to make things easier below:

Given the consistency of this data, you’re probably wondering how I can say that improving quality score does not reliably decrease cost per conversion. I mean, look at the graphs! There’s clearly a connection between quality score and cost per conversion!

Or is there?

Unfortunately, while these graphs look compelling, it turns out that the trendline has an R2 of 0.012. In non-statistical speak, that means a one-point increase in quality score only actually produces a 13 to 16 percent decrease in cost per conversion about 1 percent of the time.

Would you put a lot of time and effort into a marketing tactic that only behaves predictably 1 percent of the time? Neither would I.

Why quality score is a poor predictor

There are a lot of reasons quality score is an unreliable predictor of cost per conversion. However, I believe that the biggest reason is also the simplest reason: Quality score is Google’s metric, not yours.

Quality score matters to Google because it helps Google make money, not because it helps you make money. No one sees your ad on the SERP and thinks, “My, what a fine quality score they must have! Anyone with a quality score like that deserves my business.”

While Google cares about providing a relevant experience to their users, they don’t really care about whether or not you’re sending potential customers to your page or getting conversions at an affordable price. You got your click and they got their cash, so Google’s happy.

You, however, still need to drive conversions at an affordable price.

To do that, though, you can’t rely on the metrics Google cares about. Sure, your ad might make Google happy, but if that ad isn’t driving the right people to the right page, you could be wasting a ton of money — even on a keyword with a quality score of 10!

Case in point, over the course of our AdWords audits, we’ve discovered that the average AdWords account wastes 76 percent of its budget on keywords and search terms that never convert.

Here’s how that wasted ad spend affects your cost per conversion (using the same data subset as before):

As it turns out, this data is even scarier than the quality score data. Each 10 percent increase in wasted ad spend increases your cost per conversion by 44 to 72 percent. And, while this correlation isn’t 100 percent accurate, it has an R2 of 0.597, which means that it explains about 60 percent of your cost per conversion.

That’s a lot more compelling than 1 percent.

In fact, we’ve frequently helped clients significantly reduce their cost per conversion by reducing their wasted ad spend. For example, here’s what happened to one client as we reduced their wasted ad spend from 91 percent to 68 percent:

If you think about it, it makes sense that core account factors like wasted ad spend would have a much bigger impact on your cost per conversion than an external metric like quality score. After all, as we pointed out earlier, you can have a great quality score and still be driving people who will never buy to your site.

How to use quality score

All that being said, I still believe that quality score is a valuable metric to track and optimize. Quality score affects your cost per click and average position, which can do wonders for your account — provided that you aren’t hemorrhaging money in other areas.

If, however, you’re not wasting a ton of money on irrelevant clicks, and you feel confident in the quality of your traffic and landing page, quality score can be a great way to improve your paid search account.

First, open your AdWords account, go to the Keywords tab, and ensure that you’ve added Quality score as a column:

Next, pick a meaningful date range (I’m always partial to the last 6 to 12 weeks), and export your results as a spreadsheet. Open your spreadsheet in Excel, and create a pivot table:

The following settings will allow you to see how much you are spending on each level of quality score:

Looking at the data above, it looks like 12 percent of this client’s budget is being spent on keywords with a quality score of 1. If we assume that those ads are driving relevant traffic (maybe they’re bidding on the competition’s branded terms?), bumping the quality score of those ads up from 1 to 2 could save them thousands!

Alternatively, if you want to see exactly how much you’re spending on specific keywords with a given quality score, you can set your pivot table up like this:

In this case, I’ve included a filter for cost that allows me to see keywords with a quality score of 1 that the client has spent more than $500 on. This gives me nine high-priority keywords (representing the majority of ad spend on keywords with this quality score) to focus on, which should be a fairly workable number.

check out here

No Apple Isn’t Deliberately Slowing Your iPhone So You’ll Buy A New One, And Here’s Proof

check out this post on No Apple Isn’t Deliberately Slowing Your iPhone So You’ll Buy A New One, And Here’s Proof

Everyone loves a conspiracy theory – especially if said theory lets us all believe that we’re unwilling victims being forced to hand over all our hard-earned money twice a year.

But unfortunately for everyone convinced that Apple is deliberately slowing down your iPhone to make you buy the newest model, it just isn’t true. 

Jack Taylor via Getty Images

In spite of the seeming correlation between slowness and Apple’s assembly line gearing up for the release of the next generation, a new study has definitively shown there is absolutely no truth in the claims.

The huge debunking comes courtesy of Futuremark, the company behind 3DMark – an app that tests the performance of smartphones and is designed to emulate how a real game would operate on your device.

Running a demanding series of tests, testing both GPU and CPU, the results (of which there are hundreds of thousands according to the testers) decisively show that phones don’t just drop off a cliff as they reach a certain sell by date.

Looking at data for the iPhone 5s – the effect should be most obvious on devices that have been around longest, right? Well as you can see from the chart below, there is seemingly, no effect.

FutureMark

The data, shown in a variety of charts (you can see all of them right here) only has small fluctuations in performance over a long period of time.

In fact, these variations are so slight, according to FutureMark, that they would not be perceptible to a regular everyday user.

If that wasn’t enough FutureMark go further than defending Apple and actually say that we should all be grateful for their extensive support for older models.

Our benchmarking data shows that, rather than intentionally degrading the performance of older models, Apple actually does a good job of supporting its older devices with regular updates that maintain a consistent level of performance across iOS versions,” said a spokesperson.

So then – why do we perceive this to be the case?

It seems what is far more likely is that any software updates use more space, and require more processing and power, so it drains your phone more quickly.

Not only that, but every time you update non-Apple apps, they are likely to take up more and more space. And if you don’t update them, they just start to produce more glitches as they are out of sync with your software.

Plus, if we’re all honest with ourselves, we know what the allure of a newer model being available is probably skewing how we perceive our current phone, and it is just a convenient excuse to upgrade. 

see original

iPhone Scam Using Fake iCloud Login Screen Could Trick You Into Giving Up Your Password

check out this post on iPhone Scam Using Fake iCloud Login Screen Could Trick You Into Giving Up Your Password

While Apple’s iOS software has many great features it is not perfect by any stretch.

One of those less-than-perfect examples is that sometimes you will get an awful lot of iCloud or iTunes Store login pop-ups. We all know the ones, they appear after we’ve updated our phones and seemingly won’t disappear until we’ve inputed the right password.

Well now a mobile app developer has discovered that it is shockingly easy to recreate these login boxes and then trick users into handing over their email and password.

FELIX KRAUSE

In a blog post, Felix Krause shows how you can create a fake login box that looks pretty much identical to the official Apple login box.

Comparing the two side-by-side there’s no way that a person would be able to tell them apart.

In creating the fake login box, Krause called the whole process “shockingly easy” while pointing out that it perfectly capitalises on a now almost subconscious action that we all perform.

Felix Krause

These boxes appear so often that it has just become second nature for many of us to fill them in without thinking just to get them to disappear.

So who do we protect agains these? 

As Krause points out there are a number of reasons why you’re very very unlikely to ever encounter a fake login box.

For starters they have to be built into the app, which means getting it past Apple’s very strict approval process. Secondly you would need to have downloaded a malicious app, which in turn can be avoided through some checking of the app’s permissions etc.

Most importantly though is activating two-factor authentication.

check out original

Daily Search Forum Recap: October 11, 2017

i wish you like this post on Daily Search Forum Recap: October 11, 2017

link to main source

AdWords rolls out new interface to all advertisers

testtttt

 

 

See details of post AdWords rolls out new interface to all advertisers below

Google has announced that the new AdWords interface is now available to all advertisers. The new “experience” was unveiled last year, followed by a rollout over several months, from August of last year until the present.

view the original article here

AMP up your call conversions: 5 things you need to know

See details of post AMP up your call conversions: 5 things you need to know below

see more here

Echo and Home will probably have to tell you they’re always listening — in Europe

See details of post Echo and Home will probably have to tell you they’re always listening — in Europe below

A number of Google Home Mini devices that were distributed to members of the press had a defect that caused them to record everything being said around them. This discovery renewed privacy concerns surrounding smart speakers as surreptitious listening devices in our homes.

The problem was first discovered by Android Police. Once being notified, Google investigated and fixed the issue:

The Google Home team is aware of an issue impacting a small number of Google Home Mini devices that could cause the touch control mechanism to behave incorrectly. We immediately rolled out a software update on October 7 to mitigate the issue.

Who is affected: People who received an early release Google Home Mini device at recent Made by Google events. Pre-ordered Google Home Mini purchases aren’t affected.

As a general matter, Google Home and Amazon Alexa devices must “listen” to surrounding conversations to capture “wake words” (e.g.,”Alexa,” “OK Google”) that activate them. Some privacy advocates have sounded alarms about this and expressed concern that these devices could be abused by unscrupulous law enforcement or other malevolent state actors (see Orwell’s Telescreen).

In a well-publicized criminal case in Arkansas, local prosecutors sought recordings on an Amazon Echo in a murder investigation. Amazon fought to prevent authorities from getting access to these recordings without a warrant. The defendant in the case ultimately consented to the release of any stored data, so the warrant issue was never formally ruled on by a court.

As Internet of Things devices proliferate, privacy warnings about personal data collection will intensify. It’s very likely that there will be more than 30 million smart speakers in US homes by year-end. Google and Amazon are competitively discounting and aggressively marketing them. Google’s $49 Home Mini was introduced as a low-cost answer to the Amazon Echo Dot, which Amazon just discounted to be $5 cheaper than the Mini.

These devices are also widely available in Europe, which raises the question of how they will be addressed under the forthcoming General Data Protection Regulation (GDPR) taking effect in May 2018. Millions of smart speakers will be installed in European homes by then.

In order to process “personal data,” companies must obtain opt-in consent from users:

Consent must be clear and distinguishable from other matters and provided in an intelligible and easily accessible form, using clear and plain language. It must be as easy to withdraw consent as it is to give it.​ Explicit consent is required only for processing sensitive personal data — in this context, nothing short of “opt in” will suffice. However, for non-sensitive data, “unambiguous” consent will suffice.

It’s safe to say that these devices will be “processing sensitive personal data” and that explicit consent will be required in every case.

see more here

‘High-quality content’ tips from Google’s own style guides

See details of post ‘High-quality content’ tips from Google’s own style guides below

Google has long stressed the importance of “high-quality content” but has provided little, if any, help for those seeking to create it. Until now.

Last month, Google’s Developer Relations Group publicly published five different guides aimed at helping its own creators “striving for high-quality documentation.” And “documentation,” when posted online, means digital content.

Now available:

To put this in context, consider that these documents represent just a few of the many guides Google uses internally. The information provided is not new, unique, original, or even complete. That said, Google’s Developer Documentation Style Guides are an excellent resource for anyone interested in creating the type of high-quality content that users value and search engines reward.

Each guide reinforces the idea that high-quality pages — the kind that rank well in search — are a combination of high-quality code, content and UX.

Here is a quick overview of Google’s Developer Documentation Style Guide tips for content creators:

  • Use a friendly, conversational tone with a clear purpose — somewhere between the voice you use when talking to your buds and that you’d use if you were a robot.
  • Try to sound like a knowledgeable friend who understands what users want to do.
  • Use standard American spelling, grammar, punctuation and capitalization.
  • Craft clear, concise, short sentences with simple words that users will understand.
  • Implement effective and descriptive link text.
  • Use accessible words and short sentences that will translate well to other languages.
  • Consider numbered lists for sequences of events.
  • Ensure outbound links are to sites that are “high-quality, reliable and respectable.”

Here is a quick overview of Google’s Developer Documentation Style Guide tips for developers/technical creators:

  • Consider SVG files or optimized .png files with ALT text.
  • Use tables and/or lists correctly. For example, only use a table when you have multiple columns of information.
  • Include <strong> or <b> as appropriate — <b> is for visual emphasis and <strong> is for items of strong importance.
  • Select HTTPS for embedded resources when possible, especially images, media files, CSS and scripts.
  • For HTML templates, use HTML5 in UTF-8 without byte order marks (BOMs).
  • Consider three-character hexadecimal notations instead of six characters for colors, as they are shorter and more succinct.
  • Use HTML for structure and CSS for visual style.

check out here

You Can’t Always Get What You Want, But If You Try Sometimes You Might Find Technology Gives You What You Need

check out this post on You Can’t Always Get What You Want, But If You Try Sometimes You Might Find Technology Gives You What You Need

Whether the retail industry realises it or not, it stands at the forefront of the technology revolution.

One only needs to go to an industry show these days to see that the sheer number of tech firms in attendance trying to sell into that market is staggering. These tech firms, however, are all offering to do the same thing, provide data analytics to retailers about their customers.

This is all aimed at ensuring that people are exposed to more targeted and more effective adverts depending on past behaviour, preferences and what consumers may be interested in buying next. In effect, we have a multitude of companies who I assume make a profit as they still exist, serving up to retailers the answer to the question: “what do my customers want to buy next?”

This is in an attempt to reach what many consider the Holy Grail of retail – to know exactly what an individual wants. Leaving aside the dangers and ethical issues in collating such a huge amount of personal data on people, is this really the right direction for the tech industry to be going?

Retailers naturally want to sell. That is why they exist. On the surface, the collection of data on their customers makes perfect sense: The ability to know what the people they are targeting want should dramatically improve sales.

However, there is something wrong about the basic concept of data collection that is leading retail to a dead-end and, by extension, much of technological development. If all the focus is on giving people what they want today, is there any place for innovation? In other words – looking at what they will need tomorrow.

As Henry Ford said: “If I had asked people what they wanted, they would have said faster horses.”

From the iPhone (more than 516 million sold) and iPad (more than 68 million sold) to the Rubik’s Cube (more than 350 million sold) and the Ford Model T (more than 15 million sold) the great success stories of retail came from innovation. It is what Peter Thiel called the “zero to one” moment, the creation of something that fulfils a need we didn’t know we had.

It is the fulfilment of need, not chasing wants that should be driving technology in this sector and all others. Technological development in this area is woefully lacking across all areas.

Take the example of selling ‘smart’ technology to the hotel industry. There is a multitude of companies offering smart doors, smart blinds, smart everything. All built around the idea of ‘isn’t this tech cool’ while seemingly all offering basically the same thing. Except one firm I have come across.

This company started from a customer-centric approach and considered what the hotels they targeted actually needed. The answer was to end one of the biggest costs for hotels the world over, with just $5 of technology. Their solution, an IoT system with motion sensors, timers and the like, stops the flow of water from hotel taps. This may sound like something small, but when a cup dropping into a sink and blocking the drain can lead to flooding that destroys 13 floors and costs millions to repair – as happened at one hotel – and flooding is the biggest damage cost to hotels the world over – it is a solution hotels need. Not shiny, not ‘exciting’, but needed.

This all poses the question, how do we get out of this blind alley of finding out wants, and get back to servicing needs?

The answer is to bring the person back in. People are not a collection of data points, no matter how much some may wish they were so. They are individuals who possess, as shown by Abraham Maslow, a hierarchy of needs. Rather than finding new and better ways to collect data on their customers, thereby allowing for them to be pushed more things they might want, retailers should find new and better ways to make their customers happy by fulfilling their actual needs.

One example comes immediately to mind: I was recently shopping for a new buggy for my daughter. As any parent will know, there is a monumental amount of choice in this area, with different qualities and price points. However, I only really ever considered one company, John Lewis. While at the more costly end for the type of buggy I wanted, I was happy to spend the extra. From start to finish in buying something and beyond the transaction itself, John Lewis cares about keeping its customers happy. To them, I am an individual that is treated as such, not just a set of data points to sell to.

check out original