Last month, Google announced some changes in how Quality Score is being reported.
While it’s been described as just a reporting change (as opposed to a change in how Quality Score is calculated), some of my respected colleagues have pointed to the news as evidence that Quality Score somehow doesn’t matter any more!
I respectfully disagree with that point of view, and I’d like to share some additional data and insights about this confusing Quality Score metric, including information on how it’s historically been calculated and reported, as well as my take on why so many SEMs get confused and frustrated by it.
Quality Score Is Essentially Click-Through Rate
At a high level, you can get a decent enough idea of what Quality Score is by graphing the volume of weighted average quality scores for search network campaigns (or keywords or ads) that have met a minimum threshold of significance and then comparing that to their associated click through rate.
Figure A: The Relationship Between Quality Score and Click Through Rate
The figure above shows a manual analysis I did based on accounts that WordStream manages. As you can see, there is a clear relationship between Quality Score (QS) and click-through rate (CTR) — the higher your CTR, the higher your Quality Score.
Don’t Blame Quality Score — Your CTR Sucks
Right off the bat, one of the things that approximately half of the Quality-Score-haters don’t realize is just how high an average CTR Google expects of your keywords and ads in the first place.
For example, the other day, a guy complained to me about having sucky Quality Scores, despite having great keywords, ads and landing pages!
I did a quick AdWords Audit of his account and learned that he had an account average CTR of 1.2% for an average position of around 2.3 — which, as you can see in my chart above, is simply way too low.
I thought he actually deserved the low Quality Scores of 2-3 he was getting. I told him to quit being so lazy and instead of complaining, try to actually get stuff done in the account — and to talk to me after he’d at least doubled his average click through rate! (Sure enough, increases in CTR led to increases in QS!)
But, I think there’s also more going on here than just people being uninformed and lazy. While there’s definitely a strong correlation between CTR and Quality Score on average, there are also plenty of cases where high CTR keywords get low Quality Scores and vice-versa. It’s almost as though AdWords is just messing with us sometimes, and we have a natural tendency to dismiss what we can’t explain. But, is there a reason explaining why we’re seeing the Quality Score numbers that we’re seeing?
Cleaning Up The Noise
The high-level Quality Score data above included all sorts of search campaigns all mashed into one, including campaigns with:
- Different average ad positions
- Different match types
- High and low impression volumes
- High spend and more moderate spend
- High advertiser competition and low competition
- And more
What if we could cut through this noise and isolate some of the variables to try to get a better sense of what’s actually happening here?
Quality Score Is Normalized By Ad Position
Google has already told us that Quality Score is normalized by ad position, the idea being that ads in more prominent ad positions are naturally predisposed to score higher average click-through rates. Thus, it would seem reasonable to have higher expected CTRs for ads in higher ad positions.
Here’s what the relationship between Quality Score and CTR looks like if I only look at campaigns with average ad positions of between 2.1 and 2.2:
Figure B: Quality Score vs. CTR for Keywords with Average Ad Position between 2.1 and 2.2
And here’s what the relationship looks like if I only look at campaigns with average ad positions of between 4.1 and 4.2:
Figure C: Quality Score vs. CTR for Keywords with Average Ad Position between 4.1 and 4.2
Analyzing the difference between the various figures, I can deduce the following:
- The slope of the line is much higher in Figure C than in Figure B. This means that when you’re in a lower ad position, smaller increases in average CTR yield larger increases in average Quality Score.
- The Y-intercept is larger in Figure C than in Figure B. Essentially, out of the gate, Google has a lower expected average click-through rate for keywords/ads in lower average positions.
While this isn’t exactly a shocking new insight here, the point is that Quality Score isn’t just a random number generator that you should ignore if you disagree with the number. And with a bit of analysis, it’s definitely possible to clean up the “noise” and figure out what’s going on by isolating different variables.
How Quality Score is actually calculated isn’t all that difficult to figure out, but you can read more about what goes into it in “Quality Score Explained By A Former Googler.” 🙂
What Does It All Mean?
1. Hopefully, I’ve convinced you that Quality Score is largely CTR-normalized by ad position — which, to Google’s credit, is really what they’ve been saying all along. So, if you’re seeing QS numbers that don’t seem fair to you, it’s very likely that your CTR just isn’t that great compared to what Google expects. The key to raising that number is usually just to raise your average CTRs (as opposed to ignoring the metric completely).
2. I’ve previously written at length about how Quality Score plays a huge role in determining your average cost per click and also your average cost per conversion. Since Quality Score is just a proxy for click-through rate, you can absolutely interchange the two concepts and the advice remains unchanged — because optimizing for Quality Score is essentially the same strategy as optimizing for CTR.
3. This certainly doesn’t mean that you shouldn’t pursue keywords that happen to have low Quality Scores/CTRs and favorable CPA metrics.
All of the data I used in my article today was based on the previous Quality Score reporting system — my article next month will focus on changes to the new Quality Score reporting system (if any).
Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.