Skip to main content

A Different Way of Thinking About Core Updates

These days, Google algorithm updates seem to come in two main flavors. There’s very specific updates — like the Page Experience Update or Mobile-Friendly Update — which tend to be announced well in advance, provide very specific information on how the ranking factor will work, and finally arrive as a slight anti-climax. I’ve spoken before about the dynamic with these updates. They are obviously intended to manipulate the industry, and I think there is also a degree to which they are a bluff.

This post is not about those updates, though, it is about the other flavor. The other flavor of updates is the opposite: they are announced when they are already happening or have happened, they come with incredibly vague and repetitive guidance, and can often have cataclysmic impact for affected sites.

Coreschach tests

Since March 2018, Google has taken to calling these sudden, vague cataclysms “Core Updates”, and the type really gained notoriety with the advent of “Medic” (an industry nickname, not an official Google label), in August 2018. The advice from Google and the industry alike has evolved gradually over time in response to changing Quality Rater guidelines, varying from the exceptionally banal (“make good content”) to the specific but clutching at straws (“have a great about-us page”). To be clear, none of this is bad advice, but compared to the likes of the Page Experience update, or even the likes of Panda and Penguin, it demonstrates an extremely woolly industry picture of what these updates actually promote or penalize. To a degree, I suspect Core Updates and the accompanying era of “EAT” (Expertise, Authoritativeness, and Trust) have become a bit of a Rorschach test. How does Google measure these things, after all? Links? Knowledge graphs? Subjective page quality? All the above? Whatever you want to see?

If I am being somewhat facetious there, it is born out of frustration. As I say, (almost) none of the speculation, or the advice it results in, is actually bad. Yes, you should have good content written by genuinely expert authors. Yes, SEOs should care about links. Yes, you should aim to leave searchers satisfied. But if these trite vagaries are what it takes to win in Core Updates, why do sites that do all these things better than anyone, lose as often as they win? Why does almost no site win every time? Why does one update often seem to undo another?

Roller coaster rides

This is not just how I feel about it as a disgruntled SEO — this is what the data shows. Looking at sites affected by Core Updates since and including Medic in MozCast, the vast majority have mixed results.

Meanwhile, some of the most authoritative original content publishing sites in the world actually have a pretty rocky ride through Core Updates.

I should caveat: this is in the MozCast corpus only, not the general performance of Reuters. But still, these are real rankings, and each bar represents a Core Update where they have gone up or down. (Mostly, down.) They are not the only ones enjoying a bumpy ride, either.

The reality is that pictures like this are very common, and it’s not just spammy medical products like you might expect. So why is it that almost all sites, whether they be authoritative or not, sometimes win, and sometimes lose?

The return of the refresh

SEOs don’t talk about data refreshes anymore. This term was last part of the regular SEO vocabulary in perhaps 2012.

This was the idea that major ranking fluctuation was sometimes caused by algorithm updates, but sometimes simply by data being refreshed within the existing algorithm — particularly if this data was too costly or complex to update in real time. I would guess most SEOs today assume that all ranking data is updated in real time.

But, have a look at this quote from Google’s own guidance on Core Updates:

“Content that was impacted by one might not recover—assuming improvements have been made—until the next broad core update is released.”

Sounds a bit like a data refresh, doesn’t it? And this has some interesting implications for the ranking fluctuations we see around a Core Update.

If your search competitor makes a bunch of improvements to their site, then when a Core Update comes round, under this model, you will suddenly drop. This is no indictment of your own site, it’s just that SEO is often a zero sum game, and suddenly a bunch of improvements to other sites are being recognized at once. And if they go up, someone must come down.

This kind of explanation sits easily with the observed reality of tremendously authoritative sites suffering random fluctuation.

Test & learn

The other missing piece of this puzzle is that Google acknowledges its updates as tests:

This sounds, at face value, like it is incompatible with the refresh model implied by the quote in the previous section. But, not necessarily — the tests and updates referred to could in fact be happening between Core Updates. Then the update itself simply refreshes the data and takes in these algorithmic changes at the same time. Or, both kinds of update could happen at once. Either way, it adds to a picture where you shouldn’t expect your rankings to improve during a Core Update just because your website is authoritative, or more authoritative than it was before. It’s not you, it’s them.

What does this mean for you?

The biggest implication of thinking about Core Updates as refreshes is that you should, essentially, not care about immediate before/after analysis. There is a strong chance that you will revert to mean between updates. Indeed, many sites that lose in updates nonetheless grow overall.

The below chart is the one from earlier in this post, showing the impact of each Core Update on the visibility of www.reuters.com (again — only among MozCast corpus keywords, not representative of their total traffic). Except, this chart also has a line showing how the total visibility nonetheless grew despite these negative shocks. In other words, they more than recovered from each shock, between shocks.

Under a refresh model, this is somewhat to be expected. Whatever short term learning the algorithm does is rewarding this site, but the refreshes push it back to an underlying algorithm, which is less generous. (Some would say that that short term learning could be driven by user behavior data, but that’s another argument!)

The other notable implication is that you cannot necessarily judge the impact of an SEO change or tweak in the short term. Indeed, causal analysis in this world is incredibly difficult. If your traffic goes up before a Core Update, will you keep that gain after the update? If it goes up, or even just holds steady, through the update, which change caused that? Presumably you made many, and equally relevantly, so did your competitors.

Experience

Does this understanding of Core Updates resonate with your experience? It is, after all, only a theory. Hit us up on Twitter, we’d love to hear your thoughts!

Comments

Popular posts from this blog

How to Identify and Refresh Outdated Content

When someone regularly adds new content to their sites, they face an inevitable question: What happens to my older articles? The way blogging works is really unfair to your past work: It gets buried in archives, losing traffic and relevance. Is there a way to keep your content always up-to-date? Yes, but first let’s discuss the why. Why update your content? Keeping your content fresh and updated is more than overcoming the unfairness of your past work fading away. It's actually a legit marketing tactic that saves money and makes your users’ on-site experience smoother. So let’s dive into why updating old content is so important: 1. User experience The most obvious reason is that you want each of your site pages to be an effective entry landing page: Outdated content and broken links will likely result in bounces. These are lost leads and clients. 2. Search engine optimization When it comes to SEO, content updates offer quite a few advantages: Maintaining more consisten...

How Your Local Business Can Be a Helper

Posted by MiriamEllis “When I was a boy and I would see scary things in the news, my mother would say to me, ‘Look for the helpers. You will always find people who are helping.’ To this day, especially in times of disaster, I remember my mother's words, and I am always comforted by realizing that there are still so many helpers — so many caring people in this world.” — Fred Rogers This quote is one I find myself turning to frequently these days as a local SEO. It calls to mind my irreplaceable neighborhood grocer. On my last essential run to their store, they not only shared a stashed 4-pack of bath tissue with me, but also stocked their market with local distillery-produced hand sanitizer which I was warned will reek of bourbon, but will get the job done. When times are hard, finding helpers comes as such a relief. Even the smallest acts that a local business does to support physical and mental health can be events customers remember for years to come. While none of us gets ...

How to Create a Useful and Well-Optimized FAQ Page

Posted by AnnSmarty The golden rule of marketing has always been: Don’t leave your customer wondering, or you’ll lose them. This rule also applies very well to SEO: Unless Google can find an answer — and quickly — they’ll pick and feature your competitor. One way to make sure that doesn’t happen is having a well set-up, well-optimized FAQ page. Your FAQ is the key to providing your customers and search engines with all the answers they might need about your brand. Why create an FAQ page? Decrease your customer support team’s workload. If you do it right, your FAQ page will be the first point of contact for your potential customers — before they need to contact you directly. Shorten your customers’ buying journeys. If your site users can find all the answers without having to hear back from your team, they’ll buy right away. Build trust signals. Covering your return policies, shipping processes, and being transparent with your site users will encourage them to put more trust into...