Google's January 2020 Core Update

source: Moz.com Google’s January 2020 Core Update: Has the Dust Settled?


image: Google January 2020 core update almost done rolling out

On January 13th, MozCast measured significant algorithm flux lasting about three days (the dotted line shows the 30-day average prior to the 13th, which is consistent with historical averages) …

Google January 2020 Core Update

That same day, Google announced the release of a core update dubbed the January 2020 Core Update (in line with their recent naming conventions).

On January 16th, Google announced the update was “mostly done,” aligning fairly well with the measured temperatures in the graph above. Temperatures settled down after the three-day spike.

It appears that the dust has mostly settled on the January 2020 Core Update. Interpreting core updates can be challenging, but are there any takeaways we can gather from the data?

Which verticals were hit hardest?

MozCast is split into 20 verticals, matching Google AdWords categories. It can be tough to interpret single-day movement across categories, since they naturally vary, but here’s the data for the range of the update (January 14–16) for the seven categories that topped 100°F on January 14.

Health tops the list, consistent with anecdotal evidence from previous core updates. One consistent finding, broadly speaking, is that sites impacted by one core update seem more likely to be impacted by subsequent core updates.
Who won and who lost this time?

Winners/losers analyses can be dangerous, for a few reasons. First, they depend on your particular data set. Second, humans have a knack for seeing patterns that aren’t there. It’s easy to take a couple of data points and over-generalize. Third, there are many ways to measure changes over time.

We can’t entirely fix the first problem — that’s the nature of data analysis. For the second problem, we have to trust you, the reader. We can partially address the third problem by making sure we’re looking at changes both in absolute and relative terms. For example, knowing a site gained 100% SERP share isn’t very interesting if it went from one ranking in our data set to two. So, for both of the following charts, we’ll restrict our analysis to subdomains that had at least 25 rankings across MozCast’s 10,000 SERPs on January 14th. We’ll also display the raw ranking counts for some added perspective.

Here are the top 25 winners by % change over the 3 days of the update. The “Jan 14” and “Jan 16” columns represent the total count of rankings (i.e. SERP share) on those days.