Why is my Core Web Vitals report not updating in Google Search Console?

May 9, 2023 / Core Web Vitals / By Kathy Alice


Does this sound familiar? 

You implemented one or more fixes that should improve your Core Web Vitals metrics. Day after day, you monitor the Core Web Vitals report in Google Search Console, but nothing changes. You were so sure that you significantly improved your site’s LCP (or CLS), but now with the reports continuing to show no improvement you start questioning yourself.

Why are the Core Web Vitals reports not updating in Google Search Console?

The fact is, while the Core Web Vitals (CWV) reports can be helpful in identifying fixes, they are not ideal for getting quick feedback on the impact of your fixes. Because the CWV report is an aggregation of the previous 28 days of data, it can be slow to update. 

If you are serious about improving your CWV metrics, you’ll need to have a good understanding of how the GSC CWV report works, as well as having some alternative ways to monitor your progress.

New to Core Web Vitals?

Core Web Vitals are metrics that Google uses to measure your site’s performance in the following criteria: speed (LCP), interactivity (FID) and stability (CLS). Core Web Vitals are part of the Google Page Experience ranking factor which has rolled out for both Mobile (in 2021) and for Desktop (early 2022). 

While many SEOs don’t consider Core Web Vitals to be a strong ranking factor, passing Core Web Vitals can give you an edge against your competitors, especially in highly competitive markets. And, looking beyond SEO, speed in particular is critically important for conversion, especially on eCommerce sites. 

There is lots of explainer articles on the web about Core Web Vitals. So I’m not going to spend much time on that here. Try this article to get an overview. I would also suggest spending some some time on the web.dev site which is full of useful information.

Where people usually start their workflow is with a sample URL from the CWV report, and then run that URL through both the Page Insights Tools as well as the Performance tab in Chrome Dev Tools.These are both useful tools to narrow the issue down to the element(s) causing a problem (JS, CSS, Images, HTML, etc.). 

With the Performance tab in Chrome Dev Tools, you can find out what HTML element on your page is the LCP (Largest Contentful Paint). If you fail CLS, you can see what’s moving on the page by reviewing the timeline. You can also identify long tasks that are blocking interactivity causing poor FID or TBT scores.

Background Images

As an aside, one thing I’ve run into a lot recently is slow LCPs due to images being loaded as a background image rather via an img src tag. Unfortunately, this problem is easy to overlook as it is not highlighted by either tool currently. I was pleased to see this issue listed as one of the top recommendations from the Core Web Vitals team for 2023. If you are still diagnosing CWV problems with your site, this article could help you prioritize what to focus on. As for background images, ideally they should be moved to an img src tag, but you can also get the browser to load the image earlier with rel=”preload”.

Did you fix the pages that are in scope?

One tricky bit with the Core Web Vital (and the Page Experience) GSC reports is that they only report on pages that are getting significant traffic. 

The web.dev Page Insights tool is a helpful tool to diagnose Core Web Vitals problems, but it has a sneaky aspect to it. When you run your URLs through the tool, you will want to pay close attention to the “This URL/Origin” toggle. 

Origin Page Speed Insights results
Awesome my page passed! … Oh wait a minute

Recently a little info popup has been added, but I still think it is easy to overlook. If “This URL” is greyed out, there isn’t enough real-user data for this URL, and aggregate data for your site (“Origin”) is shown instead. 

Focus your fixes on high traffic pages

You’ll want to focus on fixing the pages that are getting enough traffic so that Page Speed Insights does show you results for that page (and not the Origin). If you only fix low traffic pages that don’t have enough CrUX data, then the GSC Reports will not improve until you fix the pages that DO get significant traffic.

What is CrUX?

CrUX is short for Chrome User Experience Report. This is a newer component in the Google Universe to be aware of. A common misunderstanding that I run into is the assumption that Core Web Vitals data is gathered by Googlebot. But CrUX, which provides the data that the Core Web Vitals ranking factor is based on, is not related at all to Googlebot crawling.

CrUX data is actually gathered from real users as they browse your websites.

Some site owners assume that since, for most sites, Googlebot primarily crawls as a mobile device (Googlebot-Smartphone), that the Mobile Core Web Vital report is more important. This might be true be if your users are primarily visiting your site using a mobile device. But if most of your traffic is Desktop, then for Core Web Vitals, I would focus there first. 

CrUX data is only gathered from Chrome browsers that have opted in. So already you can see that this limits the data, especially for low traffic pages. 

The fact that CWV is based on visitor CrUX data has some interesting ramifications. For example, your page’s LCP may vary widely based on where your users are actually located in relation to your servers (I know, it’s that pesky real world physics). Your page is likely a lot slower for a visitor from Nigeria if you are hosted in the US. Not something you would need to consider with Googlebot which always crawls out of Mountain View, California.  

If you have global traffic from many countries then consider a CDN, which can serve content to your users from different points in the world.

When it comes to passing Core Web Vitals, fixing your popular pages will have an outsized impact on your metrics.

The 28 days

You might have noticed that GSC CWV reporting sometimes can lag. For example, you may have had the experience of finding a URL that GSC reports as having a problem (ie. Mobile Usability) or isn’t indexed for a suspect reason, but when you test the URL with the URL Inspector Tool, it’s perfectly fine and indexable. Similarly there is often a lag in the Core Web Vitals and Page Experience reporting.

Note that Google has announced, that the Page Experience reports will be going away at the end of 2023. But the Core Web Vital reports will stick around.

Here’s what you need to know:

The Core Web Vitals reports are based on a rolling 28 days of CrUX data. So the most recent point on the chart is an aggregate of the previous 28 days. It’s like a moving average on a stock chart. 

So if you fix something, say on January 18th and you look at the chart the following week on January 23rd, only 4 days have passed. Those 4 days only represent 14% of the previous 28 days that the chart shows for January 23rd. You’ll likely have to wait longer so that the 28 days includes more “post fix” days. How long you have to wait will depend on how impactful the change was. 

The Wild Swings 

Maybe you have seen a chart like this:

Example of swings seen in the CWV report in Google Search Console
Swings seen in the Core Web Vitals report in Google Search Console

Sometimes you’ll see swings that go back and forth, seemingly at random. What is going on here?

If you are familiar with Core Web Vitals, you know there are thresholds that need to be achieved to get a good score. To pass Core Web Vitals, a page needs to achieve a “good” results for all three metrics (LCP, CLS and FID). 

For example, a score of 0.1 or less is a good CLS (cumulative layout shift) score. 

The three Core Web Vitals and their thresholds

This is where our analogy to a stock charts moving average breaks down. With a stock moving average you’ll always see a smooth line. Not so for our GCS Core Web Vitals and Experience reports. 

To understand why, let’s take LCP for example. Having a good LCP experience means that a visitor sees the largest HTML element (which can be an image or text) above the fold of your web page appear in 2.5 seconds or less. You’ll see a green line on the CWV report when at least 75% of your site’s visitors in the past 28 days had a good LCP experience.

What happens when that 75% slips into 74% or lower? This means that not enough users are having a good LCP experience to pass the threshold and your site’s pages (or a group of pages) have slipped into the Needs Improvement category, This is when the swings (in this particular case, the orange line will pop up and the green line will swing down) happen.

So now that we understand the CWV report better and why it may not immediately reflect the improvements that we made, what do we do? 

R.U.M. solutions

One avenue to consider is having your own real user monitoring (R.U.M.) data. Instead of relying on a delayed report that likely represents just some of your pages, you can choose the pages to monitor with your R.U.M. solution and get feedback daily on what users are actually experiencing on your site.

Some R.U.M. vendors include New Relic, Dynatrace, SpeedCurve and Pingdom. Cloudflare also provides some RUM data. None of these are free (that I’m aware of) and it is usually enterprise sites that have the budget for them.

Fortunately, there is a free R.U.M. library that you can install on your site. You’ll need to add an additional, but lightweight, JavaScript library to your site’s pages. This code will report as Events into your Google Analytics. It’s not perfect as it includes all data, including the outliers that are filtered out by the 75% percentile, but you should be able to see quickly whether your fixes worked or not. 

World graphic created by Google Analytics from LCP event data
LCP values by country as seen in a Google Analytics dashboard

One thing this library has helped me with is identify situations where LCP is failing due to many overseas visitors getting a slower experience, even though local users see the LCP appear quickly. This scenario is something you won’t see looking at lab data in the same country where your site is hosted.

Keep in mind these events can add a lot of volume to your Google Analytics and this might push you beyond the free limits if you don’t have Google Analytics 360.

Two simple monitoring solutions. 

If you made it to this point of the article, you might be in a situation where you don’t have the ability to install a JS library and don’t have the budget for a R.U.M. solution. Not to worry, I’ve got your back.

One solution would be a DebugBear subscription, which has a free trial and starts at $25 monthly. DebugBear will pull both field and lab data for your pages and over time you can easily see the trends indicating a growing problem or improving metrics. I don’t make any money if you buy a subscription, but it is a great solution I’m happy to recommend as I’ve used DebugBear for a CWV project and it was quite helpful.

graphs from debugbear
Debugbear: Each of the above rows corresponds to a URL, note how LCP is getting worse for the URL represented by row #2

Otherwise, you can use my “poor man’s” screenshot solution to track improvements. Here’s how it works.

The Page Speed Insights tool provides an expanded view of the CrUX (field) data.

Click on “Expand View” to see the percentages in each category: Good, Needs Improvement, and Poor for each Vital. Click “Collapse View” to close.

Expanded view of Page Speed Insights for a non passing page.
Expanded view of Page Speed Insights results for a URL that fails Core Web Vitals on Desktop

Now that you know about the expanded view, what you want to do is choose a representative page (or a few pages) and take screenshots of the expanded view for both Mobile and Desktop for each page every few days. If your fixes are working you should see the percentage of “Good” experiences increase, showing that you are on the right track and you just need to wait for the thresholds to be reached.

Last year I worked on a project where we optimized the images for a site. I didn’t want to wait for the CWV report to show us if we had improved LCP enough, so I used the screenshot method for two of their most traffic’ed pages. l initially took the screenshots as a baseline and then three days later I ran Page Speed Insights again and could see that Good LCP experiences had increased 4% percentage points. 8 days later, enough visitors had had a good LCP experience and we shifted into the green.  

expanded view of Page Speed Insights for Passing page
Expanded view of Page Speed Insights results for a URL that now passes Core Web Vitals on Desktop

In this particular case we added a CDN to serve the images (and we also removed an unnecessary JavaScript file). The CDN delivered properly sized images using efficient formats and it resulted in a better user experience and helped us pass Core Web Vitals for key pages.

At some point I would like to automate this solution, but that is on my ever growing list of projects that maybe one day I’ll get to. 

Other resources:

https://blog.chromium.org/2020/05/the-science-behind-web-vitals.html

https://web.dev/optimize-lcp/

About the Author Kathy Alice


Kathy Alice Brown is a SEO expert specializing in Technical SEO and Content. In her spare time she loves to get outside.

Leave a comment:


Your email address will not be published. Required fields are marked

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}

Related Posts

css.php