When Google repackaged and released a new set of site metrics that would serve the core pieces of a webpage’s site performance, it was the tech giant’s attempt to simplify and clarify what they were looking for from websites – in order to encourage the improvement of the user experience.
Yet questions remain – of course! – over their role in the total page performance, differences between laptop and mobile, etc.
So in order to brighten some unknown areas surrounding Core Web Vitals, we have come up with some questions and answers. Let’s take a look!
Question: why are Core Web Vitals scores difference for laptop than mobile?
Answer: At the moment, the metrics used to determine page experience only applies to mobile. Having said that, if you are analyzing Core Web Vitals using a RUM tool, then it’s likely that your scores for laptop may differ to your scores for mobile.
And it would make sense, because web users have two completely different experiences, two versions of the same page – which involve things such as network connectivity, viewing size, ways of interaction, etc.
Question: do Core Web Vitals influence search engine ranking?
Answer: Core Web Vitals are integrated into the wider page metric landscape, which include mobile-friendliness, HTTPS security, safe-browsing, and intrusive interstitial guidelines. Yet with Google’s emphasis on Core Web Vitals, and when taken together, Core Web Vitals make up half of all page metrics, if you’re looking to enhance your page’s experience, then it would be best to consider Core Web Vitals first.
Question: Why are my Core Web Vitals scores poor despite my page being fast?
Answer: the same page will be experienced differently depending on a wide range of factors, such as the type of device, the network connection, the geographic location, etc. Me, experiencing a webpage at home on my laptop via a secure network, will experience it differently to you, experiencing the same page outside in a rural area on your phone via a public network.
A company can analyze and modify as much as it likes its Core Web Vitals, yet some situations are simply out of its hands.
Ultimately, depending on how you’re evaluating a ‘fast’ benchmark, remember that Core Web Vitals look at more than speed. For example, Cumulative Layout Shift speaks to users annoyances like content elements that moves around the page during its loading time. Additionally, you may also use synthetic-based testing tools that try to emulate a user, but that representation may differ from your real users.
Question: if Core Web Vitals are poor, how will this impact site traffic?
Answer: there is no conclusive connection between poor scores in Core Web Vitals and low traffic. After all, there are many major brands that have poor Core Web Vitals yet receive heavy supplies of traffic. Yet it’s important to keep in mind that slow site speeds contribute strongly to users abandoning the page and heading elsewhere.
Question: what info can I see about Core Web Vitals in the Search Console?
Answer: the Search Console releases its Core Web Vitals report, which is powered by data from the Chrome UX Report. This helps to highlight current issues with a page’s user experience.
Question: Do sessions that don’t report First Input Delay mean a bounced session?
Answer: put simply, no. First Input Delay does not include scrolling. The bounce rate, as well as the abandonment rate, can be defined as part of the analytical suite, and so are not dealt with inside of Core Web Vitals.
Question: which pages are affected by the assessment of Google’s Page Experience?
Answer: the page experience is only one chunk of the metric environment that is used to rank a page. Remember that the intent of the search query remains am incredibly strong signal in itself, which is tied more towards the text-based parts of a page. So a page with poor Core Web Vitals may still rank very well, depending on the relevancy of the search query.
Question: what is the role of Core Web Vitals for sites with a user base that uses slower networks or older devices?
Answer: Core Web Vitals, ultimately, measure the quality of a user’s page experience. The user constituency of each site differs, and some sites may have significant populations of users that may be using older devices and using slower networks.
In these cases, websites should adapt the content to ensure that these types of users are still receiving a good user experience by still meeting the recommended Core Web Vitals benchmarks.
Question: there are no errors in Lighthouse so why do I see errors on the Search Console report?
Answer: Lighthouse displays data that is based on lab data, while the Search Console displays how pages are performing based on real-world user data (i.e. field data). Lab data is ideal for debugging issues with performance yet it cannot highlight real-world user issues.
So, it is recommended to use both reports to improve the overall experience of a user on a webpage, given that they both show you different aspects of a page’s experience