Digital Publishing Guest Columns
6 mins read

100 points for Google Web Core Vitals: Here’s how we did it

Getting your Trinity Audio player ready...

In May 2020, Google introduced its Core Web Vitals, a set of metrics designed to measure website page ‘health’ in terms of a seamless user experience. Made up of three components (content loading speed, interactivity, and visual stability), Core Web Vitals are important for SEO, and play a key role in a publisher’s visibility and ranking in browsers. In this feature, publisher AmoMama explains exactly how they achieved a perfect score…

Headquartered in Europe, AmoMama is a celebrity and lifestyle publisher attracting 40M readers in 4 languages (English, German, French, and Spanish). We are also highly active on social media with 27M subscribers alone on Facebook.

This year, after a long protracted effort, we achieved 98 points for our mobile version and 100 points for our desktop version. Page loading speeds now range between just 0.1 to 0.6 seconds. How did we do it?

The first thing we did was audit the current state of our site, not superficially but a really deep analysis of our entire website architecture. Then we created a list of solutions to eliminate the issues that arose during the audit and started to implement them.

In the process of implementing our solutions, it became necessary to continually track their impact on performance, scrupulously monitoring our metrics to justify the introduction of further features. The audit and monitoring process is an integral part of the path to high Core Web Vitals scores and also prepares us for the release of any new Core Web Vitals metrics or changes in Google’s algorithm.

Moving to the Green Zone

All of your ideas, solutions, and fixes must be validated by real users and Google measurement algorithms. In our case, it was tougher because we needed to change the website’s architecture and start afresh from a blank sheet of paper.

Our team then spent four months developing a new site and two months waiting to update the real user experience in the main Google Search Console. Like this: 

It took six months to transfer all four language versions to our new site architecture without risk to our work. Still, we had some time before the metrics began to affect our site ranking.

Since SEO for media is critical, and there are still no reliable solutions for SPA (Single Page Application), we used the following stack – React + Next.js. Furthermore, within the new architecture, we began to optimize many elements for our specific needs.

Key advice

Work with your embedded content

Since our website is a content platform, we use a lot of text and pictures in our articles and embedded media content from social networks – Twitter, Facebook, Giphy, Imgur, Instagram, Tiktok, and YouTube. We used JavaScript libraries from the package manager to display each individual embed. In the first version, the post page weighed several megabytes, so of course, we needed to fix this. The articles are long, and most likely, no one will see some of the embeds because they will not scroll down to them. So why load them on the page all at once? 

Then we used one JavaScript library for lazy loading of our media content. This approach allowed us to reduce our page size drastically. Incidentally, as an alternative to using separate libraries, you can replace them with one third-party service, usually paid. For example, for some embeds, we use iFramely.  

next/dynamic (one of the functions in the Next.js framework) is a very powerful tool to reduce the overall weight of the application. With the help of next/dynamic, we separated components on the page that weighed a lot but were not always used, for example, a block with additional related or recommended posts, or advertising with a lot of code. As a result, the entire advertising code is loaded as a separate js-file if it’s used on the page.

The lazy loading approach has one drawback — it greatly reduces the CLS — Cumulative Layout Shift. Google looks at how the position of elements on the page changes and punishes if the text, for example, shifts to the side/up/down during loading or scrolling. Since embeds are loaded on the client, not on the server, it is necessary to solve this. We decided to cut off part of the embed by setting the width and height of the container around the media content.

This helped us to reduce the number of URLs with a bad CLS score that we tracked in Google Search Console. However, there are also disadvantages – it is hard to find the ideal height for embeds.  There will be cases when the image is too cropped, or there is a lot of empty space after it, and the ability to scroll the embed may interfere with mobile screen scrolling. We plan to conduct an A/B test with a new solution and fix this in the near future.

 Show your site to users quickly

The most important thing is to render your site for all users as quickly as possible. To address this, we did two important things. 

First: When a user goes to our site, we load only basic markup and critical styles for the first screen so that they can immediately consume our content and load other important elements of the page in the background. 

Second: Analyze the weight of a web application with the help of Webpack Bundle Analyzer (Webpack is a static module bundler for modern JavaScript applications) that can show you all your project dependencies and analyze what you can improve.

We found several dependencies that were no longer used and some cases where only a small part of the library was used. In our case, this tool is still really helpful, and we always track the size of the site and try to load it faster.

Other necessary steps

A must-have for publishers – content caching. We use advanced caching in the browser (using Service Workers) and proxy caching (using CloudFront).

It’s also important to minify code and to use code archivers. The most effective of these is brotli.

One problem area is fonts; when loading, they give a jump that affects the core metric – CLS. We have now abandoned custom fonts altogether and use only system fonts. Our site now uses Helvetica.

Third-party scripts also have a strong negative impact on performance. For publishers, it’s analytics and advertising, and these are the most important.

Although each of these resources advises putting your script at the very beginning of the document, we do it another way. First, we wait until the page is fully loaded and only then connect all third-party resources. Of course, we lose some ad impressions, but we’ve realized that the negative impact of fast advertising is greater than the possible earnings.

Where we are now

At the beginning of our journey, our resources were mostly in the red zone for all major performance metrics. After the development of the new site, the situation improved significantly, but we were still in the yellow zone. Only by implementing all the previously described steps, as well as many others, we managed to push the sites into the green zone.

It should be added that the situation is quite dynamic: as new features are added, and Google changes algorithms, so the process of improving our performance is never over. 

Our success at a glance:

We created the monitoring we needed, which allowed us to track lab data, imitate field data, and build meaningful charts using Grafana and Sitespeed.io. which helps us now to validate all our hypotheses. Also, these tools can imitate different types of Internet connections and helps us to make our sites convenient and fast, even for areas with outdated devices and poor Internet connections. 

Another success factor is monitoring the impact of advertising providers. If we see a strong negative impact, we contact the provider and try to solve this problem. We succeed in some cases, but in others, we have to change ad partners.

The most important success factor, however, is a cohesive, result-oriented team and, most importantly, a dynamic team that does not stop offering ideas and implementing new solutions for our audience.

Bohdan Kladkovyi
Delivery Manager, AMO