Circumstance Study: How the Cookie Monster Ate 22% of Our Visibility

The author’s sights are entirely his or her have (excluding the not likely occasion of hypnosis) and may perhaps not constantly mirror the sights of Moz.

Domain & Hosting bundle deals!

Final yr, the workforce at Homeday — 1 of the leading assets tech companies in Germany — produced the choice to migrate to a new content administration process (CMS). The targets of the migration were, between other things, amplified web site speed and building a state-of-the-artwork, foreseeable future-evidence web-site with all the important characteristics. 1 of the principal motivators for the migration was to allow written content editors to perform far more freely in building pages without the assistance of developers. 

Just after assessing various CMS solutions, we made the decision on Contentful for its modern technological innovation stack, with a outstanding experience for each editors and developers. From a complex viewpoint, Contentful, as a headless CMS, permits us to pick out which rendering technique we want to use. 

We’re now carrying out the migration in several stages, or waves, to minimize the threat of troubles that have a massive-scale detrimental effect. In the course of the to start with wave, we encountered an concern with our cookie consent, which led to a visibility decline of just about 22% in five days. In this short article I will describe the issues we were being facing throughout this first migration wave and how we settled them.

Environment up the very first examination-wave 

For the first examination-wave we chose 10 Search engine optimisation webpages with large targeted visitors but small conversion fees. We recognized an infrastructure for reporting and monitoring people 10 pages: 

  • Rank-tracking for most relevant keywords 

  • Search engine marketing dashboard (DataStudio, Moz Pro,  SEMRush, Search Console, Google Analytics)

  • Normal crawls 

Right after a complete organizing and testing period, we migrated the initially 10 Search engine optimisation internet pages to the new CMS in December 2021. Whilst a number of troubles occurred throughout the screening stage (amplified loading occasions, even bigger HTML Document Item Product, and so forth.) we decided to go are living as we failed to see significant blocker and we preferred to migrate the initially testwave ahead of xmas. 

Very first overall performance review

Incredibly excited about reaching the first stage of the migration, we took a glance at the effectiveness of the migrated webpages on the upcoming working day. 

What we noticed next definitely failed to you should us. 

Right away, the visibility of tracked key terms for the migrated web pages reduced from 62.35% to 53.59% — we shed 8.76% of visibility in a person day

As a end result of this steep fall in rankings, we done one more in depth round of testing. Amongst other things we examined for protection/ indexing problems, if all meta tags ended up involved, structured info, interior inbound links, page velocity and cellular friendliness.

Next efficiency critique

All the article content had a cache day after the migration and the content was fully indexed and staying study by Google. Also, we could exclude numerous migration possibility factors (alter of URLs, content material, meta tags, layout, and so forth.) as sources of mistake, as there hasn’t been any variations.

Visibility of our tracked keywords experienced one more drop to 40.60% about the next couple of times, earning it a total fall of practically 22% in 5 days. This was also plainly revealed in comparison to the levels of competition of the tracked keyword phrases (right here “approximated targeted visitors”), but the visibility appeared analogous. 

Data from SEMRush, specified keyword set for tracked keywords of migrated pages

As other migration threat variables moreover Google updates had been excluded as sources of mistakes, it certainly had to be a technological problem. Also significantly JavaScript, lower Core Website Vitals scores, or a larger sized, more advanced Document Object Model (DOM) could all be probable will cause. The DOM signifies a site as objects and nodes so that programming languages like JavaScript can interact with the website page and alter for case in point fashion, framework and written content.

Next the cookie crumbs

We had to detect troubles as rapidly as possible and do fast bug-fixing and lower more negative outcomes and visitors drops. We at last received the very first actual hint of which specialized rationale could be the cause when just one of our applications showed us that the number of webpages with superior exterior linking, as perfectly as the number of internet pages with most content material dimension, went up. It is vital that internet pages do not exceed the most content material dimension as internet pages with a quite large total of body content material may perhaps not be entirely indexed. About the substantial exterior linking it is important that all exterior links are honest and appropriate for customers. It was suspicious that the quantity of exterior back links went up just like this.

Increase of URLs with high external linking (more than 10)
Increase of URLs which exceed the specified maximum content size (51.200 bytes)

The two metrics were disproportionately superior when compared to the number of web pages we migrated. But why?

Soon after examining which exterior links experienced been included to the migrated web pages, we saw that Google was reading and indexing the cookie consent type for all migrated web pages. We done a web site search, checking for the content material of the cookie consent, and noticed our idea confirmed: 

A site search confirmed that the cookie consent was indexed by Google

This led to many problems: 

  1. There was tons of duplicated articles designed for just about every site due to indexing the cookie consent kind. 

  2. The content material dimension of the migrated internet pages drastically elevated. This is a dilemma as internet pages with a very huge amount of entire body material could not be absolutely indexed. 

  3. The selection of external outgoing back links considerably amplified. 

  4. Our snippets quickly showed a day on the SERPs. This would advise a weblog or information short article, while most posts on Homeday are evergreen written content. In addition, because of to the date showing, the meta description was slash off. 

But why was this taking place? According to our support provider, Cookiebot, research motor crawlers access web sites simulating a full consent. Therefore, they obtain accessibility to all information and duplicate from the cookie consent banners are not indexed by the crawler. 

So why wasn’t this the case for the migrated webpages? We crawled and rendered the pages with distinct user brokers, but continue to could not find a trace of the Cookiebot in the supply code. 

Investigating Google DOMs and exploring for a alternative

The migrated internet pages are rendered with dynamic information that will come from Contentful and plugins. The plugins include just JavaScript code, and at times they occur from a companion. 1 of these plugins was the cookie supervisor lover, which fetches the cookie consent HTML from outside the house our code foundation. That is why we didn’t discover a trace of the cookie consent HTML code in the HTML supply documents in the first location. We did see a larger DOM but traced that back again to Nuxt’s default, more complex, greater DOM. Nuxt is a JavaScript framework that we do the job with.

To validate that Google was looking through the duplicate from the cookie consent banner, we utilized the URL inspection tool of Google Research Console. We in comparison the DOM of a migrated site with the DOM of a non-migrated webpage. Inside the DOM of a migrated web site, we last but not least located the cookie consent information:

Within the DOM of a migrated page we found the cookie consent content

A little something else that bought our notice had been the JavaScript information loaded on our previous web pages versus the files loaded on our migrated pages. Our web-site has two scripts for the cookie consent banner, offered by a 3rd social gathering: just one to demonstrate the banner and grab the consent (uc) and 1 that imports the banner content material (cd).

  • The only script loaded on our aged pages was uc.js, which is dependable for the cookie consent banner. It is the just one script we require in each and every website page to manage person consent. It shows the cookie consent banner devoid of indexing the articles and will save the user’s final decision (if they concur or disagree to the usage of cookies).

  • For the migrated pages, apart from uc.js, there was also a cd.js file loading. If we have a page, exactly where we want to present far more data about our cookies to the consumer and index the cookie details, then we have to use the cd.js. We imagined that equally documents are dependent on each other, which is not accurate. The uc.js can run alone. The cd.js file was the explanation why the content material of the cookie banner acquired rendered and indexed.

It took a though to come across it simply because we considered the 2nd file was just a pre-necessity for the to start with a person. We identified that basically getting rid of the loaded cd.js file would be the remedy.

Efficiency critique just after implementing the alternative

The working day we deleted the file, our search term visibility was at 41.70%, which was even now 21% decreased than pre-migration. 

On the other hand, the day right after deleting the file, our visibility improved to 50.77%, and the following working day it was pretty much again to regular at 60.11%. The approximated targeted traffic behaved equally. What a reduction! 

Quickly after implementing the solution, the organic traffic went back to pre-migration levels


I can visualize that many SEOs have dealt with little problems like this. It seems trivial, but led to a substantial fall in visibility and targeted visitors through the migration. This is why I recommend migrating in waves and blocking enough time for investigating complex faults just before and after the migration. Also, retaining a shut glance at the site’s functionality inside of the weeks soon after the migration is important. These are certainly my essential takeaways from this migration wave. We just concluded the second migration wave in the commencing of May 2022 and I can state that so considerably no major bugs appeared. We’ll have two far more waves and comprehensive the migration with any luck , correctly by the close of June 2022.

The performance of the migrated webpages is just about back to standard now, and we will continue on with the next wave. 

You May Also Like