If you’re managing an SEO strategy for a website that uses JavaScript or have recently migrated to a JavaScript-based site, you may be wondering if there are any SEO challenges specific to JavaScript. After migrating one of our clients’ sites to a JavaScript-based platform, we noticed a decline in their SEO performance. While Google can crawl JavaScript, the process can be significantly slower than crawling static HTML pages — taking up to 9 times longer — and this delay can introduce serious indexing issues that negatively impact SEO.

It’s not just search engines that struggle with JavaScript-rendered content — AI platforms and large language models (LLMs) face similar challenges. These models, such as ChatGPT, Perplexity, or Claude, are trained on static HTML data and do not execute JavaScript like browsers do. As a result, dynamically generated content often isn’t visible to them unless it’s been rendered server-side or pre-rendered as static HTML.

Even if your website is already JavaScript-based, it’s important to identify and address these rendering and crawling challenges to ensure your site’s SEO and discoverability remain strong — both for search engines and the evolving landscape of AI-powered tools. In this article, we’ll walk you through the steps we took to uncover JavaScript-related SEO issues and how we addressed them for better search visibility.

1. Understand what you lost

Before diving into technical fixes, it’s crucial to understand exactly what you’ve lost. While you may have experienced a drop in organic traffic, it’s important to ask why. Did your rankings decrease, or are your positions the same but with a decreased click-through rate (CTR)?

This distinction can provide insight into what changed in Google’s perception of your site. In our case, we noticed that our rankings remained stable, but we lost sitelinks and featured snippets, which led to a decrease in clicks. This loss pointed us to the importance of how these elements were loaded and rendered both before and after the migration. Our investigation focused on whether Google was rendering and indexing these elements differently, which could explain the performance drop.

2. Turn off JavaScript and review your website

To better understand your website’s dependency on JavaScript, try turning off JS and see what your site looks like without it. You can easily do this using the browser’s developer tools. Check the Google developer documentation on how to do this. Focus on key elements that are critical for ranking, such as:

  • Header tags
  • Paragraph tags
  • Internal links to pages
  • Menus and navigation
  • Product links

If important elements are missing, this is a good indication that your site may be too reliant on JavaScript. You’ll need to have discussions with your development team to explore ways to include these elements directly in the source HTML, whether through actual HTML or pre-rendering techniques.

3. Compare source HTML, rendered page, and indexed page using ChatGPT

To gain a deeper understanding of what Google can actually render and index, inspect the URL in Google Search Console and look at the crawled page. Comparing the code of the page as Google sees it with what’s rendered in the browser will help you identify discrepancies between what Google is able to crawl and what users see.

For a more detailed comparison, export the source code, rendered page code and the indexed page from Google Search Console, and use AI tools (I used ChatGPT-4o) to analyze and summarize the differences between the 3 codes. This can save time and provide a quick overview of the critical changes. Focus on the relevant differences that could impact SEO. However, always make sure to do a more thorough manual check after using AI tools to ensure no critical issues are overlooked. Use the output from ChatGPT as a helpful starting point for further investigation.

4. Issues we encountered

While comparing the initial HTML and the rendered HTML, we encountered some issues.

One critical issue we found was the <h1> tag not being indexed correctly by Google. Google reads the <h1> tags as single concatenated strings, for example: “ExampleExampleExampleExample” instead of “Example”. This happened because the <h1> structure contained multiple <span> and <p> elements within the <h1> tags, which probably caused search engines to misinterpret it. Google seemed to ignore these elements and index the entire content as one word.

<h1 class=”MuiTypography-root MuiTypography-h1 brand mui-style-6io6kn”>
<span class=”MuiBox-root mui-style-1a6rg1m”><p>Example</p></span>
<span class=”MuiBox-root mui-style-1bzxb7w”><p>Example</p></span>
<span class=”MuiBox-root mui-style-14mu5f1″><p>Example</p></span>
<span class=”MuiBox-root mui-style-k83gc8″><p>Example</p></span>

This is the improved html structure we implemented:

<h1 class=”MuiTypography-root MuiTypography-h1 brand”>Example</h1>

Another issue was that some product details, such as product quantities and customer reviews, were not rendered or indexed by Google at all. These elements are crucial for eCommerce websites and have a much more direct impact on search visibility and rankings. We discussed this issue with the client, and they agreed to prerender these elements.

5. What if Google rendered everything?

During our analysis, we quickly found that most of the key elements were being rendered and indexed by Google, including the text, links, and menus. We did find some components and tags that were not rendered properly, as mentioned above. But for the rest, we did not find many more issues here.

The client was already using various pre-rendering techniques, which ensured that important elements were included in the initial HTML response. At first glance, this seemed like a sign that there should be no big issue here.

However, despite the proper rendering of content, there was still a noticeable decline in SEO performance. So, we took a deeper dive into the implementation.

6. Investigating the loss of featured snippets and sitelinks

Our primary clue was the loss of featured snippets and sitelinks in the SERPs. Before the migration, these elements were sourced directly from text on product and category pages. After the migration, we noticed that the format of this important content had changed.

Pre-Migration:
The product and category pages contained a plain HTML text block of around 400 words, which included internal links.

Post-Migration:
The same text was now placed in an accordion-style dropdown, initially hidden using aria-expanded=”false” and triggered to appear through a JavaScript event (a click, in this case).

This change explained why Google wasn’t using this content for featured snippets and sitelinks. Since the content was hidden for users by default, Google likely considered it less important, thus excluding it from the featured snippets and sitelinks.

7. How to fix this?

There are several ways to resolve this issue. For example, removing the entire accordion or adding the setting aria-expanded=”true” so the content is initially visible to both users and search engines could help. Our client opted for the latter solution, as removing the accordion was not feasible. By setting the component to be expanded by default, the content remains visible at all times without requiring any user interaction.

8. Further technical issues discovered

During our investigation, we identified additional technical issues that could be affecting SEO performance.

Another issue we found was the deeper nesting of elements in the DOM structure post-migration. Elements like headers, list items, links, videos, and various structures were more deeply nested than before, which can increase the complexity for crawlers to understand the contextual relevance of these elements. While this deeper nesting could potentially affect crawling efficiency, we didn’t expect it to have as significant an impact on SEO performance compared to other issues.

While we didn’t address all the issues identified, such as the deep nesting of DOM elements, we focused on optimizing the areas with the greatest potential to enhance organic performance. By ensuring that key content, like product details, reviews, headers, and internal links, is rendered and indexed by Google, we expect a significant impact on rankings. We’ve already rolled out some of these improvements, with more planned for the next development sprint. In several markets for this client, we’ve already begun to see positive results, including the return of sitelinks. We believe that with these updates, along with other ongoing SEO efforts, we’ll be able to recover the SEO performance seen prior to the migration.

9. Key takeaways

Migrating from an HTML to a JavaScript-based website presents unique challenges for SEO. Simply comparing the browser-rendered page with the version indexed by Google in Search Console will help you identify some problems. But if your client is already using pre-rendering techniques and Google renders most important elements, it may be time to look deeper into how these elements are presented.

By analyzing your data, identifying exactly what has changed in terms of both performance and site structure, and comparing that with Google’s ability to render and index your content, you can pinpoint the specific issues impacting your SEO.

When working with JavaScript-based websites, it’s crucial to ensure that content is both properly rendered and structured in a way that Google can easily understand and index. A strong SEO strategy should not only focus on optimizing for search engines but also consider AI platforms, which rely on static data and do not execute JavaScript. This approach will help future-proof your site, ensuring visibility for both Google and AI models and supporting long-term SEO success.