Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

Google’s JavaScript Warning & How It Relates To AI Search


A recent discussion among the Google Search Relations team highlights a challenge in web development: getting JavaScript to work well with modern search tools.

In Google’s latest Search Off The Record podcast, the team discussed the growing use of JavaScript and the tendency to use it when it’s not needed.

Martin Splitt, search developer advocate at Google, noted that JavaScript was created to help websites compete with mobile apps, bringing features like push notifications and offline access.

However, the team cautioned that the excitement around JavaScript functionality can lead to overuse.

While JavaScript is convenient in many cases, it is not the best choice for every part of a website.

The JavaScript spectrum

Splitt described the current landscape as a spectrum between traditional websites and web applications.

he says:

“We’re in this weird state where websites can be just that — websites, basically pages and information that’s presented across multiple pages and linked, but they can also be apps.”

He offered the following example of the JavaScript spectrum:

“You can browse apartments in a browser… it’s a website because it shows data like square footage, what floor is this on, what’s the address… but it’s also an app because you can use the 3D view to walk around the apartment.”

Why is this important?

John Mueller, Google Search Advocate, noted a common tendency among developers to over-rely on JavaScript:

“A lot of people love these JavaScript frameworks and use them for things where JavaScript really makes sense, and then they think, ‘Why shouldn’t I just use it for everything?'”

While listening to the discussion, I remembered a study I covered it a few weeks ago. According to the study, over-reliance on JavaScript can lead to potential problems for search engine AI.

Given the growing importance of AI search tools, I felt it was important to highlight this conversation.

While JavaScript is usually well supported by traditional search engines, its implementation requires more attention in the age of AI search.

A study reveals that AI bots account for an increasing percentage of search engine traffic for indexing, but these indexing bots cannot display JavaScript.

This means that you could lose traffic from search engines like ChatGPT Search if you rely too much on JavaScript.

Things to consider

The use of JavaScript and the limitations of AI crawlers present several important considerations:

  1. Server-side rendering: Since AI crawlers cannot execute client-side JavaScript, server-side rendering is essential to ensure viewability.
  2. Content accessibility: Major AI crawlers such as GPTBot and Claude have clear preferences for content consumption. GPTBot prefers HTML content (57.7%), while Claude focuses more on images (35.17%).
  3. A new development approach: These new limitations may require rethinking the traditional “JavaScript-first” development strategy.

The way forward

As AI crawlers become more and more important for website crawling, you need to balance modern features and affordability for AI crawling.

Here are some recommendations:

  • Use server-side rendering for key content.
  • Be sure to include the base content in the initial HTML.
  • Apply progressive improvement techniques.
  • Be careful when using JavaScript.

To succeed, adapt your website to traditional search engines and indexing tools using artificial intelligence, while ensuring a good user experience.

Listen to the entire podcast episode below:


Featured Image: Ground Picture/Shutterstock



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *