Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
A recent discussion among the Google Search Relations team highlights a challenge in web development: getting JavaScript to work well with modern search tools.
In Google’s latest Search Off The Record podcast, the team discussed the growing use of JavaScript and the tendency to use it when it’s not needed.
Martin Splitt, search developer advocate at Google, noted that JavaScript was created to help websites compete with mobile apps, bringing features like push notifications and offline access.
However, the team cautioned that the excitement around JavaScript functionality can lead to overuse.
While JavaScript is convenient in many cases, it is not the best choice for every part of a website.
Splitt described the current landscape as a spectrum between traditional websites and web applications.
he says:
“We’re in this weird state where websites can be just that — websites, basically pages and information that’s presented across multiple pages and linked, but they can also be apps.”
He offered the following example of the JavaScript spectrum:
“You can browse apartments in a browser… it’s a website because it shows data like square footage, what floor is this on, what’s the address… but it’s also an app because you can use the 3D view to walk around the apartment.”
John Mueller, Google Search Advocate, noted a common tendency among developers to over-rely on JavaScript:
“A lot of people love these JavaScript frameworks and use them for things where JavaScript really makes sense, and then they think, ‘Why shouldn’t I just use it for everything?'”
While listening to the discussion, I remembered a study I covered it a few weeks ago. According to the study, over-reliance on JavaScript can lead to potential problems for search engine AI.
Given the growing importance of AI search tools, I felt it was important to highlight this conversation.
While JavaScript is usually well supported by traditional search engines, its implementation requires more attention in the age of AI search.
A study reveals that AI bots account for an increasing percentage of search engine traffic for indexing, but these indexing bots cannot display JavaScript.
This means that you could lose traffic from search engines like ChatGPT Search if you rely too much on JavaScript.
The use of JavaScript and the limitations of AI crawlers present several important considerations:
As AI crawlers become more and more important for website crawling, you need to balance modern features and affordability for AI crawling.
Here are some recommendations:
To succeed, adapt your website to traditional search engines and indexing tools using artificial intelligence, while ensuring a good user experience.
Listen to the entire podcast episode below:
Featured Image: Ground Picture/Shutterstock