Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

Google Is Requiring JavaScript To Block SEO Tools


Google has changed the way search results are served which will also help protect against bots and scrapers. Whether this will have a further effect on SEO tools or whether they can use headless Chrome that uses JavaScript remains an open question at the moment, but it is likely that Google uses rate limiting to reduce the number of pages that can be requested in a given time period.

Google search now requires JavaScript

Google has quietly updated its search box to require all users, including bots, to have JavaScript turned on when searching.

Surfing Google Search without JavaScript enabled results in the following message:

Please enable JavaScript to continue searching
The browser you are using has JavaScript turned off. Turn it on to continue searching.

Screenshot of Google search JavaScript message

In an email sent TechCrunch A Google spokesperson shared the following details:

“Enabling JavaScript allows us to better protect our services and users from bots and new forms of abuse and spam, …and provide the most relevant and up-to-date information.”

JavaScript likely enables personalization in the search experience, which is what that spokesperson may mean by providing the most relevant information. But JavaScript can also be used to block bots.

Using the latest version of Chrome, I copied a piece of JavaScript and ran it through ChatGPT to ask what it was doing. One part may relate to limiting the abuse of document requests.

Screenshot of Chrome Dev Tools

ChatGPT gave me the following feedback:

“Basic functionalities
Random value generation (rdb)

Generates a random value based on the properties (D_d, idc and p4b) of the input object a, bounded by p7d.
This can be used for rate limiting, exponential drift, or similar logic.

Purpose and context
Of its components, the script:

Probably handles request retries or access control for web resources.

It implements a policy implementation system, where:

The rules determine whether requests are valid.

Errors are logged and retries are sometimes made based on the rules.

Random delays or constraints can control the retry mechanism.

It appears optimized for fault handling and resiliency in distributed or high-traffic systems, preferably within a Google service or API.”

ChatGPT said the code can use rate limiting which is a way to limit the number of actions a user or system can take within a certain time period.

Speed ​​limit:

It is used to impose a limit on the number of actions (eg API requests) that a user or system can perform within a specified time frame.
In this code, random values ​​generated by rdb can be used to introduce variability in when or how often requests are allowed, which helps manage traffic efficiently.

Exponential deviation:

ChatGPT explained that exponential backoff is a way to limit the number of retries for a failed action that a user or system is allowed to do. The period between retries for a failed action grows exponentially.

Similar logic:

ChatGPT explained that random value generation can be used to manage access to resources to prevent malicious requests.

I don’t know for sure that’s what that specific JavaScript is doing, that’s what ChatGPT explained and it definitely matches the information Google shared that they use JavaScript as part of their bot blocking strategy.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *