Google Declaring War on SEO Tools with JavaScript

Google Search confirms it's requiring JavaScript to block bots and scrapers, like SEO Tools.

Google has updated its approach to serving search results, which will also enhance its defenses against bots and scrapers. The impact of this change on SEO tools remains uncertain, particularly regarding their ability to utilize headless Chrome with JavaScript. However, it is likely that Google is implementing rate limiting to control the number of pages that can be requested within a specific timeframe.

Update: SERPrecon

Ryan Jones’ (LinkedIn profile) SERPrecon is back up and running, according to a tweet:

“Good news. We are back up and running. Thanks for bearing with us.”

SERPrecon allows users to compare their search results with competitors over time using vectors, machine learning, and natural language processing. It is likely one of the most valuable SEO tools available at a reasonable price.

Google Search Now Requires JavaScript

Google has updated their search box to require all users, including bots, to enable JavaScript for searching.

Surfing Google Search without JavaScript turned on results in the following message:

Turn on JavaScript to keep searching
The browser you’re using has JavaScript turned off. To continue your search, turn it on.

In an email to TechCrunch a Google spokesperson shared the following details:

“Enabling JavaScript allows us to better protect our services and users from bots and evolving forms of abuse and spam, …and to provide the most relevant and up-to-date information.”

JavaScript may enhance personalization in search experiences, which could explain what the spokesperson meant by delivering the most relevant information. However, it can also be utilized to block bots.

While using the latest version of Chrome, I copied some JavaScript code through Chrome Dev Tools and ran it through ChatGPT to understand its functionality. One code component appears to be related to limiting abusive requests for documents. (Source: Search Engine Journal)

Screenshot Of Chrome Dev Tools

JavaScript

ChatGPT gave me the following feedback:

“Core Functionalities
Randomized Value Generation (rdb)

Generates a random value based on properties (D_d, idc, and p4b) of the input object a, constrained by p7d.
This may be used for rate-limiting, exponential backoff, or similar logic.

Purpose and Context
From its components, the script:

Likely handles request retries or access control for web resources.

Implements a policy enforcement system, where:

Policies determine if requests are valid.

Errors are logged and sometimes retried based on rules.

Randomized delays or limits might control the retry mechanism.

Appears optimized for error handling and resilience in distributed or high-traffic systems, possibly within a Google service or API.”

ChatGPT mentioned that the code may implement rate limiting, which is a method used to restrict the number of actions a user or system can perform within a certain timeframe.

Exponential Backoff:

Exponential backoff is a strategy for limiting the retries a user or system can make after a failed action. The interval between retries increases exponentially.

Similar Logic:

ChatGPT mentioned that random value generation can be used to manage access to resources and prevent abusive requests. While I can't confirm that this specific JavaScript is being used for that purpose, it aligns with the information shared by Google, indicating that they incorporate JavaScript into their strategy for blocking bots.

Regarding Semrush, they commented in a LinkedIn discussion that a delay they experienced was due to maintenance and was unrelated to Google's JavaScript requirements.

“Hi Natalia Witczyk, the delay you saw yesterday was due to general maintenance within our Position Tracking tool, we are not experiencing any issues related to the event with Google but will continue to monitor the situation. We’d recommend refreshing your project, if you are still having issues please send us a DM or reach out to our support…”

Will This Make SEO Tools More Expensive?

Search marketers have noted that addressing blockages on social media can lead to an increased demand for crawling resources. This increased demand may, in turn, result in higher rates for users.

Vahan Petrosyan, Director of Technology at Search Engine Journal observed:

“Scraping Google with JavaScript requires more computing power. You often need a headless browser to render pages. That adds extra steps, and it increases hosting costs. The process is also slower because you must wait for JavaScript to load. Google may detect such activity more easily, so it can be harder to avoid blocks. These factors make it expensive and complicated for SEO tools to “turn on” JavaScript.”

Buy Premium Niche Relevant Backlinks $25/Link only