Google has made a change to how it’s search results are served which will also help to secure it against bots and scrapers. Whether this will have further effect on SEO Tools or if they can use a headless Chrome that uses JavaScript remains an open question at the moment but it’s likely that Google is using rate limiting to throttle how many pages can be requested within a set period of time.
Update: SERPrecon
Ryan Jones’ (LinkedIn profile) SERPrecon is back up and running, according to a tweet:
“Good news. We are back up and running. Thanks for bearing with us.”
SERPrecon enables users to compare search results against competitors over time as well as competitor comparisons using “vectors, machine learning and natural language processing.” Quite likely one of the more useful SEO tools available and reasonably priced, too.
Google Search Now Requires JavaScript
Google quietly updated their search box to require all users, including bots, to have JavaScript turned on when searching.
Surfing Google Search without JavaScript turned on results in the following message:
Turn on JavaScript to keep searching
The browser you’re using has JavaScript turned off. To continue your search, turn it on.
Screenshot Of Google Search JavaScript Message
In an email to TechCrunch a Google spokesperson shared the following details:
“Enabling JavaScript allows us to better protect our services and users from bots and evolving forms of abuse and spam, …and to provide the most relevant and up-to-date information.”
JavaScript possibly enables personalization in the search experience, which is what that spokesperson may mean by providing the most relevant information. But JavaScript can also be used for blocking bots.
Using the latest version of Chrome I copied some of the JavaScript via Chrome Dev Tools and ran it through ChatGPT to ask what it does. One part of it may relate to restricting abusive requests for documents.
Screenshot Of Chrome Dev Tools
ChatGPT gave me the following feedback:
“Core Functionalities
Randomized Value Generation (rdb)Generates a random value based on properties (D_d, idc, and p4b) of the input object a, constrained by p7d.
This may be used for rate-limiting, exponential backoff, or similar logic.Purpose and Context
From its components, the script:Likely handles request retries or access control for web resources.
Implements a policy enforcement system, where:
Policies determine if requests are valid.
Errors are logged and sometimes retried based on rules.
Randomized delays or limits might control the retry mechanism.
Appears optimized for error handling and resilience in distributed or high-traffic systems, possibly within a Google service or API.”
ChatGPT said that the code may use rate-limiting which is a way to limit the number of actions a user or a system can take within a specific time period.
Rate-Limiting:
Used to enforce a limit on the number of actions (e.g., API requests) a user or system can perform within a specific time frame.
In this code, the random values generated by rdb could be used to introduce variability in when or how often requests are allowed, helping to manage traffic effectively.
Exponential Backoff:
ChatGPT explained that exponential backoff is a way to limit the amount of retries for a failed action a user or system is allowed to make. The time period between retries for a failed action increases exponentially.
Similar Logic:
ChatGPT explained that random value generation could be used to manage access to resources to prevent abusive requests.
I don’t know for certain that this is what that specific JavaScript is doing, that’s what ChatGPT explained and it definitely matches the information that Google shared that they are using JavaScript as part of their strategy for blocking bots.
Semrush Says It was Never Affected
Semrush commented on a LinkedIn discussion that a delay that was seen was due to maintenance and didn’t have anything to do with Google’s JavaScript requirement.
They wrote:
“Hi Natalia Witczyk, the delay you saw yesterday was due to general maintenance within our Position Tracking tool, we are not experiencing any issues related to the event with Google but will continue to monitor the situation. We’d recommend refreshing your project, if you are still having issues please send us a DM or reach out to our support…”
Will This Make SEO Tools More Expensive?
One of the observations made by search marketers across social media is that dealing with the blocks may cause an increase in resources for crawling which in turn may be passed on to users in the form of rate increases.
Vahan Petrosyan, Director of Technology at Search Engine Journal observed:
“Scraping Google with JavaScript requires more computing power. You often need a headless browser to render pages. That adds extra steps, and it increases hosting costs. The process is also slower because you have to wait for JavaScript to load. Google may detect such activity more easily, so it can be harder to avoid blocks. These factors make it expensive and complicated for SEO tools to simply “turn on” JavaScript.”