Advertisement
  1. SEJ
  2.  ⋅ 
  3. News

Google’s John Mueller Predicts Dynamic Rendering Won’t Be Needed in a Few Years

Google’s John Mueller Predicts Dynamic Rendering Won’t Be Needed in a Few Years

Google’s John Mueller predicts that dynamic rendering will only be a temporary workaround for helping web crawlers process JavaScript.

Eventually, all web crawlers will be able to process JavaScript, Mueller believes. So in a few years’ time relying on dynamic rendering may not be necessary.

Mueller made this prediction during a recent Google Webmaster Central hangout when a site owner asked if there’s any reason why they shouldn’t use dynamic rendering.

Here is the question that was submitted:

“We’re thinking of the option to start only serving server-side rendering for bots on some of our pages. Is this an accepted behavior by Google & friends nowadays? Or do you see any objections on why not to do this?”

In response, Mueller said dynamic rendering is definitely something that Google considers to be an acceptable solution. In the near future, however, sites won’t need to rely on it as much.

Googlebot can already process every type of JavaScript page, and Mueller suspects all other crawlers will follow suit.

Mueller says dynamic rendering is a temporary workaround until other crawlers catch up. Although “temporary” might mean a couple of years, he clarifies.

What makes this prediction particularly interesting is that dynamic rendering was only introduced last year at Google I/O 2018.

Now, a little over a year later, Mueller predicts this innovative solution for serving JavaScript to bots will only be needed for a few years.

It will be interesting to look back on this and see how Mueller’s prediction pans out.

Hear Mueller’s full response below, starting at the 18:38 mark:

“So you can definitely do this, from our point of view. This is what we call, I believe, dynamic rendering, which is basically when you’re pre-rendering the pages for a specific set of users. Usually, that includes crawlers, social media user agents, all of those things that are basically not normal users that wouldn’t be able to process JavaScript.

That’s certainly something you could do. Sometimes it also makes sense to use server-side rendering for users as well. Sometimes you can significantly speed up the delivery of HTML pages to them. So it’s not something that I’d consider that you only need to do for bots, it’s probably worthwhile to check to see if there are ways you can leverage that same setup for users as well. Maybe you can, maybe that doesn’t make sense in this specific case.

In any case, from our side specifically, it’s something that you can do. I suspect over time, over the long run, it will be something that you’ll have to do less and less. Googlebot is able to crawl pretty much every JavaScript-type page nowadays. I suspect other user agents will follow up with that over time as well.

So I would see this as something kind of as a temporary workaround – where temporary might mean a couple of years – but it’s more of a time-limited workaround. At some point pretty much every irrelevant user agent will be able to process JavaScript.”

Category News Web Dev SEO
ADVERTISEMENT
SEJ STAFF Matt G. Southern Senior News Writer at Search Engine Journal

Matt G. Southern, Senior News Writer, has been with Search Engine Journal since 2013. With a bachelor’s degree in communications, ...