Advertisement
  1. SEJ
  2.  ⋅ 
  3. SEO

Google Can’t See Content Behind Captchas

Hiding content behind captchas is a bad SEO practice as Google's web crawler can't see what's behind them.

Google Can’t See Content Behind Captchas

Google says websites can run into problems if they hide content behind captchas, as its web crawler won’t be able to see it.

Googlebot doesn’t interact with anything when it crawls webpages.

If it lands on a page with a captcha blocking the main content, it will assume that’s the only thing on the page.

There’s ways around this, however. While captchas are problematic there’s no reason to stop using them.

This is all stated by Google’s John Mueller during the Search Central SEO office-hours recorded on September 24, 2021.

The owner of a directory site writes in asking Mueller if the captchas they’ve implemented to avoid scraping can impact SEO.

In short — yes, they can impact SEO.

But there’s a way to use content-blocking captchas that doesn’t interfere with crawling or indexing.

Here’s what Mueller advises.

Google’s John Mueller on Captchas That Block Content

Mueller makes it clear that Googlebot doesn’t fill out captchas — even Google-based captchas.

If the captcha has to be filled out before accessing the content, then the content won’t get crawled.

Google will be able to index the page, but none of the content behind the captcha will be used for ranking.

“Googlebot doesn’t fill out any captchas. Even if they are Google-based captchas we don’t fill them out. So that’s something where if the captcha needs to be completed in order for the content to be visible, then we would not have access to the content.

If, on the other hand, the content is available there without needing to do anything, and the captcha is just shown on top, then usually that would be fine.”

As Mueller says, you can safely use captchas if the main content is readily accessible.

To be sure that a captcha isn’t blocking Google’s view, Mueller recommends using the Inspect URL tool in Search Console.

“What I would do to test is use, in Search Console, the inspect URL tool and fetch those pages and see what comes back.

On the one hand [check] the visible page to make sure that matches the visible content. And [then check] the HTML that is rendered there to make sure that includes the content you want to have indexed. That’s kind of the approach I would take there.”

That’s one solution, but there’s still another.

If you want to completely block content with a captcha and keep it Google-friendly at the same time, you can do that too.

It involves a technique you may think is against Google’s guidelines, but Mueller confirms it doesn’t violate any policies.

Serve Googlebot a different version of the page than regular users are served.

Googlebot can have a captcha-free version of the page, while users have to complete the captcha before viewing any content.

Then the content will get used for ranking, while you can still accomplish whatever your goal is with the captcha.

“From a policy point of view we’re okay with situations where you serve us the full content, and you require a captcha on the user side. If you need to do that slightly differently for Googlebot or maybe other search engines than you would for the average user from our point of view that’s fine.”

Hear Mueller’s full response in the video below:


Featured Image: getronydesign / Shutterstock

Category News SEO
ADVERTISEMENT
SEJ STAFF Matt G. Southern Senior News Writer at Search Engine Journal

Matt G. Southern, Senior News Writer, has been with Search Engine Journal since 2013. With a bachelor’s degree in communications, ...