Advertisement
  1. SEJ
  2.  ⋅ 
  3. SEO

Google’s John Mueller Q&A: 4 SEO Questions Answered

Four common questions about technical SEO issues are answered by Google's John Mueller in a new video.

Google’s John Mueller Q&A: 4 SEO Questions Answered

Google’s John Mueller answers four rapid fire questions about common technical SEO issues that almost everyone runs into at one point or another.

Mueller addresses questions sent in by people related to:

  • Blocking CSS files
  • Updating sitemaps
  • Re-uploading a site to the web
  • Googlebot’s crawl budget

These questions are answered in the latest installment of the Ask Googlebot video series on YouTube.

Traditionally, those videos focus on answering one specific question with as much detail as Google is able to provide.

However, not every question about SEO takes a whole video to answer. Some can be answered in one or two sentences.

Here are some quick answers to questions that are often asked by people just getting started in SEO.

Can Blocking CSS Files In Robots.txt Affect Rankings?

Yes, blocking CSS can cause issues, and Mueller says you should avoid doing that.

When CSS is blocked in robots.txt Googlebot is not able to render a page as visitors would see it.

Being able to see a page completely helps Google understand it better and confirm that it’s mobile-friendly.

That all contributes to a webpage’s ranking in search results.

How Should I Update The Sitemap For My Website?

There’s no common simple solution for updating sitemaps that works across all websites, Mueller says.

However, most website setups have built-in solutions of their own.

Consult your site’s help guides for a sitemap setting, or for a compatible plugin that creates sitemap files.

It’s usually just a matter of turning a setting on and you’re all set.

What Is The Correct Way To Reintroduce A Site To Google?

It’s not possible to reset indexing for a website by deleting its files and re-uploading them.

Google will automatically focus on the newest version of a site and drop the old version over time.

You can move this process along faster by using redirects from any old URLs to the new ones.

Would Deleting RSS Feeds Improve Googlebot Crawling?

A person writes in to Mueller saying 25% of Googlebot’s crawl budget is going to the RSS feed URLs that are in the head of every page.

They ask if deleting the RSS feeds would improve crawling.

Mueller says the RSS feeds are not problematic, and Google’s systems balance crawling across a website automatically.

Sometimes that results in Google crawling certain pages more often, but pages will only be re- crawled after Googlebot has seen all the important pages at least once.


Featured Image: Screenshot from YouTube.com/GoogleSearchCentral

Category News SEO
ADVERTISEMENT
SEJ STAFF Matt G. Southern Senior News Writer at Search Engine Journal

Matt G. Southern, Senior News Writer, has been with Search Engine Journal since 2013. With a bachelor’s degree in communications, ...