Google’s John Mueller illustrated how Googlebot processes sitemap files by using a unique analogy involving energy drinks.
In a Reddit thread where a user asks whether it’s redundant for a client site to have multiple sitemap files, Mueller replied saying it doesn’t matter.
All sitemap files will be imported together and given to Googlebot as one, big set of data. Or, as Mueller puts it (emphasis mine):
“All sitemap files of a site are imported into a common, big mixing cup, lightly shaken, and then given to Googlebot by URL in the form of an energy drink. It doesn’t matter how many files you have.”
Related: How to Use XML Sitemaps to Boost SEO
According to how Mueller describes the process, Googlebot never ends up knowing how many sitemap files a site has because all it receives is a list of URLs. Therefore it doesn’t matter how many sitemaps a site has, because Google reads it as all one file anyway.
One thing that does matter, Mueller adds, is providing Googlebot with information regarding the last-modification date. Mueller says site owners should be providing this information, and then making sure URLs in the sitemap files have the same date.
The last-modification date can be a useful signal when used correctly, Mueller continues in another comment. It’s not useful to use the data/time of when a sitemap was generated as the last-modification date, as that’s not actually when the primary content has changed.
Related: How to Update Sitemaps After You Change Your Content
SEOs and site owners should also be mindful of servers that return a dynamic last-modification date for all URLs, as that’s not useful either.