Advertisement
  1. SEJ
  2.  ⋅ 
  3. SEO

No, These 15 Things Are Not Ranking Factors for Google

Are social signals, accessibility, XML sitemaps, content length and more actually Google Search ranking factors? Here's what you need to know.

No, These 15 Things Are Not Ranking Factors for Google

In the wake of a few recent Twitter and SEO forum arguments about ranking factors, I wanted to dispel some common misconceptions about what is and is not a ranking factor.

There are plenty of things related to, correlated to, or associated with ranking factors that are not (or most likely are not) ranking factors themselves.

Why do we assume some of these non-factors may be considered in Google’s algorithm?

In this post, you’ll find some of the most commonly brought up by other SEO professionals or clients. I’ve tried to explain why they aren’t technically a ranking factor and included comments from Googlers, where relevant.

Website Age

I keep seeing this one in all of the ranking factors lists out there, despite the fact that Google has said it isn’t a factor.

Sure these things are correlated, but correlation doesn’t equal causation.

Domains that have been around for a while have had that much longer to accrue all of the signals that go into ranking.

If your site is older, it likely has more content and links, as it’s had more time to get customers and word of mouth, etc.

Age is not the factor here. It’s the other signals that come with age – but don’t require age to get.

Domain Registration Period

The same goes for domain registration length. This is something you buy.  It wouldn’t make sense to make it a ranking factor if you can just pay for it.

Users don’t care how long you’ve registered your domain. It doesn’t make your site more or less relevant to their query.

Does it correlate? Sure, because spammers usually don’t pony up for multiple years of registration.

You know who else doesn’t pony up for multiple years? Small businesses or companies who don’t want that expense all at once.

With auto-renew features on registrars now, it’s not really an issue to go yearly. When you own hundreds or thousands of domains, it’s better for tax reasons too.

There are better ways of determining authority.

Google does have a patent on using registration length, but that doesn’t mean they’re using it for ranking purposes. That’s not how patents work. Anybody can patent anything.

Take this Time Machine Patent, for example. Getting a patent on a methodology doesn’t mean that using said methodology actually resulted in a positive change.

Pogo-Sticking

First, let’s clarify the terms. Bounce rate is when a user visits one page and doesn’t take any action or visit any other pages.

Pogo-sticking is the act of a user visiting a page and then clicking back to the search results immediately (often clicking another search result). This is often mentioned as a ranking factor by SEO pros despite Google saying otherwise in a video.

It’s not a factor.

It may be used for internal testing, comparing ranking changes against each other, quality control, and other things, but (aside from personalization) it doesn’t appear to be a factor in the core algorithm.

There are also a lot of cases where pogo-sticking is a good thing. I pogo-stick every morning when I search for “Detroit Red Wings” news and read several articles from Google.

The same goes for any click-based metric. They’re very noisy, often don’t mean what we think they mean, and can be easily manipulated.

This doesn’t mean Google doesn’t use things like pogo-sticking to evaluate two versions of a search results page. But they likely don’t use it at a site or URL level.

Total Amount of Page Content or Word Count

This one is just silly.

Sure, more useful content is better.

More complete content is better. More relevant content is better.

But simply more content? Nope.

Think like a user.

If I’m searching for the area code in Detroit would I want the page that just says “Detroit’s area code is 313” or the one that builds up to the answer with 3000 words of elegant prose?

If you were wondering, frequency of content updates isn’t a factor (in non-news search) either.

If I’m searching for a chicken soup recipe, I don’t need Grandma’s life story – just tell me what I need and how to make it.

Unlinked Mentions

This is a case of SEO pros slightly misunderstanding some Google comments.

Google has told us they don’t treat unlinked mentions as links. Eric and Mark even did a test that showed no improvement in rankings.

What’s likely happening here is that unlinked mentions are used for the knowledge graph and determining entities, but not directly for ranking.

Does the knowledge graph influence rankings? Likely yes, in many indirect ways, but we should list those as a factor, rather than the things that may partially make them up.

Direct Website Visits. Time on Site. Bounce Rate. GA Usage

None of these are factors.

According to W3techs, only 54% of websites use Google Analytics. Most big brands and fortune 500 sites use Adobe Analytics instead. Chrome only has a 45-60% market share depending on what source you look at.

In other words, there’s no reliable way for Google to get these metrics for more than half of the web.

Big brands are dominating rankings and Google doesn’t have their analytics data. Even if they did, it’s way too noisy of a signal.

For many sites, the bounce rate is fine. Take a weather site; most users only look up the weather in one location. A bounce is normal.

For other sites, time on site being low is good, too. Take Google itself — its goal is to get you off the search results and onto something else as quickly as possible.

If you don’t believe me, here’s Gary saying it in 2017.

AMP

Not a ranking factor. Page speed is a ranking factor but AMP is different than page speed.

For many queries, page speed itself is just a minor ranking factor. There is no scenario where Google is going to rank a faster page ahead of a more relevant page.

You won’t find a user saying, “I know I searched for Pepsi, but this Coke page is so much faster…”

Does AMP improve page speed? Yes, it does. But speed is still the ranking factor, not AMP.

(Note: AMP is required for the carousel and that does rank #1, but that’s not part of the ranking algorithm. That’s a search feature, so it doesn’t count.)

LSI Keywords

This is one of those misinformation trends in SEO that keeps popping up every once in a while. All it means is that the person saying it has no understanding of LSI at all.

Seriously, the L stands for latent, and latent means not there – which contradicts how most SEO professionals then go on to use this phrase.

Here’s a relevant post that explains it way better than I can.

TF-IDF Keywords

Again, this is just an SEO pro telling the rest of the community that they lack computer science knowledge. TF-IDF is a concept in information retrieval but it’s not really used in ranking.

Besides, there are way better ways of doing stuff right now than using TF-IDF. It doesn’t work nearly as well as modern methods, and it’s not really about ranking at all.

When it comes to analysis, TF-IDF isn’t something that you as a webmaster can do at a page level. It depends on the corpus of results in the index.

Not only would you need all the other relevant documents, but you’d need the non-relevant ones to compare them to, as well.

You can’t realistically scrape the search results (relevant ones only) and then apply TF-IDF and expect to learn much. You’re missing the other half of the required data for the calculation.

Here’s a very simple primer. If you want to learn more, pick up an information retrieval textbook and read about these concepts.

I recommend “Information Retrieval” by Stefan Buttcher, who works at Google.

Quality Raters & E-A-T

They don’t affect your site at all. They aren’t specifically rating your site in any way that’s used by the algorithm.

They help rate algorithm changes against one another and create (for lack of a better term) training sets of data.

Basically, some algorithm changes that Google makes will go to the quality raters first to see if they really achieved what they wanted to achieve. They’ll do something like look at two search results pages and “rate” which one is better for that query.

If it passes, they’ll consider putting the change live.

I know, I used to be a quality rater some years ago. Nothing in my job duties had me affecting the rankings of individual websites.

Also, just because something is in the quality rater guidelines doesn’t mean that it’s a ranking factor. The quality rater guidelines are a simplified way of explaining in plain English what all the actual factors are trying to measure.

A good example is E-A-T. Google has said there’s no such thing as an E-A-T score.

EAT is just a conceptual model for humans to explain what the algorithm is trying to emulate.

(If you want my opinion, E-A-T is still mostly measured by PageRank, but that’s another post.)

XML Sitemaps

My pet peeve is seeing “no XML sitemap” on every SEO audit I come across. Seriously, I just wrote about it.

XML sitemaps have nothing to do with ranking. At all. They are a method by which Google will discover your pages — but if Google is already indexing all of your pages, adding an XML sitemap will do nothing.

Not every site needs one. It won’t hurt, but if you have a great taxonomy and codebase it won’t help, either.

They’re kind of a band-aid for sites that have crawl issues.

Also, if you really want to go down this rabbit hole, here’s John Mueller saying that HTML sitemaps aren’t a ranking factor.

Should you still do an XML sitemap?

Probably. There are lots of non-ranking benefits for doing it – including more data available in Search Console.

Accessibility

Is accessibility important? Yes, it is.

Is there a flag in the search algorithm to say whether a site is accessible? No, there isn’t.

Currently, accessibility is not a ranking factor.

Several things that are required for accessibility are ranking factors such as alt attributes, proper heading usage, etc. But the search engines are looking at those factors, not whether or not your page passes an accessibility audit.

That doesn’t mean you shouldn’t make your page accessible, though. Not doing so is a good way to get sued.

(Note: I predict a world where search engines will eventually pay attention to accessibility so that when users with assisted devices do a search they can get back only results that will work – but we aren’t there yet. This could be a fun 10% project for some Googlers.)

Content Accuracy

Google and Bing want to display accurate content, but that’s a really hard problem to solve.

Google and Bing know less about what’s accurate and more about what the consensus of the web says. The web isn’t always right.

More importantly, though, the engines are trying to match query intent and use other signals (cough, cough, links!) to gauge authority.

The focus right now isn’t on whether the data is right or wrong (as this is very hard to do). It’s more on whether or not the site is showing it is authoritative and reputable. Here’s Danny Sullivan saying just as much.

Since search engines only see what the majority of people say, they aren’t really measuring “correctness” but popularity or web consensus. It’s why we see wrong information in the knowledge graph all the time.

It’s also sort of how Google Translate works, and it’s why we see some gender bias and other issues appear in there. Unfortunately, that’s how the majority of the text on the internet is written.

Social Signals

As far back as 2010, Matt Cutts told us that Google doesn’t use social signals. (Except for that period when they actually used their own Google+ signals.)

Google isn’t using friend counts, follower counts, or any metrics that are specific to social networks.

They can’t.

Most social networks block them from crawling. Many users set their profiles to private. They simply can’t access much of that data.

But assume they could. What would happen if they were using it and Twitter suddenly put up a robots.txt blocking them? The rankings would drastically change overnight.

Google doesn’t want that. They’re all about making things robust and scalable.

Having said that, though, they do crawl social networks when and where they can – but they likely treat them just like any other page on the internet.

So if you have a high PR social page that has links to things on it, those will count as links and some of that authority may pass.

I’ve always joked that I want to create a search engine that uses only social signals. But imagine how awful it would be to search for sensitive medical information and get back a bunch of memes making fun of the condition.

For many topics, the way people share stuff on social is not the way people search.

Just imagine what a search engine that only looked at social shares would show for your most/least favorite politician and you’ll see why social signals aren’t the best tokens for Google to use.

Subdomains or SubDirectories

Google doesn’t care.

There may have been a time when they did. But search engines have gotten way better at determining whether you’re using a subdomain as a separate site or as a part of your main site and treating it as such.

When it comes down to subdomains vs directories, it’s all about how you use them and how you interlink them to everything else, not the actual domain or directory itself.

Yes, I know you’ve seen a ton of studies out there that say moving from one to the other caused a dip. However, in every one of those studies they didn’t just do a move – they changed the navigation, UX, and linking structure, too.

Of course, removing a ton of links to subpages and replacing them with one link to a subdirectory will have an effect on your SEO. But that’s all because of links and PageRank, not the actual URL structure.

Summary

I hope this helps clear up a lot of the confusion around these specific factors.

Whenever we debate whether something is or isn’t a factor, I like to think about how I’d code or scale it.

Often, just doing that mental exercise can show me all the problems with using it.

I honestly believe that Google and Bing are not lying to us when they tell us this thing or that is not a ranking factor.

Sometimes they are intentionally ambiguous in their answers, and they do choose their words carefully.

But I don’t think they lie to us.

More Resources: 

Category SEO
ADVERTISEMENT
VIP CONTRIBUTOR Ryan Jones VP, SEO at Razorfish

Ryan is the SEO lead at Razorfish and founder of SERPrecon.com Ryan works from early in the morning until late ...