Both Google and Bing told an audience at South by South West (SXSW) that they are actively working on an over-optimization penalty in order to make their ranking algorithms better. Obviously, this raised some eyebrows with many people who wondered if this might translate into penalties for their websites that are being “optimized” by SEOs.
While no official details have been provided outside of the few comments made during the SXSW session, there is a lot of speculation as to just what this might mean and how it would effect sites that are being optimized. We can glean some answers from the comments themselves, but I wouldn’t expect too much more clarification that isn’t specifically designed to frighten SEOs and Web marketers into greater compliance with their webmaster guidelines.
Then again, following such guidelines is generally good practice for building a user-friendly website. Search engines, after all, just want to be like any other users (other than they don’t place orders for your products or services). Just for fun, let’s look at some “what if?” scenarios. These are all possibilities, and, even if there is little chance of them being worked into the algorithm, they might give you some good ideas about what you can do to better your online marketing efforts.
Over Optimization What-Ifs
What if Google penalized “optimization signals” even if they were beneficial to the site visitor?
I admit this was one of the fist things that came to my mind. When you say “over-optimization penalty,” it makes it sound as if the search engines want to penalize optimization period. Sure, they don’t want to throw the baby out with the bath water, but it makes you wonder if they want to give unoptimized sites a way to compete with the optimized sites. They could do this by looking at specific things that SEOs typically do that “normal” sites don’t. What could those things be?
- Always placing the keyword at the front of the title tag
- Always placing the keyword at the front of the H1 tag
- Always using keywords in the heading tags
- Using keywords consistently in navigation on content links
- Too many instances of a single phrase OR variations of that phrase
Are any of these possible? Sure. Likely? Probably not.
Some of these are already part of the algorithm, such as too many keywords on the page, though I find it curious that this was specifically mentioned. This is something (good) SEOs have been preaching against for years. All in all, I’d say most sites that are optimized in ways that make the site better for visitors are not going to be negatively affected – unless they cross the line in other areas.
What if Google started penalizing too many links?
This was also mentioned and begs the question, what does it mean to have “too many” links? Perhaps Google knows the answer to this.
It may be as simple as comparing your link count with that of your competitors and seeing if it’s significantly out of proportion. In some cases, this can be a good indicator that the site is manipulating links to get an advantage in rankings. However, sites can also get a number of quick, legitimate links through (positive or negative) PR, contests, viral videos, social bursts and more. I would bet that this will be a part of the over-optimization algorithm, but it would need to be used in conjunction with other factors.
What if Google started looking for a proper “balance” of link text?
This is something that could easily be considered and has been talked about by SEOs already. It’s possible that “natural” sites have a “typical” ratio of keyword-rich link text pointed at their pages vs. some generic link text. SEO’d sites, on the other hand, may have a greater percentage of keyword-linked text that can set off some alarms. The search engines might decide to either devalue many of those links or put an outright penalty on the site for “optimized” linking practices.
If this “what-if” becomes part of the algorithm, it means that link building efforts will go on but with less keyword optimization. It’s possible that this could be used in conjunction with the “too many links what-if” above. Together, they might give the search engine a good idea that the site is gaming the system.
What if Google matched link growth to social engagement growth?
If the search engine sees links growing consistently, but the social engagement is low or non-existent, it can be a signal that the links are manipulated rather than natural. I think it would take a pretty aggressive link campaign for this type of thing to be clearly noticeable to the search engines, but it’s certainly possible. The idea would be to match link growth with social growth. Anything that is disproportionate would trigger an alarm.
In truth, this could work both ways. If a site is getting a lot of social “engagement” but very few links, it might tell the engines that the social sphere is being manipulated, as well. Either way, you would want your site links and social engagement to grow consistently (even if not directly proportionately). I would think a site that is naturally interesting to visitors would grow in both areas at the same time.
What if Google measured social engagement rather than just retweets and shares?
This would be an addendum to the “what-if” above. But instead of monitoring social growth, the search engine would also be looking at true social engagement. We can all ask our friends to tweet or retweet our links; however, how many of them are doing nothing more than passing the information on? Are they commenting? Are the talking back and forth? Or are they just sharing and forgetting about it?
True social engagement could be a factor as to how much people are really interested in what you are offering. Social media without engagement is like speaking into a megaphone in an empty desert. It doesn’t matter what you say or how many times it’s re-broadcasted, if no one is interacting or even caring, it’s not truly social.
What if Google started measuring traffic through every link or social share before allowing it to influence rankings?
This is my favorite “what if.”
Google discovers thousands of links every day. If you’re building links, the idea is to get Google to find each link and filter that into the algorithmic valuing of your site. But what if a link wasn’t a link until it was actually clicked and followed? And what if the value of that link increased as it continued to get clicked? That would mean you can get links from articles, directories, comments, blog posts and whatever else, but none of it would matter unless someone is clicking the link to your site. And, the more clicks that link gets, the more value it transfers.
The hurdle to implementing such an algorithmic signal is that Google has to be able to know when every link is clicked. And it can only know that if you have Google Analytics installed on your site. Not every site does. But it’s possible that they can use the data they have to surmise if other links on similar sites and pages are delivering traffic to your competitors. It’s a bit far-fetched, but it would certainly make for an interesting algorithm. Google would somehow have to measure site engagement time after the click to prevent SEOs from manipulating that, as well.
This factor could easily apply to every link and every social share on the Web. Bottom line: no traffic, no value!
What if Google really didn’t mean it as it sounded?
This rates the highest on the probability scale. Google wasn’t talking to SEOs; they were talking to regular business owners. The “over-optimization” penalty may be nothing more than what Google has been doing for years – making their algorithm better by looking at certain signals more than others and adding or removing signals as needed. It’s entirely possible that Google is just now catching up to some of the things that SEOs have been preaching for years on how to build good websites that people and search engines love.
But speculation is fun (if not a bit scary), no?