If you’ve been in SEO for more than a few weeks, it’s a safe bet you’ve seen poorly moderated blog comments or forum threads that have been polluted with spammy links. It’s also likely you’ve found evidence of “link hacking,” the unauthorized placement of links on an otherwise quality website via some sort of hacking mechanism. These are forms of link pollution, and much like real-world pollution, they can damage our search environment if left unchecked.
A must-read example of link pollution is documented in an SEOMoz blog post from late January, which shows that blog comment spam, forum spam, and hacked links placed on trusted websites led to spammy Google search results on a variety of competitive terms. This post is one of dozens that illustrate the immense task faced by Google and Bing: not only must they deliver excellent search results, but they must do so while dealing with an ever-increasing amount of link pollution.
While some people propose algorithmic adjustments to counter the effects of link pollution, I think it’s time to put a simpler and more obvious solution on the table: force website owners to take responsibility for allowing this pollution in the first place.
To be blunt, link pollution is often caused by poor and/or incompetent website management.
- Moderation of blog comments and forum posts has never been easier – there is no excuse for spammy links on blogs and forums, yet this problem doesn’t seem to be going away.
- When we find links that were placed by hackers, we’re reminded that password security is often the culprit. A 2010 password study shows that nearly 50% of Internet users have easily compromised passwords, and it’s a safe bet that many FTP passwords, WordPress passwords, etc. fall into the “easily compromised” category. How else could all these hacked links be explained?
While it’s true that even the best security and anti-spam measures can be compromised, it’s also true that these occurrences are relatively rare and – if a site is properly managed – easily corrected. If Google created a system that penalized irresponsible website management, link pollution would be drastically curtailed. Here is what I would propose:
1. Google should only index sites registered with Webmaster Tools. Assuming that all website owners valued Google search traffic, Webmaster Tools registration would be nearly universal. This would give Google a direct connection to website managers.
2. Mandate regular interaction. If Google were to send website managers a monthly “suspicious link report” – and then require the website manager to acknowledge receipt of this report – they would encourage website owners to actively manage the security and quality of their website.
3. Alert website managers to obvious spam links. While it is impossible for Google to detect every instance of link spam, there are certain occasions when a site has obviously been spammed. In these instances, Google should email the website manager immediately and request that the site be fixed ASAP.
4. De-index sites that fail to acknowledge alerts. Once Google has contact information for each and every website owner, there’s no excuse for failing to respond to Google spam link alerts in a timely manner. If a website manager fails to remedy a warning within a certain time frame, the site should be de-indexed for a period of days or weeks. Repeat offenders should be de-indexed for longer and longer periods until they either a) fix the problems or b) give up and go away.
While my plan to de-index non-conforming websites might sound draconian, imagine the benefits:
- Webmasters would close security gaps, reduce the incentive for hackers to attack websites, and subsequently make all of our sites a little safer.
- Blogs and forums would improve their moderation systems.
- A reduction in link spam would probably make paid links easier to detect.
- The rewards for ethical link-building would become even greater.
- Social sites like Twitter might do more to reduce incredible amount of spam generated on their platforms.
- Most importantly, Google would provide consumers with a better search experience.
To those who would argue against my plan on the idea that universal webmaster registration would make Google too powerful, I would submit that the current “unknown webmaster” free-for-all has no basis in reality. If we can’t drive a car without a driver’s license, why should we expect Google (or any other search engine) to send visitors to our website if we don’t identify ourselves to them first?
Besides, it’s not as if participation in my plan is compulsory. If a website owner doesn’t want to register with Google, they don’t have to…they just won’t be listed in Google search results.
To be clear, I’m not picking on Google. Google’s results are only as good as the websites they index. If tens of thousands of poorly managed websites are compromised with spammy links, it’s unreasonable to expect any search engine to overcome this problem algorithmically. My plan is focused on Google because of their prominence in the marketplace, but Bing could just as easily take action. Perhaps Google and Bing could even collaborate.
Whatever the solution, it’s clear that the current system has a fundamental flaw: there is no penalty levied against websites that are a source of link pollution. While our search environment is resilient enough to deal with some link pollution, regulation is the only way to prevent long-term damage.