Advertisement
  1. SEJ
  2.  ⋅ 
  3. SEO

SEO Link Building Q&A with an Ex-Google Webspam Team Member

A talk with former Google webspam team memeber, Andre Weyher, affords an acute look into how SPAM king Matt Cutts' operates.

SEO Link Building Q&A with an Ex-Google Webspam Team Member

I recently caught up with an ex-member of Google’s webspam team, Andre Weyher. Andre worked directly on Matt Cutts’ team and agreed to offer some valuable insight into how Cutts’ team operates, what they look for with regard to inbound link profiles (and manipulation of them), and how SEOs and webmasters can conform to Google’s webmaster guidelines now and going forward.

SEJ Q & A Now

What follows is my interview with Mr. Weyher.

1. What was your role on Matt Cutts’ team, and how long were you a part of it? 

The spam team is a pretty large organisation within Google. It consists of many people working towards one goal; keeping the organic search results free of poor quality sites and penalising the ones that got their ranking due to techniques that are against the Google guidelines. It’s often confused with the engineering team that’s responsible for the creation of the actual algorithm. These are two separate units within the organisation. It’s also not the external reviewers team that you often hear about. Within the spam team people usually get their own speciality. I was responsible for content quality and backlink profile. I’ve been with Google for 4.5 years, two of those in Matt Cutts’ team.

2. What’s Google’s process for determining when to apply a manual penalty to a website based on its inbound link profile?

Very good question, of course there are elements to it that are very secret internally but the process is in principle very straightforward. I often see people taking a very strict and mathematical approach to assessing a backlink profile. It’s good to do it in this way if you are doubting but it’s also important to use your intuition here. When reviewing a profile, the spam fighter would look at the quality of the pages where the links are hosted and the anchors used in the links. If the profile and anchors are not coherent with what a “natural” profile would look like, action would be taken. Lets take an example of a travel website – if there are 100,000 links coming in and 90,000 of them use an anchor like “cheap flights” or “book flight”, it would straight away arouse suspicion because this would never be the case if the links were natural. The quality of the pages linking in is of critical importance. Is it authentic? Or does it purely exist to host the link?

3. How does Google’s Penguin algorithm determine what domains to penalize? 

First of all, it’s important to stress that being affected, or as people commonly refer to as “slapped” by Penguin, is not the same as a penalty. It’s just a new, unfortunately disadvantageous ranking. A penalty is more severe. Penguin is a huge and very complicated update and there are very few who know it in its entirety. It is safe to say that it has been specifically designed to combat the most commonly used blackhat SEO techniques. A lot of it would have been handled manually before Penguin. Now it’s all automated, so it has become even more difficult for spammers to get away with things that worked not very long ago. The most obvious element that it focuses on is ranking due to a large amount of bad quality backlinks but it also takes into account spammy on-page techniques like keyword stuffing and over-optimization of tags and internal links.

4. How does Google spot blog networks and/or bad neighborhoods?

Search engines rely on website fingerprinting to identify clusters of ownership. If a particular website is relying on techniques that are not abiding the guidelines, it’s likely that the other sites owned by the same person are doing the same. They have very advanced techniques to figure out all the sites within one neighbourhood and would often penalise it entirely if similar techniques are found everywhere. Unfortunately I cant go into deeper detail about the tools they use but they are very advanced and can sniff out anything!

5. What’s the best way to recover a website that has been sent a notification via Google Webmaster Tools of manual spam action? 

That very much depends on the type of penalty that has been applied. There are 2 scenarios here, one regarding the quality of the content on the page itself, the second regarding the links coming in to it. In the first case it’s “merely” a question of adding value to your site. In most of these cases the penalty would be applied to a site that has affiliate links but does not offer the user any added value apart from clicking away to a third party, if this is the case, the webmaster should fully focus on adding valuable content to the page and showing Google that there are more reasons to use the site apart from enriching the owner.

In the second case it’s a bit tougher. If you have been relying on poor quality link building, you have to get rid of as many bad links as you can. This used to be a very time consuming and difficult process but luckily the new disavow tool in WMT has made this much easier. You do have to be very careful with what you choose to disavow! Again, use your intuition here. Don’t just cut all the links below a certain PR, a low PR website is not necessarily bad, the relevance of the topic of the website and above all, its authenticity are much more important than just the PR.

6. What’s the best way to recover a website affected by Google Penguin? 

This is a bit trickier. In case of a penalty, it’s often clear what needs to be done. But after an automatic demotion due to Penguin, it’s not clear in many cases what the reason was. I wish there was an easy, straightforward answer that I can give here, but the only thing I can recommend is to have a very critical look at your website and try to figure out what it is that Google saw and was not entirely in line with the guidelines. From what I have seen since I left the team, a lot of webmasters are relying on techniques that they know are risky. After penguin it’s very difficult to get away with it, so my advice would be to certainly stop any grey activity and focus on creating compelling content and leveraging social signals. These have become very important.

7. What are some of the biggest misconceptions or myths you’ve seen about “bad links” and link profile penalties in the SEO community? 

I think I could write a book about this topic! SEO is an unprotected title and anyone can call him or herself one. The result of this is that there are almost as many opinions as there are SEOs. Some of the biggest misconceptions that I have seen out there include; “directories are altogether bad” or “anything that is below a certain PR is considered spammy by Google”, I see a lot of people panicking and cutting off the head to cure the headache due to lack of knowledge. The most dangerous one of all I would consider to be the opinion that if an automated link building scheme is expensive, it must be good. Google has made it very clear that it wants links to be a sign of a real reason to link, an AUTHENTIC vote of confidence if you will. Anything that is paid for, is not considered quality by Google and participating in it puts your site at risk!

8. What do SEOs need to know right now to prepare for future link profile-related algorithm updates? 

It’s hard to predict what the future will hold but you can be sure that Google will become more and more effective at fighting everything they are fighting currently. So if there are still people out there that are getting away with spammy techniques, it’s only matter of time before Google will find a new way of identifying it and penalizing the ones that do it. I think the best way of preparing yourself against future updates is to build an SEO strategy that depends on smart on-page techniques and internal linking on one side and relationship based linkbuilding on the other side. This means that the obtained links should come from a source that has a genuine reason to link to your site. The relevance of your linking partner to the topic of your site is the key!

9. You left your job in Google not long ago, what are your plans?

I have fulfilled a long dream and moved to Australia! Sydney is an amazing city with a great startup community. I have started my own company here and am very excited about it. It’s called Netcomber.com. The first intelligent website fingerprinting service on the net. After typing in a URL, we will show you, based on over 3000 factors, what other websites are owned or developed by the same owner. We’re in beta, though we’ve just finished crawling over 200 million websites and used elements like hosting, account IDs and even coding style to determine who owns what on the web… exciting times! This new version will be up in a few weeks.

Image Credit: © JJAVA – Fotolia.com

Category SEO
ADVERTISEMENT
Jayson DeMers CEO at AudienceBloom

Jayson DeMers is the founder & CEO of EmailAnalytics. You can contact him on LinkedIn or Twitter.