Advertisement
  1. SEJ
  2.  ⋅ 
  3. SEO

#pubcon Matt Cutts and Amit Singhal Answer Questions and Offer Advice

#pubcon Matt Cutts and Amit Singhal Answer Questions and Offer Advice

pubcon qa with amit singhal and matt cutts

Why is ranking data not available in Analytics?

Cutts: Over 96% of sites get all of their searches within the 1,000 limitation. The last 4% of sites would require 2-3 times more data storage.

Due to the Panda update, lower quality sites are outranking an authority site. Why?

Singhal: Google’s preference is always algorithmic – it is scalable across all sites, countries, and languages. Overall, the Panda update has been a very positive change – the scientific measurements say the Google user experience is better than it used to be. However, they understand that no algorithm is perfect and want people to submit reports of instances like this so they can improve the algorithm.

Cutts: Google is listening. Unfortunately, the changes take time to implement. They use the aggregated reports to try to improve the algorithm. The algorithm is under active development and they want to get it right.

When we search for appliances, why do we only get Sears and other major stores?

Cutts: The web is one of the only places where the small business can move faster than the big guys. The big companies are often big for reason and as a result they can outrank other pages. However, the search engine does give the small business a chance Google Webmaster Tools is somewhat of an equalizer though and small businesses should use this – i.e. big businesses are more likely to use text in images/flash and small businesses will know better. Also, small businesses should concentrate on the small niche.

 Are they trying to make the algorithm so perfect  that they are missing the user experience?

Singhal: All scientific measures and manual reviews indicate that the algorithm is getting better and that search quality is improving (improving search quality = more relevant, higher quality results).

Google Places Page that got shut down by competitor – is there a better process to stop this type of behavior?

Cutts: The web used to be the “wild west” and there is still a small element of this especially in local. The local area is changing fast and a combination of manual spam fighters and algorithmic changes will get this under control. They are open to ideas on how to prevent malicious deletions of other businesses. They are working on this.

Where is the balance between privacy and data with SSL encryption?

Cutts: The trend is search is becoming more personal and this should continue, which means this is important to Google. People are unhappy that they have lost some of their keyword data. However, if you download your data from Google Webmaster Tools, 96% of people can still see all of their keyword data. They will not back down on the SSL – if anything they may move forward and advertisers may not get the data in the future. People want to know that they are not being snooped on.

Are PRWeb and press releases considered black hat due to duplicate content?

Cutts: Press releases are going to other people and asking them to write about you. Instead, work hard to produce high quality content on your site and people will want to write about you. It is harder to fake natural than be natural.
Singhal: The content must be high-quality and useful from a reader’s perspective. If the content is high quality and you work hard for the users, it is OK.

If I do doorway pages will the whole site get penalized or just the doorway pages?

Cutts: Are you asking how to do doorway pages (incredulously)?!? There is an answer though – it depends on the amount of spam. If there is a huge amount of great content, they will probably only penalize the portion of the site that is using doorway pages. However, if

Singhal: Don’t do it man.

Everyone says I need more links. How do links improve the quality of the site? I don’t want to play this game and I don’t want to do this.

Cutts: What matters is bottom line. Links are a part of search – they represent online reputation. Although there are many tools that report links, none of the tools can tell you which links are trusted by Google (not even Google’s tools). While the link structure looks bad from the outside, the actual linkgraph that Google uses/trusts looks much better. When the New York Times complained about a site with 10,000 spammy links, Google investigated the site and not a single link had slipped through Google’s filter. Only the links Google trusts count.

Is Google going to give more data to webmasters?

Google can either give more data (i.e. 2000 queries instead of 1,000) or give a longer timeframe (i.e. 60 days). They are leaning toward more data – they figure people can just download data periodically and still have access to past data. In an informal survey of the audience they disagree – 60% want longer timeframe and 40% want more queries.

Category SEO
ADVERTISEMENT
David Angotti SmokyMountains.com

After successfully founding and exiting an educational startup in 2009, I began helping companies with business development, search engine marketing ...