How to fix what is broken and not break what is not
This post is part of a series. See Part 1, Part 2 and Part 3
What to Make Out of It
Some results should make the link being ignored completely and even reflect poorly on the webmaster (trust), some should get full voting power and even extra power for B=0, S=0, E=10 ratings for example and everything in between depending on how much the information is trusted and other strategic decisions made by the SE.
Those attributions are on-site factors and easy to manipulate by the webmaster. However, Google has information from other sites as well, that make a statement about the page. Not on the site itself, but all other sites that refer to it. It should be possible to compare what the webmaster says about him to what other webmasters say and what Google knows already about both sites. Google supposedly knows[i] a lot to detect linking schemes and other artificial link networks.
Spam on .EDU sites will become much easier to detect, unless universities are starting to link to ecommerce sites en mass rather than other educational content. A site that does reviews tends to link to ecommerce sites with the intention to have the customer buy it if he likes the review. Those are just two examples that show how it could help in detecting spam.
The attributes I presented are only an example. I used them to demonstrate how webmasters could express and communicate different important information to the search engines. Similar or even completely different attributes might be more useful to achieve the same thing at the end.
Conclusion
Not everybody will adopt using these or similar attributes, especially right away, Google must set them to something themselves based on factors they know already, and in comparison with other sites that are similar and provided attributes.
Match this with the intention of the user that uses the search engine.
Yahoo! and Microsoft are both experimenting with matching user intention to search results and I am sure that Google also works on this problem.
Yahoo! Mindset[ii] for example let the user express their intent with a search on a sliding scale from 1 to 10 where 1 stands for shopping and 10 for researching and 2-9 for everything in between but with the ability to put more or less weight on the one or the other.
Microsoft has at their MS adCenter Labs a search feature called “Detecting Online Commercial Intention[iii]”.
There are two possible options. Either you enter a query of keywords or key phrases or you enter a website URL.
The tool will return a number between 0.0001 and 1.00000 as a result. The closer the number is to 1.00000, the higher the determined probability that the query or webpage has a commercial intend.
Those are steps into the right direction. Give the users what they want and let honest Webmasters help you with matching them with their sites and to wheat out spam.
If the user wants to “visit Disneyland”, show reviews and sites to book for it. If the user wants to “buy porn”, show the user a site where he can buy porn. In order for Google to improve on that, is it necessary that webmasters will be honest and over time trusted more by Google; and the benefits webmasters will get from it will make them do it.
Carsten Cumbrowski
Cumbrowski.com – Internet Marketing Resources
[i] Rustybrick (17. November 2005), “Google Knows Link Networks Well”, Search Engine Roundtable
[ii] Yahoo! Mindset, experimental search, Yahoo!
[iii] “Detecting Online Commercial Intention”, MS adCenter Labs, Determine probable commercial intention based on query or URL