Google is persistently refining and revamping its search engine, with one or more changes to their search algorithm each and every day. The objective for the company is fairly simple: To provide users with the most relevant, highest quality search results possible. In order to better achieve this, Google will be paying closer attention to sites attempting to exploit the system, starting in 2011.
As reported over at Search Engine Land, Google’s Matt Cutts made a Twitter post indicating that the search engine giant would be working on its anti-cloaking tactics.
For those unfamiliar with cloaking, it’s the practice of redirecting the search engine robots to a different page or lineup of content than the users will actually see when visiting the same URL. While done for many reasons, including to openly deceive the search engines in regards to the type of content, the most common reason for cloaking is to have one page for optimization and one page for user-response purposes. Webmasters have been trying to cloak via redirects, invisible content, and more for years, and it’s one of the most openly stated items on Google’s black-list, quickly resulting in being removed from the Google index.
Apparently, some users are still getting away with it, but that won’t continue for long. Cutts’ tweet stated that Google would be focusing more on cloaking during the first quarter, and users should be aware that “Not just page content matters; avoid different headers/redirects to Googlebot instead of users.”
So, whether you’re sending Google bots to a different page for minor search engine optimization tweaks or because of more shady reasons, your time is quickly running out. Those who haven’t resolved cloaks on their site by the time these new Google detecting approaches are introduced will likely be blacklisted entirely from the SERPs.