In the early 2000s, when Google announced an update, it usually meant a significant change in the search results.
Google’s Florida Update in 2003 was significant. In fact, it was the first major Google algorithm update in what would become a decade filled with huge updates.
This article discusses what the Florida Update probably was and why it still matters today.
Google Florida: The Update That Changed SEO Forever
Google’s Florida update happened in November 16, 2003, just before the Christmas shopping season and just before Pubcon Florida in Orlando.
Florida was immediately perceived as a change in how links are calculated at Google.
Many innocent, non-spam sites lost rankings. These innocent sites that lost rankings were labeled as “false positives.”
Sadly, those false positives put many small retailers and affiliates out of business overnight.
The blowback was so massive that Google promised to try not to ever roll out a significant update before the holidays again. (Google kept that unofficial promise until 2011, when they rolled out a flurry of Penguin updates in November and December.)
The shake up continued well into 2004. Google’s Matt Cutts actively sought examples of false positives. Many consultants sent example URLs of innocent sites that had been affected.
It wasn’t until sometime in January or February when rankings began to stabilize and Google had sorted through the false positives.
I’m not sure how much in-house testing Google had done prior to the release of Florida, but in my experience, it felt as if there was little pre-release modeling of how it would affect innocent sites.
What Was Google’s Florida Update?
At the time there were many theories about what Google was doing.
I remember some prominent (black hat at the time) SEO folks speculating that Google was using OCR to identify the “buy” button on ecommerce sites in order to weed them out of informational queries.
But the predominant theory was a general sense that this affected links.
I believe that is what Florida really was. It was a link analysis algorithm.
At the time, nobody, including myself, knew much about how link analysis worked.
Google Discloses Algorithm Clues
At a Search Engine Strategies San Jose session about Googlebot, Marissa Mayer revealed that Google depreciated links from irrelevant pages.
This was important because up until then a high PageRank link could help a site rank, regardless of topic.
Was this what the Florida update was about? I didn’t think so at the time.
PubCon 2005 Clues
In 2005, at Pubcon New Orleans, Google engineers revealed that they were using statistical link analysis to weed out spam sites.
It was announced at a super session of ten Google engineers. An engineer spoke about statistical analysis then opened the floor to informal one-on-one discussions.
That was the first I’d heard of statistical analysis and it was a mind blowing revelation, even more important than the revelation that Google depreciated PageRank from irrelevant sites.
There’s no confirmation that statistical analysis was a part of the Florida Update, but could it have been?
What Was the Specific Algorithm Behind Florida?
Google’s Florida update was a major disruption.
Far more than just a simple devaluation for irrelevant links. Google has never disclosed what Florida was, but in my opinion, the obvious candidate is statistical analysis.
What Is Statistical Analysis?
Statistical analysis for links is the process of plotting out on a graph the characteristics of a webpage or website. You can tally up statistics on things like the average amount of outbound links per webpage, percentages of outbound links that contain keyword rich anchor text, and so on.
Google had been researching statistical properties of links since at least 2001. The paper Who Links to Whom: Mining Linkage between Web Sites (PDF) details work on modeling statistical properties of webpages and websites. They also noticed how certain properties seemed to indicate the presence of spam.
One of the authors of this study is Krishna Bharat, who would later go on to found and head Google News. He is a co-author of the famous Hilltop algorithm and a creator of Google’s LocalRank algorithm.
By June of 2004, Microsoft had publicly published the famous research paper titled, Spam, Damn Spam, and Statistics (PDF).
It is this research paper that says out loud what search engines had been developing in secrecy. It reveals the mature vision of statistical analysis for finding spam.
If you have never read this research paper, I strongly encourage you to read it. It will give you a good idea about what statistical analysis is in relation to spam fighting and SEO.
Was Florida Based on Link Analysis?
The timeline for the development of link analysis fits the timeline for when Florida happened, in late 2003.
Seven months later Microsoft was publishing research papers about it.
Google had been researching mining the web graph since at least 2001 (and that paper cites research that were published before 2001).
It is reasonable to assume that Florida was Google’s first attempt at using statistical analysis to find spam.
However, it was a bumpy debut, which only highlighted how new and important this algorithm was.
Link Analysis Today
Link analysis changed how we talk about SEO. It ushered in phrases like “looking natural” in relation to linking patterns.
Even today, the SEO industry is still worried about “looking natural” – and for good reason.
There aren’t many research papers that focus on link analysis these days. It may be because the technology is fully matured.
The emphasis today is on machine learning in the areas of understanding concepts, understanding content, and identifying user intent.
Nevertheless, link analysis, whether it was a part of Florida, may well be a part of Google’s core algorithm. It’s an easy way to catch obvious spam and remove it.
The changes Florida brought in regards to how we approach the task of SEO is still a part of the SEO vocabulary.
Image Credit: Paulo Bobita