Advertisement
  1. SEJ
  2.  ⋅ 
  3. SEO

Anticipating SEO in 2012- Competitive Advantage

Hi there!  This is an odd situation for me – as this article is being published, I’m attending SMX Advanced in Seattle – and tomorrow morning, I’ll be speaking on the Google Survivor Tips panel.  In my presentation, I’ll be showing people how to weather the Google roller coaster, and how to dramatically increase organic search traffic beyond Google.  So I felt bad that you all couldn’t attend, and would have to wait til afterward for me to do a full article on it.

To compensate, I’m sharing with you some insights that I only briefly touch on in that presentation – insights I believe will help those who listen to take action that will give you a competitive advantage.  And you’re getting this advice early enough on that by this fall, you’ll be ahead of the curve!

Get Your Competitive Advantage On

When the May Day update happened last year, I saw the writing on the wall, and spent the next few months taking my client site planning in a direction I believed would be the future.  It turned out that future was Panda.  And guess what? Every one of my clients who adapted according to my recommendations not only lost no organic traffic this spring, in fact some saw a bump.

It’s that ability to anticipate, and adapt.  And Panda showed us that with some changes, you can’t easily adapt AFTER the fact.  Your site could be significantly hammered with no clear path to recovery.

Schema.org – A New Paradigm

While there have already been some great articles breaking it down to basics, as well as making a strong case as to why you should care at all about Schema.org, I want to touch on specific aspects of this new structure and how I believe they’re going to help search engines, and in turn, SEOs who adopt them, do what we all do for a living. Which means those who adopt early will have (at least initially) a competitive advantage over those who don’t.

The Bad News (depending on the color of your hat)

If you are like me, you’ll already see how these will be considered fair game for people wanting to use tactics of the “hat color that shall remain nameless”.  But hey – that’s part of the nature of search rankings already – this is just going to be yet one more sub-arena people will try and game.

The Good News

Even though people who come from the “hat color that shall remain nameless” camp will creatively look to push the boundaries of fairness and all that entails, I also believe this new system will aid the search engines in being able to do a better job, believe it or not, at detecting such tactics.  And I’ll  touch on that concept at the end of this article.

The Really Annoying News

If you thought it was challenging trying to get people to implement even the simple semantic markup for breadcrumbs, products, events, recipes, and the like, wait til you get a load of how complex Schema.org is.  I mean we’re talking about dozens of content (data) types, each one potentially having dozens of elements to provide content for.

And if you’re creating web pages by hand, or even if you’re developing your own sites using a CMS, it’s going to be a bitch and a half now, to get this stuff implemented properly. Every major CMS, from WordPress to Joomla to Drupal, to Magento and beyond, is going to need to be reworked.  Every dev company that has their own custom CMS is going to have to find the budgetary room to do so as well…

The Semi-Good News

The light at the end of this developer’s worst nightmare tunnel comes from a few points.  It’s a “standard” that the big three have embraced.  No more picking and choosing between microformats, RDFa, or some other competing schema that isn’t fully recognized by the three.

Additionally, for the major CMS’s, once they’ve updated their core systems to work with this, the majority of the heavy lifting will be a one time shot on their end.  Except of course, every time a new schema type or element-set comes along.  And except, of course, for any plug-ins that have been developed for any of them prior to this roll-out.  And except, of course, for any major changes the CMS communities want to make in the future that will then have to factor in schema factors.

But hey – at least it’s one methodology that everyone (who is wise) will be able to embrace!

Common Elements – Doing The Work For The Search Engines

Many people say things like “you don’t need to submit your site to the search engines – they’ll discover it.” I’ve never held that view – I’ve instead always taken the perspective that it’s better to help the search engines along.  The more I can do this, the more likely my client sites will be indexed and ranked sooner.  And more accurately to my vision of what I want them to be found for.

Many will say wait with Schema.  Let’s see if this thing really takes off, or if it’ll even be worthwhile.

I believe that’s also a very big mistake. Because the Schema.org microdata structure will do an amazing job at helping search engines.

Core Elements You Should Care About

Here are the common elements across many content types I see as being critical for SEO moving forward.

  • aggregateRating
  • author
  • datePublished
  • genre
  • headline
  • interactionCount
  • keywords
  • offers
  • publisher
  • reviews
  • breadcrumb
  • isPartOf
  • mainContentOfPage
  • primaryImageOfPage
  • significantLinks

Page Segmentation in 2012

And now the most relevant of all:

  • SiteNavigationElement
  • WPAdBlock
  • WPFooter
  • WPHeader
  • WPSideBar

Why These Are So Relevant

Think about it – the more clearly defined your content, the less guessing search engines have to do when it comes to trying to figure out what the topic of the page is, what the relationships are between pages and sections, who the person/organization is that’s the originator of the content…

How Spam Detection Will Be Easier

Okay so there’s all sorts of ways to try and abuse this.  The keywords element alone is going to be played.  And yes, scrapers will write scripts to replace author/publisher elements.

Except with so many aspects of any single page being able to be clearly defined, in a standardized structure, I also believe search engines will be able to look at any single element and any combination of elements and say “Does this type of data/content belong in this place when compared to what normally goes in these across the majority of sites out there in this market?”

And the more sites they pull into their search indexes that use microdata, the more accurate their model will become over time.

Adopt Or Perish

Remember how early on I said how challenging it’s going to be to get developers and CMS creators to implement this stuff?  Well whoever gets it right in the development arena is also going to be ahead of the pack. So keep your eyes open on that front.  Or start campaigning for your preferred CMS platform’s development team / community to get on board.

And I also believe that everyone who adopts early on will have that much of a lead on whoever ignores it or struggles to adopt.  It’s that significant a change to search at the semantic level, which is at the heart of SEO.

My Plan

The first day this broke on Twitter, I immediately sent out an email to my biggest client’s development team.  I let them know how important this is, and that we need to start planning NOW for adoption over the next few months.  They have their own CMS, so it’s going to involve a lot of work.  Yet I am quite confident it’ll pay off.

Think I’m Right?  Think I’m OverReacting?

I’d love to hear your thoughts on this.  So please – leave a comment and share them here.  Just forgive me if I am not able to respond to them right away, as I’ll be caught up in the whirlwind that happens at most conferences, and especially this one since I’m both a panel speaker and putting on one of my #EpicDinner events!

Category SEO
ADVERTISEMENT
Alan Bleiweiss Forensic SEO Consultant at Alan Bleiweiss Consulting

Alan Bleiweiss is a Forensic SEO audit consultant with audit client sites from medium scale to world class enterprise. He ...