Advertisement
  1. SEJ
  2.  ⋅ 
  3. SEO

Protect your site against a potential penalty

With the Robots.txt protocol, a webmaster or web site owner can really protect himself if it is done correctly. Today, web domain names are certainly plentiful on the Internet. There exists a multitude of sites on just about any subject anybody can think of.

Most sites offer good content that is of value to most people and can certainly help with just about any query. However, like in the real world, what you see is not always what you get.

There are a lot of sites out there that are spamming the engines. Spam is best defined as search engine results that have nothing to do with the keywords or key phrases that were used in the search. Enter any good SEO forum today and most spam topics in daily threads usually point to hidden text, keyword stuffing in the meta tags, doorway pages and cloaking issues. Thanks to newer and more powerful search engine algorithms, these domain networks that spam the engines are increasingly being penalized or banned all together.

The inherent risks of getting a web site banned on the basis of spam increases proportionately if it appears to have duplicate listings or duplicate content. Rank for $ales does not recommend machine-generated pages because such pages have a tendency of generating spam. Most of those so-called “page generators” were not designed to be search engine-friendly and no attention was ever given to engines when they were designed.

Click here to continue reading this article on the
Rank for $ales website.

Posted by Serge Thibodeau, author.

Category SEO
ADVERTISEMENT
SEJ STAFF Loren Baker Founder at Foundation Digital

Loren Baker is the Founder of SEJ, an Advisor at Alpha Brand Media and runs Foundation Digital, a digital marketing ...