Updated (see below)
I’m always following SEPS’s and big companies out there getting slapped by Penguin and Panda updates. Matt Cutts talked last week about a Panda update this past Friday at SMX last week. Was Digg impacted by this update? It appears so. Could this have anything to do with them announcing that they are starting a new Google Reader?
If you search for “Digg” in Google you will get the following results:
As you can see, Digg totally got slapped. (leaked by DataDial) Could this have anything to do with Digg releasing last week that it’s going to start up a Google Reader type service as Google is shutting down the Google Reader service in a short period.
Is Digg getting slapped because of their millions and millions of spam links out there. Could this be the lastest Panda slap? As you can see from the results below, Digg is no longer ranked in the SERPS:
There are ZERO results for Digg.com in the results. So what exactly was the update that took Digg out of the rankings? Does anyone have any insite about what’s going on? What do you think happend?
My guesses on why Digg got slapped:
- They got TONS of links and a million people linking to their site in a day. This is a site with a lot of links in the past but not a whole lot of recent links. Then all the sudden a million links from a TON of sources started linking to them. Google’s algo slapps them and causes a manual review.
- They finally got caught for buying links!
- In the past people had gone to link networks to buy links to their bio pages, these networks finally got slapped with the latest update. Tons of links from bad sites were pointing to Digg, they got slapped.
- Shady link building
- Link networks finally catching up with them cause of them or their users buying links
- “sex capsules” is their 10th highest anchor text with 100’s of links
- Google going hard on Duplicate Content
I haven’t looked much into their linking but these could be some early signs that Google is going after ever site out there that’s doing things wrong (according to them/users).
Let us know what you think of this and if you have an additional insite in your comments below!
Updated
When we broke the news about Digg earlier this morning, we checked their Robots.txt file. There wasn’t anything there.
If they didn’t want to be seen or wanted to take their site out of the index… their text would have read:
User-agent: *
Disallow: /
But there was nothing there as of this morning. Right now it currently reads:
User-agent: *
Disallow:
Google contacted me directly a few minutes ago “We’re sorry about the inconvenience this morning to people trying to search for Digg. In the process of removing a spammy link on Digg.com, we inadvertently applied the webspam action to the whole site. We’re correcting this, and the fix should be deployed shortly.”
I think there is more to this story unless there was some sort of Panda/Penguin update. I truly think Digg was updating their site and forgot to put in their robots.txt file, but now it’s in there. Here is proof:
This screenshot was taken at at 9:31 PST, around 20 min before I posted the info.
Having robots.txt issues doesn’t mean that you’ll be banned or de-indexed from Google search results, however it can cause crawling issues. These crawling issues may lead to your site disappearing from search results. Is this the other part to the story?
Also, Google said it has to do with a spammy link. See screenshot below of only spammy link I can find:
Is Google penalizing this site? I see it fully in the rankings and seems like a legit site, but is this site a spammy link-farm and the type of site Google is going after? Also, was there an update today that went after these types of sites?
What is going on? Who’s fault is this?