What To Do When Google Fails To Give Proper Attribution To Your Source Content

Note: Eventually I do cover 4 steps to take when your content is the victim of an over-eager algorithm. But it’s towards the end;-)

SEO mastermind Graywolf took a look at my post on the negative consequences of a front page Digg. He concluded that my original source article is getting hit by a duplicate content penalty. His inference is based on this search:

“This question got serious consideration for the top spot”

What this search shows is that Google drops the original source of this quote all the way to the very, very bottom of the search results. When an exact-match search of a unique phrase see your content drop to the very bottom of the search results, you can be fairly sure that your content been penalized by an algorithm… and in this case a stupid algorithm.

While Graywolf is probably right, it’s very hard for me to think that something else isn’t going on. After all, a billion dollar company should be able to see obvious clues like “100s of backlinks” and properly identify the source as the one article that all those scrapers and editorial references link back to. It seems other-worldly that Google, the billion dollar beast, would fail in such an obvious case, to identify the original source of the content.

Which leads me to this conclusion: Google is trying to crack down on “scramble scrapers” – scrapers who take content from lots of different sources and piece it together. Why do I think this? Because the original source of content, referenced above, received a steady stream of comments that probably looked to Google like a scraper piecing together more and more content from different sources.

I’m not 100% sure about this. But I really can’t think of any other good explanation.

But that’s not the point of this article!!!

The point of this article is that Google gives publishers like myself no good, surefire, scalable solution for dealing with situations like this. Instead, we’re left grasping at straws. But grasping at straws, I suppose, is better than nothing at all.

So, to help those of you who face similar problems in the future, I’ve put together a list of things I do whenever I think a dup-content penalty has been applied mistakenly.

  1. I fill out a search quality form complaining that the article that I was searching for was not showing up when I searched
  2. If the problem is not fixed, I ask a few friends each day to take the time to fill out a search quality form, again complaining that Google was not showing them the article they were looking for.
  3. I then take things a bit further, login to Webmaster Tools, and start filing Spam Reports against all the duplicate content that is competing with my original source article.
  4. If this doesn’t work, then I go on Twitter and some private forums that I belong to, and get my issue in front of people who are smarter, and more powerful, than me.

Honestly, I hate having to get to steps 3 and 4 because I have no problem with duplicate content so long as the duplicators provide proper reference back to the source. I hate, in a severe way, being forced by a billion dollar monopolistic company to file spam reports against sites that I don’t honestly view as spam.

But when your hand is forced, your hand is forced. Google has created an unfortunate state of affairs where a stupid algorithm makes stupid mistakes far too often (I’ve heard credible references to a 25% false positive rate on many of their filters) yet hasn’t built the algorithm to double check and fix the stupid mistakes that it makes.

But let’s get back to the point of this article. What do you do when your content is getting unfairly punished (erased without explanation) by Google? Do you lower yourself to the point of filing spam reports against websites you don’t really consider to be spam? Or do you have a better method? Something we could put before steps 3 and 4 in my list above?

2 thoughts on “What To Do When Google Fails To Give Proper Attribution To Your Source Content

  1. Google really needs to start looking at ways to communicate penalizations to publishers because there is just no communication whatsoever and publishers are left completely clueless when something goes wrong and potentially destroy their business.

    Google seems to think that 1 page of webmaster guidelines is enough – but its not. Even with loose and hard to interprate webmaster guidelines (which can change at any minute) Google is still making too many mistakes in its algorithm and penalizing (and sometimes destroying) legitimate sites.

    The excuse they can’t communicate with publishers because it helps spammers is just not cutting it anymore.

    The only way to solve this issue seems to be through some sort of legislation where Google is held accountable for penalizing sites and must work to fix it within a standard time frame and keep the publisher informed throughout. And if the publisher has in fact done something wrong then they must be informed of the nature of what they have done wrong and given a chance to fix it.

  2. It does seem totally weird that pages that only link to the original are ranking higher, it does seem like a penalty, but dupe content penalty? Google with all those PHD’s think an article with all the others they rank linking directly to it is dupe? Hmmmm … very odd

Comments are closed.