Search

How Long to Remove Duplicate Content Penalty?

  • Thread starter KristinaW
  • Start date
KristinaW

KristinaW

Member
Joined
May 7, 2021
Messages
127
Reaction Score
0
Points
18
  • #1
My Google rankings dropped over the past month and I have found out that it's due to duplicate content on my website. I blocked the duplicate pages in the robots.txt file and I'm now wondering how long it will take to recover. Google isn't very specific about this in any of the reading I've done, so I'd like to get some opinions from people who maybe have experienced the same thing. The duplicate content on my site came from an entire section. I blocked the URL pattern so Google can't crawl it anymore. Please let me know if you know anything.
 
Newman

Newman

Member
Joined
May 11, 2021
Messages
106
Reaction Score
0
Points
23
  • #2
I'm sure you have already read this a hundred times: "Google doesn't have a duplicate content penalty!" Yeah right. That's like saying that there's no punishment for stealing. "We'll just keep you resting in this little jail over here. It's not a punishment, mind you. It's just a time out." Sure it is.

Google certainly does have a duplicate content penalty. They may not call it one, but it exists nonetheless. If you have substantial duplicate content on your website that Google can't canonicalize to another URL, your rankings will drop. No? Tell that to all those people who have experienced ranking drops and then recoveries once they got rid of the duplicate content. Anyway...it's just so annoying to read everyone and their mother copy each other and say that there's no such thing as a penalty. They get so caught up on semantics.

It's good that you blocked the questionable content with the robots.txt file. That's another area of contention. If there's something you don't want crawled on your website or that's causing you problems, block it in that file. People say to create 301 redirects, canonical tags, and to use noindex, but I've never had any luck with any of those things. I have had luck, however, with the robots.txt file. I had a website back in 2008 where I blocked every single URL that wasn't beneficial to Google and within a few months, my rankings when from nothing to about 2,000 visitors a day. It was sudden too. No gradual climb. It was like the opposite of a cliff.

Another benefit of using the robots.txt file is that Google won't crawl those duplicate low quality pages anymore, which will most likely increase your crawling. When Google sees lots of thin, duplicate, noindexed, and redirected pages, it reduces its crawling on a website. Since you've blocked pages like that, you'll probably get more visits from Googlebot.

To answer your question, it'll probably take a few months to clear up your issue, depending on how severe it was and how popular your site is. If I had to guess, I'd say six months. From what I've experienced, if a page isn't linked to on your website and it's blocked in the robots.txt file, it'll take about 90 days to drop from the index, once Google discovers that it's been blocked. If it's still linked to, then you may never see it dropped. Meaning, a reference will still appear if you perform the "site:" command on Google, but it'll still be removed from the index if it was ever indexed in the first place.

My advice to you is to block any pages you don't like and just forget about them. Never look back. Don't get crafty and think that you have a better way next month. Once a page is blocked, keep it blocked forever and move on. Good luck.
 
Top