I was on a training the other day that happened to discuss duplicate content.
There was an interesting, but very logical conclusion that duplicated content is both nonsense and fact. How can this be the case? Read on…
So Is Duplicate Content Really Bad?
The answer, it depends.
Think about things this way. What about news sites? How many IDENTICAL posts are there on news style sites? Duplicated content all over the place right? Why do sites like Mashable still rank well if sites of that kind share content that is duplicated? If we listen to those who proclaim “duplicated content is bad for your SEO”, this would be really bad for us. I’ll be honest with you; I believed all this stuff too! I’ve been sharing content all over the place and it hasn’t affected my site in ANY way. I have a link back to source, which is all it needs. Google robots don’t distinguish between sites according to the source of this information. If they did, they would have to crawl each site in SEQUENCE to notice the content.
What is BAD, is duplicated pages etc. on your own website. If you have 20 pages on your site for example and you look to make different pages for different areas. You may have 3 totally unique pages, but 17 are in effect duplicated except for the name of the area targeted. What then happens is this:
Google crawls your site and they see page 1, 2 and 3 are unique, but when they start finding duplicated content, the pages will be crawled until the duplication is triggered into the system. Google then STOPS indexing your pages! You are losing valuable Google bot time.
Avoid Duplicated Content On Your Site.
What is needed is for you to go back and change your duplicated pages. They can certainly be similar, but NOT the same. Make this a priority for your site now. This is more especially if you are trying to rank your business locally and have pages for each area. Your services won’t change and your keywords may not change either, but the way you write your content to avoid exact duplication is important.
Look at the logical explanation regarding the news sites etc. If the issue of exact duplication across a network of sites was indeed one that Google penalised, why do the authority news sites not have problems with it?
Take the number of articles written each and every day. Can it truly be realistic NOT to see some duplication? If you also think about music and composing. How many MILLIONS of tunes are out there that you can honestly say there is NOTHING that is duplicated? I know of at least 3 works from famous composers that use the exact same theme and tunes in their work. If this was on Google and scrutinized like we have been told, ALL of these musical masterpieces would be BANNED!
There is certainly a valid argument that duplicated content posted on different sites is NOT a bad thing. My best practice is to use as unique as possible, but there is a real point made here that can be considered. My advice is to TEST it first. NEVER take anything for granted!
Totally unique content on your own site is THE thing to do. Avoid duplicated content at all costs! We all want our sites to be indexed regularly and this can only happen if the crawlers find unique content. Duplicate content can stop this and that is very BAD for you and your business.
If you find this useful, please comment and share. Also, check out my PAC membership page HERE.