Loading...
Answers
MenuSEO-content duplicity: can the same content be published in 2 different sites and not get penalized by google?
I have a site with a huge collection of Q&A's that we share with a larger site who has huge organic traffic to leverage their seo and get more adv $ (we have a rev share agreement)
Currently we are not indexing our content on google so neither of us gets penalized. Is this how it's supposed to be or can I index my content too?
Answers
For the most part, I concur with the previous answers.
I do want to clarify: There is no Google penalty for duplicate content. A Google penalty is a specific kind of action. Duplicate content, generally speaking, can certainly hurt your sites performance, but there's no penalty.
So much depends on the specifics of your situation. If this other site is the primary source of your revenue (or leads), then you probably want to do what will add gas to that fire.
Again, it depends on your business objectives. Providing them with content, traffic, links, whatever, doesn't provide you any long term "seo equity," but that may be ok.
Couple of additional ideas to consider, if appropriate:
1) Index the pages on your site, but add an rel=canonical tag that points Google to the source of the content (i.e. the huge organic traffic site). You might get a little more traffic. You won't hurt the other site.
2) Come up with a creative way to promote those Q&A's with original content on your site.
If this content is totally duplicate and they already have theirs indexed (with a bigger site), you'll want to keep these as "noindex, follow" with your meta robots tag. But, one of you should be the "owner" of that content and keep it indexed, not both of you. That will depend on A) who actually created the content and B) who should own the content and traffic. When your rev share expires, whoever gets to keep the content should be the one that has it indexed.
You also have the option of telling Google not to crawl these pages using your robots.txt file. The downside to this is that noindexed content can still pass link value to other pages of your website, but not if they can't crawl the noindexed pages. The upside to this is that it gives Google more time to crawl your other webpages which are actually indexed. This is related to the concepts of crawl budget (how much Google is willing to crawl your site over a time period) and crawl prioritization (making sure they crawl the good stuff).
Interesting question. From what you are saying, it makes me wonder what your reason would be for wanting to have your content indexed?
If you do decide to do it, you are definitely putting both sites at risk IMO. Even if you take all of the precautions to tag your content as the original source, you will likely be damaging the money site as they would risk losing visibility for the content you are providing.
I would want to know a little more about the motives and objectives of each business, but my initial recommendation would be not to do it based on what I see here. It has the potential to damage your business relationship.
If you are trying to take the business away for yourself, the answer might be a little different and would need to be very carefully thought out.
As Bill mentioned, duplicate content doesn't trigger a penalty. There are filters in place to eliminate duplicate results from within the SAME SITE but technically not across two sites. What can happen though is that one site will rank higher than the other for the content and you could end up with the wrong one ranking (if there really is a "wrong one").
That being said, I could point to thousands of situations where the same exact content shows up in multiple positions within the top 10 (think news sites syndicating articles), so it's not a guarantee that either site will take a hit. You can let both sites be indexed without too much worry as long as the entire pages aren't complete copies of eachother (which could result in a true penalty if caught...but not just because of the duplicated content), all the way down to the template. Ideally, if you really want to do it right and give both sites a shot to rank, they should each have some kind of unique value add that would serve the visitors in a way that there shouldn't be a reason to get flagged for a penalty.
The search engine's love content....just not duplicate content. Google claims there is no such thing as negative SEO. They don't punish sites Rather they don't reward sites with duplicate content. This is like saying "No Dessert" is not a punishment, but instead you weren't rewarded.
There is a white hat way to "have your cake and eat it too". If one of the sites adds additional content to each page (at least 200 words) or simply rewrites the content, it will most likely not be viewed as duplicate content. You both can get SEO credit. Finding a writer to do this should be very easy on oDesk or Zerys. Depending on the technical aspects of the content, you might even be able to find a college intern to work on the project.
When duplicate content is present, site owners can suffer rankings and traffic losses. In most cases, website owners do not intentionally create duplicate content. If many different websites sell the same items, and they all use the manufacturer's descriptions of those items, identical content winds up in multiple locations across the web.
You can read more here: https://moz.com/learn/seo/duplicate-content
Besides if you do have any questions give me a call: https://clarity.fm/joy-brotonath
Related Questions
-
How can I convince a client to sign up a 12 month SEO contract?
The best way to work around something like this is to map out the long-term strategy in phases. Build out a brief project map that outlines what they will receive within the 1-3 month period, the 4-7 month and the 8-12 month period. Set micro objectives for each period and this will give the client a bit more confidence in the short-term plans as well as the long. The key thing to remember here is that the client will often be worried about being tied into a contract that doesn't deliver results. As a result, you need to show why you need the time that you do. One thing that I often throw in is an extra incentive for longer contract lengths - for example, an extra PR/content campaign or some paid advertising extras. Try to assure them of some shorter term results that you can obtain as 'quick wins' and build their confidence this way - the major targets will always be longer term but if you can demonstrate that there will be progress between then they will be a lot more receptive.MH
-
How do you build a high traffic niche website?
Obviously, no 2 situations are alike; and multiple factors affect any outcome. Practically, the number of answers is infinite. But one factor I've looked at intensively, full time for years is the role played by the brand name and/or the site's domain(s). Think of doing business -- online or off -- as moving along a path. Some paths are rocky or go through quicksand. Others can be made straight and smooth. Obstacles can be cleared, or the surface may be lubricated. For most niches, you'll see brand names / domains that add friction -- friction that is compensated for by extra marketing inputs ... effort or money. Suppose your niche were nicotine patches. Well, ideally you might own NicotinePatch(es).com to simplify brand recognition, add trust, increase click-through rates, and so forth. Traffic can be built up without an exact-match domain. No doubt about that. Still, not all domains / names perform equally well online or in the minds' of an audience. Answers aren't always so clear cut. However, since the internet is built on domain names, domains and names are worth evaluating very deliberately.JP
-
For SEO, is it better to use sub-directories or sub-domains when geo-targeting by country?
I have been working in SEO for over 10 years and have built search agencies from the ground up. I've also worked with some big brands on International SEO like Active.com, as well as many US based chains like Extended Stay Hotels. I suggest sub-directories as they are MUCH easier to manage and promote via SEO. You can't have specific local IPs with sub-directories, but you can with sub-domains. But that's not hyper critical, there are WAY more important factors. Plus, you can geo-target in Google Webmaster Tools. Your current structure for sub-directories is perfect: http://drivingtests101.com/Canada/Ontario With subdomains, each will be treated like a brand new site. So you would have to work very hard to get each to gain good authority SEO-wise. Plus managing and hosting lots of subdomains is a ton of work! In order to not get penalized you need to: Implement the the "lang" and "hreflang" tags in the HTML of each country directory and all pages within. Have unique Title tags and descriptions written in each language. Have unique content for each country and state page. Written in each respective language. Yes, its a TON of work but hey, you are trying to take over the world! If you are serious about international search traffic then you will do these things and the costs should be balanced out with the extra traffic you'll get. If you can make the root domain super authoritative then it will boost ALL of the internal pages. This is also key to ranking well for international search. I'm open to a follow up call to elaborate on any of my responses or to answer other questions.MS
-
Will redesigning a website (ground up) ruin the search ranking of a site?
TL;DR -> Yes you will risk it if you don't perform a proper audit and migration from the original platform. Any type of architecture changes can 100% ruin your SEO if you are not migrating content and topics correctly. Many people assume this is limited to URI structure but underestimate the power of Topical hierarchy and the content, internal linking and URI structure which play a crucial role into any migration effort. If you have any questions about the migration process, give me a shout. Decent free migration checklist: https://searchengineland.com/site-migration-seo-checklist-dont-lose-traffic-286880TM
-
How Google can index/reindex my page as quickest as possible, as and where there is some change happen in the content?
Hi, Your site is a Q&A so I presume that every time someone create a new Question it will generate a unique URL for that. Indepentend if you have or not an answer you want Google to crawl and index your site as quick as possible to start to analyzing the new page and bring traffic to it. The best way to expedite this "re-crawling" is to use a ping service that you can trigger after your user answer the question. PS: If you site is updated frequently you shouldn't have a problem with crawling, because Google usually identify this type of website really quick. Drop me a call is free for this week. Best,YM
the startups.com platform
Copyright © 2025 Startups.com. All rights reserved.