Loading...
Answers
MenuHow do you re-structure duplicate URLs which were already indexed by search engines?
Answers
If you'd only like the content to live at one location, and you're worried about legacy links that people shared across the internet breaking, then a 301 redirect from the old URLs to the new URLs is what you're looking for.
Every HTTP Request receives a response code from the web server. Response codes in the 200s mean "everything went well", response codes in the 400s mean "user error" (like the famous 404, "we couldn't find the page you requested") and response codes in the 500s mean server error (Twitter's famous 503 Fail Whale).
300s indicate some sort of redirection and a 301 response code specifically tells the requesting client that this piece of content has moved to a new URL *permanently* and will never be back at this old URL.
When web browsers receive a 301 redirect, they'll cache it and automatically forward all future requests for the old URL to the new one without even asking the server. Similarly, when search engine crawlers encounter a 301 redirect for a URL that they had already indexed, they update their information to move that old content (and all of it's old ranking information) to the new URL. This means you don't have to "start over" to get a piece of content to rank if you move its URL.
However, your question makes it sound like you actually *do* want the exact same content to be served out of multiple URLs, but you'd ideally like only one copy of that content to be indexed and ranked to avoid splitting your inbound links and any other duplicate content issues.
In that case, I'd check out adding so called "rel-canonical" tags to your pages, to indicate which version is the "canonical" version of that piece of content, and have all other versions point to the canonical one with the tag. Google is pretty good about using these, and Bing says they use it has a "hint." More info:
https://support.google.com/webmasters/answer/139394?hl=en
In a world where websites might choose to organize the exact same content in a number of different ways, adding "rel canonical" tags to your pages adds a lot of semantic meaning and helps you avoid a number of issues that duplicate content brings. Hope that helps!
I'm with Hartley on this one. But it all depends on your content.
If you are going to have (and maintain duplicate content on all of these URL's) the the re-directs from the old URL's to the new URL's only take care of half of your problem, you still will be cannibalizing the rankings of these pages as they will be ''dinged' for duplicate content (worst case scenario) or at the very least, search engines will not know which version is to be representative, will not be able to discern any difference, and instead of trying to figure out which version should rank higher, both will receive a lower relevancy score. This is where the rel=canonical tag comes into play - which essentially tells Google, "hey, this is the representative version of this content that I want indexed," and to implement this you would need to add the <link rel="canonical" href="final destination URL goes here" /> to every URL that has duplicate content, all pointing back to the "champion" version.
Whenever you change URL's, that you wish to remain indexed you must do a 301 redirect for continuity of both page-level authority and so users/crawlers can make it to the correct final destination.
There are few ways to do it:
1. 404 :- This will be too bad
2. 301: You can do this but as you have huge number of url, doing this will also not be a good approach
3.Canonical: This will be best way in your case, along with you should also do no-index and no-follow for all these pages
I would do 301 redirects for the unwanted urls for a couple of months and then 404 then. This would require more work but it'll give you less impact in search ranking and traffic.
404 is too bad for SEO but it might impact your traffic.
Simple questions, simple answers.
Clearly a 301 redirect case because there is no canonical value for just switching URLs...however...this is the most important line on this whole thread coming up...
You have to do a search in Google to see which pages are ranking better based on URL and keep those to have maximum ROI. To do this properly requires a rank checker and keyword list to see how the URLs rank.
Should be a snap for a SEO specialist with Screaming Frog and a proper rank checker -- at least to identify the right URLs to keep. Now, updating the URLs with a 301 redirect should probably be done in programmatic fashion to save time...
Cheers --
Nick
The best answer i could suggest is Rel canonical.
Related Questions
-
SEO: Subdomain or subdirectory for blog?
Google's official stance is that they are "roughly equivalent" and recommends to do what is technically simpler to implement (source: https://www.youtube.com/watch?v=_MswMYk05tk). With that said, I'd recommend a directory over a subdomain. Doing this consolidates signals to a single domain, which should then theoretically build more authority for all pages off of that single domain. This consolidation of authority results in rank increases, which have been documented here: http://moz.com/community/q/moz-s-official-stance-on-subdomain-vs-subfolder-does-it-need-updating. A subdomain would split signals from the blog and the rest of the root domain content. So while Google "says" they're roughly equivalent, SEOs have seen tangible evidence that sticking to a single domain can be beneficial. If you're able to go with www.iconery.com/editorial/, I'd choose that. Hope this helps!KR
-
Which domain will have higher SEO chances?
Neither domain name option is a very good idea. I'll explain why in a second, but first I'll answer your actual question. Although there might arguably be some slight advantage in having an exact-match domain of the form Name.TLD as opposed to a domain with additional keywords alongside the name, that advantage is probably negligible. Google algorithm updates, as I understand them, withhold that exact-match-domain advantage until a website has many other reinforcing signals of authority. (Their goal has been to downgrade spammy, low-quality websites.) Whichever domain version you might choose, Google will find the brand name CUJO mentioned all over your actual website and in the referring links. Those signals will be plenty for search engines to pick up on and hence plenty for SEO, and I'd expect them to overshadow the tiny difference between the 2 domains. Your choice shouldn't be based on SEO. Stop trying to please search engines, and start paying attention to your actual human audience. Really, your decision ought to be made based on the memorability and first impression of the domains. Is the extra keyword in .COM better than a name without that extra keyword in .IO? For humans, that is. Either way, you'll run into competition from CUJO.com. And that's a potential problem. Another problem would be pronunciation ambiguity. Spanish and English speakers will see the name very differently, based on that "J". Spelling isn't altogether clear either – Koojo, Kujo, Coojo, Cujo? The main problem I see, however, is that Cujo is a murderous dog in a Stephen King novel. Since most searches for Cujo will aim at that meaning, your site will be perceived by Google as usually irrelevant in comparison with searchers' intentions. And that doesn't help SEO.JP
-
Is it possible to increase my site's SEO by getting blogs/other websites to provide back links to my site?
If you got 5000 sites to link to your site using the same keyword you'll likely be flagged for spam and attempting to manipulate the search results. That is an old-school attempt at SEOing a site that Google and the other search engines have already developed algorithmic answers to. There are three aspects to building up your search rankings. 1) On-Site Optimization: Your site has to be coded in a way that is search and mobile friendly. You need to optimize your content for searcher's topical interest's (keywords), and give your visitors a great on-site experience by focusing on usability issues. 2) Content: You need to create and publish awesome content that fills the needs of the audience you're trying to reach. Write blog posts and create other forms of content that answer questions, provide tips, and map out solutions that truly illustrate that you are an authority on the topic. 3) Social Engagement / Links: Links are an important part of the algorithm, but getting a bunch of sites to link to you using keywords is the wrong approach. You need to be engaging on social media and (to a far lesser extent) socializing your content above. But the more you engage, the more others will socialize your content for you, which is where authority is really built.SD
-
What is a reasonable price for SEO services?
The cost of SEO depends a great deal on three things: 1) The specific services being offered (SEO, social media, content strategy, etc.) 2) The degree in which those services will be implemented (how many hours per month) 3) the skill and experience level of the SEOs involved. $800/month is a pretty small investment if you consider all the layers to making a web marketing campaign successful. My company will usually not touch any fully-managed web marketing campaign for less than $1500/month and that's at the low end of the aggressiveness meter even for a pretty basic site. A larger ecommerce site might start around $5K and go up from there. Overall, you want to look for value and results. You'll want to know how long the SEO has been in business and when looking at proposals you need to understand the amount of time the company will be investing in the campaign. You also want to spell out your goals and establish the expectations on how/when those goals will be achieved. When the SEO and client don't have the same expectations, that can lead to issues later on. But if you know what the goals are you can both be on the same page from the start.SD
-
How important is it for SEO to have an up to date sitemap for an online marketplace?
Hi there - There are two parts to the answer. First, it's very important to have a site map that is both current and does not have any URLs in it that return a status code other than a 200. Even 301 redirected URLs should not be in there. Also, the site map should not have more than 50,000 URLs. If it has more, then create multiple site maps and list them in an index site map. Second, site maps are a great way to get content *discovered*, but they won't necessarily rank well. You need the URLs to also be easily discoverable on your site through internal linking. I hope this helps. Feel free to book a call with me if you'd like to discuss more. JohnJD
the startups.com platform
Copyright © 2025 Startups.com. All rights reserved.