How to relaunch website keeping ranking and position in SERP
Fact is: any website relaunch means losses. Losses of traffic, of ranking, of position in SERP, which are finally loss of money. I've seen losses of 4% and 40%, and there aren't the final numbers. The main objective of SEO is to minimize such negative impacts.
In short each website relaunch is a change of design and / or content structure. If content structure changes, with it comes definitely the change of URL structure. So we have generally 2 possible fields, where the new website version could meet a bad acceptance and following loss of traffic:
- users could dislike the new design
- search engines could become bitchy on indexing the new URL structure and some pages will be out of searcher's scope.
Our main principle should be:
to win a battle avoid it.
How to handle design relaunchTo switch new design on and then to begin a convincing battle with visitors, that new design is better - it's unproductive. With the new design version i recommend not to confuse visitors with new design version from today to tomorrow. Let your visitors decide, which design improvements are better. And better are the improvements, which like the visitors:) Make A/B tests with little design improvements, make them permanent. On this way you achieve profit triply:
- you will get continually better converting website
- you avoid totally the scaring your visitors away with sudden new design and
- you avoid totally the whole stress regarding time-pressed relaunch.
How to handle relaunch of content structureLets talk about the case, where the URL structure of the whole website changes. I will call the site with the new URL structure "the new version":
- Every old URL must be redirected with 301 redirect to the correlating new one.
- Try to change as few URLs as possible - do it only if you can't handle otherwise.
- Create an URL map: make a spreadsheet with old and correlating new URLs. Such list is very helpful, e. g. when you will catch 404 after relaunch. Your developer, who creates all redirects will be thankful for this list too. Clarify old URLs don't having their counterparts in new version, and new URLs don't having their counterparts in the old version.
- If there are old URLs without correlating URLs in new version, try to create content pages for them.
- If there are new URLs without their counterpart in the old version, try to get extern backlinks to these URLs.
- For old URLs which remain definitely without their new counterparts create nice 404 pages, and watch out for correct header response.
- Keep sitemaps clean from redirects: old site's sitemap - old URLs, new site's sitemap - new URLs.
- Don't forget to replace the canonical tags from old pages to the new, synchronous to creating 301 redirects.
- If new version gets other protocol as the old one (http→https) and / or other document extensions (html→php etc... ), don't forget to implement these changes in new URL structure too.
- If the relaunching site is so big, that it isn't possible to create an URL mapping and the URL structure is so irregular, that it isn't possible to create redirects using regular expressions and patterns, you must select, which pages get their 301 redirects manually, and which remain without. Look into your Google Analytics (minimum last year) and select for manual creating of 301 redirects pages with:
- most social signals,
- highest PR,
- highest amount of conversions,
- entries, visits,
- pageviews or
- commercial value.
- Another voluminous but necessary task, that is lying ahead, is to re-establish the internal linking structure based on new URLs (surely beside of new menu structure).
- To clean things up create for the new version new Google Analytics ID, new Google Webmaster Tools ID and implement them just before activating of 301 redirects.
- Don't hurry up! Let the old URLs version remain in web and fully available for Google after 301 redirects are activated. How long? Minimum so long till new URLs are indexed. After they are indexed you can begin... no, not delete the old version! After you assured of indexing of new URLs, make the corresponding old URLs with robots.txt noindex and nofollow.
Using robots.txt ensures Google's understanding about devaluation of these URLs, but they don't fly instantly from index away.
- If you want to speed up the Google's indexing routine... you could try, but manually:) Go into your Google's Webmaster tools, under Crawl you see the option "Fetch as Google", after click on it you get a text field, where you can fill you new URL in, you want index, then click onto "fetch" and "index".
- If you want look whether and how many your new URLs are already indexed, just perform a Google search with site:your-domain.tld and use this bookmark for extracting search results (save the following code as bookmark in your browser and click it being on the Google's search results page):
- close each page of the site with new URL structure with meta robots from any possible insights (using meta robots will ensure that any insights are 100% prohibited),
- activate your 301 redirects,
- take the crawling tool of your choice, e. g. Screaming Frog,
- give Screaming Frog a list with old URLs, ensure that the number of URLs is correct, and
- let the crawling tool crawl your site with new URL structure criss-cross, then review and correct errors like not performed 301 redirects, 404, wrong Analytics implementation etc.
- Only after this procedure take your prohibiting meta robots away