As an SEO consultant, site relaunches stress me the hell out. Sure, they’re a necessity. To stay competitive in search, you have to keep your site on pace with (or ahead of) trends in technology, user experience, and, of course, Google’s capricious guidelines.
But relaunching without an hermetic, SEO-focused plan is a death sentence.
In three years, I’ve consulted on over 200 site relaunches. And every time, I’ve dug through search results for a rock-solid SEO site launch checklist. Yes, I should know the steps by heart by now. But that’s what this post is for: to serve as a master reference for preserving rankings and traffic through the sometimes-grueling site relaunch process.
2017 Site Launch SEO Checklist
Big or Small Site?
Something important to note: you can handle your relaunch differently depending on your site’s size.
If you have a small site – let’s say fewer than 10,000 pages – you’re clear to relaunch all at once. Don’t be scared. Rip that bandage right off.
For those massive sites, though, it’s perfectly acceptable to migrate in batches. That way, you can test things out slowly and catch and fix errors with relative agility.
Data Collection and Benchmarking
- Block crawlers from accessing and indexing the development site
- Crawl and export current siteWe love Screaming Frog for this step. Crawl your live site to collect important data on meta tags and status codes, but also to perform a quick pre-launch audit. Website relaunches are times for reflection. Use your crawler to diagnose SEO issues from the live site and correct those issues during build.
- Check index statusSearch site:yoursite.com or check Google Search Console’s Index Status Report to know how many of your site’s pages are indexed currently.
- Refresh your keyword researchMake sure you’re targeting the highest-value keywords you can. Refreshing your keyword research will help greatly as you’re determining new URL structures.
- Determine your most valuable pages:
- Pages with highest organic traffic
- Pages with most external linksThese are the pages you want to preserve, obviously. Take extra care throughout the process to make sure you don’t muck them up.
- Benchmark average weekly organic traffic for key pages (total traffic, mobile traffic, and desktop traffic)
- Benchmark rankings (we love STAT for this)
- Collect server logs (for diagnosing crawl issues later)
- Determine final URL structures
- Architecture follows parent-and-child structure
- Key URLs incorporate keywords
- Important legacy code still functional
- Robots.txt file created, uploaded
- Disallows crawl traps
- Disallows private user sections
- References XML sitemaps
- XML sitemaps created (including specialty sitemaps for images, videos, blog, etc. where applicable)
- XML sitemaps are free from “dirt” like non-200 and non-canonical URLs
- Site has a mobile version (standalone, dynamic-serve, or responsive)
- Site features function properly on mobile
- Pages 404ing on mobile only
- Mobile site free from interstitials
- Content is sized for mobile viewports
- Tap targets are finger-friendly
- Text size is readable on mobile
- Site enables browser caching
- Site enables compression
- Total resource requests (and round trips) kept to a minimum
- Images (and other hosted resources) are under 100kb
- Page HTML is under 35kb
- Body copy is thorough, user-focused, and incorporates topical and semantic keywords
- Page only contains one H1 tag per section (HTML5)We used to say "one H1 per page." HTML5, however, introduced content sections. So, for example, you can have an H1 in the and one for your or section.
- Hierarchical use of header tags
- Thoughtful use of keywords in headers
- Body copy contains internal links to pages you intend to rank
- Internal links are absolute, not relative
- Title tags are roughly 50-70 characters and feature keywords towards the front
- Meta descriptions are concise and feature calls to action to entice clicks
- Body content is not duplicated on other pages
- No hidden content or cloaked links
- 404 page is helpful
- Images have descriptive alt text attributes and file names that include keywordsRemember: the image alt text attribute is first and foremost an accessibility feature. Write alt text that would be helpful to a blind 3rd-grader using a screen reader. If you can incorporate keywords, great, but focus on the user.
You may hear normalization referred to as canonicalization, and we mean the same thing. In a perfect SEO world, unique pages always resolve to one clean URL. You can make this happen through your server configuration files, like your web.config on IIS and your .htaccess on Apache servers.
- Duplicate pages resolve to single canonical URL:
- Single case
- Naked or www. (or whatever your subdomain is)
- HTTP or HTTPS
- Trailing slash or not
- Pages contain self-referential rel=”canonical” tags
- Print pages contain rel=”canonical” tags pointing to non-print versions
- Install web analytics
- Reconfigure goal tracking to capture any new site conversions
Just Before Launch
If you can only spend time on one part of site relaunch SEO, then spend it on your redirects.
- Map old URLs to their closest counterpart new URLsWe're going for as close to a 1-to-1 page match as possible. Consider how much time you have and prioritize redirection based on that benchmarking you definitely did. Don't miss your most clicked and most linked pages. Give those similar pages on the new site.
- Redirects are ready to implement
- Server isn’t defaulting 301s to 302sSome servers (looking at you, IIS) sometimes come with funky rules that switch 301s to 302s. Check for that.
- No daisy-chain redirects (or, if there have to be, fewer than 5 in a chain)
During Launch Window
- Verify your property in Google Search ConsoleGSC's property sets are a godsend for site relaunches that involve switching or combining domains or subdomains. Verify your current and development sites as a property set, if possible, and you'll have a much clearer view of the transition.
- Verify your property in Bing Webmaster Tools
- Submit XML sitemaps to GSC and BWT
- Use GSC robots.txt Tester to make sure your directives are working properly
- Crawl the newly live site and check for:
- Unintended 400 and 500 errors
- Missing meta data
- Crawl traps
Consider the post-launch period "however long it takes the dust to settle." Depending on the size of your site, this can be anywhere from two weeks to 3 months or more. This is especially true if you made major alterations to your URLs.
- Check GSC Crawl Errors report for unexpected 404s, 500s, etc.
- Check GSC Crawl Stats report
- Analyze your server logs to see where crawlers are spending their time
- Monitor traffic against benchmarks
- Monitor rankings against benchmarks
- Give yourself a round of applause
That's it. You did it. You relaunched a brand new site while preserving rankings, traffic, and revenue. You're a hero, and your hero status will be reflected in your paycheck.