This site has limited support for your browser. We recommend switching to Edge, Chrome, Safari, or Firefox.

Ecommerce SEO Migration Checklist

Ecommerce SEO Migration Checklist

Migrating an ecommerce site can be a large undertaking — one with many complex considerations and a list of things to do. This guide will help ease your mind by providing an actionable checklist to ensure you maintain as much of the organic traffic as possible while making the cutover. We’ll cover some SEO basics and more complex configuration considerations.

We’ll be reviewing:

● Redirect Strategy

● Unique Page Titles & Descriptions

● Sitemap.xml Configuration

● Robots.txt Configuration

● URL Naming Conventions and Facet Configuration

● Schema Markup

● Custom 404 Pages

● Using a Crawler to Identify Potential Errors

● Monitoring Changes & Errors in Search Console

● Modifying Default Canonical Behavior

Note: This is not an exhaustive list. There is a lot that needs to be considered for benchmarking your old site, testing before going live, and post-implementation monitoring. We highly suggest consulting with digital marketing experts that have experience with site migrations.

Redirect Strategy

An important element to nail down before making the cutover from your old website — ensuring that all of your pages (particularly the high traffic pages with backlinks) are being 301 redirected to the new relevant URL.

To ensure this is the case, you’re going to want to get a list of all the pages currently on your website. We’d recommend gathering resources from:

● Current Sitemap

● Website Crawl

● Analytics data from the last year

● Export of backlinks pages from Search Console

Without getting too tangled into the weeds with technical details, all you’ll want to do is make a list of every page on our current website, prioritized by page traffic and backlinks. Working down the list, ensure every URL on your current site has a redirect set-up for the new corresponding URL.

Depending on what platform you were previously using, you may not have to make numerous changes to your categories or product pages. It really depends on how your new site is configured from a URL standpoint. 

If you’re coming off a legacy system, you may have random numbers in your URL strings — such as example.com/category-78/product-232098

While your new product URL would be example.com/product-name.

If this is the case (a product URL changing from being in a subfolder to the root level), you’re going to need to redirect a lot of URLs. This can be done individually via import, or with redirect rules using Regex. But, ideally, your new site would be configured in such a way that not many of these URLs are changing.


Tadpull Insight: To avoid redirect chains, make sure to update any of your old redirects to the new corresponding URL on launch. 

For example, if you have a historic redirect of:

example.com/shoesexample.com/shoes-and-boots 

But your new URL is:

example.com/shoes-boots 

You’ll want to set up a redirect for:

example.com/shoesexample.com/shoes-boots

This essentially cuts down on the number of resources needed to make the redirect, and so will make the redirect quicker. Realistically, having a redirect chain of three URLs probably isn’t going to make you lose all your link equity. But as the number of redirects in a chain increases, the likelihood of a crawler continuing to follow them decreases. 

John Mueller (head of Google Search) recommends no more than 5 redirects. That said, if you have the ability to make the redirect more direct, you might as well clean it up before it gets out of hand.

Tadpull Insight: You’re still going to want to have a redirect set up for /shoes-and-boots → /shoes-boots. But having one from your original URL will make your page load slightly faster. 

Unique Page Titles and Meta Descriptions

Unique titles and meta descriptions — crucial first impressions that help set user expectations about what they find on a page, and also what Google uses to contextualize a page in its massive catalog of information. 

If we have a bunch of page titles that are the same, Google may see this content as duplicated and be less likely to rank it. For example, if we sell t-shirts and we have 30 pages titled “Grey T-Shirt”, it’s unlikely that all of these pages will be indexed and eligible to show up in the search results.


Most ERPs can be configured to make this process a whole lot quicker by dynamically pulling in your product’s title and description and using it as the page’s title and meta description. Similar configurations can be done with commerce categories, although we’d recommend putting a bit more time into these — many times these category pages will outrank product pages.

  • Product Page Example

    Title: Patagonia Lightweight Rain Jacket

    Description: This jacket is designed from lightweight…
  • Commerce Category Dynamic Example

    Title: Outerwear > Jackets > Rain Jackets

    Description: Google will pull in random elements of the page

Product pages have many fields that could be easily mapped to the page title and meta description. But commerce categories are a bit harder, as there isn’t a field that easily maps to the meta description. This is one of the reasons we’d recommend creating custom titles and meta descriptions for your commerce categories.

When creating custom titles, we like to review the following:

  • What our category is currently ranking 
  • Potential for various keywords we’re not ranking particularly well for 
  • Associated search volume 
  • How the competition is positioning their category 

After reviewing all of these, we can craft a unique title and meta description that speaks to the category from both a user and keyword perspective — without discounting the keywords we’re already ranking for.

There are many situations where we’re already ranking on some good keywords. For example, let’s say we are ranking on “lightweight rain jackets” and are in position 2. If we see that “rain jacket” has way more search volume and then over-optimize around that, we may end up decreasing our position for “lightweight rain jacket”.

Robots.txt File

Your robots.txt file controls how search engines and other crawlers access the content of your website. It’s one of the first places they’ll go to help them understand what content you want to be crawled and indexed. Keep in mind that robots.txt can be pretty powerful and shouldn’t be played around with. You can easily stop your whole site from being indexed by writing the wrong command in this file.

The main thing you should have in your robots.txt file is a link to your sitemap(s) and directives that would stop a crawler from accessing gated content or parameterized URLs on your website that you don’t want to be ranked.

Tadpull Insight: It’s hard to predict what Google will crawl and index. We’ve seen many situations where Google ends up crawling and ranking a page, even though — theoretically — it shouldn’t be, based on our meta robots file.

For small sites with less than 1000 products, it’s unlikely that you need to have a lot of directives in this file. Even if Google crawled every page and URL variation with different parameters (e.g. /category/color=blue,red) it would likely still crawl all your pages.

The robots.txt file is particularly helpful for sites with so many pages that Google would run out of crawl budget if it decided to crawl every facet and option available. Google will only dedicate so many resources to crawling your site, so we want to make sure we’re not wasting them on URLs that don’t add any value (e.g. parameterized URL variations).

Many ecommerce websites won’t need to have very much in this file — more often than not, indexing is controlled at the template level with meta robots directives. Meta robots are essentially mini robots.txt files on each of your pages that tell the crawlers whether you want that page index and crawled.

Sitemap.xml

A sitemap gives search engines a map of all of the pages on your site that are deemed worthwhile to crawl. One of the first places a crawler will go is your robots.txt file and sitemap.xml file. This is why having these linked is beneficial. Most platforms automatically generate sitemaps, but it's important to check that any pages you do not want crawled are not included.

You'll also want to ensure you submit your sitemap URL to Google Search Console after the new site launches, so Google knows the new location.

Facets SEO Subtab

Facets, by default, will show up as subfolders in the URL for newer versions of NetSuite. 

This is a good “failsafe” for those that aren’t comfortable with SEO. That said, there are specific advantages for having some of your facets show up as a subfolder:

Facet As Parameter

example.com/jackets?brand=patagonia

Facet As Subfolder

example.com/jackets/brand/patagonia

In the vast majority of situations, you wouldn’t want a facet as a subfolder — you only want this if you want that URL indexed. A couple of questions to ask yourself are:

  • Does this facet typically have enough unique content to justify having the URL indexed?
  • Would the content of this faceted search provide better ranking opportunities for valuable keywords?

If you answered yes to both, then set up the facet as a subfolder. 

If you’re a retailer that sells multiple brands, we’d typically recommend having “Brand” set up as a subfolder. This is because there are many brand-specific searches, and having your URLs structured in a way that allows you to rank for these keywords can be beneficial.

Parametrized URLs are typically not set up to be indexed, because the default canonical behavior in NetSuite makes it so that only your main category gets indexed — such as example.com/jackets?brand=patagonia. 

This would have a canonical tag to example.com/jackets, which is telling search engines that we only want /jackets to rank.

By setting up “brand” as a subfolder, we’re changing the canonical behavior to allow example.com/jackets/brand/patagonia to be indexed. This is a good idea because many people search for branded products. 

For example:

● Patagonia jackets

● Black Diamond Headlamp

● Solomon Hiking Shoes

Having a unique, indexable URL provides more ranking opportunities than one broad URL such as /jackets?brand=patagonia.  

Your /jackets page would be competing for a lot of different branded searches.

Different industries will have different use cases for having facets listed as a parameter or subfolder, so we’d recommend thinking through what pages you actually want to rank.

Final Word from Tadpull: Nailing down your URL structure before launch is incredibly important. If you end up changing this a few months in, it's a huge undertaking from a redirect standpoint and has the potential to significantly damage the rankings you’ve worked so hard to build.

Facet Naming Conventions

NetSuite gives you the option to control the URL component or each facet. 

Setup > SuiteCommerceAdvanced > URL Components for Facets 

Make sure to name each of your facet’s URL components with a user and search engine friendly name — instead of a non-user-friendly URL, such as /custitem_size/, make sure it instead has /size/ as the URL component.

Creating user-friendly URLs allows for a more focused keyword strategy for each URL, provides greater opportunities for ranking, and simply looks better.

Schema Markup

Another important element to get right before launching — particularly if you plan on running Google Shopping campaigns — is schema markup. Schema is a form of microdata that helps search engines better understand and categorize your content. Typically Tadpull recommends having organization, product and review schema at a minimum. Many ecommerce platforms have this functionality built in our of the box, but if you're doing a lot of customization or not completely filling in your product data, it can lead to errors.

Troubleshooting Your Schema Markup

There are a few ways to troubleshoot your page’s schema markup:

  1. Google’s Rich Results Test: The most important page types to test are your product pages. While category pages and your homepage will also have markup, make sure your product page markup is dialed.

    Visit the URL above and drop one of your product pages into it; if you have multiple page templates for product pages, do the same for all your templates. At a bare minimum, meet all of the required field requirements listed by Google.
  1. Site Crawler: Prior to go-live, crawl your site with one of the many crawling tools across the web — such as ScreamingFrog, DeepCrawl, and SEMrush — to verify that all your various page templates are marked up appropriately. 
  1. Google’s Search Console: If you’ve got Search Console set up on your site (and do that now if you haven’t already), navigate to Enhancements > Products.

    Here, you will find the number of products Google has markup for and any associated errors.
Tadpull Insight: This one is more helpful post-launch; you won’t be able to see your new site’s schema markup here.

Custom 404 Pages

With any site migration, regardless of how careful you are, a handful of users might visit one of your old pages that you forgot to redirect. In this scenario, having a custom 404 page that is helpful or fun is a great way to create a positive brand experience. 

Standard elements on a 404 page include:

● Letting the user know this page no longer exists

● Imagery or GIFs that are fun to interact with or make you laugh

● Links to other helpful resources that they might be looking for

● A search functionality so the user can find what they're looking for faster

In an ideal world, the links you supply would be closely related to the URL that a user was originally seeking, but many times this is a code-intensive endeavor and you might not have all the relevant information from your old platform available. If that’s the case, stick to the ideas above to make sure you give context as to why a user isn’t seeing what they expected. Make them laugh and do your best to keep them on your site with relevant links. Here's a fun example from Lego.com.

Using a Crawler to Identify Issues with Your Site

Looking at your site from the standpoint of a crawler helps you better understand what the search engine crawlers are seeing, and help you identify technical issues you might not otherwise see. There are a plethora of tools available to do this, including ScreamingFrog, SEMrush, Deepcrawl, Ahrefs, etc. All of these tools work in a similar fashion by crawling every page on your site and hitting all of the links in your page’s HTML. Many of these tools also generate automated reports highlighting key technical issues with parts of your website. 

These tools identify issues with your schema markup, broken links on your site, problems with canonical tags, URL generation issues, and orphaned pages, just to name a few. We won’t get into what all of these mean in this post, but it’s worthwhile to note that — if you’re unfamiliar with technical SEO and crawl issues in general — you may get your first report back and be incredibly worried. 

But don’t immediately panic. Not everything that is identified as an error is detrimental to your website’s health or ability to rank on a given keyword term. Every website has its imperfections, and just because a crawler says you have 200 duplicate page titles and descriptions, it doesn’t mean they can’t rank.

Prioritize the errors that will have the largest impact on rankings. Broken links, 404ing pages, major canonical issues, and issues with URL generation will probably be where you’ll want to start cleaning things up.

The information we just covered can be a lot to digest. Got questions? Reach out to us for a consultation here.

Featured Image: freepik

Use coupon code WELCOME10 for 10% off your first order.

Cart

Congratulations! Your order qualifies for free shipping You are $200 away from free shipping.
No more products available for purchase