A Checklist To Avoid Unexpected SEO Mistakes Before Launching Your eCommerce Site On NetSuite

A checklist for final SuiteCommerce SEO pre-launch considerations.

Go Live Checklist For Netsuite SEO on SCA

Migrating a site to NetSuite SCA can be a large undertaking — one with many complex considerations and a list of things to do. 

This guide will help ease your mind by providing an actionable checklist to ensure you maintain as much of the organic traffic as possible while making the cutover.

We’ll cover some SEO basics, where to find them in NetSuite, and more complex configuration considerations within the platform.

We’ll be reviewing:

● Your Redirect Strategy

● Unique Page Titles & Descriptions

● Sitemap.xml Configuration

● Robots.txt Configuration

● URL Naming Conventions and Facet Configuration

● Schema Markup

● Custom 404 Pages

● Using a Crawler to Identify Potential Errors

● Monitoring Changes & Errors in Search Console

● Modifying Default Canonical Behavior

Tadpull Insights: This is not an exhaustive list. There is a lot that needs to be considered for benchmarking your old site, testing before going live, and post-implementation monitoring. We plan to cover these at a future date.

Your Redirect Strategy

An important element to nail down before making the cutover from your old website — ensuring that all of your pages (particularly the high traffic pages with backlinks) are being 301 redirected to the new relevant URL.

To ensure this is the case, you’re going to want to get a list of all the pages currently on your website. We’d recommend gathering resources from:

● Current Sitemap

● Website Crawl

● Analytics Data from last year

● Export of backlinks pages from Search Console

Without getting too tangled into the weeds with technical details, all you’ll want to do is make a list of every page on our current website, prioritized by page traffic and backlinks. 

Working down the list, ensure every URL on your current site has a redirect set-up for the new corresponding URL.

Depending on what platform you were previously using, you may not have to make numerous changes to your categories or product pages. It really depends on how your new site is configured from a URL standpoint. 

If you’re coming off a legacy system, you may have random numbers in your URL strings — such as example.com/category-78/product-232098

While, in Netsuite, your new product URL would be example.com/product-name.

If this is the case (a product URL changing from being in a subfolder to the root level), you’re going to need to redirect a lot of URLs. 

This can be done individually via import, or with redirect rules using Regex.

But, ideally, your new site would be configured in such a way that not many of these URLs are changing.


Tadpull Insight: To avoid redirect chains, make sure to update any of your old redirects to the new corresponding URL on launch. 

For example, if you have a historic redirect of:

example.com/shoesexample.com/shoes-and-boots 

But your new URL on NetSuite is:

example.com/shoes-boots 

You’ll want to set up a redirect for:

example.com/shoesexample.com/shoes-boots

This essentially cuts down on the number of resources needed to make the redirect, and so will make the redirect quicker. 

Realistically, having a redirect chain of three URLs probably isn’t going to make you lose all your link equity. But as the number of redirects in a chain increases, the likelihood of a crawler continuing to follow them decreases. 

John Mueller (head of Google Search) recommends no more than 5 redirects. That said, if you have the ability to make the redirect more direct, you might as well clean it up before it gets out of hand.

Tadpull Insight: You’re still going to want to have a redirect set up for /shoes-and-boots → /shoes-boots. But having one from your original URL will make your page load slightly faster. 

Unique Page Titles and Meta Descriptions

Unique titles and meta descriptions — crucial first impressions that help set user expectations about what they find on a page, and also what Google uses to contextualize a page in its massive catalog of information. 

If we have a bunch of page titles that are the same, Google may see this content as duplicated and be less likely to rank it.

For example, if we sell t-shirts and we have 30 pages titled “Grey T-Shirt”, it’s unlikely that all of these pages will be indexed and eligible to show up in the search results.


NetSuite can be configured to make this process a whole lot quicker by dynamically pulling in your product’s title and description and using it as the page’s title and meta description. 

Similar configurations can be done with commerce categories, although we’d recommend putting a bit more time into these — many times these category pages will outrank product pages.

  • Product Page Example

    Title: Patagonia Lightweight Rain Jacket

    Description: This jacket is designed from lightweight…
  • Commerce Category Dynamic Example

    Title: Outerwear > Jackets > Rain Jackets

    Description: Google will pull in random elements of the page

Product pages have many fields in the ERP that could be easily mapped to the page title and meta description. 

But commerce categories are a bit harder, as there isn’t a field that easily maps to the meta description. This is one of the reasons we’d recommend creating custom titles and meta descriptions for your commerce categories.

When creating custom titles, we like to review the following:

  • What our category is currently ranking 
  • Potential for various keywords we’re not ranking particularly well for 
  • Associated search volume 
  • How the competition is positioning their category 

After reviewing all of these, we can craft a unique title and meta description that speaks to the category from both a user and keyword perspective — without discounting the keywords we’re already ranking on.

There are many situations where we’re already ranking on some good keywords. For example, let’s say we are ranking on “lightweight rain jackets” and are in position 2. If we see that “rain jacket” has way more search volume and then over-optimize around that, we may end up decreasing our position for “lightweight rain jacket”.

Robots.txt

Your robots.txt file controls how search engines and other crawlers access the content of your website. It’s one of the first places they’ll go to help them understand what content you want to be crawled and indexed.

Keep in mind that robots.txt can be pretty powerful and shouldn’t be played around with. You can easily stop your whole site from being indexed by writing the wrong command in this file.

The main thing you should have in your robots.txt file is a link to your sitemap(s) and directives that would stop a crawler from accessing gated content or parameterized URLs on your website that you don’t want to be ranked.

Tadpull Insight: It’s hard to predict what Google will crawl and index. We’ve seen many situations where Google ends up crawling and ranking a page, even though — theoretically — it shouldn’t be, based on our meta robots file.

For small sites with less than 1000 products, it’s unlikely that you need to have a lot of directives in this file. Even if Google crawled every page and URL variation with different parameters (e.g. /category/color=blue,red) it would likely still crawl all your pages.

The robots.txt file is particularly helpful for sites with so many pages that Google would run out of crawl budget if it decided to crawl every facet and option available. 

Google will only dedicate so many resources to crawling your site, so we want to make sure we’re not wasting them on URLs that don’t add any value (e.g. parameterized URL variations).

When Might I Need to Use This?

NetSuite provides the option for facets to be displayed as parameters or subfolders. If all your facets show up as parameters, you shouldn’t have to modify your robots.txt file, but if you have certain facets as subfolders (such as example.com/jackets/brands/…) you may want to have some rules in your robots.txt file that stop Google from crawling all the potential variations.

For example, you may have 15 brands of jackets. Selecting a couple of brands (example.com/jackets/brands/patagonia,arcteryx,colombia) could create a lot of different URL variations. 

We wouldn’t want Google wasting its time crawling all those. So, in that case, we would update our robots.txt with:

Disallow: */brands/*, 

There are configuration options that let you control how many of these variations you want to be indexed, but — from our experience — crawlers don’t always obey the rules. 

Robots.txt is a good way to “force” crawlers to avoid these URL variations.

(We’ll get into the configuration options a bit later to ensure you aren’t having a large number of unnecessary URLs being ranked.)

Many eCommerce websites won’t need to have very much in this file — more often than not, indexing is controlled at the template level with meta robots directives. Meta robots are essentially mini robots.txt files on each of your pages that tell the crawlers whether you want that page index and crawled.

Sitemap.xml

A sitemap gives search engines a map of all of the pages on your site that are deemed worthwhile to crawl.

One of the first places a crawler will go is your robots.txt file and sitemap.xml file. This is why having these linked is beneficial.

NetSuite’s sitemap generator has all the options you need to include in your sitemap and allows you to select which page types you include, in addition to manually adding in landing pages you deem relevant.

Sitemap Advanced Configuration Options

You’ll have the option to include:

● Last Modified Date

● Change Frequency

● Priority  

But while you have all these options, the only one you really benefit from is the “Last Modified Date” option. Google hasn’t paid much attention to change frequency or priority since 2015, according to John Mueller (head of Google Search).

To learn more about setting up your sitemap, view NetSuite’s documentation on the subject.

Facets SEO Subtab

Facets, by default, will show up as subfolders in the URL for newer versions of NetSuite. 

This is a good “failsafe” for those that aren’t comfortable with SEO. That said, there are specific advantages for having some of your facets show up as a subfolder:

Facet As Parameter

example.com/jackets?brand=patagonia

Facet As Subfolder

example.com/jackets/brand/patagonia

In the vast majority of situations, you wouldn’t want a facet as a subfolder — you only want this if you want that URL indexed. A couple of questions to ask yourself are:

  • Does this facet typically have enough unique content to justify having the URL indexed?
  • Would the content of this faceted search provide better ranking opportunities for valuable keywords?

If you answered yes to both, then set up the facet as a subfolder. 

If you’re a retailer that sells multiple brands, we’d typically recommend having “Brand” set up as a subfolder. This is because there are many brand-specific searches, and having your URLs structured in a way that allows you to rank for these keywords can be beneficial.

Parametrized URLs are typically not set up to be indexed, because the default canonical behavior in NetSuite makes it so that only your main category gets indexed — such as example.com/jackets?brand=patagonia. 

This would have a canonical tag to example.com/jackets, which is telling search engines that we only want /jackets to rank.

By setting up “brand” as a subfolder, we’re changing the canonical behavior to allow example.com/jackets/brand/patagonia to be indexed. This is a good idea because many people search for branded products. 

For example:

● Patagonia jackets

● Black Diamond Headlamp

● Solomon Hiking Shoes

Having a unique, indexable URL provides more ranking opportunities than one broad URL such as /jackets?brand=patagonia.  

Your /jackets page would be competing for a lot of different branded searches.

Different industries will have different use cases for having facets listed as a parameter or subfolder, so we’d recommend thinking through what pages you actually want to rank.

Final Word from Tadpull: Nailing down your URL structure before launch is incredibly important. If you end up changing this a few months in, it's a huge undertaking from a redirect standpoint and has the potential to significantly damage the rankings you’ve worked so hard to build.

Facet Naming Conventions

NetSuite gives you the option to control the URL component or each facet. 

Setup > SuiteCommerceAdvanced > URL Components for Facets 

Make sure to name each of your facet’s URL components with a user and search engine friendly name — instead of a non-user-friendly URL, such as /custitem_size/, make sure it instead has /size/ as the URL component.

Creating user-friendly URLs allows for a more focused keyword strategy for each URL, provides greater opportunities for ranking, and simply looks better.

Schema Markup

Another important element to get right before launching — particularly if you plan on running Google Shopping campaigns — is schema markup.

Version 20.1 and newer of SuiteCommerce and SuiteCommerce Advanced, by default, have the option to enable JSON LD markup — the industry best practice for schema markup. 

This is enabled in SuiteCommerce Advanced > Configuration

A full support doc from NetSuite walks you through the setup here.

The new schema markup (works great/doesn’t work?) for the newest version of NetSuite, but if you’re reading this guide and aren’t on the newest version there’s hope! 

While your current version of NetSuite’s schema markup might not be working out of the box, Tadpull has developed custom solutions for various versions of NetSuite.

Pre - 20.1 Netsuite Versions

If you’re on an earlier version of NetSuite and haven’t modified any of your markups, chances are your site is using NetSuite’s default microdata markup. 

While this does offer many of the “required” fields from a markup testing standpoint, we’ve found — many times — that there are some bugs with it.

The issues with earlier release schema markup typically revolve around the “offer” property, and more specifically for matrix items with children variants. 

Many times this markup won’t play well with Google Shopping campaigns, as Google will often find the price you’re supplying in the microdata is a mismatch with the feed data.

Troubleshooting Your Schema Markup

There are a few ways to troubleshoot your page’s schema markup:

  1. Google’s Rich Results Test: The most important page types to test are your product pages. While category pages and your homepage will also have markup, make sure your product page markup is dialed.

    Visit the URL above and drop one of your product pages into it; if you have multiple page templates for product pages, do the same for all your templates.

    At a bare minimum, meet all of the required field requirements listed by Google.
  1. Site Crawler: Prior to go-live, crawl your site with one of the many crawling tools across the web — such as ScreamingFrog, DeepCrawl, and SEMrush — to verify that all your various page templates are marked up appropriately. 

    Pay particular attention to the markup differences between a matrix and non-matrix item and any custom page templates you may have.
  1. Google’s Search Console: If you’ve got Search Console set up on your site (and do that now if you haven’t already), navigate to Enhancements > Products.

    Here, you will find the number of products Google has markup for and any associated errors.

Tadpull Insight: This one is more helpful post-launch; you won’t be able to see your new site’s schema markup here.

Custom 404 Pages

With any site migration, regardless of how careful you are, a handful of users might visit one of your old pages that you forgot to redirect. 

In this scenario, having a custom 404 page that is helpful or fun is a great way to create a positive brand experience. 

Standard elements on a 404 page include:

● Letting the user know this page no longer exists

● Imagery or GIFs that are fun to interact with or make you laugh

● Links to other helpful resources that they might be looking for

In an ideal world, the links you supply would be closely related to the URL that a user was originally seeking, but many times this is a code-intensive endeavor and you might not have all the relevant information from your old platform available. 

If that’s the case, stick to the three ideas above to make sure you give context as to why a user isn’t seeing what they expected. Make them laugh and do your best to keep them on your site with relevant links. 

OptinMonster has a good post on 11 Brilliant 404 Pages that will provide inspiration for your own page.

Using a Crawler to Identify Issues with Your Site

Looking at your site from the standpoint of a crawler helps you better understand what the robots are seeing, and help you identify technical issues you might not otherwise see. 

There are a plethora of tools available to do this, including ScreamingFrog, SEMrush, Deepcrawl, Ahrefs, etc. All of these tools work in a similar fashion by crawling every page on your site and hitting all of the links in your page’s HTML.

Many of these tools also generate automated reports highlighting key technical issues with parts of your website. 

DeepCrawl, of the tools we’ve tested, seems to be the best at this with easy to visualize reports stack ranking the issues that need to be fixed.

These tools identify issues with your schema markup, broken links on your site, problems with canonical tags, URL generation issues, and orphaned pages, just to name a few.

We won’t get into what all of these mean in this post, but it’s worthwhile to note that — if you’re unfamiliar with technical SEO and crawl issues in general — you may get your first report back and be incredibly worried. 

But don’t immediately panic. Not everything that is identified as an error is detrimental to your website’s health or ability to rank on a given keyword term.

Every website has its imperfections, and just because a crawler says you have 200 duplicate page titles and descriptions, it doesn’t mean they can’t rank.

Prioritize the errors that will have the largest impact on rankings. Broken links, 404ing pages, major canonical issues, and issues with URL generation will probably be where you’ll want to start cleaning things up.

Learn More

The information we just covered can be a lot to digest. Got questions? Reach out to us for a consultation here.

Featured Image: freepik

Join The Community & Stay Connected