Why Faceted Navigation Is Not Good For SEO

Undermined SEO Efforts

Ecommerce Faceted Navigation

Most large websites like ecommerce shops and listings use faceted navigation to filter or sort results that are based on attributes which aren’t related. Faceted navigation can provide users with an easy way to filter their desired results quickly. What most people don’t know is that this type of navigation can undermine their SEO efforts. It can generate millions of URLs that are indexed and crawled by Google and live when they shouldn’t be.

How Faceted Navigation Undermines SEO

With faceted navigation, every possible combination of attributes (which may or may not be related to each other), is a unique URL. Therefore, you are likely to end up with duplicated content. Similar content on multiple web pages is bad for SEO Toronto. Additionally, all these extra URLs are crawled by Google. This means that the search engine will spend valuable crawl budget on less important pages. Other than that, Google can also receive incorrect signals after crawling all those pages. More importantly, having multiple URLs that you don’t even want to be indexed dilutes link equity by passing equity to pages that are less critical.

What’s The Best Solution?

If you approach a SEO company in Toronto and insist on faceted navigation to improve user experience, you may be presented with a few solutions that will ensure it doesn’t hinder your search rankings.

Using The Noindex Tag

First things first, you need to decide what you want search engines to index. Using a noindex tag, you can let search engine bots know which pages to index and the ones to ignore. Using the noindex tag, you can remove a lot of the pages with duplicate content easily. Even after using the noindex tag, remember that the pages will still be crawled and receive link equity. Therefore, this solution still doesn’t deal with the issue that faceted navigation wastes crawl budget and link equity.

Using Canonical Tags

Canonical tags allow you to select from a list of preferred pages the ones that you want to get the most credit. These tags are primarily used to deal with duplicate content issues. However, just as using noindex tags, they do not solve the wasted crawl budget on pages. This solution is better since it allows you to safeguard link equity to only the page versions that are important to you.

Using robots.txt To Disallow Sections Of The Website

Another solution would be to disallow certain sections of your website. Applying this change is very simple and can be personalized to suit the needs of each page. However, the major drawback is that Google bots can still index the section or page.

There are many other solutions that can address the issues that come with faceted navigation but the best approach would be to avoid this if you can. If you can design your own faceted navigation that doesn’t change the URL (this is usually done using JavaScript), it will avoid the SEO issues mentioned above. However, you may be required to do some manual checks to ensure all your critical landing pages are indexable.

Learning Center