29 | Filters and Facets on Ecommerce Sites with Sean Butcher

image

Filters and facets are an issue for all e-commerce sites. Think I sound cocky? Well keep listening because you may have an issue and not even know about it…

Sean Butcher is Head of SEO at Blue Array, a Consultancy & Agency or Consulgency as they’ve patented. They are planning to double the team this year so if you’re interested and want an introduction, let me know 😉

BlueArray work with with many clients, from scrappy start-ups to large ecommerce brands.Clients include Simply Business, Netmums, Photobox and many more.

Sean has been In SEO for 10 years but as he’s Head of SEO, he needs to keep his knowledge broad, he said, but he does specialise in the more technical side of the search industry. He has done a number of Brighton SEO talks on canonical tags and website indexing. We actually met at the speakers’ party!

Why Filtering is a Problem on Ecommerce sites?
There are a number of complexities with facets/filters which we’ll go through in this episode. You can’t get away from them if you have a broad range of products. When this is replicated onto a website then it’s difficult for Google to understand the differences between those. The pages are often duplicates of each other or very similar apart from a word or two due to the nature of a filter. These are called thin pages (lack of content.)

Overall, filters are fantastic for users, Sean says. We don’t want to get negative about using them as they provide benefits to usability. The main advantage is to drill down into the product range, which, in turn, means better conversion and more sales.

A big issue is how they should be handled from an SEO perspective. You’re potentially creating tens of thousand extra URLs on your site. This then feeds into content uniqueness and value and crawl budget implications. Ecommerce businesses fall into the trap of adding filters onto the website without considering the implications for SEO.

Other issues we’ve seen are filter pages which were being created and made available even though there was limited value from a search perspective (lack of search volume aka nobody searching for those.) Or, the technical setup of a website isn’t helping index or crawl those pages.

Too many filters often mean not enough product to show on the page, which should also be a consideration. Yes, it may help users drill down to find what they’re looking for but you’ll end up with a page with thin content which Google doesn’t like.

Robots.txt file disallowed any URL with a filter in it. It may be a good way to not waste crawl budget and remove duplicate content issues but at the same time you may be missing an opportunity.

How to Make Filters & Facets work for you if you have an Ecommerce Site
If you’re indexing filter pages, do it like I do – custom chosen. Add text to the top of the pages, even if just a sentence to contextualise the page. Index only the valuable filter pages too, so this will require keyword research. Noindex any pages without enough product. We have seen great results, pages that were nowhere to be seen jumped to page 2 in the space of a few days.

Even little changes on content to make it unique helps. Choose the right pages to index and tailor to the user and match the pages to what they want to see. Check the SERPs if you’re unsure.

There are tools available that have an automated way of doing something similar, like Foundit (previously OneHydra – Greenlight agency.) The system looks at search volumes and recommends pages you could be creating to tailor to that intent.

Main (bad) examples what sites do with filters & facets:
1: Site blocking too much from being crawled and indexed so important content is missed. They would noindex a filter page or canonicalise it to the main content page (due to duplicate content.) They might also add a nofollow on to links that go to the filters. So, it’s important to do keyword research to show you if a filter may be important. It’s not in Google’s index and therefore it doesn’t rank which is a missed opportunity. An extreme example was where the client disallowed any filter page (the client didn’t know it was happening) and were not ranking for a lot of pages that they could have been.

2: Sites leave everything open and make it a free for all. In this case you’ll be suffering with issues around crawl budgets. Not managing filters effectively and leaving far too much open can cause thin content too, where multiple filters selected to drill down are effectively the same. Panda algorithm distrusts the content and information on the site which can (and does) affect the whole site. Duplicate content is an issue, no differentiation between pages. The colour of products, the page is the same with similar products so looks to be exactly the same, hence adding text to contextualise helps this.

Example: A clothing website is selling men’s jumpers. A user might want to see blue woolen mens jumpers, then surely they want a page that reflects these products. Update the H1, meta title and description, add unique content on the page too. A good idea is to have an automation of content is a great idea so that it gives a unique selling point to every page.

3: Another example that Sean sees happen is where the same filters selected but when it’s selected in a different order then it creates new URLs when it shouldn’t. Colour then size or vice versa, end up with 2 completely URLs. It’s fine if there’s a master version or noindex duplicate version but frequently not the case and causes duplicate issues. It’s better to force the URL that the size or material always goes first. Have a pattern or schema so that no matter what journey is taken that it’s always the same URL, ideal for good crawl budgets. Maybe think about showing your most popular filters first.

Why filters and facets are important?
Long tail keywords are great for conversion. If someone knows what they’re looking for, they’re more likely to buy. It’s worth putting in that effort to get these things done to increase traffic but also quality traffic that offers more sales. Sean’s clients may not rank for something generic so great chances to get more traffic is to compete and show up for the longer tail searches. More likely to show up for something niche.

Open up a couple of filter pages and get results that way to prove your case. Look at traffic coming in due to the changes you’ve made. Clearly prove it’s working then open it up further.

Have a list of pages that will work for the brand or client and if there needs to be a business case then it might be worth doing a test beforehand to prove its worth.

Essentially, it’s done on a case by case basis. Do your keyword research to gather enough data to make sure that it’s based on numbers.

Options to use; noindex, robots.txt as disallow or canonicals.

Having extra text definitely still works, such as banner text. We saw a quicker reaction from Google for rankings and visibility. Adding contextualisation with the keyword research, it is worth the time and effort, for sure. Even one or two lines offers enough unique value and differentiation.

Other episodes you may be interested in:

Crawling & Indexing

Guaranteeing SEO Results

Google Bugbears

Music credit: I Dunno (Grapes of Wrath Mix) by spinningmerkaba (c) copyright 2017 Licensed under a Creative Commons Attribution (3.0) license. http://dig.ccmixter.org/files/jlbrock44/56346 Ft: Jlang, 4nsic, grapes.

Links:
https://www.bluearray.co.uk/


Comments are closed.