SEO podcast

31 | Technical SEO in 2019 with Fili Wiese

Fili Wiese is an SEO expert at Search Brothers. He used to work at Google, specifically in Search Quality and now helps clients with visibility. He specialises in technical SEO and I’ve seen him do a site speed presentation (at SearchMetrics) that blew my mind.

Fundamental Technical SEO areas

It’s important to start with accessibility and crawlability of your website. You need to know what pages are indexable and what are not. You should also have your log files (and we talk more about this later/below.) Know where Google, Bing & other bots, spend their time. Are these pages good landing pages?

It’s similar to PPC thinking, where it’s drawing in the user and converting them. SEO is similar but you need to go further as your pages need a unique selling proposition (USP.) If not then why should these pages rank in the first place? It’s fundamental for visibility and even for a business model; understanding what sets you apart from competition.

Example: A previous episode covers Filters & Facets and we speak about how each page may seem similar to search engines so it’s super important that each one has a USP.

A USP isn’t just for Google but for the user. You should mostly care about the user so if they’re happy then Google is happy.  

Ecommerce Example: Blue boxes in different shapes. Do you need a landing page for every shape? This depends on users. If they’re looking for those then yes otherwise you don’t need separate pages.

Log Files:

It’s very important to collect log files for a long period of time. The bare minimum you should have is 12 months. Fili said that he had clients that deleted log files before because they said it was taking up too much space/memory. Make sure you continue to get access to the log files. If your website development team does delete them after 2 weeks due to taking up space then you keep these by download it and storing somewhere else. There are plenty of free cloud storage options.  

So, why? The use of having 12 months of data or longer is so that you can see over time how different search bots access your site and so you can start piecing things together. Example; you can see a dip in traffic or visibility due to a lot of 500 error pages or help you identify 404 pages, redirects that aren’t working properly etc. For me personally, I used log files for an international migration last year to ensure it’s tidy and check whether crawlers understand that we went from subdomains to subfolders.

I’ve been looking at tools and Fili uses as many tools as he can. He says that they’re all good as they’re all slightly different. Screaming Frog is affordable, then there is DeepCrawl, OnCrawl and Botify. It isn’t just how they show the data but how they get the data. For example how the processing is done; before or after the crawl.  

Site Speed

Fili says that fonts and images are a good place to start.

FONTS.Only have the fonts that you need and set them up as optional, which means if the connection is too slow then the font will not be shown. The user gets content as soon as possible despite font not loading. Web fonts tend to be large and on a 2G connection it’s a problem. Most users are on 2G; 75% of current users tend to be on 2G or 3G. If you’re at a bus stop for example or in the middle of a supermarket (depending where you are) then it means worse experience, even if you have a 4G subscription.

IMAGES. You want to load newer web formats, like webP, this lowers the amount of KB you have to load. In HTML use picture tag, then 1 source and another source of different formats, then within that image use a picture tag. It won’t load that unless the webp loads so you can really optimise for that. Here’s Fili’s presentation to help you out on site speed further, but you may be able to find even more info via Fili.com.

Make sure that JavaScript is minimal and avoid frameworks. Unlike images and fonts, it’s more about processing power than KB. Processing power is very slow. Mobile CPU processors are like early 2000 computers and can’t handle processing JS that well. JS is meant as an add-on to a website to upgrade the journey of a user but we should be able to do everything without JS. Most exist for conversion sake, like signing up to newsletter or booking a hotel but you need to do this without JS. Also a lot of people block JS because of security and networks but also data safe reasons, AMP and ad blockers (15-20% on average.)

Backend site speed optimisation. Don’t forget this. If your website takes too long to process and respond to the queries then it takes a long time to load. Use global variables. WordPress for example, does the theme query every single part of the page, such as the title of the blog so why don’t you hard code this to the theme and speed up the process. Fili recommends you continuously measure site speed and use tools so you can test response times, database connections, code executions etc.

Compressing code via plugins and get rid of useless plugins too. It will really reduce the number of code running in the backend that’s useless. Examples: Calculate redirects or insert internal links based on keywords, that’s processing power you can possibly skip. Hardcode it so that it’s only done via the edit process/publish button so only done once. Integrated that way rather than every user visit load.

JavaScript (JS)

We mention JS re site speed but we delve in deeper here. Google understands JS more than couple of years ago but they’re using dynamic rendering to understand it more which is using slightly different tools. Still, better to use minimal amount and things need to work without JS still as mentioned above.

Bloggers tend to buy out of the box templates that are purely JS. Fili says the key thing is progressive enhancement, what does the user need to access primary content for what is shown via the most basic browser. If you optimise for those with accessibility issues then most of the time you’re also optimising for Google. What we need to generate to publish the content, does the user also need this? Bloggers tend to buy out of the box solutions that use JS, if that’s only for editing content then that’s fine but if they’re trying to gain a larger audience then ensure the content is accessible. What we publish on HTML, it doesn’t matter what tool you use to edit it as long as frontend is independent from the backend. More on progressive enhancement.

Favourite Google Suggestion that was Actioned on

Fili was a senior support engineer and developed a lot of internal tools for search quality at Google and found it was a lot of fun. One of the things Fili really enjoyed was the environment for feedback. Every piece of code is shared within the business and you’ll get feedback. You are expected to rewrite the code until everyone is happy with it. This was a great place for ideas also. Fili’s idea that was implemented (unrelated to his then-current team) was for G-mail. He suggested we approach Gmail like Google reader and offer the option that when a feed is empty you can then hide it. Because of this, you can limit the labels that have unread messages within them. Google has a great feedback loop, internally and online. On a scalable way, such as the coding feedback Fili has shown this is also shown in the search results.   

Intelligent Architecture Technical SEO Tips

The best way to help Google understand the hierarchy of your website is to use breadcrumbs. You have to define where a piece of content belongs and create a structure. Breadcrumbs create great internal links whereas Google doesn’t use URL structure to determine the hierarchy. URL’s matter for the user, such as memorability. Google can deal with a flat URL structure, even on the root level or numbers and still understand hierarchy due to breadcrumbs. 

User experience teams may not allow breadcrumbs so recommend that these are at the bottom of the page so that it offers the same SEO benefits. Better to have breadcrumbs, even for users who don’t know what they want and scroll to the very bottom, as well as for Googlebot because of internal linking structure.

SEO Navigation

Navigation to pass authority to categories. Make it adaptive, so if you have tens of thousands of products, you want a navigation that changes 2 or 3 levels deep to show more links to the subcategories of that page. Don’t show the 3rd level on the first or second level! Create a link between categories too when relevant, think of ‘people also like similar products’ as this improves conversion and potentially increase average order values.  

Suggested article:

How to move content to a new location

Relevant episodes:

Filters & Facets with Sean Butcher

Crawling & Indexing with Charlie Williams

Music credit: I Dunno (Grapes of Wrath Mix) by spinningmerkaba (c) copyright 2017 Licensed under a Creative Commons Attribution (3.0) license. http://dig.ccmixter.org/files/jlbrock44/56346 Ft: Jlang, 4nsic, grapes.

Leave a reply

Your email address will not be published. Required fields are marked *