21 | Technical Bugbears with Google | GDPR, JS & HTML

Alina Ghost

Simon Cox has been in the SEO industry for a few years but has been building websites for 25 years. He has done different aspects of SEO from 1998, when keywords were a thing! For a long time, he worked with HSBC until he went freelance last year, after dragging the bank into the 21st century. He used to specialise in graphic design as we as page layout and therefore prefers technical SEO over other areas.

We go through a handful of our technical bugbears with Google, especially when it comes to geoblocking for GDPR purposes, Google being forgiving with shoddy HTML and more:

Where to find him:

A quick introduction to GDPR, that had the deadline of early 2018 to ensure that rules apply where people’s emails were involved as well as cookie and privacy policy legislation. From an SEO point of view, not much to worry about but these two things do cross.

4:45. We, in UK have been treated differently by Google. For example, Simon wanted to take a look at ‘how big is the world’s biggest flag’ was and saw featured snippet. There was a picture of a flag on a dam but he then tried to go into the site to take a look and unfortunately it he got a 403 error. Simon was blocked from seeing the content due to GDPR because of geo-targeting and so, it’s the location that he was in. So, why is there a featured snippet showing if he’s unable to read more about it?

Simon suggests that it’s cheaper and faster not to include people in a particular country due to the GDPR legislation and also not to need to look through all the rules so they’re cutting whole countries out. Simon knows it was geographical blocks due to GDPR although it didn’t state this because other sites actually state with a message that explains that you can’t access this content because of GDPR. The problem is that it isn’t the legislation itself but because companies are not prepared to sift through it to understand it and act on it in other ways.

This shows that there is a clear quality issue that Google hasn’t addresses yet but John Mueller did say that they are aware of the issue.

Simon Cox’s Tweet & Correspondence:

It’s a unique issue in the UK as we’re a part of Europe so the EU legislation includes us. Maybe after Brexit the rules will change for us but as it’s half a year away, it’s unlikely to change quickly enough.

Simon puts plainly that…

“The results are providing ‘rubbish’ and shouldn’t appear in the results.”

10 mins. I personally disagree with pop-ups and as a reader or customer find it frustrating to see multiple sign up messages to a newsletter, cookie and policy terms. Would a bounce be seen as negative metric and therefore pop-ups shouldn’t be used in the future?

Simon says that it’s currently OK to use if doesn’t cover x amount of the screen and waits a few seconds before it shows up. Google is aware of this and so Simon believes these might disappear in the near future. He has no doubt that it will come but, in the meantime, people use them because they convert well and bring in traffic and conversion. His recommendation to clients is always that they shouldn’t use them unless they need to.

14 mins. Google understands JavaScript (JS) more and is crawling it better now. Simon, as a HTML website developer finds it hard to learn JavaScript and I agree. Certain sites might need JS for a good reason so SEOs will need to understand how that’s indexed and crawled. It’s a technical SEO’s biggest challenge, how to make JS crawlable. A crawler will render it but there are situations where it doesn’t render correctly so as a tech SEO you’ll need to figure out how and why it’s doing what it’s doing.

16:22. I always recommend to stay away from JS from SEO perspective. Will Critchlow from Distilled posted research he did on JS that showed less JS performs better for organic visibility.

18:00. There are good reasons to use JS and not just for website design but for other channels, such as APIs. Future is augmented reality and voice search channels so think about data management systems. Simon explains what he means using an example from HSBC who produce a lot of content and are able to share it between divisions and countries using a database. This is a way of governing information. HSBC produce more content than all UK newspapers put together daily. Themeforest.com have plenty of JS sites. People are pushing websites that have little crawlability which frustrates me as normal people wouldn’t know about the downside of it for SEO and visibility online. Simon worked on a music site. He said it probably took longer to rebuild from JavaScript than what would have been if done from scratch to make it search friendly. People don’t think about SEO but how it’s looking rather than if it’s accessible and crawlable.

23:55. Simon took part in Jason Barnard’s UX & HTML5 webinar. He touched on the basics of it, explaining that if you write HTML5 and structure it correctly you have semantically correct content that crawlers can read. You can also use multiple H1s (header 1) but in reality for SEO it’s best just to have one and then multiple H2s. He gives an example and a list of articles that can all be a H1 for the articles as it’s a summary of the article but the argument for that is it means that the list has a title and so that title should be the only H1. Simon, from both a web dev point of view and SEO, knows the middle ground and this is his recommendation.

26:35. Google is forgiving with invalid HTML. Not finishing tags off correctly are still rendered correctly because it assumes that that is what is should be even though it was ‘forgotten’. Semantically correct content crawls better but at the same time there is no positive gain from it from Google in terms of rankings or visibility, which Simon believes it should as he produces quality content!

27:30. If you don’t write AMP perfectly then it doesn’t validate. Invalid AMP simply wouldn’t rank or appear in results, hence the validation tool. This suggests there are two different standards that Simon questions. Also, Simon wants Google to take note of valid HTML and appreciate the sites that can be bothered to do that because valid HTML also means valid accessibility. JAWS browser or reading assisting technology means we’re more conscious of this as it’s for audio and visual impairment. Here are the W3C Accessibility Standards. In the UK there are over 12 million people registered as disabled, so all of whom can buy your products. An accessible site should be more easily crawled and the biggest blind user are bots. Although this was in the past as now they render, so it’s not necessarily crawling it.

31 mins. Do you think there are so many websites, Google allows for mistakes? Simon says no.

Google renders sites using Chrome 47 (42/5?) so it’s OK because they can see it without crawling necessarily. Companies can pay web developers to focus on something more exciting rather than accessibility perhaps.

33:40. To conclude. Google has things they need to work on, especially when it comes to GDPR and improving the user journey for those sites that geo-block. Also, HTML is still leading the way. Accessibility is important, and we’d like to see more kudos for getting coding right.

Final thoughts from Simon. He says it’s not difficult. Some of the best resources for website development and coding for online is A List Apart blog. There is also a book Designing with Web Standards 2001 which I myself will go and read!

Thanks for reading and listening.

Music credit: I Dunno (Grapes of Wrath Mix) by spinningmerkaba (c) copyright 2017 Licensed under a Creative Commons Attribution (3.0) license. http://dig.ccmixter.org/files/jlbrock44/56346 Ft: Jlang, 4nsic, grapes.

Weekly SEO Newsletter