Don’t lose ground when it comes to site speed
Fili says: “Optimize for site speed. It's not just for individual pages, you really want to optimize the entire website.
We are talking about things like Core Web Vitals, but also how you approach your codebase, like JavaScript.”
Is JavaScript something that’s significantly impacting site speed?
“It can. JavaScript is a great coding language, and it has gained popularity in the last few years. However, it isn't necessarily the most efficient language. A lot of people have JavaScript in the backend, not just the frontend. When they do, they need to hydrate or pre-render their pages and content in order to optimize the delivery of that content to the user. Not everyone is doing that, and that can definitely be optimized.
On the frontend, we also see a lot of JavaScript dependency. For example, JavaScript is used to render or inject critical technical SEO elements or content elements. Unless you are using it for a sophisticated app, like a game, the individual landing page should not be that dependent on JavaScript to deliver content.
A while ago, the UK government did a study to determine who has JavaScript disabled. It turns out that every user is a user without JavaScript until JavaScript is loaded, so every user potentially cannot view it. On top of that, network issues (including mobile connections), ad blockers, firewalls, and other reasons can block the occasional request. It might not be every request for every user, but as you surf and browse the web, certain JavaScript files may not always render. If your website is dependent on that, that can cause an issue.
On top of that, JavaScript also has a potential rendering cost, including on the Googlebot side. If you inject your technical SEO signals with JavaScript, and you have contact with JavaScript, Googlebot has to render the page in order to see that. Rendering a page is much more costly than fetching a page, which they do initially. Google is getting a lot better with this, but it’s not perfect and you shouldn’t assume that this will always be the case.
You can't make your content or our technical SEO signals JavaScript-dependent. You should always inject the critical mass of your content and technical signals into the initial HTML code that does not require JavaScript to render.”
Is this an issue for SEOs working on the edge, who use JavaScript to make changes without relying on an IT team to update the CMS?
“Yes, it is. Working on the edge is a great way to test things, but you have to keep in mind that it is not the final solution. It is not the final way of presenting the content down the line. The edge is a good way to build a case and experiment to see what works and what doesn't, without having to involve the IT team. However, once you have the answer to that, then involve your IT team and try to make it part of the actual codebase, because you don't want that dependency to be long-term.
Also, you build a dependency on whichever cloud provider you use on the edge. If you make any changes there – or if they have any issues or make any changes – that might negatively impact your ability to serve the right signals. That can happen overnight without you noticing, which can have a huge impact – especially in larger enterprises.”
What other elements of site speed would you highlight?
“A lot of SEOs optimize for site speed on the frontend – fewer kilobytes in CSS files or JavaScript files, etc. There are still things that you can improve further. For example, how do you load your JavaScript? When do you load your JavaScript? When do you load your CSS? Do you have your above-the-fold critical CSS preloaded and non-critical CSS loaded afterwards?
Which fonts do you use? How many fonts do you use? To be honest, system fonts are fine nowadays. Why load an extra font if a system font will do just as well? System fonts are often designed for readability so your content will be well-presented and very readable.
Designers want to load more custom fonts to give a unique impression of the website, the logo, etc. However, in the end, I would prefer to load that as a CSV file, rather than a cacheable one – meaning with a separate file path, rather than injected into the HTML or by using another font. If you can use a CSV file to give the same impression that the font would, that's going to be smaller and more editable as a vector, which is much more useful in the long term.
You should also look at the backend. It's incredible how many people don't actually look at the crawl stats within Google Search Console and how fast their average response time to Googlebot is. A lot of websites have an average response time to Googlebot of above 200 milliseconds. Ideally (especially for larger sites), you want this under 100 or 50. The lower it is, the faster Google can crawl your site.”
How can an SEO forecast the impact that significant increases in traffic volume may have on site performance?
“It's hard to do, but you can forecast using previous data. See how your site was performing previously, measure how site speed improvements have affected conversion rates over time, and then apply that to a potential forecast of what you could expect in the future.
It’s important to use a source like the Chrome User Experience Report to get real user metrics. Of course, these metrics are collected from a Chrome perspective, so this data may not be as relevant for you if you have a lot of Safari or Apple users. That being said, they are real user metrics, and you can assess how your website has been doing over time and how much things have improved.
There's always room for improvement. This report is based on Core Web Vitals, and we're getting a new major metric to optimize for in early 2024: Interaction to Next Paint (INP). Right now, about 77% of websites have a good score in that regard, which means that a lot do not. Currently, around 50% of websites do not have a good score for Cumulative Layout Shift (CLS), Largest Contentful Paint (LCP), and INP combined. There is still a lot of possible optimization left to do.”
For Core Web Vitals, how do you know when a score is good enough and how much effort you should put in?
“It's good to compare with your competitors. If they’re doing better on one score, and they're ranking a lot better, then improving one of the factors where they outperform you may help you over time. It's important to keep in mind, though, that a lot of the data needs to be confirmed by Google and Googlebot. They have to recrawl your website and see that it’s faster – so you need to improve the backend as well.
One of the key things that a lot of people forget is that LCP depends on how fast you load some of those resources. Also, how big do you make your packages? How complicated do you make your packages? How do you code your packages? This is where CLS and INP come in. That can all be improved.
Measuring the success that comes out of that, and deciding how much you should invest, depends on your current scores. If your scores are okay, then your primary focus may not be on site speed in Core Web Vitals, but rather on what else you can improve. That could be the back end, like your response time and Time to First Byte (TTFB). I would look at that straight away and see what your average TTFB is.
That’s another web vital that is tracked. It's not a Core Web Vital, but a lot of the other signals depend on it. If your TTFB is slow, everything else is going to be slow too.”
Does JavaScript impact First Input Delay (FID)?
“First Input Delay is going to be depreciated as a Core Web Vital. It will still be around, but 95% of websites are fine with that particular signal. It's not as useful anymore, which is why they have started looking at INP.
FID comes from different sources, and JavaScript can have an impact there. It depends on how the page renders, when it's available, when the forms are ready, etc.”
Is CLS still going to be key in 2024?
“Absolutely. If you want to test how your website performs on that front (especially if you have some bad scores and you want to identify what the issue is), you can check out my CLS Debugger.”
If an SEO is struggling for time, what should they stop doing right now to spend more time doing what you suggest in 2024?
“If you’re buying pay-to-rank links, stop immediately. Use that budget to improve the site speed of your website because those links are not going to benefit you anyway. You risk a penalty, and it's wasted money. If you pay for links for pay-to-rank purposes, you're basically tossing money away. Renting links is even worse but it does happen, especially in competitive industries.
There is a distinction between what type of links you buy. If you buy links for converting traffic and you add ‘nofollow’, etc., then it's perfectly fine. That's the way you actually benefit from those links.
Remember, the web is built on links. Tim Berners-Lee invented the hyperlink, which is what created the web. Clicking from one document to the next is what enables the World Wide Web to exist. Before that, we had hypermedia (images, documents, etc.) and we could search on a computer. However, we did not have a way to navigate from one document to the next, from within the document itself. That was the invention of the World Wide Web.
Links are the cornerstone of what we consider to be the World Wide Web or the Internet. Without them, there is no web. That's why Google initially put a lot of emphasis on that for their page rank algorithms. It's also why we have menus on our websites to allow people to navigate.
Linking and link building is not a problem. The problem is doing it solely to manipulate page rankings. In the eyes of the search engines, it is not desirable. If you do it for converting traffic, you're not wasting your money because you're getting traffic that converts. You're generating business.
However, if you’re buying pay-to-rank links, please stop. Invest that money elsewhere, like site speed.”
Fili Wiese is an SEO Expert, and you can find him over at SEO.Services.