Improve your SEO audits and make them more meaningful
Nikki says: “Create a meaningful SEO audit.
Anyone can create a website and there are even automated tools that can audit these websites for you (which I wouldn’t recommend). The biggest difference is in the outcome of that audit. It's all well and good having a list of issues for a website, which is usually what you get from these tools, but the website owner, the client, the stakeholders, and everybody involved have to know what needs to be done about that list.
I’ve been refining the process of working with relevant people at all levels of the business and developing a methodology that ensures that everybody involved can get what they need. This is something that any SEO professional can do, and there's no set way that it needs to look.
Essentially, instead of having a list of issues that just says ‘X number of canonicals are missing’, you also include all of the information that is needed to fix it. I tend to lay my audits out on a spreadsheet, and I don't go into a huge amount of detail in this section. For each one, I include what the issue is, why it matters, how to fix it, what type of issue it is, where it can be found, all the specific URLs, and how to replicate the issue.
I talk about how it can be fixed, who's responsible for the fix, and who's responsible for ensuring that it's done correctly. There's an important distinction between those two because they can be different people or different departments. The last few elements that I include are the level of priority, any repercussions of not doing it, the potential value of doing it, and when it needs to be done by. You're basically going through the what, who, when, why, and how of each one.
These are all laid out in separate columns on a spreadsheet, including various amounts of detail for each one. This may seem really straightforward, but you'd be surprised by how many audits don't include any of this information. They'll just state the issue that was found.
Lay this out on a spreadsheet, on a separate tab from the main audit, which means that everything is in one place and it's easily accessible to everyone who needs it. If more clarification is needed, you could do that when you come to work on each issue. I use a custom developer briefing document that I've developed. That's where you can include more details about each issue, the technical specs, and links to any additional resources.
It’s not about changing what you audit, but how you interpret that audit. Add actions and interpretations that are relevant to your stakeholders. Most SEOs have some sort of checklist that they use for their audit. I'm talking about taking that and presenting it to the client in a way that they can actually use. It doesn't matter if you have a checklist of 20 issues or 200 issues – it’s what the client can take from that.”
In the tabs of your audit, should you have different actions and interpretations for each stakeholder?
“Yes. I break it down at a very top level. You first identify whether it's technical, on-page, or off-page and then you can see who the stakeholders are. That dictates the level of detail that you need.
That's why it's so much more straightforward and transferable in this format. You can just have the top-level detail for each person but then, if a manager needs more detail, you can include that in another tab or in the briefing document.
It's a living document. You have access to it now, but you also have access to it months down the line while you keep working on it. Department heads, C-suite individuals, and people like that are only likely to be interested in the big picture. Also, the people who are working in content will need different levels of information from people working in the tech space.”
Can you automate the analysis of this data to create a summary or is that a relatively human-driven task?
“For me, it's still a human-driven task – and I want to keep it that way. Automated tools are fine but they don't include any of the context that you need. They don't understand everything else that might be happening around the business.
As humans, we talk about everything that we pick up that information in the initial discussions that we're having. We're able to take things into consideration like the size of the dev queue, how frequent the sprints are, when there's a code freeze, and what resources the client has. These are all things that feed into the sheet.
I call it my RAG matrix: Red, Amber, and Green. All of this information that I talked about is included there, but there is also prioritisation. Red is for the most crucial issues, and green is for issues that we can come to further down the line. This is all refined based on the human elements that we take into consideration, which automated tools are not going to do.”
How do you encourage dev teams to implement your recommended fixes more quickly?
“It's all about speaking their language. You can use the briefing document and you can make sure that you've got a seat at the table. You want to be speaking directly to the people responsible for implementing the fixes you're recommending.
A lot of people might put in a request to fix the canonicals, but how do you want them to be fixed? Is it the same blanket fix across the site? What is it that you need them to do? When I'm creating these documents and having these conversations with the people who are responsible for these fixes, I make sure that I'm talking about the scope of the issue. Is it one canonical or is it a thousand? What do you need to do about them?
Provide steps on how to reproduce them and any technical specifications. You can go into a bit more detail here. The issues might only be affecting certain browsers, operating systems, or mobile devices. It can even be different between mobile and desktop. You should also talk about the success metrics and KPIs, any considerations, and any blockers.
The success criteria are important because that’s how they will know that the issue has actually been fixed. You identify what they need to look for to be confident it's fixed, and they can pass it back to you to validate. You can also include things like approvals and any sign-off that they need to get.
These are not documents where you're explaining what a canonical is or why they are used. These documents should speak the language of the developers. They're designed to cut out the back-and-forth and ensure that they have the information they need to crack on and do what they do best.
Make sure that you're having conversations with these people and asking them what they need to know to be able to do their job. It may seem like a simple step, but it is often overlooked. Skipping those conversations creates a huge amount of frustration and back-and-forth for everyone involved.
You can even take this further and ask to be included in the handover or review calls. When the work has been done, and the developers are confident that it's been done and they're getting ready to push it live to production, they will often have a call where they explain what they’ve done and how it works. If you’re on that call, and you have access to the staging site where you've tested it, you can confirm whether the issue has really been fixed or not.
It’s another way that you can check on the progress of these things and make sure that you're having the right conversations with the right people – before any work begins but also before anything is pushed live to the production site.”
How often do stakeholders want to hear from you with a refreshed or updated audit?
“I didn't want to stick to these same cliches, but it depends. It depends on the size of the website that you're working on. When I have all these items on an issue log, I plan it out month by month. Again, it’s a living document, so the priority and the focus items change each month based on where you are and what the priorities for that month are. That roadmap usually lasts about a year, but if there are big algorithm changes or a migration that will affect things, we'll do a refresh.
I also have regular, scheduled crawls running on my client sites, looking for anything that's changed or been impacted in any way. I always keep the audit updated in that way. I'm not still talking about the two canonicals that were an issue in January if I can see that it's now 22 canonicals, or whatever the case may be. We've always got the latest information when we're talking about each item.
It is a live Google Document that is updated all the time. I find that they're a lot easier to share with people at all levels. Some businesses, depending on their industry, don't like Google documents. They might prefer Excel. However, it's easily transferable – or you can use it on calls and share it with them in that way.
In 99% of cases, Google Documents and Google Sheets work for most people. You share the link once and everybody has access to it, especially if they bookmark it. It makes it a lot easier. It's also really handy for collaboration and they can change the status of any items if they need to. If you're waiting for them to implement something like a page title change, they can update the status themselves rather than you having to keep chasing them. It really does become a living document that everybody refers back to.”
If an SEO is struggling for time, what should they stop doing right now so they can spend more time doing what you suggest in 2024?
“Stop relying on these automated tools. What I mean by automated tools are the ones that audit your site for you and try to tell you what you need to work on. They tend to say that certain items are an issue when they're not.
I’ve had tools say that duplicate H1s are the number one biggest issue on a website when they’re not because there are big rendering issues or the hreflang is broken, and that's going to have a greater impact.
Do use these tools, I'm not saying avoid them altogether, but don't rely on them when it comes to your audit summaries or your prioritisation. It’s always going to take too much of your time to go through them and see what the biggest priorities are. You're much better off getting hands-on and keeping that human element in the audits that you're working on.
These tools can also miss things. The other day, I saw a chatbot that popped up on mobile devices and inhibited the user's ability to click on the checkout button. That was something that we could fix quickly and easily. However, if we'd been relying on tools to audit the sites for us, we'd never have picked that issue up because that's a user experience issue – and it’s a conversion issue as well. It was the human element we needed to be able to spot that and see which mobile devices it was happening on.”
Nikki Halliwell is Tech SEO Lead at Journey Further, and you can find her over at JourneyFurther.com.