Focus on what really counts: fresh, relevant data
Kaspar says: “Use relevant, fresh data for your SEO. Focus on what really counts; in particular, server logs for large websites.
Do not get distracted by short-lived SEO trends. SEO really is only about relevant data, and nothing else.”
Is signal input what you are tracking with data?
“There are a lot of sources of data out there that we, as SEOs, can tap into. Anything else would be guesswork. Conducting SEO, building a strategy, and introducing changes (on the content, technical, or off-page side), without a data foundation, is really just guesswork. That's a business risk, and you don't want that.
On the very basic level, every webmaster should be tapping into Google Search Console and Bing Webmaster Tools if they want their website to do well in Google Search. Bing Webmaster Tools is very important because that's another source of data that demonstrates how search engines perceive the signals from your website, but it doesn't stop there. Really large websites can do so much more.
One thing that is frequently neglected, if not completely ignored, is server logs. In my experience, large websites ignore server logs nine times out of ten. They don't record them, or they only record them for a brief period of time and subsequently ignore the data or overwrite it as time passes. That's really unfortunate because there are so many things that can be done with server logs recorded over an extended period of time.
To begin with, you can start understanding how much overlap there is between desirable landing pages that should be in the XML sitemap, and the volume of landing pages that are being frequently crawled and re-crawled by Google in order to understand the changes that are being introduced. That's just beginning to scratch the surface.”
What's the difference between Bing Webmaster Tools and Google Search Console?
“Of course, the relevance is much higher with Google Search Console, in terms of market share. The vast majority of online marketers primarily care about Google Search because the market share is so dominant; it’s still around 90-95%.
That remaining 5-10% might not seem like much, but it can be a lifeline when your organic traffic from Google isn't forthcoming for some reason or another. That could be because of a Google algorithm update, because something was done on the website unintentionally, or maybe because a legacy issue is holding the website back. That 5-10% can make a big difference.
In terms of the differences between the two, the data you get from Google Search Console is much more comprehensive and interesting. However, with Bing data at hand, you can verify things – or at least look at them from a different perspective. It isn't completely one-sided.
Most importantly, it is also cost-effective. All of that data is available free of charge, and you don't have to pay for an external tool. There are great tools out there, but many of them are paid solutions, so they require additional budget. When it comes to data, Google Search Console is the absolute minimum. If you want to take a tiny step beyond that, Bing Webmaster Tools will also help you to understand how your website is being perceived, how it's being crawled, and whether there are any issues. It doesn't stop there, of course; there's so much more that can be done.”
Can you automatically combine the data from Bing Webmaster Tools and Search Console in something like Looker Studio to get a more holistic view?
“It is important to export data in bulk and retain it because that data isn't going to be available to us in perpetuity. It is being overwritten. That data is a snapshot of the last 90 days or so. It is possible to export it and import it into other tools. We do it slightly differently (we have our own proprietary software and our own approach), but it can be done.
If data is being utilised, it’s important that it is being done by someone who has the capacity to understand what that data actually means. If that data is being exported and utilised, you either need to have the capacity to read it, understand it, and draw conclusions based on that data in-house, or you will need a third party to help read and understand it.
It doesn't have to be done on a weekly, or even monthly, basis. There are large websites with 100 million or 500 million landing pages, where the data volume justifies the resources required to ensure that the website is being continuously improved and optimized. For smaller websites, though, it doesn't have to be done on a daily or weekly basis.
If you have a car, you take it to the garage for an annual checkup – to check the fluids, the brakes, the lights, and everything else that you want to work perfectly when you go on a family trip to the beach. You don't want to be stuck in the middle of the road because you've got a flat tire, or the fluids have run low. It is similar with websites. For smaller websites that data needs to be reviewed on a regular basis, but far from weekly or even monthly. Once per year is enough.
The important thing is that this is being done, and it is being done by somebody who is capable of understanding what the data means and how to translate that into actionable advice.”
What's the optimum way to use server logs?
“Server logs are my favourite topic. I really love server logs because they allow us unprecedented insights into how the website’s server interacts with bots – and not just Googlebot.
There is a multitude of things that can be understood, but a very important one is the crawl budget. You need to have server load data covering an extended period of time to verify how much of the website is being crawled on a regular basis, knowing that the homepage is going to be crawled much more frequently than other pages. Only with that data at hand can you understand how big your website is and how long it is going to take for Google to recrawl it.”
Is how often the website is being recrawled important?
“It depends. For instance, if you are in the travel industry or the retail industry then it’s critically important. If Google doesn't recrawl the changes introduced to your landing pages, you're running a huge risk of having expired, unusable content that is still ranking. They can't purchase that travel package or the item they were looking for. If those landing pages still rank, it’s bound to create very poor user signals, which are going to destroy your rankings.
You want to understand how much of the website is being crawled, and what is being crawled. Is it the FAQ pages and the press releases? Is it the supplemental blog that is really just filler content or is it the actual sales pages? Is it the landing pages that you want to rank because they are the cash cows? Those are just a few critically important things.
Having server logs at hand, you can also understand how much waste there is, and how often the website is being scraped and crawled by fake bots.
I penned an article on Search Engine a while ago that outlines the benefits of introducing server logs, and there are very few downsides. The only downsides, or roadblocks, originate from two departments. The first is often the legal department saying that server logs are problematic at a time when data is being protected. That’s true to an extent, but it doesn't apply to server logs because you’re only looking at bot data, not user data. There is no risk of utilising any data in an inappropriate way.
The other issue that is often brought to the table is that it's expensive. It is going to cost money, but very little. Hard drives are as cheap as they've ever been, and server log data is relatively small. It can be compressed and Gzipped, so it can be stored on a physical hard drive in perpetuity. It doesn't cost a lot of money, but the potential of tapping into that data in the course of conducting an SEO audit is huge.”
How does an SEO define what is critically important for them to focus on?
“The honest answer is that it depends. It depends on every individual website and organisation. They're all different. If you are looking from a commercial perspective, and you want to do well in Google search, your unique selling proposition is going to be critical.
What is the one thing that makes you stand out? It could be because you have the best price ever, so your unique selling proposition would be founded on having the most attractive pricing. It could be that there is a great community behind the product, that you have the best selection of products in your vertical, etc. The important thing is to convey that message to the users and make sure they understand that this real brand has a unique selling proposition.
Then, you also need to convey that message on the landing pages, and in the snippet representations of landing pages that are being crawled, indexed, and ranked. Doing so will help you to have a massive positive impact on user signals. If your landing pages rank for relevant queries – and you can not only live up to the expectation behind that query but also meet or exceed user expectations on the landing page – that is going to translate to positive user signals for Google.
That is what's going to help your website rank well. Google shows a preference for websites that are popular with users. Putting the user first, via your USP, is a winning strategy. It's not the only thing that needs to be done, especially in competitive environments, but without a unique selling proposition, it is very difficult indeed.”
If an SEO is struggling for time, what should they stop doing right now so they can spend more time doing what you suggest in 2024?
“Our industry isn't very different from others. There are a lot of myths and a lot of concerns on the client side that do not really need to be addressed. One thing you can stop doing is talking about, or even thinking about, domain authority. It is a poor allocation of your time and resources.
That is because Google doesn't care about that value. Whatever your perceived domain authority may be, it is not something that Google views as good or bad. It's of no consequence, so we can stop thinking about it in the SEO industry.”
Kaspar Szymanski is the director at Search Brothers, and you can find him over at SearchBrothers.com.