Slides from BrightonSEO 2020 - Competitor Site Audits using Free Tools and Data by Sophie Gibson, SEO Strategist at Rise at Seven.
These slides will cover how to use free tools and data to audit a competitor site from a technical point of view, how to put monetary figures on the issues found, and how to use this information to your advantage – using no enterprise solutions and only free tools and free online data. A case study on the Next website.
117. Only 23% of visitors that encounter a
404 page make a second attempt to
find the missing page.
https://www.impactbnd.com/blog/intelligent-404-pages
I’m here to talk to you about….. using free tools and data to audit a competitor site from a technical POV, put a ££ figure on their issues, and use this for your advantage. or….
(competitor site audits… for free) -
Maybe I could go into some details of why some people might be limited to free tools - in house, they’re freelance
I want to run through three main issues that I found from my own site audit, using…
using… zero enterprise solutions
And using only free tools
And free online data
Just a caveat - this is a case study, but they aren’t a client.
This is a case study of the Next.co.uk website - but just to clarify, I don’t work with or for Next; -Add “why next?”
Or how I found an estimated loss of revenue of
This is a story about a site audit I completed which found aproximately 314 thousands pounds of issues
How did I find these issues - and how did I estimate the value.
Lets investigate: So this was the homepage - I noticed that they had these banners on the homepage to specific trends - now this is november, so they - which is great for those more specific searches you get
Found a nice category page here… but what is this strange URL? Lets take a closer look
So this has /0-homepage at end of URL - huh that is weird, I wonder what happens when you take this part off…..
Looks familiar huh? How similar?
Almost the same amount of products
Almost the same amount of products
So if there are two identical pages, how big of an issue is this for them - so let’s have a look at…….(c) indexing:
Yep, google is free, and can uncover
First thing to do with any indexing, is to figure out how many pages are in the sitemap: so visited the robots.txt file and grabbed the URL - it’s an index, so lets have a look inside
Because when I visited the sitemap, it’s such a big site that they have split it into a bunch of separte sitemaps; How do I figure out how many pages are in the sitemap? to do that we can use…
…. A free tool. So, I want to
Using Screaming frog - I’m talking about the free version here - which does have a limit, as you can crawl only 500 URLs - but, we don’t need to actually crawl the site - we just need to see how many pages are in the sitemap… which you can do by...
Then upload - and you can either select download sitemap, or download sitemap index - seeing as we have sitemap index, we’re going to select that option
A pop up box will appear...
And you plug in your sitemap URL - here’s our next sitemap URL
we just need to see how many pages are in the sitemap -we don’t even need to press ‘OK - because, we don’t need to actually crawl the site - as I mentioned, with the free version, you can crawl only 500 URLs
Basic audit checks - does their sitemap roughly reflect the amount of URLs indexed in Google. (Yes, this isn’t really accurfate)
Basic audit checks - does their sitemap roughly reflect the amount of URLs indexed in Google. (Yes, this isn’t really accurfate)
…. A free tool to check indexed pages in google is….
Google search operators
So let’s dig deeper - if they’re getting links from the homepage wrong, what else are they getting wrong?
Basic audit checks - does their sitemap roughly reflect the amount of URLs indexed in Google.
Here we can use the site:search operators here (Yes, this isn’t really accurate and can fluctuate here but for a general rule of thumb this is useful)
So thats abous five hundred and sixty nine thousand results
Basic audit checks - does their sitemap roughly reflect the amount of URLs indexed in Google. (Yes, this isn’t really accurfate)
…. A free tool. So, I want to
So let’s dig deeper - if they’re getting links from the homepage wrong, what else are they getting wrong?
So let’s dig deeper - if they’re getting links from the homepage wrong, what else are they getting wrong?
Yep - gross isn’t it - look at that meta. Let’s have a look to see how many pages with this in the title are indexed…. -
Let’s have a look at how many are indexed - over 4000 URLs
So, considering how many links are on the homepage (it’s not 4000 for sure) that must mean they’ve 1 been there a long while, or the issue is widespread across the site.
So let’s dig deeper - if they’re getting links from the homepage duplicated - where else are key contenders for duplication issues?
They have a master page with an A-Z list - if this is just A - how many potential duplicates are there going to be?
Lets find out, using a...
Lets find out - so to do this, we’re going to need (next slide)
…. A free tool. So, I want to
- and if every single category or filter has this tag on…. This could be thousands of additional URLs being indexed. But to be sure…..
After we’ve got our number, we now need to compare this with the number of
And when you look at how many results there were for the whole site - this is a massive…..
30% of indexed URLs
So let’s dig deeper - lets look at specifics, just to check our assumptions
Brand example. - from Mela - now there are 496 results - and only the first two have custom meta data - the rest look auto generated. So, why do we think this is the issue?
So let’s dig deeper - lets look at specifics, just to check our assumptions
Going back to our party wear page. We’ve got some category selections - if they’ve got a lot of index bloat, what are the reasons why we have multiple categories coming up - with ecom, one issue which always pops up is how filters are handled
Lets pick jumpsuits for example
Now look what happens to the URL - it changes - now, this doesn’t mean anything on it’s own - so we can use a nice free tool to check this out...
…. Now we need some help from… A free tool.
…. Now we need some help from… A free tool.
See robots will visually show you at a glance what the index status of a particular page is - it will show a coloured square, which means the following
So not only does the URL change - the page has an index, follow tag on this, which means this URL is accessible and indexable for Google
- and if every single category or filter has this tag on…. This could be thousands of additional URLs being indexed. But to be sure…..
Lets find out - so to do this, we’re going to need (next slide)
So I used this ‘isort’ term, and used the inurl: operator to find out how many URLs were affected by this issue.
So I used this ‘isort’ term, and used the inurl: operator to find out how many URLs were affected by this issue. Now, it’s not as big as it could potentially be
Now, seeing Google states there is no information for this page - this makes the think that these i-sort terms have been excluded in the robots.txt file, which means a large majority of these pages may not be crawled - but as they are linked internally, if you have not this does have the potentially to become an issue later down the line. But...
What else could be causing these issues?
Now, going back to our partywear page - there is also the word promotion on this, I clicked around from the same banner area, and found other pages with the same structure,..so to check this, we go back to ….
You guessed it, back to our handy inurl search operator
Here are the results for that search,
With around 25 thousand results for URLs with this in the URL - a much bigger issue.
And this led me to notice this quirky meta title data, which is listed on these tom search term: From the Next UK, - … hmm, okay, I think we all know what my next question is..
Lets find out - so to do this, we’re going to need (next slide)
https://ahrefs.com/blog/google-advanced-search-operators/ search operators again! But different - if I want to
Back to our handy inurl search operator
Now the page title I found was a little longer “from the next uk online store”
And this had 123 thousand results -
And when you look at how many results in google there were when doing a site search - this is a massive…..
20% of page titles indexed have this super long - which could mean they are losing out on traffic or effecting click thru rate, seeing as non-optimised page titles are widespread.
So now we have got all of this information about issues on the site, now what do we need to do?
Yep, we need to show the money - we need to try and put a figure on these issues, so we’re gonna need some stats and values from somewhere. Where….
You can use statistics from case studys.
But how do I find these case studies?
Think with Google have a nice selection of case studies with useful stats in
You can do a regular search for this - but sometimes I find the articles brought back aren’t always what I’m looking for - so we can go back to our trusty search operators - we can use….
In url… or
In title
So let’s dig deeper - if they’re getting links from the homepage wrong, what else are they getting wrong?
Or for this, I just used just exact match terms
Article, which said these had achieved a 22% increase in traiffc and 7% increase in revenue from organic. But what does that mean in terms of solid £££?
…. Now we need some help from… SpyFu, a competitor keyword tool...
And you click on
And you click on the estimated monthly clicks section
It will bring you to the seo overview tool. Here you can see a trend of how many keywords they’re ranking for. I noticed this drop off - so I can see that they went from having 78k keywords ranking in feb
And in may this dropped to 25k - so, something happened there.
Going back to the study we found, that site had a 7% increase in organic revenue - 7% increase on 1.78=
This would net you roghly an extra 124k - which you are currently just throwing away.
When I was looking around I stumbled upon the 404 page - which wasn’t great. No menu on the page - three static links to other category pages. Fustratingly, the logo didn’t take you to the homepage either.
- and if every single category or filter has this tag on…. This could be thousands of additional URLs being indexed. But to be sure…..
Lets find out - so to do this, we’re going to need (next slide)
You can use statistics from case studys.
Found this in a study which found that only 22%
Which is a 77% loss in traffic - but how do we find out how many people might reach a 404 page?
…. A free tool. So, I want to
Competitor
Back to our statistics of - losing
Back to our statistics of - I found a case study which showed that typically ecom websites in the fashion industry has a 2.35% conversion rate
And, taking a look at the general price of products, I took a guestimate of around £30 for the average order value, as we don’t have any more specific data
…. A free tool. So, I want to
All non standard implementations - All non standard implementation
Site tags firing 4 pageview requests - Category page - 13 tags!
Multiple analytics & GTMs. Too many tracking codes for the same thing causes inaccurate Analytics data
If you don’t know what online channels are working - how do you know where to best invest the marketing budget?
An increase in third party tracking scripts slows down the page loading speed
…. A free tool. So, I want to
To Semrush
You need to create an account
And you get 10 free checks
According to SEMRush 70% of traffic with ZERO referral data, going into the direct bucket - this means that potentiallly they have 11m sessions without attribution - Need to check GA for specific figures
This study showed that each script added to a page increases load speed by 34.1 ms
Of there are 13 tags on the site, this adds up to…
442 ms in total
Back to our statistics of - I found a case study which showed that typically ecom websites in the fashion industry has a 2.35% conversion rate
34ms each tag
13 tags on the site
= 442ms
Reducing scripts by just 50%
= 2.21% potential increase in revenue
39k pm
If you add all this together you get….
A net loss of 304 thousands visits
Yes, figures are estimated- these findings are not going to be super accurate without data access or paid tools, and we’re making a lot of assumptions.
And technically, they’ve not ‘lost’ this revenue, as these figures are actually potential revenue they’re missing out on due to specific issues.
But that being said
A pitch that is rough and ready with good insights can make a big difference, compared with something super accurate which took 10x longer and has less content - especially if you’re telling a story to get someone else on board, or if you’re super strapped for time.
You don’t need an arsenal of tools to be able to audit large, enterprise level sites. Don’t be intimidated by either size
You can find a lot of issues just clicking around a website - Once you’ve found one main thread, it’s easier to find related issues from the same angle.
You can use this information bolster client pitches - “If we can figure out what issues you might be having off external data sources, imagine what we can do with access to your own data”
We can use this information to add additional insight into pitches - We can infer hidden problems by making some assumptions - for example, in this case - we might be able to infer that teams may not be working together as well, as it looks like the Marketing department are potentially creating new banners and ‘featured categories’ without SEO in mind
Automated meta still being used in sections, so no procoess when addingg
You can use this information to show them other services which you could help with.
Create workflows / support Next.co.uk product team with
maintaining site health
Nothing gets resources or focus in the areas you want to go for, better than telling your boss you’d be getting one over on the competitors if they can swoop in RIGHT THIS SECOND