What is a Technical SEO Audit?
Search engine optimization (SEO) at its core, is all about ensuring your web pages are accurately ranked on search engines. Getting your site to rank well in search engines takes a lot of work, and in order to get the best possible results, it’s critical that your site is secure, optimized, and error-free.
A technical SEO audit is a review of your website, usually to identify and fix issues that might affect your site’s rankings.
The key elements of a technical audit are:
- Crawling
- Indexing
- Rendering
- Site speed
- Site structure
Technical SEO audits are an important part of any SEO campaign, but they are also time-consuming. When done right, a well-executed technical SEO audit will tell you which issues your site has and how to fix them, but it does require an eye for detail and some technical know-how.
How long should a technical audit take?
Taking into account everything from the size of the website, the number of pages you have, and of course the number of issues that show up along the way, a technical audit can take anywhere between a day and an entire week.
When performing a technical SEO audit, it’s important to recognise that SEO is also a moving target.
The goal posts are constantly moving, algorithm updates can sweep the carpet from under your feet. It’s what makes SEO exciting for some, and incredibly frustrating for those who are hit with issues and updates.
Without going too deep down the technical rabbit hole, we’ll focus on some common errors I always tend to find, as well as a few issues that just often get overlooked.
So, here is a step-by-step process for your DIY technical audit to ensure your website has a strong foundation.
Before we start we can get into the basics of what you need, and what you should check before deep-diving into a technical audit.
We can start with:
Access To Google Search Console:
If you already have access and have registered your site with Google Search Console, then great. Skip ahead!
If you are not already set up, then I highly recommend starting with Rankmath. A free and very powerful SEO plugin that will come in very handy in the long run. It’s a great alternative to Yoast and takes the hassle out of the technical setup.
It will also get you up and running pretty quickly with Search Console.
You can download Rankmath for free and the guide to getting started with Search console will have you up and running in no time.
If you prefer going the manual route then here is an in depth step by step guide to get you set up with Search Console.
Check For Manual Actions:
Once you are up and running with Search Console it may take a few days to populate data. So I guess bookmark and come back to this one.
Once you have Search Console data, login and navigate to the left menu. On the left side navigation panel select Security & Manual Actions > Manual Actions.
If you have “No Issues Detected” then great.
We can move ahead.
If you have a manual action then it is a priority you get this resolved before going any further.
Essentially, Google has flagged your site for being shady/malicious/not worthy of being shared with other people online.
Maybe it’s something you’ve done without ill intentions, but your site will be penalized and your page/pages or even entire site can be demoted in rankings or even de-indexed completely from Google.
It’s rare, but it happens. And it will take time to fix.
Make Sure Your Site Is Mobile Friendly:
Since 2020, Google has moved to a mobile-first approach. This means the mobile version of your site is given priority over the desktop version.
So when designing a website you should make sure mobile users have a great experience before you start bringing in traffic.
If your page is mobile friendly then we are off to a good start. If not, you should consider some design changes to get your site on Google’s good side.
The next step is to head over to Google Search Console. Navigate the left bar and click Experience> Mobile Usability. Here you can check for any further warnings or errors that might be holding back your mobile usability.
We will also be checking some other issues later, so you might need to come back to this one again.
Check Your Site Is Being Indexed:
To check your site is being indexed by search engines you can go directly to the search engine and do a manual check. Just head to Google (and Bing, don’t forget Bing 🙃) and enter:
Site:yoursite.com
You will see how many pages are being indexed (Although this is not completely reliable, it is a good indicator your site is being seen by search engines)
If you notice anything unusual here, like no pages at all, or just a handful on a large website it’s time to head back over to Google Search Console.
Navigate the left side menu to Index > Coverage:
Then look for anything with errors, or warnings, and see if Google are flagging any issues with your pages.
If you are having issues with any errors here head over to Content Kings Guide to Finding & Fixing Coverage Errors.
If you see your pages are valid and you are happy with what you see it’s time to move on and start digging a little deeper.
We’ve made it this far. Not too technical so far, right?
Now on to the fun stuff!
HTTP Status:
The best place to start any SEO audit is to evaluate the status of HTTP codes. You can start this one with some manual checks and it’s a good issue to find if there is one.
- What is the issue?
The HTTP (Hypertext Transfer Protocol) status code is an indicator that shows whether something went wrong or if everything is all right to the browser.
Let’s start with the HTTP vs HTTPS. That s at the end makes all the difference!
HTTPS v.s HTTP.
- Why is this an issue?
The boring part – The HTTP protocol is used for transferring data between web servers, web browsers, and other network clients. HTTPS is a secure version of HTTP that encrypts data as it is transmitted between your website and the search engine.
This means that the data cannot be intercepted by a third party and decoded. It also means that you can be sure that the data has not been tampered with on its journey.
When your site loads unsecured you will see a “Not Secure” warning in the URL bar, and visitors may even get a warning from Google when trying to load the page that most users just click away from.
How to fix:
The fun part is that the best place to start your check is with Google. If Google is indexing unsafe pages then it’s best to keep Google happy and make these pages a little more secure.
You can start with a site search as follows in Google:
site: yourdomainhere -inurl:https
This search will show any web pages that aren’t associated with your main HTTPS domain that are being indexed by Google.
This is always a good manual start to a tech audit as this search can turn up some pretty weird stuff at times like subdomains, staging sites, duplicate nonsecured pages, and should turn up any non-secured pages that are indexed by Google.
[Note: These inurl: searches aren’t as reliable these days on Google, it’s still always worth trying]
These should be your first fix. Or at least your first point to take notes of anything odd that turns up here.
You can usually fix these issues with a simple redirect to a https version of the URL.
Another issue could be with canonicals which we will cover in more detail later. Just to touch on it here, your site could pull back a non-HTTPS URL in Google because your canonicals are referencing the HTTP version instead of the HTTPS.
Take for example this site, which is secured but their canonical isn’t referencing the secured version of the page.
Or, you could also be internally linking to the HTTP version from links across your own site. You can find this one in a screaming frog (or alternative crawler) that we will cover a little later too.
Files not secured:
Another common issue causing a page to load unsecured even though it is HTTPS can be caused by images or other files on the page being served in HTTP.
This could be caused by moving your site over from staging, or just not uploading files in the correct format. The great thing is you can usually find this with a Google lighthouse search.
The issue will fall under “Best Practices” and it will generally show you the exact file causing the issue. You can just search for this in your site media, or Cpanel and change the source to HTTPS. Sorted!
Now you can appease Google and get rewarded with that little lock icon beside all your URL’s and sleep soundly at night.
HTTP status codes we can come back to later.
Using Screaming Frog:
It’s pretty important in a technical audit to have a program that can crawl your entire site efficiently and save your hours/days/weeks of manual checks.
Sure you can do a lot manually for free but it’s a lot of hassle.
There are a lot of great options out there now and most of them are very affordable for what they do. But Screaming Frog is free for up to 500 URLs so we can start with that.
You can download it here: https://www.screamingfrog.co.uk/
See our guide on how to automate screaming frog audit as well
How to set up Screaming Frog:
When you first open this up and start it’s a bit like jumping into the matrix for the first time. Honestly, it’s not so bad when you know what to look for.
So, we’ll keep this one simple, especially if you’re just using a free version and also in the spirit of not being too technical.
Before you add your site there are a few configurations to make.
- Configuration – Spider
- Speed. This is important When it comes to speed you should keep this around Max 2 or 3 url/s. What could happen if you let screaming frog crawl your site at warp speed is your website will possibly crash. I’ve taken down my own site doing this, granted the hosting package was terrible but it was a lesson learned regardless.
- Add your site and start.
Now that you’ve got your crawl finished we can start making sense of that mountain of data.
Check Canonicalisation
Here’s a sheet to follow along with this one and check your own URL’s for checking canonicalisation.
What is the issue?
Canonical tags tell Google what the “Main” URL is on your site.
This comes in handy if you have duplicate pages for whatever reason, either by accident or for designs/split testing reasons. Whatever the case, it’s always a good idea to let Google know what page you consider is your priority URL.
For example:
https://website.com/landing-page-test
https://website.com/landing-page
You want to make whichever page you drive organic traffic or internal links across your website the Canonical page.
Why is this an issue?
Canonical tags can very easily be set up incorrectly. They could all be pointing to a HTTP page as mentioned already, Sent to a redirect, pointed to a URL with a slash at the end when all your URL’s end without a / ( Read More)
How to fix:
Just follow along with the sheet above and the data from Screaming Frog. Check all your canonicals are the same, and if not it’s definitely something to look into updating.
This leads is into the next step.
Slash or No Slash /
What is the issue?
URLs are kind of like a Guns n Roses tour, some have a slash, and sometimes they don’t.
The issue is if you have both. (URL’s not guitarists)
When your web page loads with and without a trailing slash you are telling Google you have 2 separate pages.
Generally, websites are set up with a redirect already configured, but if your site doesn’t there are a few things to look out for and prioritise.
Why is this an issue?
It’s not a deal breaker, but if it’s an easy fix then it’s something worth sorting out. The problem here lies in duplicate content, an inflated crawl budget, and oftentimes wasted links. The links are the biggest one to look out for – If you have links pointing to 2 versions of a URL then you are essentially diluting your priority page.
Traditionally, URLs that pointed to files did not include the trailing slash, while URLs that pointed to directories do include the trailing slash. This means that:
https://website.com/example/ is a directory, while
https://website.com/example is a file
How to fix:
If your entire site loads with both versions it’s going to be a monumental task if you have hundreds or thousands of pages to redirect.
You can force a sitewide redirect using .htaccess or permalinks in WordPress (not recommended if you have hundreds or thousands of pages). But if that’s your choice then here is the best guide to set up your preferred URL structure.
Or, you can use Google Search Console, and see which pages are driving the most traffic, then redirect to the best-performing page.
This is the best approach to ensure you are keeping your link equity.
https://website.com/example/ is getting 1000 visitors a month
https://website.com/example is getting 10 visitors a month
You need to redirect https://website.com/example to https://website.com/example/
You can also choose to 404 the page you don’t want to keep, but it’s a better user experience to ensure a 301 redirect is in place.
404 Errors:
What is the issue?
A 404 error, also known as a not-found error, is displayed when you search for or click a link for a page but it can’t be displayed.
This might happen for several reasons:
Internal 404 errors: If you link to a page that was deleted, or just typed in a URL wrong as your link, you may be sending users to a 404 page.
Inbound 404s: Where someone else has linked to your page but it was deleted, moved, or they got the URL wrong on their end. These are important!
Outbound 404’s: Where you link to another site but you are sending users to a broken link. Bad for your content! And bad for the other site too.
Redirect Chains: After about 5 redirects, bots will treat your page as a 404. John Mueller gave some advice on Reddit about this one, saying “Search engines just follow the redirect chain (for Google: up to 5 hops in the chain per crawl attempt)” So after a long redirect chain it’s likely you will end up with a page hitting a 404.
Why is this an issue?
You don’t want to annoy your visitors. Your goal is to keep them onsite as long as possible and give them the information that they want. So, sending them to a 404 page is probably not going to help.
Maybe if your 404 page is awesome like this one: Readytogosurvival
But even still, as tempted as you might be to show off your incredibly well-designed 404 page, you should probably sort out your broken links. That way your 404 pages will eventually just disappear from the face of the earth and not bother anyone again.
How to fix:
The first step is to find all the broken links you are sending people across your site. You can do this with a Screaming Frog crawl and look for 4xx codes. You are looking for anything with a 404 status.
Then look for the “inlinks” menu, and you can see exactly where you are linking to these 404 pages from. You can decide if you want to remove these links and let the page wither away and disappear, or you can choose to 301 redirect to a more relevant page.
More on 301’s next!
301 Redirects:
What is the issue?
301 redirects are permanent redirects used when you want to send someone from an old URL to an updated one.
Even going from HTTP – HTTPS which we looked at earlier involves a 301 redirect as it is a new URL.
When you set up a 301 you are telling search engines “That piece of content you crawled and indexed before now lives over here”
And most importantly, you are passing linking strength through these redirects. So any links on your old URL is not wasted when you redirect from it.
Redirects are important to have in place in a lot of situations, but the issue is you could be slowing down your site with redirects, and possibly losing link power to certain pages.
Why is this an issue?
First issue: The 301 redirects can cause a slower loading time to a page through internal links. Rather than linking directly to your new URL, you could be linking to an old URL sending your site users through an extra “hop” to get where they need to be.
It’s kind of like having 2 doors in a corridor to get to the next room. It doesn’t harm anyone having them, but does it need to be there?
The bigger issue: Link equity. You could be passing link equity through a 301 redirect rather than directly to the main URL. This isn’t a massive issue, however, according to John Mueller “301 redirects cannot pass 100% page rank”
So really, you are wasting your website’s full potential by passing key pages and key internal links through redirects.
How to fix:
Using Screaming Frog we can pull back a list of all 301 redirects on a website. This one by Wired.com is a great example as it shows they have fixed their trailing slash issue by redirecting the non trailing slash page to a canonical with the trailing slash.
However, they are still linking to the nontrailing slash version, which is causing an extra hop to get to that page.
You can see using the “Inlinks” menu where these links are coming from. This is something you can choose to clean up and change the internal link straight to the canonical.
You can also 404 these pages, Unless – you have been receiving links from other websites to both pages and want to keep a hold of some of that link equity.
So for example,
https://www.wired.com/story/health-business-deprivation-technology if this page had 5 links from another site &
https://www.wired.com/story/health-business-deprivation-technology/ had 50 links from another site. Then the URL with the most links would need to be your canonical, but you want to keep the 301 redirect in place to get that link equity from the other 5 links.
Duplicate Homepages
What is the issue?
It’s possible to have multiple homepage URLs. Here are just some examples of possible homepage variations:
Yoursite.com
Yoursite.com/index.php
Yoursite.com/index.html
Yoursite.com/home.php
www.Yoursite.com
www.Yoursite.com/index.php
www.Yoursite.com/index.html
www.Yoursite.com/home.php
So, that’s a lot of homepage URLs, some of you most likely haven’t seen or taken notice of before.
Why is this an issue?
- Your internal and external link’s can be split between multiple homepages giving you less link equity on your key page. And you homepage is your key page!
- Google can index variations of your homepage. I’ve seen this one strangely enough with Google indexing different homepages based on location searches. So this can cause a split in traffic coming to multiple homepages when your trying to track your campaigns and traffic. Also, having multiple homepages indexed increases the chance of getting links sent to multiple homepages.
- Fixing this is great peace of mind going forward for just less that can go wrong. The less issues the better.
How to fix:
The easiest fix here is to 301 redirect each variation to the page you want to treat as canonical.
So the redirects here should be set up individually to your best performing key page like so:
Yoursite.com/index.php > Yoursite.com
Yoursite.com/index.html > Yoursite.com
Yoursite.com/home.php > Yoursite.com
www.Yoursite.com > Yoursite.com
www.Yoursite.com/index.php > Yoursite.com
www.Yoursite.com/index.html > Yoursite.com
www.Yoursite.com/home.php > Yoursite.com
Once your redirects are complete you want to make sure all internal links are pointing to your canonical aswell. So in the example above if your logo is pointing to www.Yoursite.com you should go and change that to Yoursite.com.
Did you site have duplicate homepages? Yea, not an easy one when you first see it but hopefully the steps above are a good guide.
Website URL structure:
Here’s a handy sheet for this one. URL Title & Meta Length Checker.
What is the issue?
Your web pages might be set up in such a way that a lot of your URL’s are set up to look like:
Yoursite.com/super-long-description-with-lots-of-keywords-that-looks-way-too-long-and-not-very-user-or-search-engine-friendly-2/
Why is this an issue?
You might have just your CMS or SEO plugin create a URL for any page you’ve been adding to your site. Unfortunately, your CMS or SEO plugin can sometimes go a little overboard leaving you with some eye-sore URLs.
It’s best to manually control this yourself each time a new page is added to a site.
How to fix:
You can grab your URLs from screaming frog and paste them into the Google sheet as attached. You can also pull your page titles and meta descriptions while your there and get a good look at anything you might want to work on.
Important.
If you want to change URL’s that already exist, it’s a good idea to check each page for links and traffic before committing to any changes. And if you do make any changes, always update your internal links to the new URL so you aren’t linking to a 301 redirect.
I would suggest for this one, just take note of it going forward and follow best practices from here on in rather than overhauling your existing setup. Unless it’s a new site with little traffic, then go crazy if you want to.
There is no real definitive guide for a perfect URL structure, and your URL isn’t (as far as we know) going to stop you ranking, but there are definitely best practices to follow.
By the way, if you are on a Shopify store skip to the end for an added bonus tip.
Best practices for URL’s are:
Include the keyword:
If your page is about technical seo audits, then your URL will ideally be yourwebsite.com/technical-seo-audit
It just looks better in the SERP and helps your potential traffic and Google better understand your page.
- Avoid using dates:
Your guide to best thing ever for 2022 might be great.
yourwebsite.com/great-thing-to-do-2022
But in 2023 that URL is not going to look too fresh.
2. Use hyphens instead of underscores:
Matt Cutts over at Google explains this one.
Essentially, Google views hyphens in URLs as separators, while underscores don’t separate words. So blog-post is view as “blog post”, whereas blog_post is viewed as “blogpost”.
3. Keep them short and to the point:
Longer URL’s are not fully displayed in search resuls. So a potential visitor might not see your descriptive keyword about the page.
It’s advisable to aim for 75 characters for an entire URL. You can use the Google Worksheet attached to help you with this one.
4. Use lowercase characters:
Search engines can treat capital letters differently to lower case. So
yourwebsite.com/technical-seo-audit
Is actually a different URL compared to
yourwebsite.com/Technical-seo-audit
So if someone is linking to your site, or if you are linking internally or creating canonical URL’s, it’s easy to type it out without capitals, therefore reducing link equity to your canonical page.
And again, just to note, unless your URL structure really is a complete mess, it’s best to change habits and use best practices going forward rather than going back and making wholesale changes that could potentially lead to even more work down the line.
Images Being Indexed As Pages:
This is always an interesting one to find. Any website over 5 years old on wordpress will likely have this issue too!
What is the issue?
For one reason or another Google may index your images as their own unique page. Now you might think, that’s great my images are showing in Google. It’s not what you think (probably)
These pages are showing in the main search results page and not image search. And they are showing as their own unique canonicalised page with absolutely no content other than just the image. This will create a mountain of thin content pages you need to take care of.
Why is this an issue?
Thin content and crawl bloat mostly.
If your website is wordpress it’s very likely you use Yoast SEO. Yoast SEO is a fantastic plugin for SEO on wordpress sites. It was one of the original and used widely across almost all sites who had some thought about SEO. But, like anything it’s prone to a mistake or 2.
Each time you upload an image wordpress creates a page with that image. So, it’s likely you will hae hundreds, maybe thousands of images with their own page creating thin content of absolutely no value. This is bad.
Fortunately, by default, Yoast redirects each page to the image. This is good.
In 2018 during the 7.0 update, Yoast changed a pretty important setting under the media settings.
They said No, when they should say Yes, and this caused all the redirects to disappear, create multiple sitemaps on large sites with tonnes of images, sent these sitemaps to Google for indexing, and by the time the bug was fixed a lot of images were likely indexed by Google so the robots.txt actually stopped Google from crawling these images and de-indexing them. This is bad
How to fix:
Very useful guide for the Yoast bug.
In Yoast, go to “SEO > Search Appearance > Media” and find the “Redirect attachment URLs to the attachment itself” option. Select “No”. Then find “Show Media in search results” and set to “No”.
If you want to remove these in bulk from Google you can use the removals tool.
Robots Txt File:
What is the issue?
If you want to know all the in’s and out’s of your robots.txt file Google have a pretty in depth guide on their Robots TXT Guide.
Your website doesn’t necessarily need a robots.txt file, but as your site grows, and for an ecommerce site they can be pretty important to understand and set up.
The issue we are looking for here is a possible misconfiguration of the robots.txt file that is blocking key pages or possibly your entire website being crawled.
A robots.txt file generally looks something like this:
User-agent: *
Disallow: /wp-admin/
yoursite.com/sitemap.xml
To check your robots.txt file just search for
Yoursite.com/robots.txt
It should bring you to a page that looks like this:
Not very exciting, and what does any of this even mean?
Why is this an issue?
How to fix:
Sitemap Optimization
If you already Yoast, Rankmath, or any other SEO plugin, and connected to Search Console, then you should have a sitemap in place. It’s probably already been created for you.
If not, here’s a quick guide to creating your first sitemap.
You can check your sitemap at:
yoursite.com/sitemap.xml
Or
yoursite.com/sitemap_index.xml
If you sitemap isn’t set up you can follow Google guidelines to set up your sitemap. Then submit it to Search Console. You can also drop your sitemap link into your robots.txt file too.
What is the issue?
Your sitemap tells Google what pages you consider important and worth crawling and indexing.
Why is this an issue?
The problem here is, your sitemap could contain errors, or pages you don’t actually want indexed. This confuses Google, and it’s probably best not to do that.
How to fix:
You can use Screaming Frog to crawl your sitemap and check for errors.
Open Screaming Frog.
Navigate to Mode > List
Navigate to Upload Sitemap > Download XML sitemap
Upload your sitemap link
Then click ok and let the crawl begin.
What to look for:
Pretty much anything we have covered so far. 301, 404, 30x, 40x, anything that looks weird, images being indexed as status 200.
These are all URL’s you don’t want Google to crawl and index.
You want to fix any of these errors as shown above, or create a new sitemap removing these pages and submit it to through Search Console.
Site Speed:
What is the issue?
Website loading speed has been a ranking factor since around 2010.
To get a good indication of how your website performs you can test you website speed for free using these websites.
It can get pretty technical but let’s look at the main elements here. Take GTMetrix for example. Your overall score should be pretty telling here, but is this something that needs to be worked on?
Let’s scroll down to see the page details. These are all the elements taking up your page, from font, images, JS, CSS, HTML, and the mysterious “Other”
If your images are taking up a couple of MB’s here, this can be a huge speed saving opportunity.
Maybe you can minify some code for a slight loading increase.
It could even just be a sign you need to upgrade your hosting package to something a little more high end.
Now that we have a good idea about page speed & page size, we can check out any further issues on Google Lighthouse before determining what action can be taken.
Google Lighthouse Audit:
Carrying on with site performance and website speed. You can start a Google lighthouse audit using the chrome browser. Although I always prefer opening an incognito window in case some browser extensions might affect the check.
Just navigate to the page you want to check. Right click the page, hit inspect. Then you will open a submenu with a few options, you’re looking for “lighthouse”. Once you hit lighthouse you have 2 options – Mobile & Desktop. Google say mobile first indexing, so lets check mobile first speed.
Just click analyze page load and let the crawl do it’s thing. You will be given a set of results once finished.
- Performance
- Accessibility
- Best Practices
- SEO
- PWA
This might look pretty daunting as you start to scroll through and see the sheer amount of data, and start seeing words you have probably never heard before and don’t care very much for.
The great thing about this audit is Google will give you details about what each element is, what the issue is, where to find the issue, and will generally explain what each element means for your site.
Anything flagged here can be inspected in more detail for example clicking “Expand View” over each set of metrics will go into more detail about them. For example:
How to fix:
At the risk of getting very technical here, we’ll try to focus on the priorities, and areas a lot of websites big & small get wrong with performance.
To get a thoroughly in depth look at each of these elements, search engine journal cover this in great detail in their Technical SEO Guide To Lighthouse Metrics.
What we will focus on here, is the importance of performance and some quick fixes for your site if you are having issues here.
Performance:
Let’s focus on elements that are slowing your page down. One of the most common problems here are image & file sizes. You found that perfect image and uploaded it straight on to your site without resizing or compressing it. It’s probably way way bigger than it needs to be.
Let’s see if Google thinks you can do anything better here.
Look for the “Properly Size Images” element under opportunities. If you are seeing anything here it means your images can be compressed further for file size savings and a faster load time on site.
You can use an image compression plugin but if your issues aren’t across thousands of pages it’s quite easy to do this manually.
You can download your images. Take them over to this free image compression app Squoosh. Once you are happy with the size and quality you can download. Ideally, in JPG or WEPB format. Then reupload to your site, replacing your larger image.
Install a performance plugin:
If your website is on wordpress, then the opportunities here are endless. You can get your hands on some very powerful plugins that are very easy to install and setup, that will do most of the heavy lifting for you here and give you a much needed performance boost.
WP Rocket:
WP Rocket is probably the most widely used performance plugin for wordpress sites.
Perfmatters:
Perfmatters is a pretty new plugin but it ticks all the right boxes.
Use a CDN:
A CDN (Content Delivery Network) helps make the content of your site appear faster to visitors by caching the static pages, images, CSS and other resources then serves these resources to visitors through a network closest to their location.
Cloudflare is free and simple CDN that is extremely reliable.
It’s also worth checking with your hosting service if they can provide a CDN as part of your hosting package. Even still, Cloudflare is a great free start that also offers some great value paid services too.
Files not secured:
Let’s end where we began.
A common issue causing a page to load unsecured even though it is HTTPS can be caused by images or other files on page being served in HTTP.
This could be caused by moving your site over from staging, or just not uploading files in the correct format. The great thing is you can usually find this with a Google lighthouse search.
The issue will fall under “Best Practices” and it will generally show you the exact file causing the issue. You can just search for this in your site media, or Cpanel and change the source to HTTPS. Sorted!
Schema Markup:
Schema is like dictionary for code-
What is the issue?
Why is this an issue?
How to fix:
– Bonus for Shopify – Check for bloat
Shopify has grown at an exponential rate over the past few years. However, their SEO foundations already left a lot to desire. It’s not really a fault of Shopify, but more so the theme developers.
This is certainly one worth checking in your own theme if you run a Shopify store.
What is the issue?
Poor URL structure leading to massive crawl bloat, duplicate content & linking issues.
Why is this an issue?
Let’s say you have a collection called “shirts”.
And a product with the url “red-shirt-for-men”
Then you will have the URL yoursite.com/collections/shirts/products/red-shirt-for-men
Not a great URL.
However, the bigger issue is that your canonical url will be
yoursite.com/products/red-shirt-for-men
The collection url will be a duplicate version of this page. Essentially, if you have this URL structure, if you navigate through a collection to find a product you will end up on a different URL than if you went directly to the product.
This can lead to internal and external links going very wrong. What if your product got picked up on a few blogs and each article linked to a different URL depending on where they accessed the product from.
How to fix:
You can check if you have this issue by searching:
site:yoursite.com inurl:collection
If you see a handful or thousands of URL’s with collection/name/product then this is something you might want to address.
Alternatively, you can do a crawl in Screaming Frog and see if your product pages are indexed with a collection URL.
Start in your Shopify dashboard by going to Online Store > Themes > Customize > Theme Actions > Edit Code > Templates> product.liquid
In here you are looking for a specific piece of code that says:
“within: collection”
When you find this piece of code you need to delete just “within: collection”
Depending on your theme the code may also appear in a number of other sections also. So unfortunately may you have to do a little digging around to find all instances of it.
You’ll know this works when you navigate to a collection, click a product and you get a url like so:
yoursite.com/products/red-shirt-for-men
Conclusion
Conducting a technical SEO audit doesn’t have to be overly complex. By focusing on these key areas, you can identify and address the most common issues affecting your site’s search engine performance:
- HTTPS implementation
- Proper canonicalization
- URL structure consistency
- Handling of 404 errors
- Efficient use of 301 redirects
- Site speed optimization
- XML sitemap submission
Remember, SEO is an ongoing process. Regularly auditing your site and staying up-to-date with best practices will help ensure your website maintains a strong technical foundation for optimal search engine visibility.
If you need help with a smart audit, feel free to check our startup package or contact us if you’re a scaling business