How to Perform an SEO Audit: In-Depth Guide
Whether you have a company introduction website, personal portfolio website, ecommerce store or any other website type - you will need to rank it higher on Google search results.
Unfortunately with increasing competition in most industries, you might find it difficult and time-consuming to start ranking on first page of Google for great terms that would consistently result in leads.
The first thing you need to do is make sure the website is optimized for what search engines are trying to find within your site. This can be done by performing an SEO audit on it and finding all technical or content issues on your website, then improving on those areas.
In most cases just optimizing your website to the best of your ability will lead to huge increases in search engine rankings.
Within this post we will go through all primary sections of an SEO audit and I will introduce you to ways of finding these issues, then created recommendations and solutions for most common issues in each section.
If you haven’t found exactly what you’re looking or have been faced with an issue that isn’t mentioned in here - feel free to comment below this blog post and I will try to give you some recommendations on how to fix it.
Since this post will be highly detailed and very long, while some of you might look for very specific solutions - I’m creating a table of contents, where you can head right into specific sections of this post.
Introduction to SEO audits
You might be asking - how does performing an audit on your website help you increase it’s traffic?
Most of the issues stopping you from performing better on search engine results can be improved upon if you know what they are and which specific places they appear on within your website. This is exactly what SEO audit will help you find.
Fixing these issues will result in increase in organic traffic, lower bounce rate, more time spent on your website and so on.
It could be anything from wrong link structure or website load time being too long to lack of content on specific pages, wrong keywords being targeted, missing meta descriptions. These or any other issues we’re going to talk about within this post will impact your website’s rankings in a negative way, therefore lowering potential of that website.
Finding these issues is what performing a proper SEO audit will do. We will go through the primary sections of website audit I’ve been doing for years, taking a look at ways of finding the issues, way of fixing those issues and then putting a proper structure in place to make sure the same issues won’t appear in the future.
There are online or offline tools that let you automate the whole process and there’s nothing wrong with those tools, but most of the reports created by them will only be understood by SEO specialists and information within them will be created without taking into consideration your website’s content type, industry, competition or SEO strategy that is currently in place.
The only way to do a proper SEO audit is if you perform it manually and pay close attention to specific statistics that your website gets affected by as well as your competition.
You can perform SEO audits on your competitor website as well and understand more about their structure, potentially even find areas you can outcompete them in.
Technical SEO audit
First SEO audits should be performed on your website whenever it’s created. Most likely you’re not going to have a lot of content then, so the primary things to look at would be technical setup (redirect setup and broken pages, link structure, loading speed optimization, structured data markup, internal link structure, external link structure, meta content setup, sitemaps, heading setup, image optimization, responsiveness and mobile optimization).
In the future it’s recommended to performe these technical SEO audits when major changes are done to your website (when switching domain, switching to secure website with https address, type of content is changing, new website design is created and so on).
Whenever major changes are done - this might create some technical issues on your website and if those issues will be found before new website structure is even indexed by search engines - that will be the absolute best case scenario for you.
As soon as search engines find the issue within your website - a negative impact on rankings might be felt, which isn’t something you want.
We will start with analysis that would be most common with major changes to a website.
Redirect setup and broken pages
Once a new website is created or major changes are made, it’s common to have some broken pages or broken links left within your website. It might be as simple as link structure changing on your website or old website linking into a page that no longer exists.
Since Google and other crawlers don’t like seeing your website lead into broken pages, because to them it looks like a website that wasn’t updated - it’s important to find these links and then change them to the right pages, delete them or create redirects on pages they lead to.
When finding these broken links, we will use a Screaming Frog crawler. Screaming Frog has one of the best crawlers for website technical issue analysis, their software is free with some limitations and is easy to use.
Once you download the software and install it on your computer - launch Screaming Frog SEO Spider. The primary dashboard should look something like this:
Next to Screaming Frog logo you will have a section that requires an URL to be entered. Enter your website’s URL here and press the “Start” button that is on right side.
Wait for the crawler to go through all your pages and then hit “Response Codes”, that is right below the URL area. This will bring out menu for different response codes without your website. If you want to head into broken link section - select “Client Error (4xx)” option on right-hand-side menu.
For the purposes of this guide we will look at Apple website.
The software will provide you with a list of URLs within your domain name that are broken. You can scroll this list to the right side and find ones that have status code 404 as well as more than 0 inside “inlinks” column. These are going to be broken links with some sections of your website leading into them.
Since these links have some internal pages leading into them - you want to fix them. If Google crawler finds sections of your website that aren’t accessible, but links lead into them - they will think your website hasn’t been updated recently and it could have a negative impact on your rankings.
Depending on circumstances, you will be using one of those 3 strategies to fix the issue:
- If a link structure was changed or there’s a simple error within a single URL - you will replace that URL with the right one.
- If a page it links to has been deleted since then - you will need to delete the link as well.
- If there are many different pages that link into very specific broken page, it might be easier to create a redirect for that page instead of replacing hundreds of links.
At this point you need to get a list of pages that have those broken links. This can be achieved by selecting a box in “Inlinks” column and row that has “404” status code. Once a selection is made, hit “Inlinks” tab on the bottom of Screaming Frog window.
Alternatively if you want to export the URL file - you can right-click with your mouse on the “Inline” element, select “Export” and from drop down menu hit “Inlinks”.
This will export an Excel file to your computer with a full list of links that lead into that page.
Now you know where the changes need to be made and what needs to be done.
Replacing or deleting links is very easy to pull off and no additional comments are required, but redirects might be more difficult to create, so I will talk about the way you want make them.
First thing you want to do is open up your favourite text editor, such as Notepad. Then copy the following link into your notepad:
redirect 301 /old-directory http://www.example.com/new-directory
Now what you want to do is replace what I’ve highlighted with red text into your own redirects. For example if you want to create a redirect from https://apple.com/retail into https://apple.com/products (assuming this is your new link) - you would need to have the following code:
redirect 301 /retail https://apple.com/products
Once you create this line of code, go into your website’s file manager, which can usually be found through hosting provider dashboard, in the primary folder. Find .htaccess file and paste this line of code to the very bottom of that file, then replace URLs.
It will create a permanent redirect from an old section of your website that doesn’t exist into a new one, so any pages that link into it will start automatically redirecting into the right page. Same results can be achieved if you have external links leading into a broken page.
Since you can’t replace or delete links from other websites that you don’t have access to - the only way to make these links worth anything is if you create redirects from your broken URL into a new one.
Now that you’ve learned how to fix redirect or broken page issues on your website, let’s look at overall link structure, what it should look like and what potential issues to look out for.
Making sure that the whole structure of your website is correctly set up for both search engine crawlers and humans looking at your website is one of the more important steps to take care of for technical setup, which is exactly what we’re going to talk about it.
One of the most important things you need to remember is that links should have really good readability. The most SEO benefits will be received if you have a link that is easy to understand for humans and it doesn’t contain unnecessary details.
Let’s take for example WordPress choices that are provided for posts on your website.
Which one would you say has the best readability? “Plain” and “Numeric” options fall off right away, because they don’t even contain any part of the title or context about what the page is about.
Month and day options aren’t great either, because it’s not important to have exact date mentioned within your link. If a date it is published in has any relevance to content you produce, then make sure information like that shows up on the page itself, not the URL. I see the day option being used by a major news papers, but other than that one example - there’s no use for a link structure like that.
We can take another example with product page URLs. I’ve looked at most common website URL structures and will give me feedback on it after mentioning what the options are:
All of these are taken from real websites with some details switched and their root domain replaced with domain.com.
What works the best is highlighted with a green checkmark, because those links are the only ones that have really good readability and no unnecessary elements within them.
There’s nothing wrong with using categories or the word “product” within your URL and that brings me to my next point - using primary categories for product pages or something like an /article/, /blog/ in between your post and root domain is still good, in some cases even better than not using it.
If you want specific products to have a very slightly higher change to rank for category search terms or Google to understand that the link will lead into an article talking about it, then it’s best to use those terms in between your primary URL elements.
In that case you would use https://domain.com/cat-food/product/grain-free-chicken-formula-dry-cat-food/ instead of https://www.domain.com/grain-free-chicken-formula-dry-cat-food/. A person pressing on this link will understand that they’re going into a product page, not a page of your potentially giving your opinion on very specific type of cat food.
In the end it all comes down to user experience. It’s best to serve URLs that people are able to understand, share and not be afraid to press on. If you just had an URL made out of numbers or random elements - people would be more afraid to press on such a link, which impacts your click-through rates and indirectly impacts SEO.
This brings me to my next suggestion - try to avoid using punctuation or special characters in your URLs. Using most common characters like underscore, hyphen or slash in an URL won’t create issues, but some other characters like hashtag, question mark, percent sign, asterisk could cause issues, because it might break browsers, crawler code or general code structures for other websites (for example when entering a link to your website, it will bring out an error). This isn’t something you want to do, because links that contain these characters might be punished directly or indirectly by the same search engines you’re trying to rank on.
One of the last things to look out for - keyword stuffing. While having primary keywords within your URL still helps you rank for that keyword, having too many of your keywords stuffed in there might hurt your website in the long run. So if you sell previously mentioned cat food and want to have the best chances to rank for keyword “cat food”, you should avoid a link structure similar to this:
This link contains the keyword “cat food” used in different formats 3 times and would be considered as spammy, which could have a negative effect on ranking a page like this.
There’s also an unwritten rule that links above 100-115 characters are too long. Google or other search engines aren’t against you having longer URLs, but when performing an SEO audit you also look at overall user experience, which is what gets impacted in this case.
Those are general rules you need to follow and whenever analyzing your own website or your client’s website - keep an eye on overall link structure as well as length of your links. This doesn’t mean you have to go through every single link on the website, but to better understand the overall structure of URLs what you have to do is go through several pages from each category on the website.
So if your website is selling products and has a blog on the same website - go through several random blog posts and several random products, check out URLs on them and it will give you a better idea of the whole structure on your website. It’s extremely unlikely that you will have different URL structure for different products or blog posts, so it’s not worth going through all of them.
Loading speed optimization
Loading speed of your website has both direct and indirect impact to your SEO. Google uses page speed as one of the ranking factors for their search engine results and having a slower website might make you have indexation issues due to the way their crawlers are set up.
In January of 2017 Google has released an article that explains crawl budget and how they use it while crawling websites. Essentially if you have a slow-loading website, their crawlers might not fetch all of their pages or get to read all of your content due to time spent on your website.
Loading speed also affects user experience, because most people wouldn’t be willing to wait more than 3-4 seconds for a website to load, which would increase bounce rate of your website, which Google sees as your website not being relevant to specific search term it appears on (because they see that people quit within the first few seconds), therefore this might also have an indirect negative impact on your website. If google sees your page as not being relevant, because people leave within few seconds - they might stop showing it as high, because to them showing an absolute best result is the highest priority.
Therefore it’s really important to keep an eye on loading speed when doing an SEO audit of your website. The tools I’ve primarily used to check loading speed issues are Pingdom and GTmetrix. Both of these tools have proved to be of great help when determining the major loading speed issues.
Pingdom speed testing
Let’s start off with Pingdom. First head into Pingdom homepage. Around the middle of you screen you will get an option to select URL you want to analyze and server to test the website speed from.
For this example we will look into Apple.com website from US server. The first thing you will see when finished analyzing a website is a quick overview of the primary statistics. It will look something like this:
Here you can see what the overall loading speed score for your website is (A+ through F), your loading speed from selected server, how fast it is in comparison to other websites, overall page size and number of file requests it went through.
This will let you get a better idea of how you compare to other websites. Anything above 2.5 seconds is usually too slow, so it’s recommended to get your website’s loading speed down to at the very least 2.5 seconds in the primary region you’re targeting with that website.
Right below this summary you will have the most important bit of this loading speed audit - suggestions, explaining how to further improve your loading speed.
If you press on any of the issues found on your website, it will also provide you with additional details about the issues. Each one of these could also be Google’d and lots of answers found, because many people use Pingdom or any tools that use their API within them, so same suggestions are common to have between number of websites.
Let’s look at caching first. Any browser you use caches resources so they don’t have to be loaded from the same resources every time a visitor returns to your website. In most cases caches resources are saved within your browser, so the next time you visit the same website is loaded from there instead of website servers.
Cloudflare has created a superior way to cache resources by fetching it from the closest server to your current location instead of fetching them from hosting server’s location, which makes the loading speed even better than with normal caching. I would highly recommend you give Cloudflare caching a try.
Their plan for personal websites or small blogs is absolutely free, but if you want to upgrade into professional version - it still doesn’t cost a fortune and would go for $20 a month per domain. This also includes basic security for your website (shared SSL certificate, mitigation of DDoS attacks and web application firewall).
Once number of files is fixed, we will look at the next issue mentioned on Pingdom’s tool - static content being served from domain that sets cookies.
The whole point of having cookies is for them to gather information about user and use that information to replace some data on the website with one that better fits your personal needs. Since you’re also using static resources that won’t change based on information found within cookies - it’s supposed to be better to serve these resources from section of your website that doesn’t set cookies. Fortunately with a release of HTTP/2 and some other new protocols that improve website’s performance, this suggestion might as well be ignored, because “fixing” this issue will end up being more costly in terms of loading speed than keeping it the way it is and loading all static resources from same domain.
Since we looked at the primary issues that Pingdom highlights - let’s look at what GTmetrix has to offer.
GTmetrix speed testing
GTmetrix works in a similar way to pingdom. From my personal experience GTmetrix looks at more potential issues though.
First step is to head into GTmetrix homepage and enter your URL before you hit the “Analyze” button.
Unfortunately GTmetrix doesn’t allow you to select a region to analyze the website from, unless you’re a paid subscriber to their GTmetrix Pro plan, in which case they would offer you 28 locations as of February, 2018. By default their test server is located in Vancouver, Canada and all the tests for unpaid users are performed from there.
For the purpose of this guide, we will analyze Apple.com again. Once again, the first details it provides you with is a quick overview of your primary statistics - loading speed from tested server, overall score, page size and number of performed requests for files.
Same as with Pingdom, right below this we have suggestions for better loading speed optimization and pressing on them will bring up more details.
What I found superior for GTmetrix analysis versus Pingdom analysis on top of more metrics analyzed (such as image optimization for loading speed, compression, parsing issues, etc.) - GTmetrix also provides you with more details on how to fix each issue, better explanations of how it affects SEO and loading speed.
If you press on any suggestion, then hit “What’s this mean?” button on top right - it will bring up a small window explaining in detail what the issue is. If those details are not enough, you can read extended version by pressing on “Read more” within newly opened window.
Doing this will bring you into a short article explaining what the issue is, why it affects your loading speed, how it affects your website and how to fix it. The descriptions you have on Pingdom are pretty vague, so I feel like this tool is superior to Pingdom just based on that.
Personally I use GTmetrix more often than Pingdom, unless I need specific data about loading time from region that is far away from Canada (like Australia, Europe, Asia), but for suggestions and data provided within each report, I would recommend you use GTmetrix.
Structured data markup
This is a topic I’ve spent a few days writing about. I felt like it can’t fit well within this SEO audit guide and deserve it’s own post, so created a short guide explaining what structured data markup is, how it affects SEO, what are the positives of using it, how to create it, how to find the most common issues within structured data markup code and how to add it to the website. The article is called All You Need To Know About Structured Data Markup.
I recommend you read that article if you wish to optimize your website or whenever you’d doing an SEO audit of your / competitor website. I created that article while writing current guide, but by the time I finished, I realized that it took me 2 days to write and it ended up being 7700 words just by itself, so made it into a separate guide. It does indeed contain everything you need to know about structured data markup.
Internal link structure
Google Webmaster Guidelines used to have a rule that allowed for up to 100 links to be used per page, but it has been removed and for a good reason - it’s important to allow for any number of links per page as long as they are relevant to the topic.
Now you can have a resource page with 100+ links as long as it’s relevant to the topic you’re talking about. If you look at Wikipedia for example - it has hundreds (in some cases even thousands) of links per article, but all of them lead into resources that are highly relevant to the topic.
So the common question people ask - how many internal links should an average page have? And this question is difficult to answer, because there’s no number of links that give the highest chances to rank.
It’s recommended that for content pages (articles, blog posts, recipes, etc.) you have at least 1-3 internal links leading into relevant pages, but that is not at all mandatory.
Let’s take for example previous section of this article - I’ve quoted an article I previously written about structured data markup. It was relevant, because the topic of this post is performing SEO audits and I’ve talked about ways of checking structured data markup as well as adding it to your website in that article.
So the purpose of that link is to provide users with additional information they can read about a topic that is closely related to this one. It is essentially how internal links should be built.
If you have a large website and can’t keep track of all the internal links - there’s a tool I found, which lets you find all internal links. First you have to head into their website, then add your URL in the primary section of that page, fill out captcha and hit “Perform check” button just below it. It will provide you with a total number of internal links, number of duplicate links, no-follow links, anchor text, anchor type, link type as well as the link itself.
There’s not much else to remember when it comes to internal links - try to use anchor texts that are relevant to the topic and at least 1-3 links per content page, although it’s not mandatory. Try to avoid unrelated internal links though, as Google might think it’s spammy.
External link structure
External link analysis almost deserves a separate post as well, since there’s a lot to discuss and external link structure has one of the highest impact on your rankings, therefore performing a proper backlink SEO audit is necessary.
Backlinks are especially important for a website that is looking to rank for keyword on national / international level. For a local business trying to rank for local-based rankings (city + product / service) - it’s not as important, but still impacts your potential of ranking on search engines.
When it comes to SEO, quality of links is more important than quantity. You will be better off building 20 high quality links than 2000 low quality ones, which is why we’re going to focus on analyzing the quality of your links during this SEO audit guide.
Fortunately Google allows you to disavow low quality links, essentially removing any impact they might have on your website. This is awesome, because low quality or spammy links would have a negative impact on your website and could even get it penalized.
It’s a moderately common practice for competitors to target their rivals with thousands of lowest quality links with hopes of pushing it down in rankings or penalizing it, eventhough a practice like this is illegal. That’s why you want to find the lowest quality links leading into your website and learn to disavow them, which is what we’re going to discuss in this section.
There’s several tools you can use for this type of process, but most of them either wouldn’t show all the links you have or most of the work would have to be done manually. The tool I’ve used most is SEMrush backlink detox tool.
Head into their detox tool page and you will be able to try it out for free. The tool will crawl all backlinks for you, which you can then list by toxicity score. High toxicity score means low link quality.
As you can see in screenshot above, there’s a “Toxic Score” on the right side of each website, showing exactly how toxic it is to a website. If you press on “Toxic” tab (highlighted in screenshot) on top of your dashboard, it will open up a list of links that were determined to be toxic by the crawler.
This toxic list of links needs to be exported and uploaded into disavow section of your website. Google has some strict requirements for this file that are described in one of their developer answer posts.
The file needs to be in .txt format, encoded in 7-bit ASCII or UTF-8 and formatted in a certain way, which SEMrush can do for you without any extra input. This file can be uploaded straight away. The website you upload to is Google’s disavow tool on search console.
Here you have to select a web property this disavow file should be added to, hit “Disavow Links” button and upload the encoded file you received from SEMrush. If for some reason you don’t have your property on selection list - make sure your website is connected to Google Search Console.
Without links that are supposed to be disavowed you should look at one additional thing during an SEO audit - anchor texts of your links. Having a proper anchor text structure is still important for your rankings.
There isn’t much you can do when it comes to anchor text optimization when you’re not manually building links yourself, but if you do build them yourself - there’s several tips you can follow.
You will have a higher chance to rank for specific keywords when they are used in anchor texts for your links, but at the same time you don’t want to overuse these keywords.
The anchor profile will look most natural if you have 33% of links with your brand name or website as an anchor text, 33% of links with anchor as primary keyword you’re ranking that page for and 33% of links with random anchor text like “click here”, “read more”, etc.
You current backlinks could be checked by using Ahrefs tool or Majestic tool. Just enter your website’s URL into the primary search bar and hit the search button. Then scroll to the very bottom of your dashboard on both tools to check anchor texts.
Majestic will show you how many backlinks you have in total using a specific keyword and Ahrefs will do the same with number of domains instead of backlinks.
If this number doesn’t have good ratios for specific anchor texts - the best way to fix it is to start manually building links. I have created a free guide where you can learn the whole process of high quality link-building.
Learn how I built thousands of high quality backlinks with a single strategy using SEMrush
Meta content setup
Meta content, more specifically meta descriptions and meta titles, will determine how specific pages show up on Google search results, therefore it’s important to find any issues related to meta content while performing an SEO audit.
By default Google displays titles that take up at most 600 pixels and if your title exceeds that number - not all of it will be shown. It’s set to show your page’s title next to your website’s title separated either with dash or vertical bar.
I’ve prepared an incorrect and correct examples of meta data. First let’s look at incorrect example:
As you can see in the image above, whole title doesn’t fit into search results, nor does the description of that discussion thread. The description you have below also doesn’t describe what the discussion is about, when it should, but with this specific example it’s understandable, because post is a discussion thread that moderators have no control of.
Now let’s look at the correct meta content setup:
In this example you can see that it’s clear from title and description what the page will be about, so meta content like this will generate higher click-through rate.
In most cases you want to keep title under 60 characters. The most optimal format is - “Keyword - Secondary Keyword | Website Title”, but don’t push for unnatural titles just to better optimize it for search engine results. It’s best to look at it from potential visitor’s perspective and make the title understandable, explanatory of what the page is about.
It works in very similar way with meta description as well. Google has recently increased their meta description character limit to roughly 300 characters. The exact number isn’t known, because I’ve seen meta descriptions that have 375 characters and ones that cut out at 295, but it’s recommended to keep them under 300.
Description is mostly used as one of the primary tools to generate a high click-through rate to your site. It’s important to create meta description in a way that explains in detail what the page is about, what information people can find in there and essentially advertise your page. You have to remember that meta description is the primary thing people look at when deciding whether to visit your website, which is why it’s important to develop good meta descriptions.
Adding a meta title and description could by done by creating some code in your <head> section of the website. The exact code you need for a title:
<title>Replace this with a title that fits your page</title>
Code you need for a meta description:
<meta name="description" content="Replace this with a proper description that fits your page">
Remember to replace the content highlighted with red and to add this to <head> section of your website, not anywhere on <content> section.
If you’re using WordPress - adding meta title and description is even easier. One of the most used plugins help you do it - Yoast SEO. Download it on yourWordPress website, activate it and below every post / product page / category or any other type of page you will be able to replace meta descriptions as well as titles by pressing on the search result it shows.
If you’re using Shopify - scroll right below your post, product page or category page and look for “Search engine listing preview” section. You will be able to replace boh page’s title and description in this section.
One last thing to remember - try to avoid using duplicate descriptions in more than a single page of your website. This confuses crawlers and they don’t really understand what your page is about. Unfortunately if you leave meta descriptions to be created automatically - this will be a common issue.
To check which pages have duplicate content or wrong meta content setup - please read my “Best tools for SEO analysis and SEO audits“ guide, specifically section about SERPstat. I discuss a way of auditing your website using SERPstat in there and duplicate, too short or too long meta content is part of their audit document.
There are several types of sitemaps that could be created, but we will talk specifically about XML sitemaps that have to be uploaded to Google Search Console. This sitemap indirectly helps you with indexation of pages.
While your pages will be indexed only if Googles determines it to be relevant and search-engine-worthy, submitting an XML sitemap on search console will let them know that these pages you have on a sitemap are ones you’ve determined to be of high enough quality for them to consider. This doesn’t mean that each and every page will be indexed, but does mean that you will have a higher chance of them being indexed and shown on google search results.
Remember that you want only important pages for users to show up on sitemaps, not necessarily every page on your website. For example I have sitemaps on this website, but I haven’t added my admin login page, page for each media resource, post category or tag pages into sitemap, because I don’t really want them indexed or ranked. Only the pages that you see on my primary menu and guides similar to this are added into a sitemap, because those are the only pages that are important for search engines or other users.
Google has created their own extensive guide on how to build a sitemap and upload it to their search console. I would recommend you follow those instructions, create and upload the sitemap as soon as possible, because it does impact your chances of having the right pages indexed.
If you’re using WordPress with Yoast or Shopify, it’s likely that sitemap is already created and maybe even added to robots.txt file, but those tools don’t upload the same sitemap to Google Search Console and this step still has to be done manually. Once again you can follow the Google’s guide provided above.
Whether you have a sitemap on your website can be analyzed using [SEO Site Checkup tool for sitemap testing][seo-sitemap-tool]. Unfortunately it doesn’t show whether the sitemap is uploaded to Google search console, but you can check that yourself by going into their sitemap list page.
The page can be accessed either by hitting a link provided above or going into your Search Console, selecting “Crawl” category and “Sitemaps” subcategory.
Whenever you’re looking at heading tags during an SEO audit, the primary things to remember about heading setup is that primary keywords you’re trying to rank for have to be used within your headings and that the right structure of heading tags needs to be used on a website.
First let’s discuss keyword use in heading tags. Google uses heading tags (especially H1 headings) to determine what the page is about, so having your primary keywords used on headings does have an impact of their chances to rank.
The effect from this change isn’t huge, but it’s recommended to have your primary keywords from each page used within at least a single heading tag.
The overall structure of your headings should make sense as well. Headings should be used in the same way as you would do in a word document - headings with lower number are used as primary topics of each page, while lower headings separate it into smaller sections, sub-categories and topics.
H1 headings should be followed by H2 headings, those should be separated into H3 headings and so on. If all of your headings within a single page were H1 or H4 headings would follow H1 headings without anything else in between - this would confuse crawlers and make it more difficult for them to contextualize your pages.
So it is as important to follow a proper structure when creating headers on your website. There is a way to check the whether your pages follow the right structure by using [SERPstat audit tool][serpstat-audit-tool]. It’s a process I’ve described in detail on my previous post - Best Tools for SEO Analysis and SEO Audits (pay closest attention to SERPstat section of the post).
When performing an SEO audit, image optimization is as important as other content optimization. Having a wrong setup could negatively impact your loading speed or confuse crawlers, because no context is provided for visual content.
Image optimization could be separated into several major topics - image file size optimization, which impacts loading speed and keyword optimization with the use of alt text, which gives more context to search engines for your visual content. We will discuss both of these in this section.
Image file size optimization
It’s something we spoke about briefly in loading speed optimization section, because GTmetrix shows you issues like this on analyzed page. The most common issues related to file size could be because of no file compression done or because images are served in full size before they are scaled down.
If you have “Serve scaled images” as an issue mentioned on GTmetrix report - the one and only way will be to scale your images down to appropriate sizes before uploading them to a server.
Fortunately GTmetrix will let you know which images need to be resized to which size formats in order to make it more optimization.
Please note that if you analyzed only homepage - GTmetrix will show issues only on that homepage and no exterior pages.
Another issue that can be found by analyzing your website on GTmetrix would say “Enable gzip compression”.
Gzip compression for websites works in a very similar way like normal file compression - before sending files to the browser, the files are compresses into units that take up less space, which makes website loading time much better. If you don’t have gzip compression enabled, this should be one of the most important things to take care of when it comes to loading speed optimization and image optimization.
Gzip compression could be added to your website by following instructions by GTmetrix themselves.
Additionally, for WordPress and Shopify websites you have several additional options to choose from when it comes to image compression. Crush Pics addon could be used for Shopify and WP SmushIt plugin could be used for WordPress websites.
Using such measures allows you to have the most optimized image files, which reduces browser block and helps you with loading speed optimization.
Alt text optimization
Alt texts are supposed to be added to images, which will make them better optimized for search engine results. Essentially alt text is a piece of content that will be displayed instead of your image if it fails to load. It also provides search engines with more context about images used on your website.
If you have a lot of visuals on your website, such as visuals you see on guides I create - it’s extremely important to create alt texts for all your images. An alt text should contain a short description of what the picture shows.
For example if you have a website selling industrial furniture, they might have an image like this:
Alt text for this image could be something like “Industrial stype wooden coffee table” or “Modern wooden coffee table”, because it explains what the picture represents.
In this example, the following code needs to be added to your website for image to have the alt text:
<img src="/images/table.jpg" alt="Industrial stype wooden coffee table">
Just remember to replace first text highlighted in red with an URL of your image, second highlighted area with alt text that fits your needs. Obviously if you already have image on a website and want just to alter existing images - add just the part of code starting with “alt=” within the same brackets as shown in the example.
When using platforms like WordPress or Shopify - alt text additions are even easier to make. On shopify you can upload an image and press a small “Alt” tag underneath it, which will let you edit alt text for that image.
Similarly you have “Alt Text” section to fill out whenever an image is uploaded to WordPress as well. As soon as you upload an image, look to the very right side of your screen and you will see one of the sections with “Alt Text” next to it.
Now the only question you might have - how would you find images that are missing alt texts. There are several tools that can find these for you. My recommendations are to use either SEMrush SEO audit tool (which would analyze all technical measures of your website), Screaming Frog or SEO site checkup mage alt test.
Performing an SEO audit on your own website, your client’s websites or your competitor websites will help you better understand what the primary issues with your SEO are, why they need to be fixed and how to do it within your platform.
With the help of tools such as SEMrush or Screaming Frog, most technical issues could be found in no time, but you need to interpret these suggestions well in order to fix them. Before you’re able to fix the issues, you have to understand what impacts your SEO and find these within your site, which is exactly what we went throught in this guide.
Hopefully this guide helped you optimize your own website or your client website. If you have any questions about SEO audits - feel free to ask in the comment section below or use our contact page to reach out ot me directly.