As a search engine optimizer, I regularly conduct SEO audits for a wide variety of clients. In this article I show you: What is an SEO audit? Why is it so important for search engine optimization? And how can you do the audit yourself? A complete guide for your own website as well as for freelancers and agencies.
An SEO audit is the ultimate test of your website to assess the current level of search engine optimization and technical optimization. Roughly speaking, all OnPageand OffPage factors of your website or online shop are checked, down to the smallest detail.
More specifically, an SEO expert uses checklists, online tools and his own experience to evaluate the technical infrastructure and performance. But also the content and backlinks of site . I'll show you how to create the audit yourself.
As a rule, a catalogue of measures is developed from this. This lists all the weaknesses and potential opportunities of a website. In short: An SEO audit is the basis for your strategy for the next few months.
The focus is on evaluating the search engine friendliness of a website. From the results, a catalogue of measures is created at the end. Mostly, website owners or the clients of your agency pursue the following goals with this service:
- Evaluation of the website for current SEO trends, traffic etc.
- Third party verification of the success of a current SEO agency
- Increase visibility in organic search results
But also for agencies SEO audits are an opportunity to
- To expand the service portfolio
- Convincing customers of technical and content-related measures
- Inform potential customers about the state of your website
The biggest advantage of an SEO audit is that the recommendations for action can be implemented by the SEO agency as well as by the client or by yourself.
Why are SEO audits so expensive?
The problem of an SEO audit lies in the search engine optimization itself. Because there are dozens of rules and optimization possibilities for which each URL of the website must be analyzed. In addition, you have to examine the content or even thousands of backlinks individually for their quality.
This whole process is time consuming. As a rule, an SEO audit takes several hours, days or even weeks. The larger the website, the more time it takes to complete the analysis. In addition to the expert's hourly wage, an SEO agency will employ several tools and charge for their use. And these tools are usually not cheap.
All of these factors lead to the fact that an SEO audit can become a very meaningful, but expensive endeavor.
When I was asked by RAIDBOXES for an article on this topic, I thought about it for a very long time: How can this very extensive topic be summarized in such a way that anyone and everyone is able to perform an SEO audit themselves? Because no tool can replace an expert who regularly deals with the matter. Regularity is the keyword. There are always new features added to search engine optimization, although the basic rules of the game have remained the same for years.
There are thousands of changes to the search algorithm every year, and they keep making small adjustments at regular intervals. And the competition never sleeps. Information, tricks and tips are no longer up to date after a few months. Or they are copied by several competitors. For you to successfully complete an SEO audit you should:
- continuously deal with the topic of search engine optimization
- Be informed about current changes and trends
- Have a broad knowledge about the optimization of his website
See, for example, the post on the SEO Guide 2019. In the following guide, I'll show you step by step how to discover critical errors of your websites yourself. And how you can fix them.
For an SEO audit to be holistic, you should make sure you have access to the most important analytics tools and services. These include Google Analytics, Search Console and access to the Google My Business entry.
An SEO audit is a time-consuming undertaking. In order to keep the time spent as low as possible, it is recommended to use professional tools when analyzing the website.
Professional SEO audit tools are for example:
The Screaming Frog Spider is a great tool for analyzing your URLs. However, the tool is mostly aimed at advanced users. Unlike the online tools, Screaming Frog turns your PC into your own crawler. Up to 500 crawled URLs the program is free. That is just enough for small websites.
Other useful (free) tools:
Neil Patel's sites audit is free. However, it is still miles behind the paid tools. However, the advice provided so far is sometimes very good. The tool is constantly being developed further and has made great leaps in recent months.
Before we let a seo audit tool check our website, we need to make sure that only one version of it can be accessed. There are 4 different URL variants with which users can access a website:
Call up the input mask of your internet browser and type in one version of your website each. Optimally, all versions should redirect you to a single version:
If this is not the case, or if you receive the warning message "Not secure" in the browser, then you should contact your web host. So that he can activate the SSL certificate for your website or adjust your DNS settings correctly.
Tip: For customers of RAIDBOXES these settings can be viewed directly in the dashboard. You can activate SSL under the menu item of the same name. A Let's Encrypt SSL certificate is always included there free of charge. It can be installed with one click.
It's important for your websites to be accessible via https:// for two reasons: First, SSL is a ranking factor. And secondly, you certainly don't want your visitors to be informed that the website or online shop might not be secure.
To make sure that all contents of your website are really integrated via a protected connection, you can take an additional look into the developer console of the internet browser:
If the console displays a so-called "Mixed Content Error", this must also be corrected. This usually happens if you have included content from the website with http:// instead of https://. Common culprits are external scripts like Google Maps or images from other websites.
Tip: If the configuration of your web hosting is too complex, you can use a tool like Really Simple SSL. It fixes such errors for you. But not only in this case you should think about a managed WordPress hosting.
In the meantime, we'll start an SEO audit through an appropriate tool. Of course you have the choice to use Neil Patel's Site Audit. However, I rather advise you to try out a trial version of the professional tools (see above). If you like the tool and it is successful, you can continue to use it after the trial.
Important settings, no matter in which tool:
- Accept the instructions made in robots.txt. We want to consider a web page just like a crawler
- If possible: Crawl the website using the sitemap.xml. Common paths for a sitemap are: domain.com/sitemap.xml or domain.com/sitemap_index.xml
This process will take some time depending on the size of your website(s). So let's move on to the next step.
Can users find your website?
In the next step, you check whether your website can be found at all. Or if crawlers have problems reading it. Visit google.com and enter the following command in the search mask:
Now follows a listing of all search results, which however only concern your domain:
As you can see, everything seems to be fine here at least. The website can be found in the search results.
Next, go to the Search Console. Google's in-house console shows webmasters information about how their website is indexed. It also notifies you of critical errors. A great and very helpful thing.
Check if your domain is already registered and verified. If not, make up for this step. It is mandatory for Search Console that the correct version of your website is included. In other words, if your website is accessible via https://domain.de but is registered as https://www.domain.de, Search Console will not work properly. Then you will get wrong results.
Click in the tab "Coverage": Here you get a list of possible errors as well as excluded and valid sites :
sites , which are marked as errors here, must be corrected! You should also look critically at warnings. And remove their causes immediately.
Also interesting: The screenshot shows only 28 valid web pages. But as we just found out, Google shows 45 results for this website. We should investigate this more. It could mean that 404 error pages are indexed in the search.
To control this there are the following ways:
- You jump back into the Google search and use the command "site://deinedomain.de". From there, you can click through the search results and see if there are any anomalies like 404 errors, etc.
- Or you can use an SEO tool like the Screaming Frog Spider or the tool of your choice to search for 404-sites .
Since you're already in the appropriate section of the search engine, let's take a quick look at excluded URLs at the same time. Click on the "Excluded URLs" tab and look for the following messages:
404-sites are pretty much the most damaging thing a website has to offer. After all, what visitor wants to click on a result only to see that the desired content no longer exists?
In addition, each of your subpages has a value. If you do not redirect the 404 site , this value will be lost. And with it all the trust and success that you have earned with this website at a search engine. High-quality backlinks that you have built up also simply fizzle out in nirvana.
So that exactly this does not happen, there are so-called "redirects". Matt Cutts has explained wonderfully for Google what such a redirect is and why this tool is so important for Google:
If you have your website at RAIDBOXES , then it is very easy to set up such redirects. Just go to Dashboard -> Settings -> Redirects and define which URL should redirect where:
The difference between the two redirects:
- 301 Redirects / Permanent Redirect: Redirects permanently to the target page. Is used when the content can be found on another URL.
- 302 Redirects / Temporary Redirect: Temporarily redirects to the target URL. This is used for test pages, for example.
Now, however, a URL is not always meant to point to a new site . Sometimes a content is simply permanently deleted. For such cases, you can use the Yoast SEO Premium Plugin or the WordPress Plugin Redirection. They offer exactly this function, including 301 and 302 redirects:
Fixing 404 errors has top priority for you. Make sure that you only redirect to URLs that have a connection to the deleted sites .
Next, let's take a look at the Search Console URLs excluded by NoIndex:
The noIndex tag tells crawlers that corresponding sites should not be indexed. This is the case, for example, for thank yousites after a newsletter registration, for member pages or for legally required subpages.
But it can also happen that important sites ones, which should actually be indexed, end up in this pot. For example, through plugins carelessness or through uncleanly scripted WordPress themes. If that is the case, then you have to identify the cause.
Tip: If you use the Yoast SEO Plugin to optimize your posts, you will find a selection box at the end of the post. With this you can determine whether the tag "index" or "noindex" should be assigned to a post. Other SEOPlugins for WordPress , such as All in One SEO Pack by Michael Torbert, offer similar functions.
With the SEO site checkup you can check if the problem was fixed by this. If not, it is also possible that you have configured the Yoast SEO Plugin incorrectly. In this case, I recommend this tutorial.
An XML sitemap is a guide for search engines. The file contains a list of all URLs of your website and when they were last modified. There are all sorts of Plugins that automatically generate a sitemap for WordPress and WooCommerce . For example, the Yoast Plugin. Its sitemap can be accessed at "yourdomain.com/sitemap_index.xml".
We show Google where the XML sitemap is located. To make it as easy as possible for crawlers to access it. To do this, go to the "Sitemaps" tab in the search console and add it there:
Next, we call robots.txt, at yourdomain.com/robots.txt. The robots.txt is a file with instructions to the crawlers that visit your website.
If your WordPress site does not use robots.txt yet, you can copy the following entry and paste the file. Then upload it manually to the root directory of your FTP folder. Alternatively, the robots.txt file can also be easily created in Yoast SEO Plugin :
At best your WordPress robots.txt looks like this:
However, most website operators forget here that they can also store their sitemap.xml in robots.txt.
However, depending on who worked on your website, there may be other code snippets in robots.txt. Sometimes they exclude certain crawlers or try to stop certain sites from indexing.
Note: NoIndex instructions in robots.txt are no longer supported by Google as of July 2019. You have excluded sites this way so far? Then you have to implement alternatives like the NoIndex tag.
By now, the SEO tool should be done with the scan of your website - and present a whole series of recommendations for action. For the following steps, I use my favorite tool SEMrush. Usually, most SEO tools only analyze the technical aspects of your website.
Tip: Next to each hint in SEMrush you will find a link with the anchor text: "Reason and fix". There Semrush shows you why this error is a problem for the search engine optimization of your website.
Critical recommendations for action
Critical errors must be fixed in any case. They affect the rankings of your websites the most and can have negative consequences.
Attached is a list of issues you should be aware of during an SEO audit:
- Each web page has only one H1 tag. This tag is short, concise and clearly describes the topic of the web page. If you have several, then you should change this.
- Do all my sites have an appealing meta title? Or an appealing and unique meta description? Avoid duplicates. You can make changes in Yoast Plugin , see this guide.
- Is all the content on my website unique? And if not, do I use NoIndex tags or Canonicals so that they are not indexed?
- Do all my web pages load fast enough (see step 5)?
- Do all internal and external links point to valid website?
- Are all 404 errors on my website fixed (see step 3)?
- Are all URLs on my website crawlable (see step 2)?
- Can my website only be accessed with a single https://-Version (see step 1)?
- Are all images on my website accessible and do not trigger a 404 error?
- Does a valid XML sitemap exist on my website?
- Does a robots.txt file exist for my website? Does it contain only as many instructions as are actually needed?
- Is the XML sitemap included in the robots.txt?
- Multilingual web pages: Is the href-lang tag properly implemented?
- Is a valid SSL certificate activated? Is all content accessible via https:// (secure connection)?
- Have forwarding chains been removed?
Important recommendations for action
Medium errors have a less drastic effect than critical errors. They either have a passive effect on the ranking of your website, or they influence the user experience. This in turn has consequences for the ranking of your website.
You should identify and resolve the following issues here:
- The ratio of text to HTML is higher than 10 percent. This may indicate overloaded code.
- URLs have less than 300 words. sites with little content is usually an indicator of poor research or lack of quality, so that the question or search intention of the user can not be answered.
- Meta titles are shorter than 70 characters and unique. Meta descriptions are shorter than 160 characters and unique.
- Each image also has a descriptive ALT attribute. This is important for search engines to understand the content of an image. And for an accessible website.
- Temporary redirects have been removed or changed to 301 redirects.
- URLs use hyphens and not underscores (hyphens are used to separate words).
- All internal links use the dofollow attribute.
- The site uses modern technologies like caching, compression or modern image formats like webP to reduce the loading time of a website. See the notes here under "Performance and SEO".
- URLs are as short as possible. Stop words are avoided.
You should also fix the hints, even if they are not presented as problems. In this section, SEMrush gives you recommendations for action, where the tool can not exactly assess whether these settings are intended. For example: Should this site really be marked as noindex? Recommendations for action that affect your content are also included.
Here's what you should look at when doing an SEO audit here:
- sites can be reached with more than 3 clicks. This speaks for a bad structure of the website.
- Website URLs are only linked once, which also indicates poor structure.
- All URLs were checked for the correctness of the noIndex tag and the noFollow tag.
Anyone who has followed the SEO development of the last few years remembers discussions like the Mobilegeddon. Discussions that have brought the entire SEO scene into great turmoil. The Mobilegeddon did not happen. However, it's no secret that Google has been putting a lot of effort into making websites faster on the web for years. At the same time, Google encourages webmasters not to flood the Google index with low-quality sites .
Thus, the speed at which a site page is fully loaded has become an important ranking factor. And Google has launched several tools to help website owners optimize these loading times.
Even if the website speed is one of the weaker ranking factors, the loading time can still have an impact on the user experience of your website. Thus, it has a passive effect on the evaluation of user behavior. As a result, your sales also suffer, as Amazon has published in a study.
Both tools evaluate the current optimization of your website and give recommendations for action that you should follow. I don't want to go into this complex topic here, but refer to the PageSpeed Test guide.
In addition, the following tools can be used to get a quick assessment of the speed of your web pages:
- GTmetrix.com (server locations in Europe only after free registration)
- Experte.de (tests several URLs for speed)
A good and reliable web hosting like the one from RAIDBOXES takes a lot of work off your shoulders. Because the servers are precisely tailored to WordPress or WooCommerce and the user load. In addition, the caching is still taken over on the server side.
Tip: You can't fall back on a server caching? Then I recommend you a Plugin like WP-Rocket. The Plugin is chargeable, but compatible with most WordPress -Plugins.
My checklist for optimizing your load times:
- Does your website have fast web hosting? See: Speed comparison.
- Does the website use caching and compression? See: Caching and Brotli Compression.
- Is every URL on the website quickly accessible? See: Time to First Byte.
- Are services such as CDNs (e.g. Cloudflare) properly integrated? But be careful: Are CDN's useful?
- Are features like lazy load being used?
- Are images compressed and use modern formats? See: Image Compression.
- Is meta information removed from images?
- Are publicly viewable documents (.pdfs) compressed?
- Were all images uploaded in the correct image size?
- Is there no external integration for the most part? For example, Google Fonts, social media integrations, YouTube videos, etc.?
- Will the number of Plugins be reduced to the bare minimum?
- Is the database of WordPress and WooCommerce cleaned and tidied up?
- Are there redirect chains that can be broken up?
More resources on this topic:
- The 10 most important settings for your WordPress speed
- 13 necessary measures for high traffic
- 13 advantages of managed WordPress hosting
A backlink audit checks how many backlinks are pointing to your website, if these backlinks are pointing to the most important URLs of your website and how high quality they are.
Why it matters. In 2019, backlinks are still used to rank a URL. Think of a backlink as a recommendation. Every URL that links to your website makes such a recommendation to the content of your site . This makes backlinks one of the most important ranking factors.
However, Google is now very good at evaluating how high quality such a backlink is. The following applies: The higher the quality of the source and the more suitable the match to the topic of your website, the better the quality of a backlink.
Now it happens that not every backlink pointing to your website is also a good recommendation. Possible reasons can be:
- Bought backlinks from dubious sources
- spam attack
- The referring website is no longer maintained and the information is no longer up to date.
- Referring website was sold or not renewed and is now spam
The second case, for example, stood in the way of a client of mine ranking when nearly ten thousand links from spam domains pointed to his sites .
Fortunately, Google offers with the Disavow Tool (more on that in a moment) a way to permanently exclude such harmful and inappropriate links from the calculation of your rankings. And lo and behold, only a few days after the Disavow application, there were already decent successes to be recorded:
There are two ways to make such links discoverable:
Option 1: Find and remove manually
To do this, go to the Google Search Console. Under "Links" you will find the widget "Top referring websites". There you will find a list of all domains that link to your website. Export this list as a table and check each individual domain for its spam content:
At https://www.google.com/webmast ers/tools/disavow-links-main you can then have these domains declared invalid.
Option 2: Let an SEO tool analyze the backlinks
Seo tools like Semrush and ahrefs also give you tools to analyze your website's backlinks for quality. ahrefs might give the best recommendations out of all the tools, as it has become known especially for its large backlink database.
In Semrush you can find an overview of your backlinks under the menu item "Backlink Audit". Here you can see the percentage of these links that come from toxic (low-quality) sources. Most important, however, are the lists under the tab "Audit".
Here SEMrush evaluates exactly why a URL is considered toxic. From there, you can either mark it as "safe" or promote the URL or the entire domain directly to disavow.txt. To upload it to Google afterwards.
Once you've done that, a few hours later you should be notified via Search Console that the disavow file has been successfully applied:
Tip: Don't forget to give SEO tools full access to your Search Console and Analytics data. This is the only way to get a detailed listing of all your backlinks.
- Does the linking domain have a thematic relation to my website?
- Is the linking domain a spam website (pornographic content, scam, fake websites)?
Check most linked sites and linked anchor texts
In the Search Console you will find the tab "Links". There you will see the widget External Links -> Most Linked sites . Open this report:
Now you can see which of the URLs on your website are linked to the most by others. As a rule, your home page "/" will be linked to most often. The simple reason for this is that people prefer to link to brands rather than whole groups of words. And that's a good thing. I'll explain why in a moment.
First, however, we are interested in the number of links that point to subpages. Such links are usually an indicator for high-quality and topic-relevant references, which Google and other search engines rate as particularly good.
Take a closer look at the following screenshot:
231 backlinks point to the start page "/", but hardly any to the important subpages. There are no backlinks at all to the central services. As a search engine optimizer, I would now conclude the following:
- The quality of the subpages is poor, so that they are not linked by other website operators. There is no corresponding incentive.
- Services have not ranked so far simply because there are no thematic backlinks.
Finally, let's take a look at the widget "Top linking text". You can also find this in the Search Console under "Links":
In the best case, you will find a list of anchor texts that refer to your domain or brand. Or to the author of a website. These are very welcome and essential for search engine optimization, since mentions on the net are trust-building for search engines. They help you to build up an authority.
Closely followed by anchor texts with keywords that are especially important for your site. For example services or team members.
Here, too, the suspicion that the services have not yet received any backlinks solidifies. In addition, one can read from the anchor texts that important keywords are currently still too short. In addition, anchor texts of foreign domains can be found, which may indicate spam. These should be examined more closely.
Every SEO audit aims to increase the organic traffic of your websites in the long run. Therefore, in this step we will examine the current rankings of a website and the traffic from Google Analytics.
To do this, first call up Google Analytics. Get an overview of the visitor numbers of the last 12 months in the menu item Target Group -> Overview. Please deselect the segment "All users" and replace it with "Organic traffic":
In the best case, the ranking of your website should increase constantly. If the traffic of your website is stagnant or even collapsed (see the following case), then we need to research the causes for this collapse:
In such a case, it is possible that your website was affected by a Google update. Unfortunately, Google does not notify websites individually if an update has negatively affected them. The maintenance and control is in the care of the webmaster.
You can now use Google search to check for updates and announcements during this period:
And lo and behold, directly major SEO magazines catch your eye reporting an update on August 01, 2018:
Now you know that your website was actually affected by an algorithm update. This gives you the opportunity to learn more about the causes and effects. Or exchange information with others in order to restore the ranking of your website.
Alternatively, you can go to SEMrush's organic research and enter a domain of your choice there. Known updates from Google are also noted in the diagram. They give you a direct clue as to why the traffic dropped at that time:
Unfortunately, Analytics doesn't show data on your website's current keyword rankings. To get such data, you need to use a paid SEO tool or UberSuggest. The larger the keyword database of the SEO tool, the better your analysis will be.
To get data on the ranking of a website, we open the organic research in SEMrush again.
Here you can also see that everything has been done right so far. And that the rankings of the domain are constantly improving. In addition, we get an overview of our URLs: For which keyword or on which position do they rank?
If you click on one of the keywords, you will see a list of all the important data for this search query. Also a list of the top 10 results. In other words: You see your competition.
Here it is the task of our SEO audit to develop new traffic sources and evaluate the current rankings. This is the only way to tap into keyword opportunities and increase the website's traffic in the long term. Analyze what competitors are doing better that are ahead of you. This can be better content, a better tool or simply better backlinks.
In order not to go beyond the scope of the article: Read my guest post "With keyword research to SEO success". There, I show you step by step how to conduct a keyword research with the help of your current rankings in order to open up new traffic sources.
What do you do when traffic stagnates?
Either your traffic is no longer developing because no new content is being published on the website, or because your website is no longer rising in the rankings. Also in this case I recommend you a competitor analysis for each keyword. Or to develop new keywords by means of a keyword research.
Was the publication of your blog posts a while ago? Then it is also recommended to refresh all content by updating it. See step 8.
What do you do when traffic has plummeted?
If you don't find any signs that your website was affected by a Google update, then the obvious assumption is that your website fell victim to a technical error. Reverse engineer all of your changes. Check whether sites was changed or deleted weeks before the traffic collapse. Or have you added or updated WordPress -Plugins ?
Whatever the reason, you need to get to the root cause through the process of elimination.
Search engines love fresh content! It is not without reason that it has become part of every SEO strategy to regularly adapt content on the web and supplement it with new information. And also your visitors will be pleased about fresh articles.
This also keeps the content of your website up to date. And you reduce the risk that your competitors compete for first place with new content. For search engines, there is nothing worse than sending a searcher to a website with outdated content. After all, the job of search engines is to help the user with the best possible answer to their search query. If he bounces off your website, that's not good for the search engine. And it's not good for your user experience. In other words, it has a negative impact on your ranking.
This is exactly why you should regularly subject your content to a content audit. In a content audit, you analyze all the content on your website to see if it:
- Provide added value for the user
- Have current content at your disposal
- No duplicates of other sites are
- Show organic traffic
- Can be combined with other sites
- How the user experience is affected on this URL
As a rule, you then create a list of all URLs and decide whether a content:
- Merged with another URL
- Can be updated
- Or whether no manual action is required
You can easily maintain a content audit within a spreadsheet/excel table. It looks something like this:
In addition, you can examine each URL individually in Google Analytics for the dwell time and bounce rates. The values provide information about whether your visitors rate the content of a website as useful or unhelpful.
To do this, go to the tab Google Analytics -> Behavior -> Websitecontent -> Landingpages. There you will find a list of all URLs and the most visited sites :
You want to learn more about the content audit? Then I recommend the article Content Audit: Analyzing Existing Content by Mareike Doll. In it, she describes in detail how a content audit is organized and carried out.
The book Think Content by Miriam Löffler is also very suitable for this. It contains numerous tips on content marketing and the right content strategy.
Note: Do not overlook meta titles and meta descriptions!
During a content audit, many webmasters only focus on the superficially visible content of their website. Often they forget to adjust their meta descriptions and meta titles as well. These are the short texts that are later displayed in the search results. They should encourage users to click on your result.
To check the meta information of your website: Call up the Google search and enter the command "site:yourdomain.com" in the search mask again:
You will then receive a list of all your indexed sites . You can see here which meta descriptions and meta titles are too long or too short.
Tip: Always present the content of your sites in an informative way. At the same time, make sure that your search results are as attractive and exciting as possible for the user. Meta information is no longer an active ranking factor. However, the number of people who click on your result can still have a passive effect on your positioning. The keyword is the click-through rate (CTR).
Alternatively, use a tool like the Screaming Frog Spider. It crawls all URLs of your website and shows if meta descriptions are too long or even missing:
The length of my guide shows how extensive an SEO audit can be. The good news is: you've done it! If you follow all these steps, you should have a good understanding of your website's weaknesses. You are now in a position to fix the most important ones.
The bad news is: An SEO audit is usually much more extensive. Some topics like Structured Data, Google My Business or Duplicate Content are more complex than I could describe here. Feel free to write me in the comments, for which sub-topics you are missing a suitable tutorial.
Even after an initial audit, there is still a lot of optimization to be done. For some things you will have to resort to SEO professionals. They have more experience in interpreting the data to tackle the problem with the right approach. Nevertheless, I hope that I could help you with this article. And now: Good luck with your SEO audit!
What questions do you have about the SEO audit? Which tools can you recommend? Feel free to use the comment function. Want more tips on WordPress and performance? Then follow us on Twitter, Facebook or via our newsletter.