As a search engine optimizer, I regularly conduct SEO audits for a wide variety of clients. In this article I show you: What is an SEO audit? Why is it so important for search engine optimization? And how can you do the audit yourself? A complete guide for your own website as well as for freelancers and agencies.
An SEO audit is the ultimate test of your website to assess the current level of search engine optimization and technical optimization. Roughly speaking all OnPage- and OffPage-factors of your Internet presence or your online shops tested, down to the smallest detail.
More precisely, an SEO expert uses checklists, online tools and his own experience to assess the technical infrastructure and the Performance. But also the contents and backlinks of site the . I show you how to create the audit yourself.
As a rule, a catalogue of measures is developed from this. This lists all weaknesses and potential opportunities of a website. In short: An SEO audit is the basis for your strategy for the next months.
In the foreground is the evaluation of the search engine friendliness of a website. At the end of the project, a catalogue of measures is drawn up from the results. Usually, website owners or the clients of your agency have the following goals with this service:
- Evaluation of the website for current SEO trends, traffic etc.
- Review of the success of a current SEO agency by a third party
- Increase of visibility in organic search results
But also for agencies SEO audits are a chance to
- Expand the service portfolio
- To convince the customers of technical and content-related measures
- Inform potential customers about the status of their website
The biggest advantage of an SEO audit is that the recommendations for action can be implemented by the SEO agency as well as by the client or by yourself.
Why are SEO audits so expensive?
The problem of an SEO audit lies in the search engine optimization itself. This is because there are dozens of rules and optimization options for which every URL of the website must be analyzed. Additionally you have to check the content or even thousands of backlinks one by one for their quality.
This whole process is time consuming. Usually an SEO audit takes several hours, days or even weeks. The larger the website, the more time is needed for a complete analysis. In addition to the hourly wage of the experts, an SEO agency will use several tools and charge for their use. And these tools are usually not cheap.
All these factors mean that an SEO audit can become a very useful but expensive undertaking.
When I was asked RAIDBOXES for an article on this topic, I thought about it for a very long time: How to summarize this very extensive topic in a way that everybody is able to do an SEO audit by themselves? Because no tool can replace an expert who regularly deals with the subject. Regularity is the keyword. New functions in search engine optimization are constantly being added, although the basic rules of the game have remained the same for years.
There are thousands of changes to the search algorithm every year, and they keep making small adjustments at regular intervals. And the competition never sleeps. Information, tricks and tips are no longer up to date after a few months. Or they are copied by several competitors. For you to successfully complete an SEO audit you should:
- Continually deal with the topic of search engine optimization
- Be informed about current changes and trends
- Have a broad knowledge of how to optimize your website
See for example the article on SEO Guide 2019. In the following tutorial I will show you step by step how to detect critical errors of your websites by yourself. And how you can fix them.
For an SEO audit to be holistic, you should make sure you have access to the most important analysis tools and services. These include Google Analyticswho search console and access to the entry of Google My Business.
An SEO audit is a time-consuming undertaking. In order to keep the time spent as low as possible, it is recommended to use professional tools when analyzing the website.
Professional SEO audit tools are for example:
The Screaming Frog Spider is a great tool to analyze your URLs. However, the tool is mostly aimed at advanced users. Unlike online tools, Screaming Frog makes your PC your own crawler. Up to 500 crawled URLs the program is free of charge. This is just enough for small websites.
Other useful (free) tools:
Neil Patelsites 's audit is free of charge. However, he is still miles behind the paid tools. But the advice given so far is sometimes very good. The tool is constantly being developed and has made great leaps in the last few months.
Before we let a Seo-Audit tool check our website, we have to make sure that only one version of it can be reached. There are 4 different URL variants with which users can call up a website:
Call up the input mask of your Internet browser and type in one version of your website at a time. Ideally, all versions should redirect you to a single version:
If this is not the case, or if you receive the warning message "Not secure" in your browser, you should contact your web host. So that he can activate the SSL certificate for your website or correctly adjust your DNS settings.
Tip: Customers of RAIDBOXES can view these settings directly in the dashboard. You can activate SSL under the menu item of the same name. A Let's Encrypt SSL certificate is always included free of charge. It can be installed with one click.
It is important for two reasons that your websites are accessible via https://: Firstly, SSL is a ranking factor. And secondly, you certainly don't want your visitors to be informed that the website or online shop may not be secure.
To make sure that all the content of your website is really integrated via a protected connection, you can take an additional look at the developer console of your Internet browser:
If the console issues a so-called "Mixed Content Error", this must also be corrected. This usually happens if you have included website content with http:// instead of https://. Common culprits are external scripts like Google Maps or images from other websites.
Tip: If your web hosting configuration is too complex, you can use a tool like Really Simple SSL to put the gun down. It fixes such mistakes for you. Not only in this case, however, should you think about having a Managed WordPress hosting
In the meantime, we will start an SEO audit through a suitable tool. Of course you have the choice Neil Patel's site audit to use. However, I rather recommend to try a trial version with the professional tools (see above). If you like it or if you are successful, you can use the tool after the trial.
Important settings, no matter in which tool:
- Accept the instructions given in the robots.txt. We want to look at a website in the same way as a crawler
- If possible: Crawl the website using the Sitemap.xml. Common paths for a sitemap are: domain.de/sitemap.xml or domain.de/sitemap_index.xml
This process will take some time depending on the size of your website(s). So let's move on to the next step.
Can users find your website?
The next step is to check whether your website can be found at all. Or if crawlers have problems reading this. Visit google.de and enter the following command in the search mask:
Now follows a list of all search results, which however only concern your domain:
As you can see, at least everything seems to be okay here. The website can be found in the search results.
Next, you call the search console on. Google's in-house console shows webmasters information about the indexing of their website. You will also be notified of critical errors. A great and very helpful thing.
Check whether your domain is already registered and verified. If not, make up that step. For the Search Console it is mandatory that the correct version of your website is included. Say: If your website is accessible via https://domain.de, but is registered as https://www.domain.de, the Search Console will not work properly. Then you will get wrong results.
Click on the tab "Coverage": Here you will get a list of possible errors as well as excluded and valid sites :
sites which are marked as errors here, must be corrected! Also warnings should be considered critically. And remove their causes immediately.
Also interesting: The screenshot shows only 28 valid websites. But as we just found out, Google shows 45 results for this website. We should investigate this further. It could mean that 404 error pages are indexed in the search.
To control this there are the following ways:
- You jump back to the Google search and use the command "site://yourdomain.com". From there you can click through the search results and see if there are any anomalies like 404 errors etc.
- Or you can use an SEO tool like Screaming Frog Spider or the tool of your choice to search for 404-sites .
Since you are already in the corresponding section of the search engine, we will take a quick look at the excluded URLs at the same time. Click on the tab "excluded URLs" and search for the following messages:
404-sites are pretty much the most harmful thing a web site has to offer. After all, what visitor wants to click on a result just to see that the desired content no longer exists?
In addition, each of your subpages has a value. If you do not redirect the 404site , this value will be lost. And with it all the trust and success you have gained with this website in a search engine. High-quality backlinks that you have built up also simply fizzle out in nirvana.
So that exactly this does not happen, there are so-called "redirections". Matt Cutts explained to Google wonderfully what such a redirection is and why this tool is so important to Google:
By the way, if you have your website with RAIDBOXES you, then it is easy to set up such redirects. Just go to Dashboard -> Settings -> Forwardings and define which URL should forward to where:
The difference between the two redirects:
- 301 Forwarding / Permanent Redirect: Permanently redirects to the destination page. Used when the content is completely located on another URL.
- 302 Forwarding / Temporary Redirect: Temporarily redirects to the destination URL. Used for example for test pages.
However, it is a fact that a URL should not always refer to a new site one. Sometimes a content is simply deleted permanently. For this purpose there is the status code 410. For such cases you can use the Yoast SEO Premium Plugin or that WordPress plugin Redirection use. They offer exactly this function, including 301 and 302 forwarding:
Fixing 404 bugs is a top priority for you. Make sure that you only redirect to URLs that have a connection to the deleted sites ones.
Next, let's take a look at the URLs of the search console excluded by NoIndex:
The noIndex tag tells crawlers that corresponding sites ones should not be indexed. This is the case, for example, with thanksites you pages after a newsletter registration, with member pages or with legally required sub-pages.
But it can also happen that important sites ones, which should actually be indexed, end up in this pot. For example, through plugins carelessness or through uncleanly scripted WordPress themes. If that is the case, then you have to identify the cause.
Tip: If you use the Yoast SEO plugin to optimize your posts, you will find a selection box at the end of the post. This can be used to determine whether the tag "index" or "noindex" should be assigned to a contribution. Other SEO plugins for WordPress, like All in One SEO Pack by Michael Torbert, offer similar functions.
Find all tips in our SEO site checkup you can check if this fixes the problem. If not, it is also possible that you have configured the Yoast SEO Pluginincorrectly. In this case I recommend this tutorial.
A XML Sitemap is a guide for search engines. The file contains a list of all URLs of your website and when they were last modified. There are many Plugins that automatically generate a sitemap for WordPress and WooCommerce . For example the Yoast Plugin. Its sitemap can be reached under "deinedomain.de/sitemap_index.xml".
We show Google where the XML Sitemap is located. To make it as easy as possible for crawlers to access them. To do this, go to the search console to the tab "Sitemaps" and add them there:
Next, we call the robots.txt on, under your domain .de/robots.txt The robots.txt is a file with instructions to the crawlers that visit your website.
If your WordPress site does not yet use robots.txt, you can copy the following entry and paste the file. After that you upload it manually to the root of your FTP folders. Alternatively, you can create the robots.txt file easily in Yoast SEOPlugin:
At best your WordPress robots.txt looks like this:
However, most website operators forget the possibility of storing their sitemap.xml in the robots.txt.
Depending on who has worked on your website, there may be other code snippets in the robots.txt. Sometimes they exclude certain crawlers or try to stop certain sites ones from indexing.
Note: NoIndex statements of the in robots.txt are no longer supported by Google since July 2019. You have excluded in this way so sites far? Then you have to implement alternatives like the NoIndex tag.
By now, the SEO tool should have finished scanning your website - and present a whole range of recommendations for action. For the following steps I use my favourite tool SEMrush. Usually, most SEO tools only analyze the technical aspects of your website.
Tip: next to each hint in SEMrush you will find a link with the anchor text: "Reason and fix". There Semrush will show you why this error is a problem for the search engine optimization of your website.
Critical recommendations for action
Critical errors must be eliminated in any case. They affect the rankings of your websites most and can have negative consequences.
Enclosed is a list of issues that you should be aware of during an SEO audit:
- Each website has only one H1 tag. This tag is short, concise and clearly describes the topic of the website. If you have more than one, then you should change this.
- Do all mine sites have an appealing meta-title? Or an appealing and unique meta description? Avoid duplicates. You can make changes in YoastPlugin, see this guide.
- Is all content on my website unique? And if not, I'll use NoIndex-tags or Canonicals...so they won't be indexed?
- Do all my websites load fast enough (see step 5)?
- Do all internal and external links refer to valid websites?
- Are all 404 errors on my website fixed (see step 3)?
- Are all URLs on my website crawlable (see step 2)?
- Can my website only be accessed with a single https://-Version (see step 1)?
- Are all images available on my website and do not cause a 404 error?
- Is there a valid XML Sitemap on my website?
- Is there a robots.txt file for my website? Does it contain only as much instruction as is actually needed?
- Is the XML sitemap included in the robots.txt?
- Multilingual websites: Will the day href-lang properly implemented?
- Is a valid SSL certificate activated? Is all content accessible via https:// (secure connection)?
- Have forwarding chains been removed?
Important recommendations for action
Medium errors have a less drastic effect than critical errors. They either have a passive effect on the ranking of your website, or they influence the user experience. This in turn has consequences for the ranking of your website.
You should identify and solve the following points here:
- The ratio of text to HTML is higher than 10 percent. This may indicate overloaded code.
- URLs have less than 300 words. sites with low content are usually an indicator of poor research or poor quality, so that the user's question or search intention cannot be answered.
- Meta titles are shorter than 70 characters and unique. Meta descriptions are shorter than 160 characters and unique.
- Each image also has a descriptive ALT attribute. This is important for search engines to understand the content of an image. And for a barrier-free website.
- Temporary redirects have been removed or changed to 301 redirects.
- URLs use hyphens, not underscores (hyphens are used to separate words).
- All internal links use the dofollow attribute.
- It site uses modern technologies like caching, compression or modern image formats like webP to reduce the loading time of a web page. See the Notes here under "Performance and SEO".
- URLs are as short as possible. Stop words are avoided.
You should also fix the hints, even if they are not presented as problems. In this section SEMrush will give you recommendations for actions where the tool can't exactly judge whether these settings are intended. An example: should it site really be marked as noindex? Recommendations for actions concerning your content are also included.
Here's what you should look at during the SEO audit:
- sites can be reached with more than 3 clicks. This speaks for a bad structure of the website.
- URLs of the website are linked only once, which also indicates a bad structure.
- All URLs were checked for the correctness of the noIndex tag and the noFollow tag.
Anyone who has followed the SEO development of the last few years remembers discussions like the Mobilegeddon. Discussions that have brought the entire SEO scene into turmoil. The Mobilegeddon did not take place. But it's no secret that Google has been working hard for years to make websites on the net faster. At the same time, Google encourages webmasters not to flood the Google index with low-quality sites ones.
Thus, the speed until one is site fully loaded has become an important ranking factor. And Google has launched a number of tools to help webmasters to Optimization of these loading times to help.
Even if the website speed is one of the weaker ranking factors, the loading time can still affect the user experience of your website. Thus it has a passive effect on the evaluation of user behaviour. Suffering from this also your turnoveras Amazon published in a study.
Google would like to show the same phenomenon with its speed test on Test my Site with Google ...to tackle it. Another important tool that helps you to optimize your website is Google Pagespeed Insights.
Both tools evaluate the current optimization of your website and give recommendations for action that you should follow. I do not want to go into this complex subject in detail here, but would like to refer to the instructions for PageSpeed Test.
In addition, the following tools can be used to get a quick evaluation of the speed of your web pages:
- GTmetrix.com (Server locations in Europe only after free registration)
- expert.com (tests several URLs at once for speed)
A good and reliable webhosting like the one of RAIDBOXES takes a lot of work off your shoulders. Because the servers are exactly on WordPress or WooCommerce and the User load tailored. In addition, the Caching still server-side taken over.
Tip: You can not use server caching? Then I recommend a Pluginlike WP Rocket. Although this plugin is a chargeable service, it is compatible with most WordPress plugins.
My checklist for optimizing your loading times:
- Does your website have a fast web hosting? See: Speed comparison.
- Does the website use caching and compression? See: Caching and Brotli Compression.
- Is every URL on the website quickly accessible? See: Time to First Byte.
- Are services like CDN's (e.g. Cloudflare) correctly integrated? But attention: Are CDN's useful?
- Are functions like Lazy Load used?
- Are images compressed and use modern formats? See: Image Compression.
- Is meta information removed from images?
- Are publicly viewable documents (.pdfs) compressed?
- Were all images uploaded in the correct image size?
- Is external involvement largely avoided? For example, Google fonts, social media integration, YouTube videos etc.?
- Will the Number of Plugins reduced to the bare essentials?
- Is the database of WordPress and WooCommerce cleaned and tidied up?
- Are there Redirect chains that can be broken up?
Further resources on this topic:
- The 10 most important set screws for your WordPress Speed
- 13 necessary measures for high traffic
- 13 advantages of managed WordPress hosting
A backlink audit checks how many backlinks point to your website, whether these backlinks point to the most important URLs of your website and how high-quality they are.
Why is it so important? Even in 2019, backlinks will still be used to rate a URL. Think of a recommendation when you backlink. Every URL that links to your website is recommending the content of your site website. Backlinks are therefore one of the most important ranking factors.
However, Google is now very good at evaluating how high-grade/qualitative such a backlink is. The following applies here: The higher the quality of the source and the more appropriate the match to the topic of your website, the better the Quality of a backlink.
But it happens that not every backlink that refers to your website is a good recommendation. Possible reasons could be:
- Bought backlinks from dubious sources
- spamming attempt to damage a website
- Linking website is no longer maintained and the information is no longer current
- Referring website was sold or not renewed and is now spam
The second case, for example, stood in the way of the ranking of one of my clients, when almost ten thousand links from spam domains pointed to hissites .
Fortunately, Google's Disavow Tool (more on this in a moment) offers a way to permanently exclude such harmful and inappropriate links from the calculation of your rankings. And lo and behold, only a few days after the Disavow deployment, there were already some decent successes to be had:
There are two ways to make such links findable:
Option 1: Find and remove manually
Go to the Google Search Console. Under "Links" you will find the widget "Top referring websites". There you will find a list of all domains that link to your website. Export this list as a table and check every single domain for its spam content:
On https://www.google.com/webmasters/tools/disavow-links-main you can then have these domains declared invalid.
Option 2: An SEO tool to analyze the backlinks
Seo-tools like Semrush and ahrefs also give you tools to analyze the backlinks of your website for their quality. Of all tools, ahrefs should give the best recommendations, as it has become especially known for its large backlink database.
In Semrush you can find an overview of your backlinks under the menu item "Backlink Audit". You can see here in which percentage these links come from toxic (inferior) sources. However, the most important are the enumerations under the "Audit" tab.
Here SEMrush evaluates exactly why a URL is considered toxic. From there, you can either mark it as "safe" or put the URL or the whole domain directly into the disavow.txt. To upload them to Google afterwards.
When you have done this, you should be notified via the Search Console a few hours later that the Disavow file has been successfully transferred:
Tip: Don't forget to give the SEO tools full access to your Search Console and Analytics data. This is the only way to get a detailed list of all your backlinks.
- Does the linking domain have a thematic reference to my website?
- Is the linking domain a spam website (pornographic content, scam, fake websites)?
Check most linked sites and linked anchor texts
In search console you will find the tab "Left". There you see the widget External Links -> Most Links sites. Open this report:
Now you can see which of the URLs on your website are most linked by others. Usually your homepage "/" will be the most frequently linked. This has the simple background that people prefer to link brands instead of whole word groups. And that's a good thing. I'll explain why in a moment.
First, however, we are interested in the number of links that refer to subpages. Such links are usually an indicator of high quality and topic-relevant links, which Google and other search engines rate as particularly good.
Take a closer look at the following screenshot:
231 backlinks refer to the start page "/", but hardly any of them to the important subpages. There are no backlinks to the central services at all. As a search engine optimizer I would now conclude the following:
- The quality of the subpages is poor, so that they are not linked by other website operators. There is no corresponding incentive.
- Services have not ranked so far because there are simply no thematic backlinks.
Finally, let's take a look at the "Top-referencing text" widget. You can also find this in the Search Console under "Links":
In the best case you will find here a list of anchor texts that refer to your domain or trademark. Or to the authors of a website. These are very welcome and are essential for search engine optimization, since mentions on the net are trustworthy with search engines. They help you build authority.
Closely followed by anchor texts with keywords that are especially important for your site. For example services or team members.
Here, too, the suspicion that the services have not yet received backlinks is becoming more and more widespread. In addition, you can read from the anchor texts that important keywords are currently still being missed out. In addition, anchor texts of foreign domains can be found, which may indicate spam. These should be examined more closely.
Every SEO audit aims to increase the organic traffic of your websites in the long run. Therefore in this step we will examine the current rankings of a website and the traffic from Google Analytics.
The first thing you do is call Google Analytics on. In the menu item Target group -> Overview an overview of the number of visitors in the last 12 months. Please select the segment "All users" and replace it with "organic hits":
In the best case, the ranking of your website should constantly increase. If the traffic of your website has stagnated or even collapsed (see the following case), we have to investigate the reasons for this collapse:
In such a case it is possible that your website was affected by a Google update. Unfortunately, Google does not notify web pages individually about whether an update has negatively affected them. The care and control is in the care of the webmasters.
You can now use Google Search to check for updates and announcements during this period:
And lo and behold, directly noticeable are big SEO magazines, which Update on 01 August 2018 report:
Now you know that your website was actually affected by an algorithm update. This gives you the opportunity to get more detailed information about the causes and effects. Or exchange with others to restore the ranking of your website.
Alternatively you can use the organic research of SEMrush and enter a domain of your choice there. The diagram also shows known updates from Google. They give you a direct clue as to why the traffic was down at that time:
Unfortunately Analytics does not show any data on the current keyword rankings of your website. In order to get such, you must rely on a paid SEO tool or UberSuggest. The larger the keyword database of the SEO tool, the better your analysis will be.
In order to get data on ranking of a web-site we open again the organic research in SEMrush.
Here it is also evident that so far everything has been done correctly. And that the rankings of the domain are constantly improving. We also receive an overview of our URLs: To which keyword or at which position do they rank?
If you click on one of the keywords, you will see a list of all important data for this search query. Also a list of the top 10 results. In other words, you see your competition.
Here it is the task of our SEO audit to develop new traffic sources and evaluate the current rankings. This is the only way you can open up keyword opportunities and increase the traffic of the website in the long run. Analyze what the competitors in front of you are doing better. This can be better content, a better tool or simply better backlinks.
So as not to go beyond the scope of the article: Read my article post "With the keyword research to SEO success" through. There I will show you step by step how to do a keyword research with the help of your current rankings in order to open up new traffic sources.
What do you do when the traffic is stagnant?
Either your traffic does not develop any more because no new content is published on the website, or because your website does not increase in positioning. Also in this case I recommend a competition analysis for each keyword. Or to develop new keywords by means of a keyword research.
Was the publication of your blog posts a while ago? Then it is also recommended to refresh all content by updating it. See step 8.
What do you do when the traffic slumps?
If you don't find any indication that your website was affected by a Google update, then the following suspicion is obvious: Your website has fallen victim to a technical error. Track all your changes according to the Reverse engineering back. Check if weeks before the traffic sites was changed or deleted. Or have you plugins added new WordPress ones or made updates?
Whatever the reason, you have to find out the cause through the process of exclusion.
Search engines love fresh content! It is not without reason that it has become part of every SEO strategy to regularly adapt content on the web and supplement it with new information. And also your visitors will be pleased about fresh articles.
So the content of your website will also remain up to date. And you reduce the risk that your competitors will challenge you for first place with new content. For search engines there is nothing worse than sending a searcher to a website with outdated content. Because the task of search engines is to help the user with the best possible answer to his search query. If it jumps off your website, it is not beneficial for the search engine. And it's not good for your user experience. In other words: It has a negative effect on your ranking.
That's why you should regularly audit your content. In a content audit, you analyze all the content on your website to determine whether it is
- Offering added value for the user
- Have up-to-date contents
- No duplicates of others sites are
- Show organic traffic
- Can be combined with otherssites
- How the user experience on this URL affects
Usually, you then create a list of all URLs and decide whether a content:
- Deleting is
- Merged with another URL
- The following can be updated
- Or whether no manual action is required
You can easily maintain a content audit within a spreadsheet/Excel table. It looks something like this:
In addition, in Google Analytics, you can still point each URL individually to the Dwell time and the Jumping rates investigate. The values indicate whether your visitors rate the content of a website as useful or not.
Go to the tab Google Analytics -> Behavior -> Website content -> Landingpages. There you will find a list of all URLs and the most sites visited :
You want to know more about content audits? Then I recommend you the article Content Audit: Analyse existing content by Mareike Doll. It describes in detail how a content audit is organized and carried out.
Also the book think content by Miriam Löffler is very well suited for this purpose. It contains numerous tips on content marketing and the right content strategy.
Note: Do not overlook meta titles and meta descriptions!
During a content audit, many webmasters focus only on the superficially visible content of their website. They often forget to adjust their meta descriptions and meta titles as a result. These are the short texts that are later displayed in the search results. They should encourage users to click on your result.
To check the meta data of your website: Call up the Google search and enter the command "site:deinedomain.de" in the search mask again:
You will then receive a list of all your indexed sites . Here you can see which meta descriptions and meta titles are too long or too short.
Tip: Always provide the content of your pages in an informative way. At the same time, make sure that your search results are as attractive and exciting as possible for your users. Meta data are no longer an active ranking factor. However, the number of people who click on your result can still passively affect your positioning. Focus on your Click-through rate (CTR).
Alternatively, use a tool like the Screaming Frog Spider. It crawls all URLs of your website and shows if meta descriptions are too long or even missing:
With one Pluginlike Yoast SEO, the meta information of a URL can be Edit directly in the WordPress editor. It Pluginalso shows you whether you have respected character boundaries:
The length of my guide shows how extensive an SEO audit can be. The good news is, you did it! If you follow all these steps, you should know the weaknesses of your website well. You are now able to fix the most important ones.
The bad news is that an SEO audit is usually much more extensive. Some topics like Structured Data, Google My Business or Duplicate Content are more complex than I could describe here. Feel free to write me in the comments, for which sub-topics you are missing a suitable tutorial.
So even after an initial audit, there is still a lot of room for improvement. For some things you will have to resort to SEO professionals. They have more experience in interpreting the data to address the problem with the right approach. I still hope that I could help you with this article. And now: Good luck with your SEO audit!
What questions do you have about the SEO audit? Which tools can you recommend? Please use the comment function. You want more tips on WordPress and performance? Then follow us on Twitter, Facebook , or subscribe to our newsletter.