Guide to Do a Complete SEO Audit 2021

by admin
0 comment

In this post I am going to discuss one of the most important and necessary elements of positioning work: the SEO Audit .

I am going to explain it to you step by step so that you understand each of the points well and I am also going to recommend free and paid tools , as well as information on the  price of the SEO Audit.

It is important to make it clear from the beginning that there is no single valid method to carry out an  SEO Audit, so we must have an open mind to learn from other professionals and incorporate interesting aspects into our own methodology, since SEO is learning and constant review of the postulates and theories themselves.

That said, I will try to provide my definition of what an SEO Audit is.

What is an SEO Audit?

It consists of the step-by-step analysis of each and every one of the SEO aspects or factors that affect the positioning of a web project, to make a diagnosis as accurate as possible and detect errors, problems, difficulties and SEO opportunities and thus be able to execute improvements in the future with the aim of positioning.

As I have mentioned above, in SEO there is no single valid way of doing things, that is, each professional or agency has its own method of analysis. However, an SEO audit should cover all the main aspects and key factors that affect positioning.

There is a certain consensus within the SEO world about which are the most important factors of positioning, although in some aspects there may be different points of view, hence there are different methods .

In this post I am going to try to tell you step by step our SEO Audit method as a result of experience , but of course we are open to you to leave us your point of view at the end of the post, because it will surely serve to enrich it even more.

How to do an SEO Audit step by step?

As I just mentioned, the SEO Audit must carefully analyze all the factors that affect the positioning, always from the point of view of who performs it, as a result of their experience and their vision on SEO.

Taking this into account, and following our criteria, we can group the Audit process into 5 phases or large main blocks of analysis:

  1.  Indexability / Tracking
  2.  Content / Keywords / CTR
  3. Inbound Links / Domain Authority
  4. Performance / Adaptability / Usability
  5. Code and Labels

1. Indexability and Tracking

Phase 1, Indexability and Tracking , comprises the following important points that must be analyzed:

1.1. Indexability

The analysis of the URLs of the site that are indexed , that is, that can be served by the search engine as a result of a search, or in a less orthodox way, the pages of a website that appear in Google.

This first factor is very important. A website must always have its relevant URLs indexed for SEO, that is, those on which there is real search intention and that have been optimized with the intention of appearing ranked for said searches.A website must always have its relevant URLs  indexed

Following this line, it is as important that the relevant URLs are indexed as that the non-relevant URLs are not .

By non-relevant URLs we mean all those pages of a site that do not provide relevant information for a specific search intention or that have not been optimized for SEO, or that offer poor content ( thin content ) or that have content that is coincident or duplicated with respect to other URLs on the site, which do not add anything new.

By tracking we understand the process of reading that Google does of the content offered by the URLs of a website. Google prefers not to crawl irrelevant URLs, as this saves crawling time that can be used to crawl other relevant URLs.

In the case, in addition to very large websites, from about 10,000 URLs approx. up, Google might have trouble seeing all URLs and index, due to the limited budget tracking or Crawl Budget .

This makes it even more necessary to spend some time analyzing potential crawling and indexing issues and fixing them once the reasons why URLs are or are not being crawled or indexed are discovered.

✖  Reasons why a URL might not be indexed:

  1. Noindex robots meta tag , marked within the web code or on platforms such as WordPress through plugins such as Yoast SEO or similar. In this case, the search engine crawls the content but does not index it, following the indication of the meta tag.
  2. Order Disallow in robots.txt. This indication in the robots file tells search engines which URLs or directories of a website should not be crawled. If they do not receive any links, the URLs in Disallow may not be indexed.
  3. URLs with duplicate / irrelevant / poor content . Sometimes these pages on the site might not be indexed if the search engine considers it so, even though they do not have any indication of noindex or disallow type.
  4. Orphan URLs , that is, they do not receive any internal links and therefore Google could skip them when crawling and indexing. This does not always happen and it depends on each specific case, but in general if a URL is important to you and you want to index and position it, you should link it internally.
  5. Problems of excess URLs that exceed the crawl budget .
  6. Very large and complex sites, with bad internal linking, and that also do not have a sitemap .

As you can see, even if you think that all your important content is correctly indexed by Google, sometimes it is not and it is very important that you analyze it and solve it as one of the first actions to be addressed in the SEO Audit .

How to analyze the indexing status of a website?

You can view the URLs indexed or not indexed, valid or with indexing problems, with the Coverage functionality of Search Console.

Google’s free tool is an excellent option due to its simplicity of use and, at the same time, its reliability and detail when it comes to detecting indexing problems and assessing possible causes in order to solve them.

Another option to see the indexed URLs of a site is to enter the command site: domain in Google. In this way you will be able to see the (approximate) amount of URLs that the domain shows in Google and you can also see how the SERPs  or search results boxes are shown to users.

Another excellent tool (although paid) for analyzing the status of your URLs is Screaming Frog . With this tool you will not only be able to observe if the URLs are indexable or not, but you will also be able to obtain a lot of very valuable and detailed technical information about each of the URLs of a project.

Remember

Do not forget that a website should not have all its content indexed in the search engines, but only that which offers a response to a search intention and is optimized for SEO, avoiding problems of duplication, thin content, poor optimization, etc …

1.2. Robots

As I mentioned above, the robots.txt file is used to give orders to search engines about which parts of the site should or should not be crawled. It does this using the Disallow command , followed by a relative URL of the type / directory or / url.

You can also add the Allow command to allow access to certain zones or URLs as an exception to a disallow.

In the robots file you can also define the exact location of the sitemap to facilitate the work of search engines.

In the SEO Audit you must check that there is a robots file and that it is correctly configured, depending on the areas of the site that must be blocked or allowed for crawling, according to the needs of the project in terms of indexing, crawl budget, pagination, content duplicate, etc.

How to view and parse the robots.txt file?

You can see the robots of any web page by typing the URL domain / robots.txt. It is a file that is visible to the search engine and also to users, so you can easily analyze it even if you don’t have access to the project’s backend.

1.3. Sitemap

The sitemap xml file  must contain a list of the site’s indexable URLs to facilitate Google crawling and indexing, avoiding errors when indexing and speeding up the crawl time, which can benefit the accessibility of your project by users. search engines and optimize your crawl budget .

Therefore, it is recommended that all websites, especially the more complex ones with a large number of URLs, have this file available and added by Search Console so that it can be read by search engines.

The sitemap should contain the relevant URLs that you want to index, not those that are irrelevant or do not offer content for a specific search intention.

There are several ways to organize the index of the sitemap for search engines: it can be classified by types of content (pages, posts, products, etc.), listed by dates, by areas of the web, by priority or most recent URLs, etc.

The important thing is that it is understandable for search engines and that it does not have errors in the list of URLs that it shows, that is, that the relevant URLs are always there and that the irrelevant ones do not sneak in.

How to view and parse the sitemap file?

You can see the main sitemap file at the URL domain / sitemap_index.xml. In this location you will find the index of all the sitemaps that you have for the domain.

 In the example you see below, each of these secondary sitemaps link to a list of URLs of that type (one links to the list of posts, another links to the list of pages).

If instead of checking the existence of these files by hand you prefer to do it through an application that gives you a simple and fast report, you can use the free SEOptimer tool  , which allows you to check in a moment if a website has the sitemap available  , like this like the robots file and many other basic elements for the SEO of a website.

2. Content / Keywords / CTR

Phase 2 comprises the following key aspects of analysis:

2.1. Content

The content is a key factor in SEO to be analyzed thoroughly to detect potential problems and optimize it to the maximum.

Some of the most important aspects of SEO related to the quality of the content and that must be analyzed are the Value it provides, Retention,  Permanence and User Loyalty , Scannability , Duplicate text and Cannibalization between URLs.

In addition, in this block we will also analyze the strategy of Keywords, Architecture, Internal Links, 404 Errors, Broken Links, Redirects, URLs and Image optimization .

  • Value

The value  that the content brings to the user, that is, the level of satisfaction that the user obtains when consuming the content to meet their specific needs expressed through search.

When the content is perceived by the user as value, it usually performs positive actions for the SEO of the site, such as placing links in a natural way (essential for SEO), staying on the page and visiting other URLs, sharing on social networks, adding comments , recommend it to other users, etc.

It is a somewhat subjective factor, since there is no tool that directly measures the value of a URL, but you can measure factors such as time of permanence, bounce , links , shares in networks, positive comments from users, etc.

Remember

Put yourself in the shoes of your user and think if the content you have created is the best possible that you could offer or if, on the contrary, it is one more content that does not stand out too much in front of others. If you are going to attack a keyword and if that keyword is also very competitive, try to create the best content that exists.

  • Retention, permanence and loyalty

The degree of interest and satisfaction that the content can generate in the user to make them stay on the page for as long as possible and consider the content as ideal.

The more value you bring to the user, the more likely they will stay on your page and even come back another time, generating more recurring visits in the future and preventing the bounce rate from increasing .

To analyze the degree of optimization of these factors, you have Google Analytics , which will provide you with exact data on how users interact with your content (dwell time, bounce, etc.).

  • Scannability

The ease that the content offers to be read and understood by the user in a first quick reading, prior to a later reading in a more detailed way. It is important that that first quick reading the user can read and see the essential and valuable content.

If we place texts and hierarchical elements in greater size and visibility compared to the rest, in that first reading the user will be able to understand and assess whether the content is going to be of value for them to stay and read it.

In those visible and large elements, you should place messages that bring interest to the user and make them make the decision to stay in your content. In other words, it is not only about adding value, but about reinforcing the appearance of value so that its value is evident to the user.

  • Duplicate

That the content is not literally duplicated or coincides in high proportion with other content external to the web or between the internal URLs of the web itself.

Google considers duplicate content as undesirable, that is, it prefers not to index the same content in many URLs so that each of them contributes value by itself and thus not have to show users the same content in different URLs, thus like not having to crawl URLs with very similar content to save extra crawling effort.

Duplicate content is sometimes unintentional, that is, it is not always a consequence of having consciously created similar or the same content, but it can also be due to:

  • Pages that group URLs with taxonomic or thematic criteria (category pages, tags, etc.)
  • Pages automatic on a list of items that will not fit all in one pages and distributed paging sequences
  • URLs with parameters of any type that are the same as the original URL and are not redirected
  • Not having the different versions of the domain redirected (domains that show the same under www or without www, http and https versions that show the same information as they are not redirected to the secure version, etc.)

How to fix duplicate content?

  • Generating different content (in all those URLs that are a real positioning objective and work different keywords)
  • De-indexing duplicate URLs if they are not necessary to position since they usually work the same keyword (cannibalization)
  • Using canonical to tell the search engine which is the main URL that it should index and take as priority among a set of similar URLs
  • Using the next and prev attributes in page URLs

With the free Siteliner tool you can easily analyze if there is internal duplicate content on the website.

  • Cannibalization

It is important that there is no more than 1 URL to work the same search intention on the site. That is, each search intention or keyword must be worked on a different URL of the site so that Google understands well which URLs serve each search intention, so that you can position better.

In addition, if you group in a single URL a whole global search intention (keyword and its related words), all the authority , links , shared in networks , traffic , etc., you group them in that single URL, instead of dispersing them in several , with which it is easier for you to achieve your positioning objective.

In this article about Search Console I explain how to detect if there is cannibalization between the URLs of your website, using the Performance functionality .

  • Content optimization (labels, density, etc)

In this section we analyze if the content complies with the correct structure of heading h tags , that is, that the content is structured following a logical format of tag h1 or title 1 for the main title, titles 2 or h2 for subtitles or headings main titles, 3 or h3 for subheadings, etc.

It is assumed that in these heading tags or titles h we are going to place enough relevant words from the content ( keywords and related words), which is beneficial for Google to understand that those words are important.

The <title> tag is also super important , which is usually the one that contains the title (and the keyword) and that usually coincides with the title of the h1 , although not always. This is the most important tag of all in terms of relevance to Google. Also, what you put in title then appears as a title in the Google Search Results (SERP) box.

Organizing content in this way is not only good for the correct understanding and scannability by the user, but also for Google, since the title h tags and the texts that you include within them are relevant to the search engine.

In addition, we must analyze the density of keywords within the content, that is, the total number of times the key expression appears written compared to the total number of words in the content. The keyword density should not be very high and should tend towards naturalness , to avoid falling into Keyword Stuffing .

Apart from these, there are other tags that you can add to your content to optimize SEO and improve how it is displayed to users on other sites apart from your website, to increase clicks and traffic. They are the social graph labels (open graph and twitter cards), which control how your content appears on social networks (images, titles, descriptions, etc). You can add them with the Yoast SEO plugin .

You can also add rich content tags or Rich Snippets from Schema.org, which optimize the appearance of SERPs on Google with additional content, which can increase your CTR (Click Thorugh Rate). About this factor I will talk more fully in another section.

Be careful, don’t forget that Google can only read content written in text format . The texts painted on images, multimedia elements such as video or audio, Slideshare presentations and other elements of this type will not be read by Google along with the rest of the content on the page. For this reason, always think about the possibility of transcribing into normal and readable text by Google.

A free and very easy-to-use tool to analyze these types of tags and many other important elements of your site is Seoptimer .

Important

Gone are the times when to position a keyword you had to repeat it over and over again within the content. Google values naturalness more every day . Write naturally, without thinking too much about repeating the keyword and without forcing anything, use lexical richness and synonyms. That is, write for the user … and you will be writing for Google.

  • Architecture

The architecture web is a very important factor and not always working correctly on the web. The way in which a website structures its sections and groups its URLs, the depth of the directories in which the URLs are located, how the information on the site is ranked, how the URLs are internally linked to each other, etc.

How does web architecture affect SEO? Its incidence is studied from three levels: on the one hand it influences the understanding and ease of navigation of the user on the web (essential for conversion), on the other, in the way in which search engines crawl and understand the site. And finally, in how the internal authority of the web is distributed among its various URLs.

Web architecture is not only worked at a structural level , but also from a semantic level . That is, the keyword research and the key terms that are going to be worked on on the site can determine its structure and organization to a certain extent.

When you analyze the web architecture in the SEO Audit, make sure that the site is well structured according to usability and semantic criteria .

Important sections, either because there is a broad search intention on them or because they are a priority positioning objective , are usually placed in main menus and upper areas, are usually very internally linked and are usually few levels of clicks (and not very deep in the directory tree) relative to the home page.

It is recommended that on the website the relevant information for SEO is not more than two or three levels of clicks with respect to the main domain, that is, do not put the relevant contents in URLs of the type domain.com/directory/directory/directory / content-slug. This will favor them in tracking ease and obtaining internal link juice .

Below you can see an infographic that shows a typical web structure with a well-organized horizontal architecture with a maximum of 3 navigation levels. (Be careful with creating excessively flat structures where all or almost all the URLs are hanging directly from the first level, as you would be dispersing the internal authority between the URLs without ranking by importance and offering a navigation that is not very structured and difficult to understand for the user).

Here is an example of a vertical architecture , less organized, with too many levels of navigation and even some orphan URL (not linked internally). This structure makes it difficult to crawl and obtain link juice on URLs that are at too deep levels.

Usually, websites created without taking SEO into account have been built following somewhat arbitrary or capricious criteria according to the will or personal taste of the client or the developer. This is common when a project is developed without the figure of an SEO from the beginning who makes suggestions or sets clear architectural guidelines .

When this happens, it is necessary to carry out a global review of the web with SEO criteria and restructure all the content and sections. Therefore, in all web development whose objective is to attract organic traffic, the indications and recommendations of the SEO professional should always be followed.In all web development, the indications and recommendations of an SEO professional should always be followed from the beginning, to avoid architectural problems that have to be corrected afterwards, with the consequent financial and time costs

You can easily analyze the web architecture and its directory structure with Screaming Frog , using the View> Directory Tree Graph functionality.

  • Internal links

The internal linkage is an important factor ahead of the SEO and must also be analyzed in the audit as an essential aspect for good positioning. Internal links convey authority between internal site URLs.

It is advisable to promote internal linking to give strength to the areas you want to be more relevant within the site, sending link juice to the URLs you want to reinforce.

As we have seen above, internal linking is also part of the web architecture and helps to organize and distribute the internal authority of the web, as well as to facilitate crawling by the search engine.

In addition, it benefits the decrease in the bounce rate by recommending other URLs of the site to the user to visit.

To analyze the internal linking of a website you can use tools such as Search Console (free) or Ahrefs (paid).

  • 404 pages

In the audit you must analyze the 404 errors that your website has. But, contrary to what some believe, 404 errors are not penalizing or generally as troublesome as you might think.

What is a 404 error ? It is a URL of the site that no longer exists or is incorrectly written in an internal link or poorly linked from another website, so that instead of content, what you see is a 404 status page, which you can customize as some sites do Web.

The reasons that you have a 404 error may be: that you have deleted or changed a URL, that it has been incorrectly linked externally from another site, or sometimes it may be that you have a plugin or application installed on your website that generates URLs of dynamic way that they disappear when you uninstall them.

The main problem with 404 errors is that there are relevant URLs on the site that users cannot see, such as important site content, product or service pages, landing pages, etc. In this case, the user could get frustrated by not being able to see the content and, as a consequence, leaving the site, not buying your product or hiring your service, not sharing, not linking, not commenting, not recommending. Ultimately, if there is no content in the URL, there is no user interaction or conversion or benefit for your brand.

Another possible problem, in the case of websites with many URLs, is that you have many internal links to pages with a 404 error, which would force the search engine to consume crawl budget unnecessarily. In other words, you would waste time crawling by following internal links that lead to error pages.

How to fix 404 errors?

Simply do 301 redirects from the error URLs to the desired site URLs. Generally, towards the equivalent or similar URLs, or failing that to the home page. 

You can make redirects easily with the free Simple 301 Redirects plugin .

It is important to note that it is not necessary to redirect all 404s . Only the relevant ones, that is, those that are generated by pages of the site that previously existed and that had a certain page authority, links, shares, traffic, etc.

You can monitor 404 errors with tools like Screaming Frog  and Ahrefs (paid) or Search Console (free). To customize your 404 pages like in the fun examples that I have put before in the link above you can use the free 404page plugin .

  • Broken links

The outgoing broken links (links from the site pointing to another site and that do not work because they have been mistyped or because the web or destination URL no longer exist) are negative for the user (because they can not access the link giving an image little care) and for Google (since it conveys the idea that the site is poorly optimized and provides poor quality)

Also, just in the case of a lot of broken links, you waste time tracking down links that are useless, and this can affect your crawl budget .

Linking to relevant sites with a similar theme is positive for Google because it helps it to track and also adds value to the user, therefore it is a factor that you must take care of and of course analyze it in the audit.

To detect broken links on a website, you can use the Screaming Frog tool . You also have a free tool called Broken Link Check that works very well for doing this analysis.

  • Redirects

The most typical redirects are 301 (permanent) and 302 (temporary). A redirect is used to send user traffic from an old URL that no longer exists to a new one. The redirect, if permanent, also transfers authority from the old URL to the new one.

In the SEO Audit we analyze the redirects that the web has implemented both at the domain level  (complete redirections from an old domain to a new one) and specific URLs (pages or entries that no longer exist, products, etc.).

When we do redirects we can slow down the loading of the web by forcing the browser to reload another URL, or even several if it is chained redirects. 

Obviously, the incidence in the loading time is only relevant for SEO if there are many redirects or there are specific problems in these redirects at a technical level.

It is advisable to carry out the redirects with the htaccess file on the server, since they are the ones that consume less resources compared to other types of redirects with javascript, HTML, etc.

Another important aspect of redirects is that they are made to the correct URLs, bearing in mind that with redirects you transfer authority from one to the other.

Redirects can be analyzed with the Screaming Frog tool . You also have free alternatives such as the Link Redirect Trace extension for Chrome, or the HTTP Status application that will return all the status codes that a specific URL has.

  • URLs

In the section on URLs we will analyze if the URLs are friendly and include the keyword of each page that we want to position.

Every URL of every page on the site has a slug . The slug is the part of the URL that follows the domain. In the example URL example.com/slug,  the slug is what follows the domain.

It is recommended for the positioning that the slug is friendly , that is, that it includes the keyword , word or key expression (separating the terms by means of hyphens) around which you will want to position each page of the site, and that it does not include characters strangers, numbers, stop words such as articles, conjunctions, prepositions * , etc. It is also recommended that they tend to be short .

* This aspect is quite debatable, because sometimes the inclusion or not of a preposition can change the meaning of the search.

You can easily analyze all the URLs on a website with Screaming Frog . You can also use the command o footprint site: followed by the domain to see one by one all the indexed SERPs of the site in Google, including the URLs. For example:  site: aulacm.com .

  • Images

How to know if the images of a site are optimized for SEO? Images are optimized, on the one hand, in terms of size and weight . On the other, in terms of its semantics .

The first of the optimizations seeks to speed up the weight of the image by reducing unnecessary resolution to be seen on screens. A medium-low quality of pixels per inch is usually sufficient. Also size is important. If an image is going to be seen on your page at a size of 300 × 300 pixels in width and height, upload it to that exact size so that it weighs less.

Regarding the second aspect, it is necessary that the image includes text data that can be read by the search engine, since to this day the texts overwritten in the image are not yet taken into account by Google.

The text fields that you can include in the images are title , alternative text (alt) and description . The first two are relevant and it is recommended that you include the description of the image and keyword. The description serves to display a short text in the search results box next to the image.

Another aspect that helps is the name of the image file . If this name includes the keyword next to the file extension, you are providing more data to the search engine and therefore helping it to be positioned based on that keyword.

The context of the image can also help , that is, the text around the image within the content. It is easier to position an image that is inserted within a text related to its theme than an image URL without anything else.

 As in many other factors, the Screaming Frog tool can be used to analyze how optimized the images are for SEO, that is, if they include these text labels necessary for positioning. 

To see if images are optimized for weight and size , you can use GTMetrix , which provides endless performance data for elements on a web page.

2.2. Keywords

What keywords does the project currently position or try to position? Are they correct according to the objectives of the company and the current possibilities of the project or should I be trying to position others? What keywords does the competition position?

Focusing SEO actions around the right keywords is crucial. If you are trying to rank for the wrong keywords, irrelevant to your business, too difficult to rank or with negligible search volumes, you are probably wasting time and resources for nothing.

Therefore, it is of paramount importance that before developing your content plan and even before structuring the architecture of your website, you do a thorough Keyword Research in which you obtain the appropriate keywords to position your project.

From this, you must develop your content strategy for the future and organize your website with main sections and articles based on the correct keywords.

A web project that has not done a keyword search before may not rank any relevant terms or may do so accidentally, which denotes little strategy and possibly will yield few results.

Writing content without taking into account the search intentions of users who could potentially be interested in your articles, products or services, is a waste of time and possibly a guarantee of failure in the short and long term.

As I have already told you in the content optimization part, once you have found the keywords  you must distribute them among the URLs of your site following logical criteria for the user and also SEO architecture. And always keeping in mind that each indexed URL of the site must work a keyword or search intention separately, to avoid cannibalization .

How to optimize your keyword within the content to improve SEO?

  1. Write naturally , do not force repetitive writing of the keyword
  2. However, write with a certain density of keyword, and place it in relevant places on the page (titles, headings h, alt, anchor texts, first paragraphs, etc.)
  3. Write a minimum amount of content at least, say 300 words but even if you can write more, do it. The key is to satisfy the user’s demand for information, not simply to reach a predetermined number of words
  4. Write synonyms of the keyword throughout the content, because although they seem different keywords, in reality they respond to the same search intention and probably also have certain search volumes, because although they are less searched than the main keyword, perhaps some users use them.

In this article I leave you with a lot of free tools to search for keywords, such as Keyword Planner and many others. With the free Yoast SEO plugin you can measure a lot of aspects of the content, including the appearance and overall density of the keyword in the URL. With Keyword Density Checker you can also measure its density within the content, that is, the percentage of times the keyword appears compared to the rest of the words in the content.

In summary

The analysis of all these factors can lend itself to a certain interpretation, regarding the undeniable factor of subjectivity involved in calibrating the value that a content offers without being the target user of that content.

Despite this, there are objective elements of analysis that can serve as guidance: the average time the user stays on the page, the bounce rate , the natural and shared links in networks that it has generated, the number of comments and mentions, etc.Good content makes you earn links in an organic way without having to buy or request them, it allows you to go viral on social networks, it allows you to increase the time the user stays on your website, increases user interaction and, very importantly, can help you to increase conversion rates .

This double objective / subjective dimension must be applied critically when assessing whether the content offered by a web page is really fulfilling its objectives: to attract , retain and satisfy users.

2.3. CTR

The CTR  (Click Through Rate) or clicks that you get on Google with respect to the total number of times your printed result comes out, is a very important factor that of course has to be analyzed in the SEO Audit, since it has a crucial impact on positioning.

If you detect that your results ( SERPs ) have a low CTR, you should ask yourself if they are attractive and explanatory enough to attract the attention of users and receive clicks.

There are studies that make approximate averages of how much CTR is reasonable based on your position. Although this obviously varies depending on the sector (existing competitors, user behavior habits, etc.) and the types of results that Google shows for the query.

Nowadays Google is already showing outstanding results that throw very low the first results that before were seen at the top. There are also results in map formats, rich snippets, etc., which in some cases change the rules of the game a lot.

Suppose that, in your sector, the normal thing is that the third result achieves an average CTR of around 15%. In that case, you must analyze if your results that are in that position are receiving the adequate or minimally acceptable CTR.Since CTR is a key factor, it pays to think about and develop strategies to optimize it as much as possible. 

How to improve the CTR of your results (SERPs) in Google?

  1. Place very attractive titles with words of value and calls to action for the user, using all the available extension
  2. Write attractive and explanatory meta descriptions about the value of content.
  3. Use prominent and attractive visual elements in your titles and meta descriptions, to catch the user’s attention, such as emojis, special characters, etc.
  4. Use enriched elements (rich snippets) whenever you can, to provide more specific and specialized content to the user and also visually stand out (recipe snippets, events, articles, reviews, star ratings, etc.)
  5. Remove the dates if your content is evergreen or timeless, to prevent the user from thinking that it is obsolete content and prefer not to click on you.
  6. Use copywriting techniques that work and have been tested by other professionals.
  7. Test , analyze, observe, try again and again changing your own SERPs until you find the most suitable formula for the particularities of your sector.
  8. Look at the SERPs of the competition and be inspired by what they do well (be careful, I did not say copy as is) and what they do wrong (so as not to repeat it).
  9. Arouse great interest with the texts of your SERPs, try to respond very clearly to a specific question or problem, be forceful and clear when expressing the value of the article (although without going over the top of the story).
  10. Offer valuable content that is perceived by the user as valuable, do not try to give a too commercial touch to the content that serves to provide pure value.
  11. Work on your brand image globally, create good content, be active on social networks, respond to blog comments, to generate a positive perception of your brand that users remember when deciding what result to click on.

With the Search Console tool you can easily analyze your CTR for all the URLs on your website. In this post about the new version of Search Console  you have a step-by-step tutorial to monitor and optimize your CTR through the Performance functionality , with which you can not only keep track of your global average CTR, but also of all the URLs separately.

Above all, it can be very profitable to optimize the CTR of URLs that are potentially good but have little worked SERPs , or of URLs that are already relatively well positioned (first or second page of Google) and that despite this are having less clicks of those expected, especially in the case of URLs that position for keywords with many searches (impressions).

In the example, there is a page that is between position 7 and 8 (7.5) but has a CTR of 1.5% (only 22 clicks) of the total 1502 impressions. 

In other words, optimizing the SERP of that URL for the keyword “how wordpress works” can achieve good results, possibly positions will be raised and that will continue to improve the CTR, for a keyword that can potentially get many clicks.

3. Inbound Links / Domain Authority

In phase 3 of the analysis we are going to analyze inbound links (backlinks) and domain authority :

3.1. Inbound links

The inbound links or backlinks  (links to get other websites to your website) are a crucial factor SEO forever and now, of course.

Since Google was born, the quality and quantity of inbound linking that a domain has is one of the most relevant and direct factors that Google uses to determine its positioning. Links transfer popularity from one domain to another, or from one URL to another.

Therefore, it is important to analyze that your domain has a good number of inbound links of a certain authority (see next section on authority), natural-looking, of the follow * type (those that transcend popularity) although not all, and in general that they conform a healthy link profile with references from as many relevant sites as possible related to your topic .

* A follow link is taken into account by Google for the popularity transfer. A nofollow link, on the other hand, does not transfer authority or very little. You can see if your links are follow or nofollow with Ahrefs  filtering by type of links or also by hand directly on the link, with the Inspect functionality of Google Chrome:

As you can see in this example, in the code inside the rel attribute the value nofollow has been placed , so that this link is not followed by Google from the site of origin to the destination.

For all these reasons, it is important to monitor the profile of inbound links that the domain has, to evaluate the quality of the links, and also their health or appearance of naturalness . A toxic or artificial looking inbound link profile could be penalized by Google Penguin and hurt your SEO.

Sometimes, especially if it has not been monitored before, the profile of inbound links of a domain can be full of undesirable links, either due to random reasons, either due to external attacks, or due to an own action of buying inappropriate links .Google says: make your links natural . This does not mean that Google can know that you have bought links, but it certainly does analyze your links to see if they are apparently natural or not.

Here is a very useful post to know everything there is to know about links in SEO . 

Above all, you must have a certain diversity  of follow and nofollow links, preferably from websites with a theme similar to yours and from the same language as your site, and achieved over time and not all at once, as this type of Links are the most natural.

You can analyze your inbound link profile with Search Console (free) or Ahrefs (paid). Another free link monitoring tool is Ranksignals .

3.2. Domain authority

The authority , relevance or popularity of the domain can be analyzed from different points of view and different tools. In general, an authoritative domain is one that has good links, signs of good SEO, and some age.

To evaluate the relevance of a domain, Google has always taken into account its own metric, the PageRank . Today it is impossible to know this value, which means that there are other metrics or criteria to estimate the authority of a domain.

You can measure your domain with the Domain Authority of Moz , a fairly reliable metric to determine the health and quality of a domain. 

You can also use payment tools such as Ahrefs , whose metric, Domain Rating or DR , is also quite reliable to estimate the quality of a domain.

However, you should bear in mind that these metrics are performed by tools external to Google and they are still estimates based on a series of criteria. Do not take these results as something absolute, but rather as an approximate indication of the current SEO status of the analyzed domain and its evolution over time.

4. Performance / Adaptability / Usability

Phase 4 will serve to analyze the performance and speed of the web (also called WPO , Web Performance Optimization ), and how is its responsive adaptability for different devices, as well as its degree of usability or ease of use for users.

4.1. WPO performance

In the SEO Audit we will also analyze the performance of the web page, that is, its loading speed . This is today an important factor that must be measured and optimized, as Google has already made it clear that websites have to be fast and load without problems.

There is no maximum load time that Google has defined, therefore the slogan is: try to make the web load as fast as possible, identifying and separating all the elements and components that, despite the fact that they may be weighing down speed, They are essential, and all those that are not so necessary and you could eliminate to gain speed.

There are numerous factors that can affect the loading speed of your web page, which I will list and explain below:

What are the main factors that influence the loading speed of a website?

  1. HostingThe server where you host your web page is one of the main factors that determine a fast or slow load. I recommend you host your website in specialized hostings for your system or CMS, with good performance and security, as well as good support.Changing from a bad or regular hosting to a good one is one of the simplest actions that you can take to improve the loading speed of a website and that you can recommend to your client before carrying out other optimizations on their project.Good hosting includes advanced features or enhancements such as Gzip compression, Keep Alive, advanced Cache systems, etc., which are excellent for improving performance. Here I leave you a comparison of hostings so that you can choose the most suitable for your project. In this other link you have an in-depth analysis of the best hosting for WordPress , in the event that your website is built with this popular CMS.With this tool you can see in which hosting a web page is hosted:  https://www.whoishostingthis.com .
  2. Template and pluginsIf you are using a content manager or CMS type WordPress or similar, it is also important what template or theme you use, as well as the plugins or modules that you add to the web to include extra functionalities. A slow template will make speed optimization quite difficult, as well as using very heavy plugins or installing too large a number of them.
  3. ImagesAnother simple aspect to detect and solve. Sometimes a website is slow because its images are not optimized in size and weight , and this is one of the essential actions that every web developer must perform. Do not upload images larger or in higher resolution than necessary. You have free plugins like EWWW Optimizer  or WP Smush that go very well to reduce weight.Regarding the size (width and height), never upload the images without first passing them through Photoshop or similar tools and leaving them in the exact size. If the problem is that your WordPress generates incorrect sizes automatically, you can use this plugin to redefine the automatic versions of size that are generated when uploading images: https://es.wordpress.org/plugins/regenerate-thumbnails/.
  4. CacheInstalling a cache system is one of the most recommended methods to improve its performance and response speed. For this, there are plugins like WP Rocket  (paid)  Fastest Cache  (free), which do real wonders in the performance of the web. A cached web serves its static elements more quickly because they are already preloaded.
  5. Use CDNA CDN improves the delivery of your website data to the user thanks to the fact that it uses data centers spread over different geographical locations and all kinds of improvements to increase the loading speed and the fluidity of the server’s response.
  6. Minimize the codeReduce the amount of code on your website by cleaning up unnecessary code or reducing the space it occupies by minifying code, speeding up its reading by concatenating lines of code. WP Rocket works very well for this, although being paid I leave you a free alternative: https://wordpress.org/plugins/bwp-minify/.
  7. Asynchronous loadingLoading certain elements of the page in a delayed manner avoids a jam or bottleneck in the header of your website due to the excessive accumulation of requests to Javascript files and other types of files, by loading the elements a little after the start of the charges without the user being able to perceive it, but improving the initial charging speed. You can implement asynchronous loading with WP Rocket.
  8. Optimize the databaseIf you regularly clean the tables and unusable and obsolete contents of the web database , you will speed up the queries that are made from the web and in this way you will improve their speed. When you uninstall plugins sometimes tables remain in the database, also when you make revisions or save your content. With this plugin you can easily optimize the size of your database: https://es.wordpress.org/plugins/optimize-database/.
  9. Loading Images and Videos (Lazy Load)Make the images and videos not load all at once but load progressively as the page is scrolled, avoiding overloading the initial load with too many multimedia elements. You can implement lazy load with WP Rocket (paid) or with this other free plugin: https://es.wordpress.org/plugins/lazy-load/.
  10. Calls to external servicesAvoid installing too many elements on your website, especially if many of them are superfluous or not really important. Keep in mind that you slow down your website every time you insert content from social networks, videos (be careful, I’m not telling you not to embed them, but rather control the amount), plugins to connect with external tools or applications, code scripts to insert functionalities on your site, etc.
  11. Review and update the website periodicallyPerform optimization and update tasks of your CMS, your template and plugins from time to time, so that everything works perfectly, avoiding slowdown and security problems, as well as incompatibilities between the various components of your website, all of which are sometimes they can make it difficult for your website to load smoothly.

In the link that I leave below you have a lot of tips to optimize WordPress and make it work super fast. In this other link you have 4 real cases of WPO optimization .

You can measure the loading speed of a web page with free and very complete tools such as GTMetrix , Pingdom Speed ​​Test  or Page Speed ​​Insights .

4.2. Responsive adaptability

Google analyzes whether websites are  responsive as a factor for positioning, and this has been the case for a few years, so in the SEO Audit it is also essential to analyze how adaptable or responsive the website is.

In the SEO Audit, we are going to analyze whether the web adapts correctly to each screen size and device (desktop, tablets, smartphones, etc.), not only because Google takes it into account, but also for the user .

Don’t forget: many SEO factors are optimized not only to meet Google’s guidelines , but also for the user . Because the user, through their way of interacting with your website (dwell time, bounce, CTR, etc.) is sending signals to Google about the suitability and value of that content, and this influences SEO because Google attends to these signals of user to rank.

If the website is not responsive or not at all, there may be users who on certain devices do not have a satisfactory user experience , which could affect the time they remain on the web and the actions they take within it, how to interact with the sections, generate conversions, etc.We not only optimize the responsive versions of the web to please Google : we also do it for the user.

Therefore, it is not only about having a responsive website (nowadays, any WordPress template or similar platforms already come with the responsive versions implemented), but about how adjusted and optimized in detail those responsive versions are.

 Sometimes, you have to make some adjustments to the template you have used to make your website with CSS or plugins to make the responsive versions perfect.

How to analyze if a website is responsive?

You can do two things: one, monitor the web directly on real devices , observing with total precision how the content behaves on that screen size. Or two, you can monitor it with desktop applications that simulate various screen sizes, brands and models of smartphones, tablets, laptops, etc.

You can monitor how a web page looks on various devices and screen sizes with the free Screenfly tool . You can also use Google Chrome’s Inspect command , which lets you preview how your website would look on a bunch of devices.

To do this, click with the right button of the mouse on any area of ​​your website, then click on Inspect and then click on the mobile devices icon at the bottom left.

4.3. Usability and User Experience

The third point of this block 4 of the SEO Audit is going to focus on usability and user experience , which although they seem the same, in reality they are not exactly the same concept.

What is usability?

Usability is the ability of a website (or any application) to be correctly understood and used by the user.

 In other words, a usable website is one in which the user knows how to use its functionalities as easily as possible and without serious problems of understanding, operation and navigation through the different areas of the site. ? A usable website is intuitive and accessible to the user. 

A usable website does not require too many explanations of use and navigation for the user to navigate through the menus and find what they need without problems.

 A usable website has texts in an easily legible size, font and color.

 A usable website allows the user to understand the objective or purpose of the website, as it offers understandable and coherent messages for the user. 

A usable website loads smoothly. A usable website does not show interrupting or excessively invasive elements for the user. A usable website can be navigated and understood the same on any device or screen size, be it on a desktop, tablets or smartphones. 

A usable website allows a satisfactory user experience, and that benefits SEO because a satisfied user tends to stay longer on the page, interact more, ? An unusable website is not very understandable,  not very intuitive and not very accessible to the user. 

An unusable website can work abnormally or unexpectedly, load slowly, or make it difficult for the user to find the sections they want and the correct interaction with the menus, as well as any type of interaction with forms, buttons, applications on the web, etc. An unusable website makes it difficult for the user to understand the objective or purpose of the website, since its messages are contradictory or poorly explained.

 An unusable website displays elements that interrupt navigation or annoy the user. 

An unusable website has poorly legible texts in small sizes or with poorly legible fonts, in poorly contrasted colors with respect to the background, with insufficient line spacing, etc. An unusable website does not behave correctly on all devices and screens.

What is the user experience?

The user experience refers to the degree of satisfaction or dissatisfaction previous usability factors generated on the user, prompting him to stay on the site, interact, perform positive actions for SEO and generate conversions if the site is usable, or quite the opposite if the site is not usable.

Therefore, if  usability refers to how the website favors or not the use of it by the user, the user experience refers to how the website has had a positive or negative impact on the user based on all those factors.

And consequently, a positive user experience is beneficial not only for your brand image and your sales or conversions, but also for SEO, since a user who is satisfied or who has managed to appreciate or understand the objective or value of the web, usually generates better signals or clues that affect SEO:

  • More time spent
  • Less bounce rate
  • More CTR
  • More visits to internal pages
  • Most shares or recommendations
  • More form submission
  • More natural links
  • More conversions (leads, sales, etc)

The usability and user experience derived from the level of usability can be measured from many perspectives and with different tools:

You can monitor the exact behavior of the user on the page with a tool like Hotjar, where they click most often, see their entire recorded sessions, etc.

More tools: You can check the color contrast of the texts with CheckMyColours . You can do a general test of various aspects related to the coding of the web with the W3C Validator . 

You can detect mobile usability issues with Search Console . You can measure the time it takes users to perform certain actions on the site with Usabilla . You can measure load times with GTMetrix . 

You can monitor how visually impaired people see your website with Vischeck . You can calculate the ease of reading and understanding of texts on the web with Word Count Tool, or the readability index withJuicy Studio . You can even collect user feedback on your website with UserVoice .

The goal of any measurement and analysis is always the same: that the website offers the most satisfactory user experience possible.

5. Code and Labels

Phase 5 of analysis is related to all technical aspects at the code level, specific coding problems in the different languages ​​or technical implementations that may be damaging or benefiting the SEO of the site.

5.1. Code and labels

In this section we are going to analyze the project from a technical point of view. On the one hand, we are going to scrutinize the exit code of the web (the one that Google reads in the public front part of the web), basically HTML, Javascript and CSS code .

We are going to start with the most common language of all webs, HTML . The HTML code generates the visual structure of the web page, and also uses meta tags to send information to search engines and browsers, as well as specific attributes within those tags.

What are the HTML tags and attributes that influence SEO?

  • title
<title> </title>

The most relevant tag of all and in which the keyword must be yes or yes, because it is also the title that appears in the results box that shows the URL in Google. It’s at the top of the page’s code and you can see its content if you hover over the browser tab for a moment. It usually matches h1, but it doesn’t have to. Title is a meta tag (but not to be confused with the meta title tag, which is no longer useful!), That is, it is not visibly displayed on the page but is to be read by the search engine.

  • meta description
<meta name = "description" content = "">

This tag defines the description that will appear in the Google results box, under the title. It does not have a direct importance in SEO (not by putting the keyword in the meta description you will reinforce it more directly), but it influences the CTR (Click Through Rate), with which you should optimize it for each URL that you are going to index of the site. Optimizing it means: filling the 155 characters it offers you with an explicit, attractive and persuasive description of the content, using rich and visually prominent elements. It is about winning the click against your competitors.

  • Headings or titles <h>
<h1> </h1>
<h2> </h2>
<h3> </h3>
<h4> </h4>
<h5> </h5>
<h6> </h6>

The h title tags (h1, h2, h3, h4, h5, h6) mostly serve two purposes:

On the one hand, they visually format the content headings in a hierarchical manner, displaying them from largest to smallest size and thickness as you go from the <h1> tag to the <h6> tag. That is, the h1 usually looks very large because it contains the title, and the h6 is usually the smallest (although it does not have to, this could easily be modified with CSS). A correct visual hierarchy helps the content to be more visually scannable (that the user can understand the content quickly, in a first fast scroll reading) and this can benefit the dwell time, bounce, conversions, etc.

On the other hand, the title h tags have direct relevance in SEO (those with the most <h1> and those with the least <h6>), and it is recommended that you place important texts in them: keyword, synonyms of the keyword, etc. You can only have one <h1> tag per URL (in the title), then the content headings in <h2> tags, <h3> for subheadings, and so on down in relevance to <h6 <(not it is essential that you use them all).

  • alt and title of images
<img alt = "" title = "">

The alt and title attributes are used so that Google (and also the user in certain cases) can understand the images semantically, that is, they are the texts that Google reads from the images. Therefore, in alt and title you must put a description of the image with keywords, so that it serves for the accessibility of the site and also for Google.

  • og: (Open Graph)
<meta name = "og: title" content = "">
<meta name = "og: description" content = "">
<meta name = "og: image" content = "">
<meta name = "og: url" content = "">
<meta name = "og: site_name" content = "">
<meta name = "og: locale" content = "">
<meta name = "og: type" content = "">

Open Graph tags add valuable meta information for social media and other platforms about your content. Using the Open Graph of image, title, description, site, etc., you can define how you want your URLs to look when they are shared on networks. That is, you can define the content that appears in the snippet automatically when someone or yourself publishes a web URL on networks (be it a post, a page, a product or anything else).

The contents of a website tend to generate greater interaction and clicks from social networks if they are well optimized, with attractive titles, explanatory and persuasive descriptions, quality images, etc.

  • twitter (card, description, title, etc)
<meta name = "twitter: card" content = "summary_large_image">
<meta name = "twitter: title" content = "">
<meta name = "twitter: description" content = "">
<meta name = "twitter: site" content = "">
<meta name = "twitter: creator" content = "">
<meta name = "twitter: image: src" content = "">

In the case of Twitter , it has its own labels to show the content on the network, as the open graph does for the others.

  • lang
<html lang = "">

The lang attribute of the general <html> tag is defined at the top of the document, and is used to indicate the language in which the page is written. If the site has different language versions, this attribute must always declare the language of each URL.

  • hreflang
<link rel = "alternate" hreflang = "">

The hreflang attribute of the <link> tag declares, in the code of each page, what are the different idiomatic versions that a given page of the site has, and also indicates which are the URLs where these other idiomatic versions of that page are located.

In this way, Google can perfectly understand the internal structure of the multilingual site , to show each indexed URL in the correct language version of the search engine.

  • meta robots
<meta name = "robots" content = "" />

The meta robots defines whether or not a specific page on the site should be indexed or not, and crawled or not (using the index-noindex, follow-nofollow values ). If you insert the value index in the content attribute , Google will index the URL. If you also insert the follow value , Google crawls and follows the link trail. The values ​​are put in pairs separated by commas, as follows: ” index, follow”, or “noindex, follow”, and so on.

  • canonical
<link rel = "canonical" href = "" />

The link tag with the canonical rel attribute  defines the uniqueness or main relevance of a URL with respect to other similar ones or for having similar content, or being part of a series of similar URLs.

  • link title
<a href="" title=""> </a>

This attribute of the link <a> tag is used to provide additional information about the link, and Google also crawls it, as well as the anchor text. In both cases, using the keyword can be beneficial to boost the positioning of the destination URL around that keyword.

  • structured microdata
Example: <div itemscope  itemtype = "http://schema.org/Recipe" > ... </div>

The microdata structured each day are more SEO useful because help Google understand the content with high accuracy and thus to serve as a search result more appropriate and accurate for the search.

In other words, if for example I add structured microdata from Schema.org  to a recipe on my gastronomy blog, I am making Google understand perfectly that it is a recipe and possibly show it in its results in recipe format.

The benefit is twofold: on the one hand, it is information that helps Google better understand the semantics, and on the other, the attractive and specific appearance of the SERP for that search can lead to more clicks, that is, increase CTR and visits.

  • Attributes rel dofollow / nofollow / ugc / sponsored
Example: <a href="" rel="nofollow"> </a>

Links on a website can be marked with the rel attribute to understand the relationship of the link with the website it points to.

For example, a link with a nofollow attribute does not transfer authority to the linked site (although this will be debatable as of 2020 as it will go from being a directive to a suggestion), compared to a dofollow that does (or simply, if there are no attributes in the link , it is considered dofollow by default).

In March 2020 Google will also introduce other attributes such as ugc (links posted by users, such as links in comments and forums), and sponsored (to mark those links that are the result of a payment or commercial relationship that is not completely natural).

How do you know if you have the necessary tags for SEO and how to put them if they are not there?

To analyze a website and monitor whether the tags and other code elements are correctly implemented, you can use a tool like Screaming Frog (paid), or free tools like Woorank , Seoptimer , or extensions like SEO Meta in 1 Click .

As you can see, the options are multiple, and they make the task much easier. However, you can always go to the web, right-click on it and click Inspect to see the code in pure format.

In this way, you will be able to see if you have the correct title and heading tags (h1, h2, h3, etc), meta description, image alt, lang and hreflang language tags, open graph, twitter cards, canonical and other elements of HTML code required. You will also be able to see the Javascript codes (especially to check the insertions of scripts such as Google Analytics, Tag Manager and other external tools that you implement on the site).

Another alternative is to right click> View source code of the page, and you will see all the code of the page (HTML, JS, CSS, etc).

What is the price of an SEO Audit? How much does it cost to audit a website?

The price of the SEO Audit depends on the size and difficulty of the project and the areas to be analyzed.

complete audit of a website can be done or specific aspects can be audited on which there are suspicions that they are generating problems or that need improvements.

When calculating the price of the SEO Audit, you can ask yourself the following approaches.

  • How many professionals will be involved in the process?
  • How long will it take depending on the specific difficulties that the project presents?
  • What is the degree of difficulty that the diagnosis will have?
  • How many URLs does the website you are going to audit have and how complex is its architecture?
  • What will be the degree of detail and depth that you are going to apply in the study?
  • What is the system or CMS that the web uses and its degree of difficulty?
  • How many languages does the page have?

That is to say, it is not the same to audit a website that presents well-founded suspicions of having received a penalty than to do it on a project that has been taking little time and that has not yet been able to make serious errors.

In the same way, it is one thing to do a complete and in-depth audit of the site and from there to use it as a detailed roadmap for the execution of the improvements in the future, and another thing is to prepare an orientation document about the status of the web project as a first approximation for a later, more advanced analysis.Each professional or agency may have their own way of calculating the SEO Audit budget , and sometimes the prices vary a lot depending on other factors such as the technical level and prestige of the professional or agency, or the degree of responsibility that the project poses. depending on the size and importance of the client.

In general, analyzing the prices of agencies and professionals in the sector in Spain, the price of the SEO Audit  usually varies between € 200-€ 300 the cheapest, in which an approximate analysis of the general state of the web is made, and € 1000-€ 1500 or more depending on the complexity factors discussed above.

In some cases, especially at the beginning when there is still not much confidence or has not yet decided whether to hire your services, the client may have certain reluctance to give you access to their project from within, that is, to the desktop or backend of the web, analytics tools, reports, etc.

This means that on these occasions you have to be more inclined to carry out an approximate type of audit at first, until you get the accesses and keys you need to analyze everything more thoroughly.

With the SimilarWeb tool you can obtain approximate analytics and traffic data on any domain you want to analyze, the closest thing to having access to your Google Analytics account so that you can perform a rough audit if you still do not have access to the client’s Analytics account .

Should I do the SEO Audit for free?

I believe that there is a premise that we all share and that should not raise any doubts: no professional should work for free, since it is fair to receive compensation for your work and your technical knowledge. And an SEO Audit is a service of high technical level.

However, sometimes you can perform part of your services for free in a first phase in order to get a client to whom you will later do the rest of the work and, therefore, collect your fair remuneration.

That is, it is a strategy to attract your client by offering a less detailed demonstration of the service so that they can assess your ability as a professional, in a context that is undoubtedly competitive.

We can call this step pre-audit . You offer the potential client an approximate document as a first diagnosis about the obvious problems that you have detected on their website, so that it is clear that you know the problem of their project and the way to solve it.

It is important that this first document is complete enough to demonstrate your level of technical and strategic knowledge about the project, but incomplete enough so that it is not a valid roadmap that serves the client to use it to their benefit by hiring the services to another cheaper professional or run it himself.

Herein lies the key to this strategy: it is about contributing enough to win the client but without giving away your work.

List of 75 SEO Audit Tools classified by area

In this section you have a list of tools to do SEO Audits , free and paid, for each of the audit actions. I am going to give you the most important ones and the ones that we use most often, although of course there are thousands of tools and using one or the other also depends on the taste of each one.

In many cases, Google’s own tools , such as Search Console, Google Analytics and Ads Keyword Planner, allow us to analyze a lot of things for free and reliably. In other cases, for more advanced things, it may be interesting to acquire a payment plan for super complete tools such as Ahrefs, Sistrix, Semrush, etc (many of these tools are available for free during the Aula CM SEO Course ).

 Tools for Indexability and Tracking Analysis

 Tools for Content, Keywords and CTR analysis

 Tools for Link Analysis and Domain Relevance

 Tools for the analysis of Performance and Speed, Usability and Responsive

 Tools for the analysis of Code and Labels and other technical aspects

Do you want to ask any questions or give your opinion?

Leave a comment if you want me to clarify a specific aspect that you have not understood. You can also contribute your point of view on how to do an SEO Audit, or tell us if you know other SEO tools, tricks, tips and anything that serves to enrich the content and thus we contribute together ?

Related Posts

Leave a Comment