Error analysis

All pages in a website should return a 200 code from the server. If so, that means that everything is correct and that you have it under control.

What other codes can I find, and how can I identify and fix them?

301 Errors: Within this block we can find the famous 301, which is the code that tells us when there is a redirect from one page to another . Redirects increase loading times when generating requests, so we should correct them all as much as possible.

It is also possible that you will find 302 redirects, which are those that we generate to indicate that a redirect is temporary and that in the future that page will be active again. In this way we do not lose the authority of the page.

404 Errors: Who Has Never Had 404 Errors ? When a page does not exist, it returns a 404 code generating a bad user experience. The solution is to redirect that page to another related one, in addition to correcting the internal links that are pointing to this already non-existent page.

Errors 500: This code warns you that the server could not process the page and therefore shows this status code.

When a page receives a 500 error it can be seen clearly, either by the typical blank page that we see or by the message that it usually tells us: 500 Internal Server Error

Sitemap and Robots

Configuring these two files is a fundamental part to indicate to Google all our pages of the site and where we do not want the Google bot to pass.

On the part of the sitemap, this is an XML file where it contains the structure of all the pages that make up the web .

In this file, only the pages that we want to index should appear and that contain the 200 code that we had previously talked about.

We can add the site map from the Search Console. Other optimizations that we can make to our sitemap are the following:

In case of having a site with many URLs, we can divide the sitemap by sections of the web: categories, articles, products, etc.

In case of having a multi-language site, we can create several sitempas for each language version.

As for the robots.txt file , in it we define which parts of our site we want it to crawl or not. Using the Disallow command , we can block those sections that we don't want the bot to crawl.

If we want to see the robots.txt, we can access it from any web page at the following URL: mydomain.com/robots.txt.

So far we have seen the main more technical SEO aspects to review during this audit.

Next, we will analyze all the points related to the Contents.

Content quality

The content is another of the most important parts of this branch of SEO since it is the users, and Google, who are going to interpret all the text. Therefore, the content that you are going to offer has to be well optimized.

Does it meet the needs of your potential users? Is the information well structured and relevant?

These are some of the questions you should ask yourself when determining the quality of a text.

Of course, in this content, keywords play a fundamental role, since it will be the search engines that will determine when your page should appear in Google for them.

Doing a keyword research, or Keyword Research, is the first step before you start writing.

In this keyword search you will be able to discover how many times a specific term is searched, as well as new opportunities that you may not consider.

keyword research in google planner

Other factors that fall within this optimization and we recommend you analyze are the following:

Duplicate: if you have repeated fragments of texts on different pages you can have duplicate content problems and as a consequence, Google can penalize you for it.

Density - The number of times the keyword appeard in the text matters here. We must try to be as natural as possible when writing content to avoid falling into over optimization or what we also call Keyword Stuffing .

Thin Content - Also called poor or poor quality content. All pages that are relevant for SEO must have a minimum text length so as not to be considered low quality in the eyes of Google.

Those that do not contribute anything at the SEO level, it is best to remove them from the Google index.

Cannibalizations: does your website have several pages that are very similar to each other and do they work with the same search intention ? If so, you may find yourself with cannibalization, that is, several of your internal pages are competing with each other under the same keyword.

Meta Tags

All the information that Google shows in its search engine are meta tags. The best way to indicate the content or theme of the page is to include these HTML tags, on the one hand the title tag, and on the other hand the description tag.

Some recommendations that we give you when creating these labels:

For meta titles:

The title must be unique , and not be repeated on other pages.

It should include the word key forever, and as far as possible to the beginning of the text.

The size of the title must not exceed 65 characters .

For meta descriptions:

Nor should they be repeated on more than 1 page.

The keyword should be included, in this case the location of the keyword is not so relevant.

It should include a call to action to attract the user's attention and thus increase the CTR.

The size of the description must not exceed 155 characters.

Images

How do we inform Google of the content of an image?

With the ALT tag we can assign a description according to the image and using the keywords that we are aiming for.

Add ALT tag on images

Remember that Google also has a main search engine for images available and your page could receive traffic from there.

In terms of loading, one of the first optimizations that we should pursue with this type of content is to reduce its size and resolution as much as possible, adapting it to the size of the page where it is going to be placed.



Join the Discussion
Write something…
Recent messages
ijeomaeze Premium
Thank you very much for this wonderful post. This training is really critical because one may be toiling hard every day thinking the that one has a great website without knowing that the website has a lot of problems that may prevent Google from indexing your website or article. You are right, it is necessary for one to audit website periodically. This is a great post Keny!
Reply
Keny44 Premium
Yea IJ, you got it absolutely right. Auditing website periodically is very important. It is so needful to avoid running into problems with search engines.
Reply
Pkizito1 Premium
Too good
Reply
Keny44 Premium
Thank you Kizito.
Reply
michaelw777 Premium
I am just going to suggest; you might do this again; in more detail? If you want to? You might do this again in more detail? And you might explain some of the language that some Members might not understand? No Offense meant here; trying to help.
Reply
Keny44 Premium
Hello.
Thanks for dropping your comments. Please kindly tell me the areas that are not too clear to you. Thanks.
Reply
michaelw777 Premium
You're Welcome; glad if I could help.
Reply
Parameter Premium
Thank you Ken for this instructive training,

conducting website audit should not be a one off thing, rather it should be consistent

Ayodeji
Reply
Keny44 Premium
hanks for your contribution Ayodeji. You got it absolutely right. Wesite Audit should not be a one off something.
Reply
emimos12 Premium
Every website owner must take this training serious,Seo audit is quite helpful indeed
Reply
Keny44 Premium
Thank you Emimos. It is Paramount to Audit our website from time to time and to desist from certain things which are not improving SEO.
Reply
Top