SEO Website Optimization

SEO Website Optimization Checklist: 18 checkpoints for successful implementation.

To conduct a self-audit of your website, follow the checklist provided below. In the article, we have outlined key aspects that need to be reviewed on your website before promotion or launch.

Technical Optimization

Any analysis begins with parsing. An in-depth analysis will immediately show problems: for example, how many pages are deleted, and perhaps among them there are important pages that collect traffic. Parser analysis will show how many pages can be indexed and how much of the URL is covered by unexpected index or robots.txt.

(Parsing tools ex: Screaming Frog SEO Spider 9.2, Netpeak Spider, and more)

1. Check robots.txt

Traditionally, we start by checking robots.txt – a file with recommendations for search engines about content indexing. File rules can be respected or ignored by robots.

What should be closed by the Disallow rule:

  1. Check that CMS login pages are blocked from indexing (“/bitrix”, “/login”, “/admin”, “/administrator”, “/wp-admin”).
  2. Check closing pages with session indicators.
  3. Technical pages – authorization, password change, ordering.
  4. The results of the internal search on the site.
  5. Versions of pages for printing.
  6. Closing pages with duplicate content, rss, feed.

What should not be covered:

  1. Whole site.
  2. Part of the site.
  3. Separate content pages.
  4. Product cards.
  5. Service pages: Contacts, Delivery, Payment, About us.

How to check:

  1. Checking the validity of the file is carried out in the old GSC panel (Crawl – robots.txt Tester). Here you can check any page for closed or open for robots.
  2. An alternative tool for checking the file is also available in the Google Search Console) Robots.txt Analysis).
  3. An additional way to check a URL for deny rules in a file is to use a bookmarklet.

2. Sitemap

Maps are a must for large projects. It is recommended to set up an auto-updating map where all newly added URLs will be written. XML sitemap requirements:

  1. The XML map contains addresses using the current protocol (http and https).
  2. The XML map contains pages that only respond to 200 OK.
  3. Does not contain non-canonical pages.
  4. Does not contain pages that the site owner has closed from indexing (no index tags, no index meta tags, HTTP response header).
  5. The correct last and priority headers are set. Crawl priority becomes meaningless if all pages have a value of 1. Having the same date in last mod on all pages won’t be useful either. After all, it indicates that all URLs were updated at the same time. Based on this data, the robot will not be able to select the pages that need to be indexed.
  6. Maps are being validated and added to GSC and Yandex webmaster tools.

An additional advantage of loading a map in GSC is indexing control. For example, your map is divided into sub-maps. You can download each submap individually and observe the errors and crawl date of each URL.

The screenshot shows the scan status of an individual map: errors, number of excluded and indexed URLs.

3. Redirect chains

Most often, sites use 301 (permanent redirect) and 302 (temporary redirect) redirects. Detailed information about redirects is in the article – “ 301, 302 or 404? What to apply and in what cases? “. It is important to ensure that URLs that return 301 and 302 codes are not in the site structure. Sometimes sites use chains consisting of several redirects. The screenshot shows that the robot can reach the final URL through 2 redirects. This practice is fraught with the consumption of the crawling budget. And from this, it is especially dangerous for large sites.

How to check:

Parsing will help you find addresses with a 3** code.

  • In Screaming Frog – tab “Response Codes” – “Redirection (3хх)”
  • In Netpeak Spider – “Summary” – “Page type” – “Redirect”.

4. Checking for a 404 error

The 404 error should only occur if the user entered the URL incorrectly. The error page is designed in the style of the site and offers the user as many options as possible to navigate to other pages.

The subtleties of the design of the error page are described in our article ” Error 404 – what does it mean, how to find and fix the error “. Check for the presence of a 404 page in the site structure: if a robot or user stumbles upon a non-working page while surfing the site, then they are more likely to leave the site.

When checking for a 404 page, you need to pay attention to:

  1. The 404th page returns the 404th code.
  2. Pages that should be indexed do not respond with a 404 error.
  3. You need to check the correctness of the server response code on all types of pages (main page, category page, product card, article page)

How to check:

Here you will see the pages giving 404 errors, and their location, as well as the dynamics of growth on the site.

5. Site Mirrors

The site must be available at one address. Duplication on the same domain, but with the www prefix, may negatively affect the evaluation of the main domain. Checking mirrors for availability:

  1. Using two protocols at once http and https.
  2. With the www prefix and without it.
  3. By its IP (the site is available both by the domain name https://site.com and by the IP address 185.42.230.20).
  4. Availability of the domain indicating its port, for example, https://site.com: 443/.
  5. Complete duplication of the site on the service subdomain: for example, https://mail.site.com.

How to check:

  • Manually.

As a result, we see which subdomains of our site the traffic goes to.

We check the found subdomains for duplicate content of the main site. If the content is completely duplicated, then we close it from search engines.

6. Checking technical duplicates

Duplicate content on multiple URLs.

  • with a slash at the end: http://www.site.com/page1/;
  • without trailing slash: http://www.site.com/page1;
  • with index.html at the end: http://www.site.com/page1/index.html;
  • with index.php at the end: http://www.site.com/page1/index.php;
  • using the protocol http: http://www.site.com/;
  • using the secure protocol https: https://www.site.com/;
  • using capital characters: http://www.site.com/Page1/;
  • with www prefix: http://www.site.com/page1/;
  • without the www prefix: http://site.com/page1/.

How to eliminate duplication:

  • set up a redirect (301) from a duplicate to the target address;
  • set the rel=canonical attribute from the duplicate to the target address;
  • use the robots= noindex,nofollow meta tag that prohibits indexing;
  • close the duplicate with the HTTP response header X-tag robots= noindex,nofollow.

How to check:

  • It is worth manually checking all types of pages for accessibility with and without a slash at the end.
  • Availability under two protocols will immediately be shown by the parser. For example, Screaming Frog – Protocol tab.

7. Checking pagination

In the organization of pagination on the site, two approaches are traditionally used. They depend on the priority search engine.

What to check:

For Yandex, the canonical attribute is used from all pagination pages to the first http://site.com/category/.

When checking pagination, be sure to check the availability and accessibility of the page http://site.com/category/page1/. Usually, it is a duplicate of http://site.com/category/.

The page http://site.com/category/page1/ must be closed from crawling and indexing.

8. Micro markup

This is the markup of the HTML code of pages using special attributes according to accepted search engine standards.

Schema markup gives Google bots more accurate and useful information about your site. Improves snippets, making them more visible, affects CTR.

What to check:

Most often, websites use markup for:

  • Contacts.
  • Product cards.
  • Bread crumbs.

Check the correctness of the markup on all standard pages

How to check:

  • Structured Data Validation by Google
  • Microdata validator.

9. Checking the saved copy of the page

The content that is visible to users should not be different from the content that you give to the search robot.

You can check how the robot sees your page through a saved copy.

What to check:

  • whether the text is visible in the saved copy;
  • does it match what the user sees;
  • whether navigation links are visible;
  • whether there are elements or text that is not shown to the user.

How to check:

  • Manually.

10. Check download speed Speed ​​

is an important ranking factor, especially on Google. When optimizing for speed, aim for a maximum of 200ms (server response).

How to check:

  • PageSpeed Insights

11. Checking CNC addresses

URLs must meet the following requirements:

  1. All characters are in lowercase.
  2. There are no URLs longer than 100 characters.
  3. There are no underscores in the URL, spaces, or quotes.
  4. There are no Cyrillic characters in the addresses (for addresses in Latin).

Important! This recommendation is important for young sites that have not been promoted before, or resources that are closed from indexing. If search traffic goes to addresses with such errors, then do not rush to reformat the URL. A significant increase in ranking after the introduction of CNC was not noticed, but a drawdown of positions for some time is inevitable. Turning regular URLs into CNC is worth it when you have nothing to lose.

How to check:

  • Screaming Frog will show URLs containing an underscore: URL – Underscores
  • Netpeak Spider shows addresses containing an underscore when setting up a segment

Online Services:

12. The site is adapted for mobile devices

In the age of Mobile First Indexing, perfect mobile responsiveness is a must for all websites.

What to check:

It is better to check all the pages of the site manually from the mobile.

Then you will feel like a mobile user and be able to find problem areas in the corners of the site. But if we are talking about a large store, then check the pages of each category and at least one product card of each category.

How to check:

You can trust the GSC panel to mass-check the status of the mobile version. It constantly monitors the site and the correctness of its display on mobile.

  • The https://search.google.com/test/mobile-friendly tool will show the issue on a specific page.
  • In Netpeak Checker, you can get data by a list of URLs from Google PageSpeed ​​Insights, Mobile-Friendly Test, and Safe Browsing services, while you need to have a public API key (this key is free and is used for all these services)

13. Indexing

Check how many pages Google has. If the number of indexed pages in one PS significantly exceeds the number in another, this may already indicate a problem: either sanctions or technical errors that prevent the indexing of some pages.

What to check:

Check the index for the most important pages.

How to check:

  • Use the in URL: and site: operators to spot-check the presence of pages in the index.
  • Checking the URL in GSC helps you know if the page is indexed and what are the problems on the indexing path.
  • https://indexchecking.com/ – checks indexing in batches. There is an unloading in Excel.
  • Google Search Console

Internal optimization

14. Unique content

To allow duplication within the site is a hand-made decrease in the site’s rating. The most common problems are the duplication of the content of the main meta tags and headings (title, description, h1, h2).

How to check:

Any regular parser is equipped with the option to find, show, and export duplicate content to reports.

2. Presence of blank pages Pages

that are empty or have little content can be considered low quality by search engines. The presence of such pages on the resource can lower the overall rating of the resource.

How to check:

  • Screaming Frog – Word Count tab. By sorting in ascending order, you will see pages that either need to be filled with content or closed from indexing.
  • Netpeak Spider – open “Options” – “Number of words” (you can filter by number from larger to smaller and vice versa)

15. Title optimization

The title of the page, its content, and keyword optimization directly affect the ranking.

What to check:

  • Use a short and descriptive title.
  • The title displays the content of the page.
  • Important keywords are used towards the beginning.
  • The title is unique within the network.
  • Special characters (emoji) are used. Title length within 70 characters.

How to check:

  • Screaming Frog – Pages titles tab. Contains a drop-down list of sorts, and can show pages where the title is duplicated, missing, or exceeding the length. The header size can be left as default, or you can set your parameters.
  • Netpeak Spider — you can see all errors related to the Title by applying the “Segment” with the appropriate settings
  • Plugin for browser SeoMeta in 1 Click shows the size and content of the title of the open page.
  • The service https://www.seoreviewtools.com/bulk-title-tag-checker/ checks the presence and size of the title in the list of URLs.

16. Description meta tag optimization

The visibility of your snippet in the SERP depends on the content of the meta tag.

What to check:

  • The text in the meta tag consists of no more than 250 characters.
  • Description text draws attention and encourages action.
  • The description contains keywords.
  • Added chat buttons from Yandex (if an online consultant is connected to the site). Whether the “Products and prices” module is enabled in Webmaster.

How to check: tools mentioned in the previous paragraph. When the snippets are ready and optimized, you can check how they will look in the SERP.

Tools:

For example, https://technicalseo.com/tools/google-serp-simulator/ will help you check the snippet

There is a similar function in other tools: https://totheweb.com/learning_center/tool-test-google-title-meta-description-lengths/

https://www.serpsimulator.com/

17. Checking text content

What to check:

  • The uniqueness of the texts of articles – from 85%.
  • Structured and readable content.
  • Texts are optimized for keywords.
  • Check texts for grammatical, spelling, and stylistic errors.

How to check:

  1. Grammarly: Helps identify grammar and spelling errors, and provides suggestions for improvement.
  2. Copyscape: Checks for plagiarism by comparing text content against a vast database of web pages.
  3. Yoast SEO: A WordPress plugin that analyzes content for SEO best practices, readability, and keyword optimization.
  4. Hemingway Editor: Highlights complex sentences, passive voice, and other readability issues to improve the clarity of your writing.
  5. SEMrush SEO Writing Assistant: Provides real-time recommendations for optimizing content based on target keywords and search intent.
  6. Readable: Evaluates text for readability scores, including Flesch-Kincaid, Gunning Fog, and more.
  7. Google Search Console: Offers insights into the performance of your content in search results, including impressions, clicks, and average position.

These tools can assist in improving the overall quality and optimization of your text content for better SEO results.

Don't forget to share this post!

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *