SEO Audit

Report Features

  • SEO Friendly and usability ratings
  • HTML Tags: Recommendations around duplicate values, title tag length, missing description tags, and H1 tag recommendations
  • Content quality report, including statistics around page length and duplicate content
  • Keyword density statistics and focus areas
  • Conclusion and overview of recommended improvements and changes
  • Internal link ratings and recommendations
  • Page speed rating, including response times, and page size breakdowns
  • Structured data details and backlink scoring

An SEO Audit is a Vital First Step on the Road to a Well-functioning Business Website

If you are planning to optimise a website, an SEO technical audit is the all-important first step. It provides you with a basis in which to launch SEO campaigns that will help your website steadily move up in the rankings. As your website ranking improves, you will also see an increase in your conversion rate and sales volume.

What is an SEO Audit?

You can’t build a house without a level foundation. The SEO audit is the level foundation of your website. It ensures that any optimisation exercise you undertake will go as planned and you will not have to go backwards and have to fix things that have been affected by the optimisation process. By using these SEO audit tools to get the website in a healthy enough shape to begin optimisation, it saves you from hours of backtracking to fix problems that the optimisation process has uncovered.

An SEO Audit is a process that ensures a site is in line with industry best practices, thereby giving the webmaster a solid foundation on which to build a successful SEO campaign. A successful SEO campaign is vital to getting the most effective use from your website as possible. There are certain steps that must be taken in an SEO audit. These must be followed in a particular order. Much like a pilot runs through a checklist before taking flight, the audit process ensures that everything is working as it should before you attempt any optimisation of your website.

Read More...

What Does an SEO Audit Checklist Entail?

This SEO audit checklist is designed to take you through the necessary steps of an audit from beginning to end, ensuring no stone is left unturned. Here are the steps of an SEO audit checklist:

1. Check Robots.txt and XML sitemap

In order to ensure your website can be seen in the search engine results pages (SERPs), it is important that these two files (i.e. Robots.txt and XML sitemap) are included in the website’s coding. Robots.txt is a basic text file that is placed on your site’s root directory and references the XML sitemap location. It tells the search engine bots which parts of your site they need to crawl. It also stipulates which bots are allowed to crawl your site and which ones are not. It acts as as a signpost pointing the way to where you have posted relevant keywords that will help you gain more visitors to your website.

The XML sitemap is a file that contains a list of all the pages on your website. This file can also contain extra information about each URL in the form of meta data or thumbnail descriptions of the content contained on the respective URL. Together with Robot.txt, the XML sitemap helps search engine bots crawl and index all the pages on your website. You can check Robots.txt and XML sitemap files by following these three steps:

Screenshot

1) Locate Your Sitemap URL

The XML sitemap can mostly be checked by adding /sitemap.xml follow to the root domain. For example, type www.example.com/sitemap.xml into your web browser.
As you can see in the above screenshot, primal.co.th has multiple sitemaps.
This is an advanced tactic that can help to maximize the indexation of your website and increase site traffic in some cases.

The experts over at Moz.com have written in detail about the value of multiple sitemaps . if you don’t find a sitemap on your website, you will need to create one. You can use an XML sitemap generator or utilise the information available at Sitemaps.org .

2) Locate Your Robots.txt file

The Robots.txt URL can also be checked by adding /robots.txt follow to the root domain. For example, type www.example.com/robots.txt into your browser.

If a Robots.txt file exists, check to see if the syntax is correct. Syntax refers to the phrasing and spelling of the file name. If it doesn’t exist, then you will need to create a file and add it to the root directory of your web server (you will need access to your web server). It is usually added to the same place as the site’s main “index.html”; however the location varies depending on the type of server used.

Screenshot

3) Add the sitemap location to Robots.txt (if it’s not there already)

Open up Robots.txt and place a directive with the URL in your Robots.txt to allow for auto discovery of the XML sitemap. For example:

Sitemap: www.example.com/sitemap.xml
The robots.txt file will be:

Sitemap: www.example.com/sitemap.xml

User-agent:*

Disallow: /wp-admin/

Allow: /wp-admin/admin-example.php

The above screenshot example from primal.co.th shows what www.example.com/robots.txt should look like when the sitemap has been added for auto discovery.

2. Check Protocol and Duplicate Versions

The second stage of completing a technical SEO audit is to check all versions of the page listed below to see if they are all accessible or are redirecting viewers to the website.

  • https://example.com
  • https://example.com/index.php
  • https://www.example.com
  • https://www.example.com/index.php

It’s important to note that Google prefers sites that use HTTPS rather than HTTP. HTTPS (Secure HyperText Transfer Protocol) is essentially the secure version of HTTP (HyperText Transfer Protocol).

It is particularly important for e-commerce websites to use HTTPS as they require increased security due to shopping carts/payment systems that provide sensitive bank account numbers and/or credit card information.

3. Check the Domain Age

It is important to check the domain age of a website, as this can affect SERP rankings. An aging website website that hasn’t been updated for quite a while will suffer in the rankings.

Use whois.domaintools.com to:

  • See if the site is old or new
  • Reflect to backlink profiles (generally the older the domain, the bigger the backlink profile is going to be)
Screenshot

4. Check the Page Speed

As outlined by the experts at Moz.com , site speed is one of Google’s ranking factors. It’s one of the main factors that deters visitors from exploring your website. Therefore, it’s important that a site’s load speed is maximized.
The loading speed of your website can be quickly checked at tools.pingdom.com . This tool provides information regarding load time and how your site performs in comparison to other websites.

To improve the speed of individual pages loading, webmasters should look at compressing images, adopting a content delivery network and decreasing the server response time. Heavy custom coding and large image sizes can also slow down loading times. Remember to check the site speed of both the mobile and desktop version of your website.

Screenshot

To learn more about improving page loading speeds, visit Google’s PageSpeed Insights .

5. Check URL healthiness

There are a number of elements to check when assessing a URL’s ‘healthiness’. These include:

Page title

The page title (also referred to as the title tag) defines the title of the page and needs to be accurate and concise in its description of what the page is about. The page title appears in the SERPs (see meta description example below) as well as in the browser tab and needs to be 70 characters or fewer in length. It is used by search engines and web users alike to recognize the topic of a page. So it’s important to ensure it is correctly optimised with the most important and relevant keyword(s).

Meta description

Meta descriptions don’t directly affect SEO, but they do affect whether or not someone is going to click on your SERP listing. They act as explanatory sentences of the content available on the respective page. Therefore, it’s important that the meta description is uniquely written and accurately describes the page in question – rather than simply taking an excerpt of text from the page itself. A snippet optimizer tool makes it easy to create meta descriptions that are the correct length (156 characters or fewer). Note: the below tool also features the page title.

Screenshot

Canonicalization

This is a long word that basically means to ensure a website doesn’t contain multiple version of the one page. Canonicalization is important because otherwise the search engines don’t know which version of a page to show users. Multiple versions of the same content also causes issues relating to duplication, which could confuse and frustrate viewers – and it’s therefore important to ensure canonicalization issues are addressed.

If there are multiple versions of one page, the webmaster will need to redirect these versions to a single, dominant version. This can be done via a 301 redirect, or by utilizing the canonical tag. The canonical tag allows you to specify in the HTML header that the URL in question should be treated as a copy, while also naming the URL that the bots should read instead.

For example:
Within the HTML header of the page loading on this URL https://primal.co.th/index.php there would be a parameter like this:

Screenshot

https://www.primal.co.th/ has been specified as the main version of the homepage URL that the bots should crawl.

Headings (H1, H2, etc…)

When performing an SEO site audit, page headings should be checked to ensure they include relevant keywords. However they shouldn’t be over-optimised (i.e. the same keywords shouldn’t be used in multiple headings on one page).
Make sure that headings are unique and specific to each page so that no duplication issues arise.

Index, noindex, follow, nofollow, etc…

These meta tags tell the search engines whether or not they should index a certain page or follow links that are placed on that page.
Index - tells the search engine to index a specific page
Noindex - tells the search engine not to index a specific page
Follow - tells the search engine to follow the links on a specific page
Nofollow - tells the search engine not to follow the links on a specific page

Response codes – 200, 301, 404 etc.

Check which HTTP response status code is returned when a search engine or web user enters a request into their browser. It’s important to check the response codes of each page, as some codes can have a negative impact on the user experience and SEO.
200: Everything is okay.
301: Permanent redirect; everyone is redirected to the new location.
302: Temporary redirect; everyone is redirected to the new location, except for any ‘link juice’.
404: Page not found; the original page is gone and site visitors may see a 404 error page.
500: Server error; no page is returned and both site visitors and the search engine bots can’t find it.
503: A 404 alternative; this response code essentially asks everyone to ‘come back later’.
Moz.com has some excellent additional information regarding response codes.

Word counts/thin content

Ever since Google released its Panda algorithm update, thin content has become a real issue for webmasters around the globe. Thin content can best be described as content that is both short on words and light on information; think of short, generic pages that don’t provide any information of real value.

It’s important to make sure that the pages on a website provide visitors (and search engines) with in-depth, informative and relevant content. In most cases, this means providing content that is longer – e.g. at least 300-500 words or more depending on the page and respective topic (note that product pages can usually get away with fewer words).

Screenshot

In-depth, informative content is essential to SEO success.

6. Check the Site’s Mobile Friendliness

According to a comScore report released in mid-2014, internet access via mobile devices has overtaken internet access from desktops in the U.S. – and the rest of the world is heading in that same direction, if they haven’t already.
It’s important to note that mobile friendliness is now one of Google’s ranking factors, thanks to the ‘Mobilegeddon’ algorithm update.
It’s therefore critical to ensure that a website is mobile friendly to maintain your website’s ranking – something you can check via Google’s Mobile Friendliness tool .
This tool checks to see that the design of your website is mobile friendly.

Screenshot

Examples of mobile friendly and non mobile friendly websites.

7. Check Backlink Profiles

Backlinks are the links on other sites that point back to the website in question; they can be seen as ‘votes of confidence’ by other web users in favour of a respected website.

Generally, the more backlinks a website has the better it will rank, with certain major exceptions. Large quantities of ‘dofollow’ links coming from one domain are frowned upon and penalized by Google, as are links coming from poor quality/unrelated websites.

In other words, effective backlinks are about quality over quantity. Unnatural backlinks can lead to a website receiving a manual penalty or being de-indexed. Google Support offers a detailed guide that explains how to disavow unwanted backlinks. Performing a disavow is a two-step process. You will need to download a list of all the links pointing to your website, and then create and upload a file to Google that details all the links that need to be disavowed. To avoid the necessity of going through this process, get in the habit of researching the links on your website regularly to ensure that they are all valid and relevant to the content they are linked to.

Ready for optimization!

Once you have performed an in-depth SEO website analysis, areas in need of improvement will naturally become apparent. Then, It’s simply a matter of rectifying these elements to ensure that the site is better placed to achieve a higher SERP ranking.

Once these areas of concern have been addressed, the site is now essentially ‘up to date’ in terms of SEO; from here on out it’s a matter of building upon this strong foundation to boost rankings, increase your conversion rate and achieve better results!

For experienced and knowledgeable SEO in Thailand and a free SEO audit, get in contact with us. We will provide you with a full strategy outline of how we can then furnish you with a full range of SEO services. Please contact us and take the next step with your digital marketing agency in Thailand.

We Generate Sales Via SEO No Matter What Industry

Play Video

"What do you get when you combine leading digital marketing experts with proprietary marketing technology?"


MEET SENTR™

Our Results Do The Talking

Results lets relax Graph Results lets relax Graph
Results drivehub Graph Results drivehub Graph
Results land houses Graph Results land houses Graph

Scale Your
Revenue 10x Faster
Than In-House

Fill out this form and we will call you to understand your business and your goals. If we can help you, we will develop a free customised digital marketing strategy.

Get My SEO Audit

Please enter your details:

FREE
STRATEGY
AUDIT
Show Buttons
Hide Buttons