The online promotion of a website is one of the key elements of any digital marketing strategy. By optimizing your website, you can get more traffic from organic search, which is almost free. To rank higher in search results, you need to continually work on SEO, improving pages and overall website performance. An obligatory step in any SEO strategy is the web audit, which is the process of checking possible problems and identifying things to improve, obtaining more traffic and sales. It normally consists of:
- Technical Audit,
- User Experience Audit,
- Link Profile Audit,
- Content Audit.
How To Perform A Website Audit?
There are several areas where a website should be audited, and a technical audit is a good place to start. After all, if something doesn’t work or is misconfigured on the site, it will cripple all other work and won’t let you get much traffic.
It’s rigorous, time-consuming work to do manually, especially if you represent a small business and don’t know much about SEO. Fortunately, there are professional SEO tools, both free and paid, that allow you to do it virtually with a single click. So, let’s see what crucial errors can be found and what metrics you should pay attention to when running a web audit.
This type of web audit deals with the technical aspects of website optimization, from ensuring everything is working properly to checking for duplicate pages. Let’s start with the crawl report. It is recommended to automate this process with specialized SEO tools that can replicate the behavior of the Google crawling bot and therefore can detect errors that may prevent the bot from correctly crawling all your pages. There are several types of crawl errors:
- Unavailability occurs when some part of your website is unavailable due to heavy load on your server, timeouts, or DNS issues.
- txt issues: This file helps search engine crawlers index your content. If Google’s crawler doesn’t see the robots.txt file on your website, it will stop crawling, resulting in your new pages not being indexed.
- Problems with sitemap.xml: It is a special file that collects all the pages of your website, which helps search engine crawlers to easily find all the pages of your website. You can generate a sitemap during your next SEO review with SE Ranking’s web audit tool. The following potential problems may occur:
- The sitemap does not exist;
- The sitemap is too big;
- Pages with NoIndex are present in the sitemap;
- The sitemap is not present in the robots.txt.
Metadata And Tags:
- The incorrect length that prevents them from being fully displayed in search results;
- Duplicate tags on different pages;
- “canonical” tag string;
- “canonical” tags that point to non-HTTPS URLs;
- NoIndex or NoFollow in the HTML or HTTP header.
Most of the website pages must be open for search engines to index them, but some important pages might be banned in some way. It can happen due to SEO error or when a webmaster forgot to open some sections of the site for indexing after long development. However, there may be pages that don’t actually need to be indexed: a shopping cart page or a wish list page that are typically dynamically generated for each user.
Make sure they are closed to indexing. You’ll see potential issues in the crawl report, so you’ll need to make changes to those files to fix them. There can be quite a few potential problems here when the page you want to index is:
- prohibited in robots.txt;
- blocked by NoIndex;
- blocked by NoFollow;
- blocked by x-robots tag.
URL Problems: These refer to broken links pointing to some non-existent pages. It usually happens when the URL of some page was changed, but the links to it were not. Also, some pages may no longer be present but still linked, and some URLs may be too long for crawlers to handle correctly.
Frames Are Used: Frames are old and obsolete web elements. It’s best not to use them altogether, as they make tracking difficult.
Now let’s move on to other possible technical problems with your website that can be detected with a thorough web audit:
Website Security: Using HTTPS on your website improves security and also your search engine ranking. There can be several types of issues with HTTPS that are worth checking out. If any of these events occur, search engines will mark your site as unsafe, and you could lose your traffic volume significantly:
- Another domain name in the certificate: Attempting to use a certificate from another domain can cause problems. It often happens when there are versions of the site on different domains for some reason, like io and online.
- Old protocol version: The SSL and TLS protocols came in versions, and some of the older ones are now considered insecure.
- Non-HTTPS content on the site, as well as HTTP pages in the sitemap or internally linked from HTTPS pages.
- HTTPS Status: All your active pages should respond with correct HTTPS status codes. If you see some error response codes in your crawl report, you should fix them, in particular, remove unresponsive pages with 404 errors.
User Experience Audit
If you follow the latest trends in digital marketing, you will know that one of the main goals of search engines is to provide a positive user experience. That is why search engines actively track multiple parameters of user behavior on websites, such as bounce rate or depth of page views, trying to identify sites with the best UX.
Also, Google is rolling out its Page Experience update that further focuses on page speed, visual stability, and interactivity. That is why, you should pay close attention to the usability audit, and this is what you should check:
- Website Speed: There is plenty of research, confirming that people react negatively, leave or prefer not to buy from slow loading sites. These sites are often demoted by search engines in their struggle to provide a great user experience. Therefore, it is essential that you make sure that your pages open incredibly fast. You can use Google’s PageSpeed Insights to check and follow up with actions to take:
- Compress images;
- Use Gzip compression;
- Minify HTML, CSS/scripts and remove any unused code;
- Use CDN/caching.
- Mobile device compatibility: The growing popularity of smartphones has led to a continual increase in traffic coming to sites from mobile devices. It was so important that it resulted in Google’s adoption of the “mobile-first” indexing algorithm, which gives better rankings to pages that have no problems when viewed on mobile devices. Use Google’s mobile-friendly test to find any issues with your pages.
- Navigation: While its desirability may be subjective, search engines track many parameters on your pages, such as bounce rate, time spent, or pages per visit, to determine if your pages are navigable. Please check the following:
- Calls to action and visual cues: An easy-to-understand interface can make or break website usability. Make sure all your menus, buttons, status bars, etc. are always clearly visible, relevant, and easy to understand.
- Text eligibility: Reading the text on your website shouldn’t be difficult by any means. Take into account people who do not have good eyesight and adjust the font size and font itself if necessary.
- Pop-ups, online chats, etc. – all those fancy things that appear on the screen, often unexpectedly, can really annoy your users. They can increase conversion when played smartly, but make sure they aren’t too intrusive and don’t block important content or navigation.
Link Profile Audit
One more area to check is your link profile. Generally, the more links you have to your website, the better, but in 2021, they should be 100% trustworthy. You need to keep an eye on the links you get from other sites because on the one hand you need to systematically grow your link profile and on the other hand you need to get rid of “toxic” backlinks.
You can start researching your backlinks in Google Search Console, but it only provides some basic information, such as which sites are linking to yours and the anchor texts.
So, here’s what to check during the link profile audit:
- Domain trust and page trust are similar metrics based on multiple factors, including the quality of referring domains and backlinks pointing to your website as a whole and to specific pages. If your site gets a rating below 50, it is considered weak and you need to improve the situation as soon as possible (remove toxic backlinks and get backlinks from more reliable sources).
- The total number of referring domains is the number of sites that refer to yours, and you’d like it to be as high as possible as long as they are quality sites. You will see that your domain is trusted and you will draw your own conclusions by checking these domains.
- A number of backlinks from all referring domains. Check this parameter from time to time to make sure your link profile is growing and you are not losing backlinks.
- The main backlink anchors are the words or phrases that are used in the links to your website. It is good to check if the anchor corresponds to your target keywords and if generic keywords are used for the anchors; sometimes that’s a bad thing, as the wording used as a keyword should also correspond to the theme of your website. Generic words like “here”, “click”, “read more” are considered spam by Google robots.
- The DoFollow/NoFollow ratio shows how many sites refer to yours with a “DoFollow” tag that is positive for your SEO and how many backlinks do not transfer PageRank to you. If you have about 60% DoFollow backlinks, you’re fine, but it can still be less or more, depending on your backlink strategy.
There is also another useful tool for checking backlinks, Majestic.com. It shows you backlink history over time and also offers a lot of the above data in graph form, which can be a bit easier for your digital marketing team to understand.
It deals with the quality of the content on your website. It could be a subjective area, but there are also various objective factors like duplication or low word count. This is what you should look for:
- Content Value – Compare the pages you want to rank well for the same keywords as your competitor. You can use the “site:” search operator to perform a search for specific keywords on competitor websites. Also, you can use the Woopra tool which is great for understanding the customer journey; can help you create really engaging content.
- Hierarchy: Google loves not only relevant and interesting content but also well-structured content. Make sure all your text on the website and blog is well formatted, including lists, H1-H4 headings, subheadings, and is broken up into small paragraphs. You can check for any such issues with your content with SE Ranking’s Web Audit tool.
- Duplicate content: occurs when you have two or more pages with the same or almost the same content. There are several tactics on what you can do to remove duplicate pages:
- Delete pages you no longer need (don’t forget to set up redirects for pages that are no longer available).
- Merge duplicate pages into one page.
- Set “canonical” tags to tell search engines which pages are “top” pages.
- Content Gaps: There may be keywords that you’d like to rank well for, but currently, you may not have a page for them. A complete audit of your content can help you plan your content marketing and set up an efficient brand search.
Your website is arguably your most important digital marketing asset, and you want to make sure it’s in the best possible shape. The process to ensure this is called a web audit and generally consists of four stages: technical, usability, backlink and content audits.
With the help of specialized SEO tools, you can perform web audits with literally one click. And you can perform such checks whenever you want, especially after major updates to your website made by yourself (if you’re a small business owner) or your digital marketing department.
Therefore, follow the guides in this article and pay attention to errors and suggestions that appear in web auditing tools. Correct them in time and your website will stay healthy, generate more and more traffic and be profitable for you.