NOTE

You are using services from SemrushTooolz.com .
If you are here from another website, Please let us know.

How One Artisan Bakery Is Using an SEO Audit to Boost Visibility

How One Agency Is Using an SEO Audit to Boost a Bakery's Visibility

Anna Kochegura

Jul 28, 202111 min read
SEO Audit to Boost Visibility
Share

TABLE OF CONTENTS

SEO is about much more than placing the right keywords into a page’s site copy; there’s plenty of behind-the-scenes factors that you need to take into consideration, too, that can directly impact your ranking potential.

This is where a technical SEO audit comes into play. Without a technical audit, it’s like planning a once-in-a-lifetime cross-country roadtrip without checking that the car was in good working order first. If the car dies ten minutes down the road, it doesn’t matter how much great planning went into that trip; you’re not going to have the experience you thought.

This is no different from SEO. Even with visible gains in website performance, the reality is that there could still be deep problems that impact its indexing by search engines like Google. It doesn’t matter how great your content is or how strong your link portfolio has become if your entire backend is working against you. 

In this episode of our SEO Reality Show, we’re going to show how the expert agency conducted a technical audit for Florida-based Edelweiss Bakery, showing our readers how to perform a technical SEO audit step by step. 

Missed the first few episodes? Check out the previous content here: 

What to Assess During a Technical SEO Audit?

A technical SEO audit is going to look at factors beyond how you’re optimizing your copy with keywords. It will typically include assessing the following:

  1. The configuration of robots.txt
  2. The configuration of the main site mirror
  3. Mobile optimization 
  4. Page rendering correctness
  5. Site loading speeds 
  6. The presence of broken or duplicate links

Let’s take a look at how the expert agency used technical SEO audit tools to find errors that can be auto-detected and the manual techniques they used to find anything else of concern. 

Step 1 — Searching for Errors Using the Semrush Site Audit Tool

The first thing the agency needed to do was to assess the baseline level of technical optimization. To do this, they used the Site Audit tool from Semrush. 

To do this, they created a new project, used the default settings, and ran the audit. You can either adjust the settings here or leave them as is; if you’re not a technical SEO specialist, it’s best to stick to the default options. 

O7-wS7RVRxYyf7QUkNMlUCgzz78rEickae0wlHqxJdN_Yg816nXJ4AiH1TV4WtNArV8YhbvIf97RiPwaLmlTrys1Nmj2VDyLltNUoYffN90b0ds-PMh8PPK4IhEvyAUgoteA7o3x
1. Creating a new project in the Site Audit tool
avHGNyFYgHEAoOpTkDol3IOolXQM94SKhZqQ3_Fm3ZD823zqnuH8ofUfc8D1UfcIBnoZJ1I1p_Ta_Hj16fR6y0_8bDtTQQ9HVlR3KizASoPtZU_hidOVBPbWK1jD-VrkOnfmToCe
2. Stick to the default settings by default if you are not an SEO King

1. Critical errors and warnings

After the audit was conducted, the agency viewed the reports in the Site Audit section of the project. 

First, they looked at the Errors section. This section flags any serious problems that need to be addressed right away. In this case, the bakery’s site had one critical error: a broken link to the xml sitemap

It’s worth noting that a presence of a sitemap isn’t always necessary, though if you have a site with up to 1,000 pages it’s a good thing to have in place. Still, you do want to identify and remove broken links; every time someone clicks on a broken link, the search robot wastes resources and it can prevent users from arriving at the right location. This can lower your site’s ranking position, since Google’s robots aren’t able to crawl it properly. 

u6P7s5aDqMBwHVn2LcJz08z01TgBa9qfLtj0o_G6SYHY_GHbwF8NTTzieu-9p7QS-tWK599dU7h0X6kq6aHdQ-4EpnfrcPfPbI7EJmCf7sVVZHonETLHv_NVo2_dhmMr2HlwYRsf
3. Errors and warnings found through the Semrush Site Audit tool

When checking the site map, the agency came across an interesting usability error worth noting. When going to the address /sitemap.xml (where the xml sitemap is usually located), we saw the design of a non-existent page that left much to be desired. There is always a chance of broken links appearing on your site, and in order for users not to get lost when they get to such a page, it’s best to make an original design for it with the ability to go to the Main page of the site or return to the previous page.

fchlWY4gbh6_e7siyVtaT6AnJIpXp_y2zVJtpCyqKcTLfltGZucn7r1vrUZvGnmzurVUr3IArTYvncRVzdTDsYlWzAiG5d8vHZ1SJU20Sv7IJasqugQu54C4fcPcNhSuooONBZA_
4. View of 404 pages on the site https://www.edelweissbakeryfl.com/

After revealing the 404 page on the bakery’s website, the agency turned their sites to the “Warnings” section in the Issues tab. There were also a total of 45 warnings, which aren’t as serious as the critical errors but that still needed to be addressed. 

These were the warnings they felt were most important to adjust: 

  • Two images without the ALT attribute (which prevents Google from indexing the images and can hurt SEO).
  • Absence of the H1 tag on one page.
  • One page answer with a 302 redirect (which, in this case, should have been a 301 redirect to the final page).
  • Not enough copy on a page (critical if you are going to optimize this page for SEO).
1D5507sHjsAX3LuDkzqkU-QZuJvRIWOzLVVHMpevUzuEXDMVoHioIRl6uxebQFzYV_Xuwc5kwmkA8bRbVweApmRl8nshEgbAcDkl5jjyb83jmBmp-k7yEEht6qccVFxSPkNbkyXx
5. Prioritizing warnings to be fixed

When you’re replicating this process for your site, keep in mind that you don’t need to spend an exorbitant amount of time finding out what kind of errors are in play, why they matter, and how to fix them, the Semrush audit tool has your back. Each warning can give quick tips on what’s wrong and how to fix it. Just click on the link “Why and how to fix it” and you will see a tooltip with all the necessary information. Then you can simply send tasks to Trello and Zapier to automate the errors fixing process.

Qy5UbT-jOBpcisk3hLfsTjXIHVwjVtDbnqN2no18w97SFBUE_oCe46pgzoQSG-twfzuB0rXDsCcY5ETKrzQi6adUAXcy5lkmCBJdNFR2p9Uv9AQ0Oz5lMGOCnBF513qq5y-yiOC4
6. Tips from Semrush Site Audit
JrosvRAHGlDdtKuQCVn2DIRzik3SgKEoOtFH9cv9-btNC-9wD49Fd1YlW0--wQ3LFN-Bsj0_FhnYBOkuCo-SQriiyoT9kU_DpYZ6G1nQavqXVBIWHYJsnuoAp4U4NG_Mbpp-0UwJ
7. Reports with detailed information about the website health in the Semrush Site Audit tool

The agency then proceeded to carefully move through all the reports while making notes. They found that poor performance of Core Web Vitals and a lack of markup needed to be addressed. 

2. Structured Data

While a lack of markup (aka “structured data”) on the site isn’t necessarily a critical error, it’s a big missed opportunity that could be hurting the bakery’s SEO potential. Structured data allows you to display additional information in the search results snippet while also setting your brand apart. 

It’s exceptionally easy to check your site’s markup presence using Semrush. Just go to the Crawled Pages tab after you conduct your audit, then select “Markup.” You can filter the results by the types of markup that you’d like to review. In addition to reviewing json-Id markup, you can also check for Open Graph and Twitter Card markup, too. 

Hfg9Oj4tvGbNSMYs9HHqTWySxy5Mh4CQjn7r2mD6-wrmLaP6yqnArTEHiFy9-divAmQtb1C6D5VJwOWMOmT4hKF1hp6PzOV1mNSdypx-MtT9uH4uLcVRkceX293bppz1aYXF-8do
8. Checking the page markup in the Site Audit tool

If you need a page-by-page check, you can use Google’s free tool here

0Qy3SO7lJOqC3-E9kNh983SCCipusfaoW18sQjXDeMdZxsqq88YovAxVI4elhnJsr4s5JcCeKve-cTG7b07x9AQiaVr_WX9gMDJZJQRkd_RB8tmqBGDiHd-qBgtRQ1vTDyzaAxQv
9. Checking markups page-by-page in the Google tool

In our case, there is no markup. Types of markup that need to be applied to the main page of the site (and so far we only have it) — Local Business (or Organization) and Restaurant.

2he_y7coCptVrDEt-fUJgGsy0a4G6p7RyUa4FXzH5SkvcIFh5kq3mhgSXqohyJpHMRyLf03Cu5kSfKGhuIif3bx0qO15hoTN-nInblbSfOiX-iq7g2dGNOKkYEeo6sxnQhIksAtJ
10. In our case the Restaurant markup is applied with the Aggregating field

It’s also a good idea to add OpenGraph markup to the pages to get beautiful snippets of text across social media when someone shares the link to your site. Twitter Cards markup can also help with this same purpose on Twitter. 

To check the presence of this markup, you can use the markup verification services in Semrush, or use the official validators from Facebook and from Twitter and or look at the source code of your site for the presence of tags for OpenGraph markup. These will appear in the following formats: 

  • og:title
  • og:type
  • og:image
  • og:url

This is how it will appear for Twitter Card markup: 

  • twitter:title
  • twitter:description
  • twitter:creator
  • twitter:site
  • twitter:title
  • twitter:description

In the bakery’s case, both types of markup were missing.

rOdAW8AoOgwZgDeytwOlMwJOtnvOMySXJPgP-WyDIJQur6MbZTA3IIwQ86wuZ1wNjdd6Fw3Qsk3bBeQnyVY1BKhdssSzpjI6lXmsqe9dP84LKckq0vwPmYZcqKR05gZjmMIGNWIw
11. OpenGraph markup? What is it?
R_Ed2lC9eDHgOq52zTUQnzPXj4r6q5dr4shdFi1s0cl9Yyn3OfuI_PAp2-uYaGJM7fbm7n5XdYGyhC_A4G04uF3CFKJW91zfViwOPCb6a_5R7rVE8701_16JEAZoy9qvWtoKnqqO
12. The Twitter Card is also missing

3. Page Load Speed

Page loading speed is an extremely important indicator and contributing factor when it comes to SEO potential. 

This is because Google’s search engine engineers are focused entirely on delivering a great user experience. This means that they’ll prioritize high-quality content on reliable sites that deliver strong user experiences. And sites that take too long to load are not compatible with an overwhelmingly positive user experience in this day and age. 

The Google team even came up with a name for the download speed and UX score indicators and how these metrics should be measured. If you’re interested, you can read more about Google’s Core Web Vitals (which are different ranking indicators) here

To see the impact of your site loading speed, you can use Semrush’s Core Web Vitals Report. This will show you how close your indicators are to the recommended values and optimize your site accordingly as needed. 

IhAJ-pfVMEG5D5eCGzbzxx36tuhXIrGcr6yhh1CQdMT0VmnCb6yt7sKMx-bihqjizeJRhPOPjWqjKMmAVC65NgBuFXcyxHbus6H7nKf5lkn9xRNwPygbZ8uv6r69WbaCLwwhV5Dy
13. Mass verification of site page indicators for compliance with Core Web Vitals

This tool is perfect if you want to look for pages that are weak in loading speed, or pages that have a pattern of a singular issue (like having an error only product cards). 

When you want to prepare the list of technical specifications that programmers will use to eliminate problems with site loading speed, however, a more detailed analysis is necessary. Google’s PageSpeed Insights is the tool to use. 

Fortunately, the tool is easy to use. Here, the agency entered in the bakery’s domain name and got results quickly, organized with three different labels:

  • Green means that things here are working perfectly
  • Yellow is okay, but with optimization potential
  • Red indicates that there are serious issues that need to be reconciled to improve your loading speed
6OPcluMZUF8lv6Y32_l6Sn3gkfOtMdw2zSqVDyp3tbijs5hftJlx--x0MfIZN4nyQ9cMWE9edqsdVQzO1xJ7VqDGCKGKeb86oKw_ufywtFYXRVu8daBy7HZx694ZTiskeUbYqdkG
14. Houston, we have a problem!

Step 1 Summary 

We just went over a lot of information, so here’s a quick summary of what happened. 

During the verification by the Semrush service, the agency identified the following errors on the bakery’s website:

  • The presence of a broken link to the xml sitemap
  • Two images without the ALT attribute
  • No H1 tag on 1 page
  • One page gives a 302 redirect
  • A small amount of text on the page
  • A lack of markup
  • Poor performance of Core Web Vitals

This isn’t a bad starting point, but the agency knew that looking for more details with a manual audit would be an important step. 

Youtube video thumbnail

Step 2 — Searching for Additional Errors 

In this section, the agency decided to go beyond Semrush’s initial report, manually looking for additional potential issues. 

1. Verification of robots.txt

The robots.txt file contains the necessary directives that tell search engines which parts of your site should be crawled and indexed, and which shouldn’t be. It may include:

  • System files and folders
  • Sections with commercial information
  • Payment information

There is, however, no universal rule here about what should be indexed, and the composition of the file itself depends on the CMS that you’re using to run the site.

There is one thing you’ll want to do: Ensure your site is open for crawling and indexing by search robots once it’s past the development stage. 

In the case of our bakery, the agency knew that the robots.txt would require adjustments because in their experience WordPress directives (which the bakery used for their site) aren’t specific enough. 

Fb5fRA02sd_YICDLLsVcDmueytzHvYLx_3FJ6OG2GwkCrdN_86-4fyT-7KbY3QpXaIq6CyxValGnqjv4C-NBOt0TFpNCfgAGMKzj5EBydHLpGOkYDdE7hKh9CqfNsXNKdu9LSvoU
15. Current robots.txt file

2. Mobile Rendering

Mobile site optimization is crucial because so much internet traffic is now coming from mobile. At this stage, the agency checked for mobile site optimization. This included reviewing how well the site worked and how fast it loaded on mobile devices, and how completely the search robot was able to see the site’s mobile pages. 

This tool from Google will help you assess how mobile-friendly your site is. It’s a good starting point to see how Google’s search robot sees your mobile site. It may be different from how you see it, which means that Google isn’t crawling it correctly. This can hurt your search potential and (in some cases) the user experience.

The agency found that this exact scenario was the case for the bakery, and knew this was something they’d need to address. 

ShWJUNCSg_RLmlwKz-joPlfrwCR2T3DMwRRWE0a52SG7a2ez5IK9Z4GDfv2tT6zGMdDxzzyHZosuf_ggddK1vGrRdNZV3mwXddCpgLo6qXphrmkF97yClchuL2_-lhr8nNqrhn0T
16. Houston, we have a problem again!

3. Configuring the main mirror of the site

Most sites have “mirrors,” or duplicates. These are often used as archives or even test sites for development. The main mirror of the site is the full domain name of the site that is specified as the main address. It’s considered the “main” site because, unlike the others, it participates in search results. It’s designed to be accessed by users. 

It’s essential for the search engine to understand which of the mirrors is the central main site that actually should be served up in search engines. To do this, you need to explicitly specify by “gluing” it together

To check to see which of your sites mirrors appear in search, you can use Google Search Console and see what shows up. You can also check it manually by typing the site's domain name into the Google search bar. In our case, the main mirror is https://www.edelweissbakeryfl.com/

EBxSNJTppo57MkUct5lhBEtArkBcosM4uEj-YQ5Qwyixy_xe8jyVSWwHlyhrKf5fU5t0x6z6nNRdXZmdWPB6j4-AyfQMmEelwg1gG2pPGBHU7Ht6hqDoSOw1tGfdVcjtv7dDTfQt
17. In our case, the main mirror is https://www.edelweissbakeryfl.com/

To check the settings of the main mirror, you then need to type in the variations of the domain name of your site and see if there is a 301 redirect to the main mirror. You can check this using this service.

The main combinations that need to be checked are:

  • site.com with “www” and without “www”
  • site.com with https and http (if you have an SSL or TLS certificate installed)
  • site.com/index.html
  • site.com/index.php
  • site.com/index.htm
  • site.com/home.html
  • site.com/home.php
  • site.com/home.htm

There may be additional combinations depending on your CMS. If the main mirror is not configured, then it is necessary to set the task for the programmer to configure 301 redirects to it. In the bakery’s case, everything is clear here. 

B7p08bb2seNRSt5nN4mZw1fVcl7hRiF5yfrw2bapsqZRHKsz9isWKXeuMGsX82nGDgKDiqF8d92nqCTdrAQAQ29uzksu5gJvvqz__WxxTwkHB7osQauR6t3IL_n0UiOkgfl57Aom
18. In the bakery’s case, everything is clear here. 

4. Layout errors

Layout errors can significantly complicate how search robots read and interpret the code of your page. They can also impact the display of the site across different browsers or devices. Because of this, it’s a priority to get rid of any layout errors. 

The agency used two different validators to check for layout errors on the bakery’s website: HTML code and CSS code. They found out more errors to be fixed. 

JcTf1LgfQbbxUcN0QvnOlAFQ9SRuNdVQLfWiOWVo_K8q0axRHAjxyZGCvG2SmXSPKBbirTvdHLwXVHpgLOkrfg1C9LDfhwMAqblPyqhU_tIpuuU4C5uf-tLbu8IRNX2spiA4CvmP
19. The presence of errors in the HTML code
HAyZ1k60UC2dNw_ftvtJbdGQ4cfFZyWqFoKcGI9eWcs5wQlTw5yPd337UBvgt3H1QgjVj-sA2ZWtCXnsrR8HVSBViqvkKDFtAsroAF3W8U7B5dxL-aSDz8TXtJWnlgnskNZ6TQJm
20. The presence of errors in the CSS code

Summary Step 2

During an additional check, the agency found the following errors:

  • Incorrect robots.txt
  • Problems with page rendering
  • Layout errors

Now the agency can be sure that they have done everything they could, and the elimination of the errors found will definitely benefit the site.

Next Up

After the manual assessment, the technical SEO audit was over! The agency compiled the full list of concerns into a single document for developers, who then went to work optimizing the site. With the technical SEO put into place, the bakery’s site was well on its way to improving its search ranking potential.

Our next episode is going to look at on-site optimization, including the layout of the product page, meta tags, and technical specifications for the copy of the home page. This is one episode you definitely don’t want to miss, so stay tuned! 

Get Your Free Technical SEO Template

Infographic: Performing Technical SEO Audit in 10 steps

img-semblog
Share
Author Photo
Bringing over 5 years of marketing experience to Semrush awesome products in France. Now exploring new horizons of the digital world and shaping inspiring data-driven projects.