NOTE

You are using services from SemrushTooolz.com .
If you are here from another website, Please let us know.

SEO Split Test Result: Find the Best Answer for Both Users And Search Engines

SEO Split Test Result: Find the Best Answer for Both Users and Search Engines

Brian Moseley

Apr 28, 20224 min read
SEO Split Test Result: Find the Best Answer for Both Users and Search Engines
Share

TABLE OF CONTENTS

Before you start: if you’re unfamiliar with the principles of statistical SEO split-testing and how SplitSignal works, we're suggesting you start here or request a demo of SplitSignal. 


SEO A/B testing is all about finding the best possible answer. Not only for search engines like Google but also for users. We know from experience that real users can greatly influence the success of a particular optimization, especially when optimizing SERP elements such as page titles. As an SEO, you want your search result to match what a user wants or expects and to do it better than any other search result you compete with.

When it comes to user-oriented testing, finding the best possible response to a search query is a constant journey. To really validate which variant performs better, you need a statistical method of validation. With this case study, we show that it pays to always test multiple variants to ultimately learn what drives users to click on your search result.

One of the things we wanted to validate for Blokker is what would be the most optimal page title for their category pages. Blokker is one of the oldest retailers in the Netherlands. The first Blokker store was opened more than 125 years ago, and there are currently more than 400 physical stores. In addition to the stores, Blokker has a webshop with a marketplace concept comprising more than 500,000 products. Organic traffic is the main source of traffic for the webshop. Together with Koen Leemans from OrangeValley, they started testing with more than 3,000 category pages to increase relevant organic traffic.

The Hypothesis

For this first round of page title testing, we tested four different variants. Research showed that we should focus on the transaction intent for the category pages. Each variation consisted of two elements: The name of the category, in the singular. 

*The original had the category name in the plural, while research showed that users tend to search with singular keywords, so we wanted this to match the page title. 

The first element was the same for all four variations. The second element we tested was the inclusion of a CTA; this element was different for the four variants. 

Original page title: “Stofzuigers koop je online bij Blokker” which can be translated as something like, “Buy vacuum cleaners online at Blokker.”

Variant one example: Stofzuiger kopen? Stofzuigers koop je online bij Blokker

CTA translation: “Buy vacuum cleaners online at Blokker”

Variant two example: Stofzuiger kopen? Online bij Blokker

CTA translation: “Online at Blokker”

Variant three example: Stofzuiger kopen? Koop het online bij Blokker

CTA translation: “Buy it online at Blokker”

Variant four example: Stofzuiger kopen? Snel in huis | Blokker

CTA translation: “Fast home delivery”

Please note that in the example below we show only CTAs (in the original there are also category names):

img-semblog

All variants included the category name (in the singular) and a CTA. We wanted to validate which setup would result in more (relevant) organic traffic to the category pages.

The Test

We used SplitSignal to set up the four tests. Around 3,000 category pages were selected as either variant or control, which means each test contained around 750 pages. We were able to determine that Googlebot visited 99% of the tested pages.

The Result

After 21 days of testing, we reviewed the results. There were two variants (variant one and variant three) that outperformed the modeled control group. For the other two variants, we were unable to detect any significant change, meaning that the variant performed no better or worse than the modeled control group.

Note that we are not comparing the actual control group pages to our variant pages but rather a forecast based on historical data. We compare this with the actual data. We use a set of control pages to give the model context for trends and external influences. If something else changes during our test (e.g., seasonality), the model will detect and take it into account. By filtering these external factors, we gain insight into what the impact of an SEO change really is.

img-semblog

We were able to identify the best variant, namely variant one. We saw a 12.8% increase in organic clicks to the tested pages, with a 99% confidence level. 

How to Read SplitSignal Test Results?

The Analysis of the Result (Why?)

The outcome of this page title test was determined by users. As mentioned, SEO A/B testing is not just about validating your SEO changes for Google but also for real users. Although Google will always be a factor, Google isn’t alone in tipping the scales.

Knowing what users expect and what users find appealing can make a big difference in how users interact with your search result snippets. For this test, we now know that variant one best suits the target group. Optimizing page titles is not just about including your main keyword(s). There is much more to writing appealing and effective page titles. 

This case study shows that finding ways to stand out and be the most relevant answer is now more important than ever. As an SEO, you need to think about and experiment with different factors that make up a page title. Within search, you always compete with other search results. If your answer to a search query is not what a user wants or expects, a user can simply click on a different search result.

For Blokker, we now have a new “master page title” template for the category pages to further test. This case study is an excellent example of why testing multiple variants helps you to quickly identify the most optimal format for SEO-related elements, such as your page titles. Together with OrangeValley, Blokker has created an SEO A/B test roadmap for continuous testing and improvement. We will publish more interesting insights, so stay tuned.

Have your next SEO split-test analyzed by OrangeValley Agency.

Share
Author Photo
Director of Agency Channel Sales for Semrush‘s Enterprise Product Division
More on this