How to Use Google Analytics Experiments

google experiments analytics a/b testing

Google Analytics is a highly intuitive, widely used platform for analyzing SEO optimization, user behavior, and much more. With a number of unique features that can be leveraged to help organizations make informed and data-driven decisions about the design of their site, it is one of the most widely used analytics sites in the world.

One unique element users can leverage is the Experiments feature, a content framework which enables website administrators to test practically any website change they desire against its impact on a specific objective. For example, imagine a key piece of content is switched out for a section with more SEO-friendly keywords; the Experiments feature will allow website administrators to test both versions simultaneously and determine which performs better against specific, measurable objects like a lower bounce rate or increased conversion rates.


By leveraging data to inform your website content, strategy, and design, you will help improve your search engine ranking and continuity of the user experience. This is accomplished by isolating key pieces of information in order to identify which changes are achieving desired effects. With direct access for administrators to view the experiment in real time and the ability to predict a successful variation using cumulative data in Google Analytics, Experiments offers an intelligent, intuitive approach to web design.


Google Analytics employs a three part process in implementing Experiments for your website.


There are a number of factors to determine before configuring experiments for your content.

Define the variations: Based on the content you wish to test, your variations should be defined according to your own preset parameters. A variation can be as small as a simple element on a page or far more complex. You simply need to provide a name for the variation with a Variation ID in an indexed list. Do note that once the list is set, the order cannot be modified.

Set the Serving Framework: Depending on personal preference, Google Analytics can handle optimization for the experiment, or you can choose to utilize an outside service. In Google Analytics, you can configure this option by setting the “Serving Framework” field using the Management API, which will affect the value of the JavaScript snippet property of the experiment. For more information, see this helpful reference from Google Analytics. Options include:

  • Redirect (default): This framework creates the experiment via the Management API, relying on Google Analytics to perform client-side redirects in order to display the specific variation.
  • API: In this framework, you choose the specific variations and control the display of them for users, but still employ Google Analytics to optimize the experiment.
  • External: With a 3rd party optimization platform, you choose the specific variations, optimize the experiment, and report only the chosen variation to Google Analytics.

Determine Goals & Metrics: If you set the Serving Framework to API or Redirect, the experiment goals and metrics will be determined internally by Google Analytics (if set to External, the objective goals and metrics will be configured externally instead). In Google Analytics, you will need to define which experiment objective you would like to optimize, either by minimizing or maximizing a goal or predetermined metric such as lowering bounce rate or increasing conversions. Remember that Goals must be previously set in Google Analytics, if you choose to use them.


You can use the built-in interface to run the experiment or programmatically run the experiment using the Management API. To run the experiment using the built-in API, you will need to choose the objective, then identify the original and variation pages for the comparative analysis. You will add and check the experiment code by verifying it within the Experiments program. Lastly, you will need to modify the experiment to optimize it before executing the actual experiment itself. You can modify the experiment while it is running if needed, and can stop the experiment at any time. It is valuable to note that you can copy an experiment if needed. For step-by-step instructions, see this helpful guide. Once the experiment is running, you can use the experiment ID, along with the list of variations with unique Variation ID’s, to send information to Google Analytics using a collection API/SDK. Experiments can run anywhere from 3 days to 3 months, depending on the objective goals and metrics required.


Experiments will help choose which variation to show a user based on the user’s previous behavior. They will always be shown the same variation each time they visit the site, giving continuity to the experiment data; however, a new user will be shown one of the two variations based on assignment within the experiment. Once a variation has been chosen to show a user, the Experiment ID and Variation ID are sent to Google Analytics in conjunction with relevant data for analysis.

To implement the experiment on the property, you will need to address the following specific concepts to measure the experiment in Google Analytics:

  1. Choose a variation for new users: To choose which variation is shown to a new user, the serving framework will be used. If the serving framework is API or Redirect, Google Analytics will choose the variation using a multi-armed bandit approach.
  2. Choosing a variation for returning users: Returning users are automatically shown the same variation each time they return to ensure continuity of the experiment. The Variation ID for the user will be stored anonymously in a secure, accessible location.
  3. Sending data to Google Analytics: To ensure complete accuracy, data should be sent to Google Analytics every time a user (whether new or return) is exposed to a variation in Experiments. Each time, both the Experiment ID and the Variation ID must be sent as parameters attached to specific hits. These collection API/SDK guidelines from Google outline exactly how to initiate data processing, which continues until the manual stop point for the experiment is reached or until a winner is declared by Google Analytics.
  4. Determining an outcome: To determine the winning variation of the experiment, the serving framework configuration informs the outcome.  For an External serving framework, the outcome is dependent on the externally-determined objective; this means the external service should be used to determine the best performing outcome of the experiment. For API and Redirect serving frameworks, the winning variation will be determined internally by Google Analytics. With a defined objective and measured conversion events and metrics, Google Analytics will determine the variation winner automatically once the goal/objective is reached.

The end data of the experiment is displayed in the Experiments report in the web interface or as an experiments dimensions and metrics in the Core Reporting API.


At WDG, we integrated Google Analytics Experiments as part of our redesign efforts for Federal News Radio, a large scale publication website covering non-partisan federal agency news. In order to increase user retention rates and traffic to their website, Federal News Radio sought an informed design that leveraged a data-driven decision process for the design revision. The new design needed to increase the amount of content that was visible “above the fold,” thereby improving overall user engagement and fundamentally improving navigation architecture. Specifically using  Google Analytics Experiments, our WordPress development team built a fully redesigned second home page with alternative features to test usability.

Using the variation options, Experiments measured the user journey in terms of bounce rate, time on page, horizontal traffic, and other retention factors. The two pages could be edited simultaneously from the same point in the WordPress backend, meaning the front page retrieves the data from the same place in the CMS for both versions. This was accomplished with careful development adaptations based on the client’s current system in order to facilitate easier processes for Federal News Radio administrators. After three weeks of testing, the experiment predicted with 100% assurance the new version would be the winner. It was therefore fully implemented into the redesign. Due to our strategic design and development efforts for the client, traffic has greatly increased. Federal News Radio analytics show a 2.65% lower bounce rate and increase of 72% average session duration, as well as 25 % more page views with 26% more pages viewed per session. See our case study here.

Want to find out more about using Google Analytics Experiments to make data-driven decisions in your website redesign? Contact our development team or email us at [email protected].

Related Insights

Start a Project

Want to Skip Right
to the Good Stuff?

Contact Us