Set up a custom experiment (original) (raw)

You can create a custom experiment from your original campaign and compare how your experiment performs against your original campaign over time. The experiment shares your original campaign’s traffic (and budget) and lets you test changes to your campaign so that you can make more informed decisions on which tactics give you a better return on investment.

This article explains how to set up a custom experiment. You can also Set up ad variations and Set up video experiments. Learn more about the Experiments page.

Things to note about custom experiments

Note for Hotel Ads campaigns:

Keep in mind

If you make changes to your original campaign, those changes won’t be reflected on your experiment. Making changes to either your original campaign or experiment while your experiment is running may make it harder to interpret your results.

Instructions

Set up a custom experiment

  1. Go to Experiments within the Campaigns menu Campaigns Icon.
  2. Select the plus button at the top of the “All Experiments table” and select Custom.
  3. Select Custom experiment (experiment with multiple campaign optimizations for any campaign), select a campaign type, then select Continue.
  4. In the “Set up the experiment” section, enter the name of your experiment and description (optional). Your experiment shouldn’t share the same name as your campaigns and other experiments.
    1. Choose the original campaigns you want to test in your experiment. We’ll display the campaigns you selected and automatically create the experiments.
    2. Label your test campaigns with a suffix to differentiate it from the original campaigns. We’ll append the suffix to all of your test campaigns. For example, “Campaign 2 (Holiday_test)”.
    3. Click Continue to finish the setup of the experiment. The next page will let you make changes and schedule the experiment.
  5. To make changes to the experiment, select up to 2 goals to measure your success metric. For example, select “Clicks” and “Increase” for the first goal.
    Note: These success metrics will be emphasized in the campaign report, however you'll still find all of the other metrics as they're normally reported.
    1. In the “Experiment split” section, select the traffic and budget you want to split your experiment by. We recommend using 50% to provide the best comparison between the original and experiment campaigns.
    2. Select the “Advanced” options you want to use in the experiment:
      • Cookie-based (recommended) randomly assigns users to either your experiment or original campaign and ensures that a given user only views either the original or the experiment.
      • Search-based randomly assigns users to either your experiment or original campaign every time a search occurs. If a user runs multiple searches, the same user could view both the experiment and your original campaign. This option may get statistically significant results faster than a cookie-based split.
  6. In the “Schedule the experiment” section, select the start and duration in “Experiment dates”. The program will determine the end date.
  7. Select Save to finish creating the experiments. Your experiment is now completed and ready to run.

Fix issues creating an experiment

In some instances, you may encounter an error when trying to create an experiment. Here are a few reasons why this could happen, along with steps you can take to fix these errors.

What you can do when you're ready to end your experiment

If you’re happy with experiment results, you can apply your experiment to the original campaign or convert your experiment into a new campaign.

In both instances, your experiment’s performance data will be preserved. Learn more about how to apply your experiment.

If you prefer to simply end your experiment, you can do that by changing the end date of your experiment so it stops running at the close of the day. You can also select Edit in the upper-right corner of the page and select End now. Learn more about making changes to your experiments.

Note: Creatives in campaigns that are under the “setup” or “creating” status won't go through the review process. As a result, the creative status will stay “Under Review” until the draft is applied to the original campaign.

Was this helpful?

How can we improve it?