Split Testing / Comparing Results / Winner

brewhouse
edited December 2016 in Email Marketing

Hi All -

Maybe I’m just overlooking this, but I can’t seem to find anywhere within the reporting interface that shows a comparison of the results when sending a split test campaign.

Is there a screen/page that will show Campaign A vs Campaign B with any of the following metrics:

  • Open Rate
  • Click Rate
  • Bounce Rate
  • Etc.

I hope I’m just overlooking this and that it is an easy find. I think this information would be very useful for determining subject lines and from names for future campaigns.

Thanks!

Comments

  • Brian
    Brian Chicago, IL
    edited December 2016

    Hi!

    If you go to the campaign’s report, you’ll see a filter in the top right corner that will allow you to switch between versions:

  • pioneersofchange
    edited June 2017

    Hi there,

    I’m also looking for an results overview.

    We have choosen “Determine (and send using) the winner” with three different subject lines. We have a winner: ok. But I cannot find the open rate at the moment the decision was made by the system.

    It would be interesting to know, how close the numbers were at that moment. There is no comparision possible based on the “Filter split test campaign” because for the winner the other 70% of emails have been sent already…

    Hopefully I am overlooking some detail here ?

    Thank you for your support!

  • marketingnutz
    edited June 2017

    I would also love to see tha answer to this question?! :slight_smile:

  • gordoncreative
    edited January 2018

    Yet another fail by AC. You can’t even see which technically had the better open rate because the numbers are skewed. If you simply filter by individual campaign it will show you the open rates but like the last poster said you can’t determine what they really were.

  • costengineering
    edited July 2019

    Any improvement with this one? I still can’t see the results of the sample.

  • alexwriter2003
    edited September 2019

    I completely agree …

  • erni
    edited October 2019

    Hello, I am experiencing the same issue - I am also missing the comparison of open rates of different versions at the time when the winner was determined. These are important data for understanding which preferences our target audience has! Could you please include that in the testing results asap?

  • asentria
    edited December 2019

    I agree that the “Campaign Overview Report” with the graph is not helpful at all, but you can click on each row in the SUMMARY tab and it’ll take you into a page with more specific data points where you can filter the split test campaigns and manually compare the numbers. :persevere:" alt=":persevere:" />

    I’ve contact Support to see if there was an easier way, I will update this post if I have more news.

  • bos484
    edited March 2020

    I would also like a similar reporting feature. It would be nice to see what the result was at the time the system chooses a winner. For example, how do I know if the winner was chosen by a narrow margin and if it is worth re-testing or using the results to shape future email strategy…

  • pesmail
    edited April 2020

    The Active Campaign reporting views really stink. How on earth are they so behind on this? I didnt even think to look at these options before I signed up because they are so thoroughly standard with ever vender now.

  • chelseanicole
    edited May 2020

    I’m also looking for a way of analyzing split test data (at the time winner was determined.) I’m actually really surprised there’s no way of seeing this information. :frowning:" alt=":frowning:" />

    I’m really disappointed that this was suggested years back and there’s still no progress when it is such an obviously important thing. Also, AC’s marketing on the Split Test page makes it seem like you can easily view this: https://www.activecampaign.com/email-marketing/ab-testing – with is very misleading to new customers. Any updates?

  • ihtspirit
    edited October 2020

    I’m wondering if anyone else is having trouble with the split test division in an ABCD test scenario. I’ve been doing these tests for a while and it’s becoming more regular where, in the test period, ABC all deliver to the same number of contacts while D somehow gets shorted, yet that shortage isn’t identified until after the send takes place. Yesterday, for example, I checked the report progress twice during the test window. The first time, ABCD had all been sent to the same number of contacts. The second time, perhaps 45 minutes through the hour, D had dropped by 400 sends. This skews my results because we determine the winner by 1-hour open rate, and D will always have a higher open rate because it didn’t send to as many contacts. I have asked support many times and can not get a straight answer as to why…if I’m sending to an entire distribution of 15,000 but testing each version to 10%, why ABC send to 1,500 contacts but D does not. Somewhere there is a glitch, either in the sending or the reporting.

    I don’t see this issue when doing simpler AB tests and haven’t noticed it on ABC tests.

  • artsummits
    edited January 6

    Totally agree – why can’t we see what the split test results at the time the winner was chosen? This is standard even in apps like MailChimp with less functionality than AC. I need to know how close a split test was… was it a 1% difference, or a 10% difference?

Sign In or Register to comment.