The results are in. Last month, we asked you, the readers of the MarketingExperiments blog, to write the most effective copy for a Consumer Reports email in a way that could test which value factors were most appealing to Consumer Reports donors.
To expand the amount of test ideas, we also asked the readers of the Convince & Convert blog.
We’ll get to the results, and the big winner of the MarketingSherpa Summit package, in just a moment. But first, a little more background and a few lessons.
A little background
Every year, prior to MarketingSherpa Summit, with the help of the MarketingExperiments blog audience and the audience of another marketing blog, we run a nonprofit organization test with a nonprofit organization.
Partnering with a nonprofit gives us a real audience to test with. More importantly, it allows us to use our collective ability as a community of marketers to create effective messaging for a greater good.
Prior to the test, we work with the nonprofit for a few months, diving into the data, getting an understanding of previous tests and coming up with hypotheses.
Then we reach out to you, tell you what we’ve learned about the nonprofit and give you a chance to win a package for MarketingSherpa Summit (produced by MarketingExperiments’ sister publishing brand). This year, the prize was a ticket to MarketingSherpa Summit 2016 in Las Vegas and a stay at the Bellagio. We award the prize based not on what we just think will be most effective but, instead, on actual results with real customers.
We then use what we learn from this test to run a follow-up test, crafted by the MarketingSherpa Summit audience. This “live test” is launched and the results are reported live onstage at Summit.
The goal of this public experiment is to create a very tangible example of customer-first marketing to help you improve your own marketing efforts while putting the customer first.
Objective: Increase clickthrough to donation landing page.
Problem Statement: The generic value communicated in renewal emails does not match the motivation of why a past donor would want to contribute to Consumer Reports.
Hypothesis: By highlighting specific value claims that communicate the appeal and exclusivity of donating to Consumer Reports, we will increase the overall clarity of value of donating while better matching customer motivations, leading to higher clickthrough to the donation landing page.
Primary Research Question: Which value claim best matches past donors’ motivation to donate again?
Primary Metric: Clickthrough to donation landing page
For this test, we sent an email to Consumer Reports’ previous donors to ask them to renew their membership, which is a donation to Consumer Reports. The goal of the email is to get a click to the donation landing page and ultimately give a renewal gift to Consumer Reports.
After our data sciences team crunched the numbers, it was determined that we could run a test with a control and four treatments and still run a test that achieved statistical validity.
“Testing four panels [treatments] plus a control is something we haven’t done as we tend to do one to two test panels at a time. Investing in a test like this where we took the time to come up with four different themes and testing them head-to-head in one test is very valuable to have and has inspired me to challenge ourselves to consider doing more robust testing in one round, rather than straight A/B testing in multiple rounds, to get learnings sooner,” Dawn Nelson, Director, Fundraising, Consumer Reports.
Here is the Control and the four treatments. The subject line was the same for all of them — ”Valued Supporter, Please Read This Important Notice.”
Control: Mixed value proposition
We worked with Consumer Reports and conducted research before we launched this contest to identify which value focuses might be most appealing to donors. The Control that Consumer Reports had been using was a mixture of these value focuses.
Treatment #1: Honest and unbiased reporting
For the treatments, we sought to isolate the value focuses to get an understanding of which elements of value were most impactful for the ideal customer (in this case, Consumer Reports donors).
Here is an explanation of the honest and unbiased reporting value focus:
- Process-Orientated: Not having to rely on advertisers or corporate funding allows Consumer Reports (CR) to deliver honest reporting.
- Consumer-focused
- Independent and unbiased
- Purchases all products that are tested
- Not influenced by corporations
Treatment #2: Personal impact
We described the personal impact value focus as:
- Outcome- Orientated: The research and testing conducted by Consumer Reports has a direct impact on the safety and quality of life for millions of people.
- Makes a difference in the lives of others
- Helps keep me and my family safe
- Helps me make well-informed purchase decisions
- Helps me save money
Treatment #3: Quality of research
We described the quality of research value focus like this:
- Process-Orientated: Trusted, high-quality research is powered by the largest consumer advocacy group in the world, Consumer Reports.
- Consumer Reports has 60 labs plus its own auto track used to conduct its testing
- CR rates nearly 4,000 products and services each year
- There are nearly 300 CR staff members involved in testing, researching and reporting on all of these products and services. This includes, to name a few, individual testers, scientists, doctors, writers, editors, proofreaders, statisticians, Web developers and market researchers.
Treatment #4: Consumer empowerment
We also wanted to give the entrants in our audience of professional marketers an opportunity to suggest a value focus.
Of the suggested value focuses, we chose consumer empowerment, submitted by Sherice Jacobs of iElectrify, to be option that might most likely resonate with Consumer Reports donors.
You can read the full email copy by clicking on the email below, but here is an excerpt from her copywriting to help you understand the value focus she proposes:
“Don’t just make a donation. Make a statement. Help us to continue making sure the best products always rise to the top and quality is always rewarded. From everyone here who works hard to make Consumer Reports the trusted resource it is today — thank you, and let’s keep working together to make 2016 the Year of Consumer Empowerment.”
Results
The Control outperformed all of the treatments. It had the largest impact when measured against Treatment 4 (29.2% relative decrease at 99% level of confidence), while it did not validate when compared to Treatment 3 (6.7% relative decrease at 85% LoC).
Level 1 lessons from this public experiment
There are two levels of lessons you can gain from this experiment. At the base level, there is the analysis you must conduct with any test — why did one treatment perform better than the other? Or, in this case, why did the Control win?
“The big takeaway for me on our customer (and something we have suspected and need to test further) is we definitely can talk in a broad sense to our donors and see good results. But if we can do segmentation and talk to those segments in very specific language — we may see better conversions,” Bruce Duesterhoeft, Program Manager, Online Fundraising, Consumer Reports, said.
This public experiment was essentially a mini Research Partnership between the team at Consumer Reports and MECLABS Institute (parent research organization to both MarketingExperiments and MarketingSherpa).
According to an analysis from the MECLABS team (major props to Melissa Mike, Jesse Kraker, Derek Snow, Jonathon Yates and Jordan Baker for their work on this experiment) based on the patented MECLABS conversion heuristic, here are a few key lessons:
Value
- The value proposition for the Control focused on multiple value statements throughout the email that could have appealed to the motivations of a wider user base over the treatments which focused exclusively around one value statement.
- The Control and Treatment 3, focused on quality of research, increased clarity about Consumer Reports testing by using specificity surrounding the testing process and how Consumer Reports would use donations.
- The other treatments did not provide specific value points which could have hurt the clarity of the value proposition.
- The value proposition of Treatment 4 (consumer empowerment) did not provide as many specific evidentials to support the claim of consumer empowerment which reduced the clarity and appeal of the email, causing it to perform the worst of all treatments.
- The opening statement of the Control may have provided more relevance to customers and provided continuity from previous customer touchpoints.
Motivation
- The value proposition for the Control focused on multiple value statements throughout the email that could have appealed to the motivations of a wider user base over the treatments which focused exclusively around one value statement.
- Across the Control and all treatments, the most clicked links were the red “Donate” button CTAs on the top and bottom of the page. However, the next most clicked link was the membership benefits link in the footer of the email.
- This shows that length-based friction may not be a deterrent for this group base and that many are motivated to learn about benefits membership provides.
- Those motivated by the value of the testing and personal impact are the most likely to donate more money — the personal impact value factor generated 41% more revenue per email delivered than the honest and unbiased value factor, and the quality of research value factor generated 38% more revenue per email delivered than the honest and unbiased value factor. However, there are fewer Consumer Reports donors motivated by altruistic outcomes than there are by the product testing process.
Friction
- Both the Control and Treatment 3 used very similar layouts that mix bullet points, paragraphs and text links.
- This layout reduced difficulty-based friction associated with longer emails, providing clarity for customers trying to absorb the information being presented and understand the overall ask of the email.
“Learning about the value proposition really opened my eyes to looking at our work (and other marketing, too) with a different eye,” Bruce said. “We definitely will be reviewing all of our emails and landing pages using the probability of conversion equation [heuristic] moving forward.”
Level 2 Lesson: Challenge the model
The previous lessons apply specifically to this test, but there is a bigger lesson to be gained from this public experiment.
When I first approached Dawn and Bruce about partnering on this test, they were already successful marketers and very experienced testers.
As Bruce said in the blog post that launched this experiment, “Since 2013 we have grown [donations] from $800,000 a year to looking at over $2.7M this year.”
But they didn’t settle. When approached with a very out-of-the-box idea, they jumped in wholeheartedly. And I can attest, throughout this process, they made many difficult decisions that challenged their current understanding of their customers.
That’s no easy task. We work so hard on our brands, and we think we know our customers. Perhaps we’ve already tested heavily with our customers. To publicly go against the wisdom (we think we have) already gained about our customers in a campaign that needs to generate revenue goes against every bone in our marketing bodies.
But never be complacent. The times change. People change. Our competitive landscape changes. Always challenge your current model. These difficult challenges are where you fill find the next opportunities.
You might sometimes get a loss. But you will always get a greater understanding of the customer (if you experiment correctly). And, in the long term, that is how you achieve sustainable business results.
“Challenge what you think you know by doing very specific testing around messaging (first) and then presentation (second),” Dawn advised. “Test the boundaries to ensure there are very distinct differences in your messaging to allow you to really discern what is resonating and what definitely is not. You can learn as much from the ‘losers’ as you can from the ‘winners.’”
“Marketers need to leave their ego at the door and look at their messaging with a different eye. Our instincts are not always the consumers’,” Bruce suggested.
And the contest winner is …
While none of the treatments outperformed the Control, we were also holding a copywriting contest, as I mentioned previously. The winner of the contest is the writer of the best performing treatment, even if it didn’t beat the Control. After all, these intrepid copywriters were creating a treatment against a control created by experienced marketers with a deep inside knowledge of their brand.
When comparing the treatments against each other, Treatment 3 (quality of research) outperformed all the other treatments with the largest validated increase in clickthrough compared to Treatment 4 (24.1% increase at 99% LoC).
So congratulations to the writer of Treatment 3 — Ahnna Pildysh, Product Marketing Freelancer, Malaspina Labs. Ahnna won a ticket to MarketingSherpa Summit 2016 and a stay at the Bellagio Resort.
“Since Treatment 3 themed around ‘quality of research’ was second to the Control in terms of clickthroughs and donations, I learned that this theme resonates and is one we haven’t emphasized as much in past communications and perhaps could emphasize more in our future messages, along with product testing,” Dawn said.
When asked about her approach, Ahnna said, “I chose to use two value focus options which were based on the market research doc that was provided. I focused on the value trigger points of the donor audience and wanted to deliver simple messaging that highlighted how Consumer Reports delivered on those expectations. The market research doc was really helpful in understanding what the audience cared about and helped me narrow down my focus and keep the value prop on target.”
Congratulations again to Ahnna, and thanks to all of the marketers and copywriters who took the time to share their wisdom and provide insights Consumer Reports can use to better serve donors and consumers.
“This contest piqued my interest simply because it gave me an opportunity to test my ability to craft a value prop. I’m a big believer that everything should be tested to gain useful learnings and continuously iterate,” Ahnna said.
You can follow Daniel Burstein, Director of Editorial Content, MECLABS Institute, @DanielBurstein.
You might also like
Top Takeaways of MarketingSherpa Summit 2016 webinar [From MarketingSherpa]
The Writer’s Dilemma: How to know which marketing copy will really be most effective
Subject Line Test: 125% more unique clickthroughs
MECLABS Online Testing online course [From MECLABS, the parent research organization of MarketingExperiments]
No comments:
Post a Comment