On June 15th EMI was on hand to pick up four awards for our B2B work from the New England Direct Marketing Association. In the first event of the night, we won gold in B2B print for financial technology leader Intuit’s Digital Insight subsidiary. EMI’s interactive work for the same client took silver, and the integrated lead gen campaign took home a bronze. Our DM for upstart Zipcar’s B2B launch also took bronze. 4 awards out of more than 430 competitive submissions…Two of our award-winning creative team–Jen and Nathan–took home the statues. Love the awards, but value results for our clients even more.
At the 2010 NEDMA conference, EMI gave a presentation on how to take an iterative approach to testing to optimize response in a B2B lead generation environment. We used case studies to illustrate two approaches to testing and then refining campaigns through learning, optimization and subsequent roll outs.
Case Study 1 covered a series of product promotion campaigns over the course of two years. Each campaign was optimized based on key learnings from preceding campaigns (i.e., audience targeting, positioning, response channels, and incentives) that included email and direct mail channels. We discussed what we learned from both success and failure, and how, over time, we’ve developed a knowledge base that allows us to more effectively and efficiently target our audience with messages and creative approaches that drive response. The highlight of this case study was the 80% cost per lead (CPL) reduction from 2008 to 2009 direct mail campaigns and the identification of key drivers of email campaign performance: list selection and a clear, simple call-to-action.
Case Study 2 covered a true 4-cell test. We tested two approaches to messaging (broad versus niche) with two different creative formats (letter package versus self mailer). Our measure for evaluation was CPL, and the clear winner emerged with a CPL 51% lower than the overall average. Interestingly, the winning approach was more expensive on a per-piece basis that focused on a broader positioning. We then reprinted and dropped a larger quantity of winning approach to a larger audience. That effort resulted in a 1.8% response rate and a CPL 27% lower than the CPL for the test.
At the end of the presentation, we took questions from the audience, which consisted of a range of industry professionals, and wanted to share a couple that brought up important considerations.
Q. Do you ever find a piece of content that just works so well you keep going to it?
A. Yes. There are some restrictions to this approach, however. The content has to be “evergreen”, meaning that the topic is relevant even if market conditions change. Even with content fitting this requirement, given the audience overlap among publication email lists and direct mail lists, you eventually hit a saturation point where you see response inevitably begin to decline. When this happens, you need to retire that go-to campaign and replace it with something new.
Q. How do you determine list quality when you’re deciding which publication lists to rent?
A. Testing. While some lists can be ruled out based on available demographic information or reputation of the publication, in our experience list quality and price do not always go hand in hand. We have gotten very strong response and high quality leads from inexpensive lists, and very poor response from some of the most expensive lists we’ve used. There are also general audience preferences by list that drive response for certain types of content over others, so testing lists with various pieces of content is also important for determining list preferences.