Every notice, marketing, brand, and strategy plan has a specific audience. The plan is built out to target that audience through multiple different tactics. This structure is what ultimately drives the campaign to the right people. But how do we make sure that once the content reaches the right people, they will engage with it? How do we choose the right creative for our audience?
The Élan team takes a holistic approach to narrowing down and selecting the basic creative. Our initial research process might include looking at: industry data, past campaigns, future campaigns, additional audience insights, platforms the creative will be served on, landing page designs, etc. Throughout the duration of the campaign, we can also monitor what creative is working and optimize the remainder of the campaign to adjust accordingly.
However, the best method to truly strategize and maximize media dollars is to take a test and learn approach. The most common form of this is known as A/B testing. A/B testing can be important even if you’ve previously run very similar campaigns. Audiences change and a test will reflect this, and give you the most up-to-date and accurate insight.
The following is an example of an A/B test that was run for a client. Details have been changed to preserve anonymity and confidentiality.
Testing image vs text creative
Step 1: When performing an A/B test for creative you first want to have a campaign with enough time to balance out averages. For this example, we ran the campaign for 30 days.
Step 2: Narrow down what you will be testing. We did image vs text creative for this particular test. Other options can include color, text, images, image style, etc.
Step 3: Control all variables. For this example, we left the audience, spend, time, and placements the same for both creative. That way we could confidently analyze the results.
Step 4: Run the test and check in weekly. We saw results within one week, and they continued to build throughout the month.
Perhaps the most interesting finding this test gave us was the inconsistency of the results across campaigns. For some campaigns, the image-based creative outperformed the text creative by a high 84%. For other campaigns, both types of creative were at around a 50% performance. And for other campaigns, some text creative outperformed image-based creative by around 34%.
This type of inconsistency is why A/B testing is so important in every type of campaign.