Campaign Management & Analytics Best Practices Optimize Your Retail Campaign Testing Tips on Developing Best Practices to Avoid Pitfalls and Enhance Your Overall Strategy This document is designed to share our best practices for campaign testing. SDL works with a large number of clients to ensure that their email campaigns are successful. As part of these partnerships, we provide guidance on how to properly conduct tests prior to launching any email campaign. Our client, Newsmax, sends over 4 billion emails annually and has a consistent delivery rate of 96%. This is not just blind luck. A high degree of planning and knowledge goes into testing email campaigns so that Newsmax is regularly hitting this high rate of deliverability. In this document we will discuss which areas are important for campaign managers to test, the best approaches to managing the testing process, and specific testing best practices for Retailers. We also highlight some of the pitfalls to watch out for that can impact a testing program. The major focus on testing applies to email, purely as it enables such rapid testing results, but the fundamental principles and practices also apply to other channels. Developing the right structure for your tests Organize for success. If you are accurately prepared for testing then you will be better positioned to deliver results faster. Being organized means getting a clear plan in place, but also having the right tools to enable you to both run tests and correctly evaluate results. In larger organizations where several people are involved in running campaigns you need to have some coordination of activities, If you don t, then either the key findings aren t accurately shared or one activity could impact another test. Having a campaign management and analytics software in place definitely helps, both for running your tests and reporting the findings. Decide what is important to test. It is imperative to agree on exactly what areas you want to test. Traditional direct marketing had a clear hierarchy Roger Luxton of areas to test; with people typically focusing on the product/service, the target audience, the offer, the timing, before moving onto format and Industry Marketing Director creative tests. However, with the emergence of digital channels, there is a chance to re-evaluate which areas are most important for you to test. The emphasis on what you should test will differ and will be based on which channels are most important to you, what segments work, how mature you are as a business, as well as if you are experiencing a significant fall off in parts of your business. The Titanic Principle definitely applies when deciding on your test. There isn t any point being on deck re-arranging Molly Boyer Product Marketing Director
the deckchairs, if your real challenge is an Iceberg. Measure and report. Equally as important as knowing what to test is deciding what it is you want to measure. This will drive both the design and assessment of the test. For instance, in email, if you are testing four different subject lines, one of these might be more successful at gaining opens, another at generating click through, another at producing purchases within the first 2 days, but another might result in more sales over a 7-day period. Determining the most important metrics will help to avoid any post rationalization of results. In faster paced channels, such as email, SMS and the web, the speed of being able to view your results is also clearly important. There is little point in running a test looking at the best way to display your current sale products, if it takes weeks, rather than hours or days, to obtain the results. You will end up learning what worked, but far too late to use the learning in your next activities. Share your results. Developing a mechanism for sharing your measurement and reporting is also essential. This should definitely include providing easy access to report dashboards so colleagues can quickly see test result themselves. However, we believe it should also include a more formal and regular discussion in marketing team meetings. This enables the latest test results to be discussed, considered and used in other parts of the business. Review what and why you are testing. Once you ve started to structure your testing program and built a level of knowledge of what works best, it is time to put your findings into action. Organizations who deliver the results are constantly reviewing and repeating their testing. This practice ensures that they are always current with the things that matter in driving customer conversion. Seven Pitfalls to Avoid Now that you have implemented a solid testing program, it is important that you steer clear of some common pitfalls related to email best practices. We ve identified seven pitfalls to avoid: Not testing. Probably the biggest challenge is not starting to test. We see too many organizations that aren t doing any testing. So don t wait, make sure that you start your testing. When done properly, it will provide the results you need to make a difference in your campaigns. Too much complexity. On the other end of the spectrum, don t be overly sophisticated too early. It is easy to get excited and test everything every time. However if you aren t careful you can easily get bogged down in minute detail and create increased stress and cost, like developing a lot of different content, when actually you could achieve bigger wins by focusing on a smaller number of more relevant activities. Be realistic about how much content you can create and what you want to test. If you only have the budget available to test a couple of creative approaches, then choose wisely and don t try to test more than you feasibly can. Mismatch between ambitions and data available. If you have a very small database, designing too many different test cells in your campaign may produce results which are statistically insignificant and you may not learn as much as you wanted. It is far better to ensure that your responses are large enough to show clear results and be statistically significant. Lack of reporting. Your results need to be measurable and shareable; if they aren t then you aren t getting the maximum benefit. Businesses who don t share the results or don t have a measurement or reporting system in place have no way of tracking results. If you aren t tracking your results, then you are unable to produce a clear view of the overall effectiveness of your campaigns. Not challenging previous learnings. Testing can give clarity to what works, and establish champion messages, offers or rules. However, every so often you need to challenge these. In the pre-digital age, some organizations would stick with these champions for years. With faster response, new channels and consumer behavior, you are better positioned to challenge some of these champion approaches much more regularly. Assume one size fits all. Consumers don t all react the same way; therefore you need to avoid assuming that any test will apply the same for different segments of your database. The more you can segment your tests, so that you understand which messages or channels work best for particular groups of your customers, then the better the learning. Lack of focus. Marketers are full of ideas on what to test. t s important, however, not to lose focus on what you want to learn from a test; else you run the risk of not really understanding what the test results
tell you. If you have an existing creative which works, but you want to improve it, then there might be 10 things you could change. However if you change four of these items in every one you send out, then how will you know what works? It could have been just one of the items you changed that led to an improvement, but without being able to isolate the four elements you don t learn as much. Structure your tests to learn more than one thing at a time. For instance you could split your audience down into four groups and test one item for each group. This would enable you to speed up the rate at which you learn and will keep focus on what you are trying to accomplish and improve. Best Practice Tips for Email Email is a direct marketing person s dream. It enables you to test relatively easily and obtain rapid results. It can also be a less expensive route to gaining learning. The first challenge is to focus on what is important to test. With email there are several key stages involved, and to maximize success you need to decide on what end results you wish to see, or the goal you are looking to achieve. Do you wish to improve your open rate? Do you wish to improve your click through rate? Do you wish to increase conversions? By focusing on the goal, rather than the item you initially wish to test, you are more likely to improve performance. After all, this is ultimately all about driving business, rather than finding nice to have facts, so let s consider each of these goals and understand what to test. Improving your open rate If your goal is to increase the open rate, then you need to influence the factors that will be relevant to getting that email opened, particularly the Subject Line, Sender Address, Time of Day and the Day of the Week. Subject Line. Choosing and testing the right subject line is fundamental for email success. This is the first item that a reader will see in their inbox, so the words you choose will be very important. From our experience there are a number of things that are worth testing: Testing various key words. Often good practice is to look at Google Ad words and take some of these important words and use them in your subject line. Tone of the Message. Does service focused work better than a pure sales message, or is it best to focus on added value items? The subject line clearly needs to fit the content of the email to avoid disappointment. We ve seen a number of retailers having great success recently with more service/ informational type messages. Personalized subject lines. Do these work for your audience, or do your customers think they are a bit spamy? Subject Line Length. There are a lot of theories as to whether pithy subject lines or lengthier messages work best. If you are not sure which is best for your open rates, then you need to test for your audience. Time of Day. Most retailers are trying to ensure that their emails appear in the top of people s inboxes when they check their messages. Therefore, it is best to test sending at different times of the day. For instance do you get more success sending in the morning, so when people check emails at lunchtime they open them, or are you better sending them in the afternoon, so they see them when they view emails in the evening? Day you send your emails. You also need to test which are the best days to send out emails. For instance, are retailers more concerned about driving people to stores just before the weekend or earlier in the week? If you have an upcoming sale are you better off sending teaser messages before the sale starts, or focusing on regular contact during the sale period itself? Email Address/Friendly from. Does your email address just need to be your brand name, or can you test a friendlier sender address which puts the emails in context? For instance, does it come from the Retail Head office, or does it appear to come from the local store? And for those retailers with strong loyalty programs are customers more inclined to open an email from a sales person they are used to doing business with? Increasing Click Rates Once you ve tested the best approaches that work for getting your emails opened you need to ensure they click through from the email. Content. The biggest factor within the email is the content and how it is designed. Each retailer has different customers and requirements and therefore will want to test different elements of the content, but among the main areas we see to test are the following: Sentiment. Does a more informational email generate higher click-through rate than a sale focused one? Do you find that one works better than another for particular segments within your customer base? Long vs. Short Emails. Is your audience interested in succinct emails or do they prefer reading or seeing more content?
Single product featured vs. 2 to 3 products vs. Lots of products. What generates more clicks for you, highlighting one single product or showcasing as many different products as possible in your email? Balance between image and text. Is your audience more visual or do they like to read through content? What seems to be the right balance between your images and copy? Where you feature links. It is imperative to know which links get clicked. Do they come from just your images, or do you find that copy links also work well? We ve recently done some work with a number of retailers who ve boosted click rates by increasing the links highlighted in their copy. One hero image vs several images. Where do you feature images? Does a large hero image perform better than a handful of smaller images? Product vs. Product in situ vs. People enjoying. Are your customers product focused? Do they prefer clean/clear images of the product or do they like to see it being used or displayed? For instance if you are selling a floor lamp does an image of the lamp alone work well, or is it more effective when displayed in a living room scene? Or do you get more success by showing someone interacting with the light? Position of call to action. Have you tested the best place to put your main calls to action Are you putting your main calls to action in the top left of the email, or are there other places that work better for you? Personalization. Does your audience like to see personalization within the email so that they feel they are receiving a very personal message, or do they click through regardless? What appears to be the right balance for different parts of your customer base? Head Office or Local Store. Do your customers react better to a message that is localized or from a more corporate account? If customers have an affinity with their local stores then you may get higher click through rates if the message comes from a local store manager with images of the store and other specific local content such as parking and store opening hours. Designing for Mobile. According to a July 2012 study by Litmus, mobile has overtaken webmail and desktop email opens. 36% of all emails are opened on a mobile device and email opens on mobile devices have increased 80% since January 2012. With this shift in consumer interaction, it is imperative that you design your content to be viewed on a mobile device. If your content is not designed for mobile, you have a lost critical customer touch points. Increasing Conversion Converting shoppers into customers is critical in retail so you can t afford to lose any opportunities to connect with them. Your content must be relevant and valuable as it will greatly impact the likelihood for it to be clicked, and ultimately the likelihood of a customer continuing on their path to purchase. However there are a couple of additional areas which should also be looked at for testing. More vs. less information up front. If you have more complex products or particular terms and conditions which might impact conversion, then with particular segments of the customer base are these better featured up front or just showing them once a customer has clicked through from the email? The chart below shows how testing various segments of your customer base can offer you greater insight into click-through rates. Landing Pages vs Straight to Ecommerce site. For many retailers you want to drive customers straight from clicking on your email into your ecommerce site. However, are you better pushing customers via a Landing page first, which can be branded to reflect the campaign messaging, rather than a generic page? Testing in Other Channels Similar principles apply when testing your messages from within other media available to you. However the speed of the results and the cost/complexity of running the test may be more challenging.
SMS. SMS, or Short Messaging Service, provides you with the ability to test in a similar way to email, but you need to be more cautious around your messaging. It is a much more intrusive media so we caution you to be more careful about what you test and how frequently you use the channels. You can certainly test different messages, but most consumers tend to view SMS as a service focused channel rather than a heavy sales channel. Direct Mail. Traditional direct mail can also still be tested extensively by retailers, either for sales-focused mailings, or for more substantial catalogue offerings. The production and distribution costs tend to make this a more expensive media to test in, however. We see many retailers using email to understand some of the tone and content that appeal to particular segments and then rolling this out through direct mail. For certain retailers, there is a drive to keep catalogues smaller. We are, therefore, seeing quite a few retailers testing main catalogues vs. smaller catalogues and adjusting the frequency of catalogues that are distributed. Above the Line testing. Again, it is still possible to test on channels such as TV, Radio, Press and Inserts using these methodologies. The ability to perform A/B splits through some of these channels can also provide good insight. Social Media. There are still opportunities to test social media impact. For instance, if you launch a new store, then the social media channels or messages you put out to announce these could differ for each store. By monitoring social media conversations resulting from these announcements, as well as the sales activity could give you some useful information. Other Aspects to Test As well as testing across channels, there are other things that a successful retailer should be looking to test. Volume of activity. Every retailer is searching to find the best frequency of communication for their individual customer. The temptation, particularly in email, is that once you have a customer who interacts with your email is to increase the frequency of emails that you send out. The downside of this is you may well reach a point where the overall impact of each email is lessened and you either experience lower levels of engagement or worse you see more unsubscribes. It is good practice to test frequency of send, across all channels to make sure that you are getting the balance right a profitable path for you and a pleasant experience for your customer. Channel combinations. Your customers are likely to have a preference as to which channel they prefer, but that doesn t mean you should use that exclusively. Smart testers will be looking to assess the impact of using more than one channel to see if it boosts overall performance. For instance, do teaser emails before a catalog mailing work well and enhance response? Do emails around regional press promotions help to boost sales in the local stores? Can an occasional direct mail Store Invitation work to drive local store traffic even for customers who tend to buy mostly online? Direct vs. Stores. Most retailers operate both online and via stores. The balance and focus between these are certainly elements to test, and could generate a real sales uplift. For instance do you have discrete customers who only use one of these channels, or do you have a broader cross over from one to the other? Are there opportunities to encourage migration from direct-to-store perhaps preview evenings, special events, or priority access to a sale? Can you drive more success by appearing more local with contact coming from your local store? Measurement and Reporting Testing is only as effective as your measurement. If you fail to have systems and processes in place, then you will never learn all you can about your tests. Clarity on your measures. At the heart of your testing should be an understanding of the types of data you are looking to measure. For instance, if you are trying to maximize email opens, then this is your key metric. Whereas if you are looking to maximize click through or sales, then these are the measurements you should focus on. When designing your test the objective should clearly state your measures. It won t do you any good to test for testing s sake if you are not clear on what you hope to learn. Test and Control groups. As part of your campaign, you will need to set up and track your control groups. For an individual campaign this is usually straightforward, particularly if you ve got campaign management software in place. However if you have a number of campaigns running, and are running some longer term tests, then you will also need to ensure that you have the appropriate exclusion rules built into all your relevant campaigns. Again, with the right campaign management software in place this is very easy to do. Campaign tracking. When your test runs, you will want to track and report on this with appropriate frequency. The primary period that you will want to track a campaign for will differ in part by the channel and by the offer. For instance, an email campaign will
provide responses back within just a few days, while a direct mail catalog will have a much longer shelf life. Fix dates in the diary when you will produce an interim and final report on a campaign For higher profile campaigns, book a meeting to share results Keep the top line results easy to understand what worked best Include more information on differences by segment as well Clearly summarize the top 3 learnings that we got from this test Clearly show how we should use this knowledge in the future Store the results so that they can be referred back to again Make note in your calendar if you want to evaluate how different segments performed over time Longer term tracking. Along with understanding a campaign s overall performance, you need to evaluate results over a longer period of time. Many retailers have welcome programs designed to bring customers on board and to encourage them to make those important 2nd and 3rd purchases. This might involve a number of communications over a period of time. While you might want to test individual campaign elements, often retailers want to evaluate the overall impact of investing in a program. To report on this, you need to set up a ringfenced control group, those who are not included in your new welcome program. After your other new customers have gone through your welcome program, you will be able to compare how the welcome customers performed against your un-contacted group. At this stage, you would be looking at comparing how many in each group bought again during this period. However, you d also want to report again on this group, to see performance after perhaps 3, 6 and 12 months. Good luck and get started We certainly hope that by reading this article you are inspired to implement best practices in testing or begin your first testing program. The sooner you define and implement your testing program, the sooner you will be positioned to send meaningful communications to your customers and future customers through the channels they prefer. Just remember to follow our tips for success, and avoid those pitfalls. The more you test, the more you learn and the more you will sell. To find out more If you are ready to learn more about optimizing your retail campaign testing programs, contact SDL for more information. We have a team of retail-specific consultants who can help turn your retail ideas into retail best practice. http://www.sdl.com/solutions/industry/retail/ cma.html SDL Campaign Management & Analytics/Alterian Inc. 1580 Lincoln Street Suite 480 Denver, CO 80203 USA E-mail: info@alterian.com Website: www.sdl.com Alterian now SDL Campaign Management & Analytics Division. SDL s customer analytics, campaign management and email marketing tools enable marketers to develop and act on customer insights by engaging them with timely, relevant communications. Enterprise ready technology provides the engine for planning, managing and executing integrated marketing strategies across online and offline channels. More than 300 customers and 50 partners, including one-third of the top 50 global brands, individualize their customer s experience using SDL Campaign Management & Analytics software. SDL s customers include; Cisco, Newsmax, Princess Cruises, Marriott, Best Western and Wilson Sporting Goods. Copyright 2012 SDL plc. All Rights Reserved. All company product or service names referenced herein are properties of their respective owners.