The Mobile A/B Testing Starter Kit
Table of Contents Introduction 3 Give the Users What They Want 4 Make Navigation Simple and Easy 7 How to Make the Most of Promotions 10 Refine your Call-To-Action 13 Shares, Shares, Shares 16 Conclusion 19 2
Introduction What are your goals for improvement? Do you want to improve engagement rates? Increase social sharing? Grow revenue? Enhance the user experience? Enter mobile A/B testing. With it, you can test any change you can dream of and get in-depth analytics so you know what works and what doesn t. The best part is that mobile A/B testing is fast and powerful. You can create new versions of your app in minutes and know what really works for real users in a few weeks. Iteration now takes minutes rather than months! In this ebook, we ll introduce you to some ideas to get you started on mobile A/B testing, and show how different companies have used A/B testing to get great results for their apps. Let s jump straight into it with some real-life examples. 3
Give Users What They Want Make it easy for users to get value from your app. HotelTonight saw an increase of 15% in bookings 4
THE APP HotelTonight is an app-only hotel booking site shaking up the hotel industry. The app allows users to book easily make same-day hotel bookings, enabling people to take advantage of the serendipity of life. Users search for hotels available within a certain radius, and make a booking with minimal effort. THE PROBLEM HotelTonight had become very popular, and had become well known for their easy and quick payment system. With an already slim funnel, it seemed there was little they could do to improve the process further. However, they still decided to check and make sure that they had done all they could to increase conversions. THE A/B TEST HotelTonight decided to test out to see if removing signups would further increase their bottom line. While most mobile apps require sign-ins so that they can collect user data on mobile and web, many online ecommerce sites did not require signups. They created two variants, the control, where users went through the process of signing up before completing a purchase, and the variant, where users could simply search, select their hotels, enter payment, and book a hotel. 5
THE RESULTS The variant far outperformed the control, with bookings increasing by 15%. When HotelTonight removed additional friction from the process, users were more willing to book through the service, rather than turn away at the payment page. The HotelTonight team thinks this is because users mindsets are those of an ecommerce consumer, rather than a mobile one. Many ecommerce sites let users purchase without logging in, instead allowing users to checkout as guest. Therefore, the expectation of not having to sign up likely caused many users to be turned off by the sign in requirement. The changes led to a 15% increase In bookings HotelTonight proved that common best practices such as making all users sign in doesn t apply in every situation. Apps should make sure they run tests to make sure they don t have a leaky funnel. 6
Make navigation simple and easy Test your User Interface. You might be surprised by the results. 30 Day Retention up by 18.3% 7
THE APP The original app [Anonymous] is a social fashion app that allows users to follow style trendsetters and also create trends of their own. Fashionistas can easily snap pictures of their outfits and share them with their followers to start a trend. The app monetizes through the use of display ads, so user engagement is the key to a growing revenue stream. THE PROBLEM The app product team utilizes many user tests to generate ideas for app improvement. One idea that had come up several times was to make certain actions easier for the user to perform. The original version of the app had a hamburger menu icon that revealed a long list of menu items when clicked. Users found this menu to be overwhelming. THE A/B TEST The product team decided experiment with a tabbed menu that would contain the 4 most used actions in addition to the hamburger menu. Before and After They thought it would not only make it easier for users to navigate, but it would also highly encourage users to take photos of their own to share. The team hypothesized that if people shared their own ideas more, they would feel more attached and loyal to the app. 8
THE RESULTS The A/B test yielded surprising results. Having the camera icon easily accessible had no significant impact on the rate at which people shared their own outfits. However, the added nav bar did increase engagement and retention for the app. Metrics Tracked Clicks on Menu Icons Lost Users Moderate Users Heavy Users Users who still opened after (30, 60, 90 days) The team was able to determine that the new menu helped salvage Lost users into Moderate users while Moderate users were becoming Heavy users. Many users were opening the app more times in the first month with the new menu. 30-Day Retention 18.3% Increase 60-Day Retention 10.3% Increase 90-Day Retention 9.1% Increase Additionally, when the team looked at 30, 60, and 90-day retention (for both new and existing users), they found that all three numbers increased. 30-day retention increased by 18.3%, 60-day retention by 10.3%, and 90-day retention by 9.1%. While the app team hypothesized that the camera feature was going to increase engagement, it turned out that a more visible nav just made the app easier to use overall even if users which increased engagement and retention. 9
How to Make the Most of Promotions Make Users happy right before they take important actions. An increase of 28.1% in restaurant bookings 10
THE APP [Anonymous] is a popular restaurant reservation app. Users can look up restaurants in a select area, see the restaurant reviews, and reserve a table straight from the app. THE PROBLEM The app team knew the average number of restaurants browsed before a user will make a reservation. This led to the idea that perhaps users just needed help choosing a restaurant out of all the options available in any given category. This app already had a promotions mechanism that offered deals for reservations at restaurants that advertised through the app. It seemed like a good idea to combine the two concepts and highlight deals that applied to whichever food category and distance the user was searching. THE A/B TEST The team changed the frontend of the app so that deals were highlighted both in the list view and the map view. However, the algorithm for sorting the restaurants did not change. Restaurants were still shown based on a combination of relevance and distance with the promotion not affecting the restaurant s ranking in the search results. The tested variants only changed whether the promoted restaurant would be highlighted or not. 11
THE RESULTS The test turned out to have negative results. The highlighted deals variant increased the rate at which the promoted restaurants were viewed, but actually decreased the number of reservations made. It also lowered the return rate of users. Highlighting deals was actually turning users away. The First Variant Users felt manipulated by the blatant promotion of sponsored items. It lowered the users trust in the editorial integrity of the app, even though the sorting algorithm didn t change at all. ONE MORE TRY Rather than calling it the end of the experiment, the app team decided to try one more thing. Before and After 28.1% increase in restaurant bookings The team decided to test a surprise and delight tactic, where users were told a restaurant had a promotion running after the user was already reading the reviews for that restaurant. While this variant didn t have an affect on the number of restaurants viewed, it increased the booking rate on promoted restaurants by 28.1%. Since these restaurant promotions account for a significant portion of the app s revenues, this test dramatically increased the profit that the app was earning. 12
Refine Your Call-to-Action Your CTA is critical for conversions. Make sure you have the right one. Clever Lotto saw a 35% increase in conversions 13
THE APP Clever Lotto is the leading German lottery app that allows users to manage their entries easily. Through the app, users can enter multiple lotteries and track the winning numbers without having to travel to a physical store. If users do purchase physical tickets, the camera functions helps users easily convert those tickets to digital form within the app. THE PROBLEM After a specific lottery drawing is over, Clever Lotto enables users to play the game again in the next drawing with a simple click of a button. The product team wanted to test adjustments to that button to find ways to increase the rate at which users entered a subsequent drawing. The baseline variant is shown below with a green Jetzt online spielen ( Play online now ) as the call to action. THE A/B TEST Clever Lotto s growth lead Andreas Fuchs decided to create a multivariate A/B test to determine the best color and copy for this call to action using the Visual Apptimizer. The following variants took him only five minutes to create. 14
THE RESULTS The team predicted that the red fonts would work better since red tends to feel more urgent. However, the results showed that both variants with the copy NOCHMAL SPIELEN! ( PLAY AGAIN! ) outperformed Jetzt nochmal spielen ( Play again now ). Within each copy set, the green outperformed the red. 35% Increase In Conversions 5 Minute Setup When we break down the multivariate test to separate the impact of color and copy, we find that the green has an 72.40% chance of causing an 5.95% lift over red while NOCHMAL SPIELEN! is has an 99.85% chance of causing an 25.38% lift over Jetzt nochmal spielen. This means that color did not have a statistically significant impact while the copy change did. When combined, the green NOCHMAL SPIELEN! lead to 42.31% more clicks on that call to action than a red Jetzt nochmal spielen. Compared to the baseline (green Jetzt online spielen or Play online now ), the green NOCHMAL SPIELEN! improved clicks by 35.39%. Clever Lotto expects this small copy change to improve the total number of lotteries played in a year by 28%. 15
Shares, Shares, Shares Seek creative ways to get users to share and review your app. SafeTrek saw an increase of 106% in clicks The results let to more reviews in 1 month than it had in 7 months prior 16
THE APP SafeTrek is an app available on both ios and Android that empowers users with proactive mobile protection in times of emergency. In a dangerous situation, the user opens the app and holds down a button until the danger dissipates, typing in a pin number to disarm the app. If a user lets the button go and the app is not entered after ten seconds, SafeTrek calls the police. THE PROBLEM The original app had this sharing icon: The icon then opened this popup: SafeTrek had received several glowingly favorable reviews before implementing their test, but the creators believed that they could dramatically increase sharing and rating of the app with a few simple changes. The idea they had was to change the standard sharing icon to something more unique and representative of the idea that you are giving a friend access to life by sharing this app. The user can then click SHARE or RATE to either share the app or submit a review to the app store. THE A/B TEST The team decided to test the traditional sharing icon against two others that they created a solid heart and a heart with a plus sign inside. In addition to tracking click rates on each icon, they also tracked further down the funnel to clicking of SHARE or RATE in the modal and the selection of sharing mechanism. 17
THE RESULTS The two versions of the share button Both heart variants outperformed the baseline, but the heart with the plus in the middle increased clicks on the icon, share and rate buttons the most. With the heart+ variant, the rate button received 106% more clicks and the share button received 86% more clicks. The heart+ icon itself received 128% more clicks on ios and 97% more clicks on Android. The three variants This change lead to more reviews in the itunes app store in a month (May) than the app had ever received in its seven months since launching (October April). The Safetrek team hypothesized that the change led to more clicks simply because users are unfamiliar with the heart icons, and were concerned this would leave to a negative impression. However, the data proved that wrong. Rate button: 106% é Clicks Share Button: 86% é Clicks Throughout the experiment, he conversion rate between clicking the icon and clicking the SHARE or RATE buttons remained steady, while the ratings in the store continued to be extremely positive. 18
CONCLUSION Now that you ve gotten a better picture of what can be done with mobile A/B testing, it s time to start making changes on your own app. It only takes 5 minutes to install the SDK with our no-coding installation. Get started today by signing up for Apptimize! 19