Spy on your competitors’ A/B tests on Android
With our new A/B testing feature, AppTweak is giving you more insights than ever! Have you ever wondered what elements your competitors were testing in their metadata? Now, you can directly see which apps are running A/B tests, on which metadata, and what version they decided to keep. Read the blog below and start fine tuning your strategy.
What is A/B Testing?
A/B testing is a process that allows you to test 2 versions of the same variable to see which one performs better. It’s very often used in marketing to test web pages, ad creatives, emails, newsletters and more.
Apps can also do A/B testing in order to test each of their 7 metadata (Subtitle, Short description, long description, Feature Graphic, Icon, Screenshots, Videos).
The important difference between an A/B test and an update is that an update is a version that did not go through a period of testing. When you directly upload a new version without testing it, it can be less expensive for the company but you may select a version that will not seduce your customers as much as another could have.
However, it is easy to understand the importance of an A/B test but you may think that it is not easy to know how many images you should test, during how many days it could be tested, etc. This is why our new feature in the Timeline section will help you understand the A/B testing strategies of your competitors and develop your own.
Our A/B testing algorithm
Now that we have cleared up some definitions, we can dive deeper into how this feature works.
In the timeline section of the tool, you may have seen different coloured dots representing: the updates (green), the A/B tests (orange) and an A/B test on a metadata element plus an upgrade on another one (green and orange).
What we consider an update
Our algorithm will see if a metadata (icon, description, title, etc.) has been updated one day and then, it will look if during the following two days, there was another update for the same metadata. For instance, if an icon has been updated once and then, during the next two days, has not been updated, then it is considered an update.
What we consider an A/B test
On the other hand, if the algorithm sees another update within the next 2 days time, we will consider that the app developers are A/B testing their app’s metadata. Therefore the dots will be orange.
Of course, you will see that, some days, apps will do an A/B test on one metadata and an update for another metadata. This is why you will find two coloured dots in the Timeline.
In order to get this data, we fetch the data of an app everyday and store it in our database. The A/B tests are identified following this algorithm but it is possible that an A/B test, having a different behavior from the one we analyze, will not be detected.
You may think you will find yourself lost with all these coloured dots but if you want more precisions on what the A/B testing looks like, you can click on the dot of your choice and all the details will be shown in a table below.
We looked at the icon changes for the app Gardenscapes and you can see between the 18th of November and the 8th of December, the app went through two A/B testing phases. They kept the one they selected on the 8th of December.
Understand A/B testing strategies with AppTweak
Discover what and how often competitors are A/B testing
With our colour code, you can directly identify in our timeline, if an app has been A/B testing or not. When you hover over the dot, you can further identify what element they are A/B testing. When you click on a dot, you can even see which versions they are testing.
Here, We took 8 gaming apps in the US (Gardenscapes, Homescapes, Wildscapes, Lily’s Garden, Royal Garden Tales - Match 3 Puzzle Decoration, Manor Cafe, Matchington Mansion, Butterfly Garden Mystery: Scapes Match 3 Story) and we selected all the metadata changes and looked at the changes on 16th of December. You also can see that by clicking on one dot you have the detail for this date (here we clicked on one of Matchington Mansion’s dot on the 16th of December).
If you have a group of competitors, it is possible that some will A/B test a metadata more than others. As a result, within this group, you can learn from each separate A/B test of your competitors.
Here, for example, we selected a peer group of traveling and booking apps (Expedia, Trip.com, Priceline, Booking.com, Hotels.com, Agoda, KAYAK and Trivago). They are all competing in the same category but each is A/B testing different metadata right before the summer.
On this period, Expedia is testing its Feature graphic to see if a quick overview of their offer is better than an open-air photo.
Trip.com & Agoda are working on their Description: Trip.com changed their entire text structure while Agoda is trying to see if including “newly listed Home discounts” would make a difference.
Booking.com, Hotels.com, Trivago are testing their screenshots: Booking.com and Trivago are testing the background picture when Hotels.com looks at the best keywords between “Secret Price” and “Instant Savings”.
Seasonal A/B Testing
You can notice that some apps are trying new versions of their metadata during very popular seasons in order to follow the trend and get more downloads. You can check what your competitors A/B tested to get inspired from the versions they decided to keep at the end of the A/B test. It could give you valuable insights on best practices and on how you should invest in your A/B testing strategy.
We looked at the feature graphic changes for the app Gardenscapes in the US during the Christmas season. From this example, we notice that Gardenscapes tried to test a Christmas feature graphic at the beginning of the month. However, they quickly changed it back, putting a graphic not related to the Christmas theme. The hypothesis behind it is that, by changing their picture, the principle of the game was not clear enough and their brand was not recognizable enough to drive downloads. As a result, on the 22nd of December, they stayed with a feature graphic showcasing better the game without including Christmas elements.
Learn specific information on your App’s category
Sometimes, you can learn not only about the frequency of A/B tests but also their final results. Let’s say your app is in the food & drink category and you are not sure if you should put first screenshots of delicious food or the features and specificities of your app. To answer that question, go have a look at your competitors and see if they already tried to answer this question.
Here, we took the example of DoorDash, a food delivery company in the US. As you can observe, they first started by testing if two screenshots of food would work better than having: first a screenshot showcasing the app functionality and second a screenshot with food.
At the end of this experiment, we noticed that the second option was actually kept and they tried to decide which type of food they wanted to put first. This started another A/B test, that you can also get insights from:
During one month, they alternated between three different food options (asian food, tacos and Burgers) and selected the burger screenshot to be put at the second position of their screenshots:
Here is the final option they decided to keep, a screenshot of their app’s functionality and a screenshot of a pile of burgers.
Start fine-tuning your A/B testing strategy!
To summarize this blog on A/B testing, we want you to keep in mind that with this new addition to the tool, you will be able to:
- Understand the overall situation: if your competitors are A/B testing or not
- Know how often and how long an A/B test lasts
- Scrutinize the strategy of each competitor: Which metadata is a priority
- See if there are seasonal A/B tests
- Learn the best practices from the leaders of your category
Sign-up now and discover our other features!