Example case: Analytics for a mobile game - Ancient Blocks (Part 1)

Over the next few weeks we are going to be publishing some example analytics cases and show where Calq has been used to provide the necessary insight.

This week's example discusses some key metrics for a mobile game. Our example game, "Ancient Blocks", is actually available on the App Store if you want to see the game in full. This example is meant to be a starting point, it is not meant to be an exhaustive list of everything a mobile game should measure.

Common KPIs

The high-level key performance indicators (KPIs) are typically similar across all mobile games, regardless of genre. Most developers will have KPIs that include:

  • D1, D7, D30 retention - how often players are coming back.
  • DAU, WAU, MAU - daily, weekly and monthly active users, a measurement of the active playerbase.
  • User LTVs - what is the lifetime value of a player (typically measured over various cohorts, gender, location, acquiring ad campaign etc).
  • DARPU - daily average revenue per user, i.e. the amount of revenue generated per active player per day.
  • ARPPU - average revenue per paying user, a related measurement to LTV but it only counts the subset of users that are actually paying.

There will also be a selection of game specific KPIs. These will give insight on isolated parts of the game so that they can be improved. The ultimate goal is improving the high-level KPIs by improving as many game areas as possible.

Retention

As mentioned in our previous article on retention, player retention is a critical indicator. Arguably it's even more important to measure retention than revenue. If you have great retention but poor user life-time values (LTV) then you can normally refine and improve the latter. The opposite is not true. It's much harder to monetise an application with low retention rates.

Calq provides a retention grid report to easily visualise retention within the example game - Ancient Blocks.

When the game is iterated upon (either by adding/removing features, or adjusting existing ones) the retention can be checked to see if the changes had a positive impact.

Active user base

The DAU/WAU/MAU measurements are industry standard measurements showing the size of your active user base. From here it's easy to spot if your audience is growing, shrinking, or flat.

Active user measurements need to be analysed with the additional context of the retention report. Your userbase will be flat if you have lots of new users but are losing existing users (churn) at the same rate. If this was the case then time could be spent trying to keep existing users rather than investing in new ones.

Game specific KPIs

In addition to the common KPIs each game will have additional metrics which are specific to the product in question. This could include data on player progression through the game (such as levels), game mechanics and balance metrics, viral and sharing loops etc. Most user journeys (paths of interaction that a user can take in your application, such as a menu to start a new game) will also be measured so they can be iterated on and optimised.

For Ancient Blocks game specific metrics include:

  • Player progression:
    • Which levels are being completed.
    • Whether players are replaying on a harder difficulty.
  • Level difficulty:
    • How many attempts does it takes to finish a level.
    • How long is spent within a level.
    • How many power ups does a player use before completing a level.
  • In game currency:
    • When does a user spend in game currency?
    • What do they spend it on?
    • What does a player normally do before they make a puchase?

In-game tutorial

A typical component of most successful mobile games is an interactive tutorial that teaches new players how to play. This is often the first impression a user gets of your game and as a result it needs to be extremely well refined. With a bad tutorial your D1 retention will be poor.

Ancient Blocks has a simple 10 step tutorial that shows the user how to play (by dragging blocks vertically until they are aligned).

Goals

The data collected about the tutorial needs to show any areas which could be improved. Typically these are areas where users are getting stuck, or taking too long.

  • Identify any sticking points within the tutorial (points where users get stuck).
  • Iteratively these tutorial steps to improve conversion rate (the percentage that get to the end successfully).

Metrics

In order to improve the tutorial a set of tutorial specific metrics should be defined. For Ancient Blocks the key metrics we need are:

  • The percentages of players that make it through each tutorial step.
  • The percentage of players that actually finish the tutorial.
  • The amount of time spent on each step.
  • The percentage of players that go on to play the level after the tutorial.

Implementation

Tracking tutorial steps is straight forward with Calq. Ancient Blocks uses a single action called Tutorial Step. This action includes a custom attribute called Step to indicate which tutorial step the user is on (0 indicates the first step). We also want to track how long a user spend on each step (in seconds). To do this we also include a property called Duration.

Action Properties
Tutorial Step
  • Step - The current tutorial step (0 for start, 1, 2, 3 ... etc).
  • Duration - The duration (in seconds) the user took to complete the step.
{
    "action_name": "Tutorial Step",
    "properties": {
            "Step": 2
    }
}

Analysis

Analysing the tutorial data within Calq is very easy. Most of the metrics can be found by creating a simple conversion funnel, with one funnel step for each tutorial stage.

The completed funnel query shows the conversion rate of the entire tutorial on a step by step basis. From here it is very easy to see which steps "lose" the most users.

As you can see from the results: step 4 has a conversion rate of around 97% compared to 99% for the other steps. This step would be a good candidate to improve. Even though it's only a 1 percentage point difference, that still means around $1k in lost revenue just on that step. Per month! For a popular game the difference would be much larger.

Part 2 continues next week, looking at metrics on game balance and player progression.