Example case: Analytics for a mobile game - Ancient Blocks (Part 2)

This week continues on from our last blog post (Part 1). As before, the example game "Ancient Blocks" is available on the App Store if you want to see the game in full.

Measuring gameplay

The core driver of any mobile game is always going to be gameplay. It wont matter how great the artwork is if the game play isn't awesome.

Drilling down into the specifics of gameplay will vary between different games. Our example game, Ancient Blocks, is a level based puzzle game and the action model will reflect that.

Game balance metrics

It's important that a game is correctly balanced. If it's too easy then players will get bored. If it's too hard then players will leave in frustration instead.

The initial metrics we want to record in this example are:

  • The percentage of players finish the first level.
  • The percentage of players finish the first 5 levels.
  • The percentage of players that quit without finishing a level.
  • The number of times a player replays a level before passing the level.
  • The average time spent playing each level.
  • The number of "power ups" that a player uses to pass each level.
  • The number of blocks does a player swipes to pass each level.
  • The number of launches (block explosions) that a player triggers to pass each level.

Implementation

As Ancient Blocks is a resonably simple game we can get a lot of data from just 3 actions: Gameplay.Start for when a player starts a new level, Gameplay.Finish for when a user finishes playing a level (whether or not they managed to pass it), and Gameplay.PowerUp for when a player uses one of the special a "power up" abilities whilst playing a level.

Action Properties
Gameplay.Start
  • Level - The number of the level being played (e.g. level 7).
  • Difficulty - The current difficulty setting of the level being played.
Gameplay.Finish
  • Level - The number of the level being played (e.g. level 7).
  • Difficulty - The difficulty setting of the level that was just finished.
  • Duration - The duration (in seconds) the player took to finish this level.
  • Success - Whether or not the user passed the level (true) or if they were defeated (false).
  • PowerUps - The number of times a special power up ability was used.
  • Blocks - The number of blocks the player moved during this level.
  • Launches - The number of times a player triggered a launch during this level.
Gameplay.PowerUp
  • Id - The numeric id of the power up that was used.
  • Level - The number of the level being played (e.g. level 7)
  • Difficulty - The difficulty setting of the level being played.
  • After - The amount of time (in seconds) into the level the user was when they used a power up.

Analysis

Using just the 3 simple actions defined above it is possible to do a range of in depth analysis on player behaviour and game balance.

Initial progression

One of the first things to analyse within Calq is the successful progression through the first 5 levels. This is a good indicator whether or not the first few levels are
well balanced, and whether players really understood the tutorial that showed them how to play.

This is done by creating a conversion funnel in Calq that describes the user journey through the first 10 levels (or more if we want). The funnel will need 10 steps, one for each of the first 10 levels. The action to be analysed is Gameplay.Finish.

Each step will need a filter. The first filter needs to be on the level Id to filter the step to the correct level, and a second filter on the Success property to only include level play that passed.

There will normally be natural drop off as not all players will want to progress further into the game. However, if certain levels are experiencing a signifciantly larger drop off than we expect then those levels are good candidates to be rebalanced. It could be that the level is too hard, it could be less enjoyable, or it could even be that the player doesn't understand what they need to do.

Level completion rates

Player progression doesn't always provide the full picture. It's also good to look at how many times each level is being played compared to how many times it is actually passed.

Taking Ancient Block's 3rd level for an example: we can query the number of times the level has been played and break it down into successes and failures.

To do this in Calq we can use the Gameplay.Finish again, and apply a filter to only show the 3rd level. By grouping the results based on the Success property and showing a pie chart we can quickly see the failure rates for this level.

The designers of Ancient Blocks were targetting a success rate of 75% on the 3rd level. Our results show it's slightly too hard and needs a little tweaking.

Aborted sessions

Another metric which is incredibly useful for measuring early play is the number of people that start a level but don't actually finish it - i.e. quit. This is especially useful to measure for the first level after the tutorial has finished. If players are just quitting then either don't like the game, or they are getting frustrated after the tutorial.

We can create a short conversion funnel within Calq to measure this. Using the Gameplay.Start action, the Tutorial Step action from last week (so we can account for people that dropped off before the tutorial was even finished), and the Gameplay.Finish action.

The results show 64.9% of players (the result between the 2nd and 3rd step) which finished the tutorial went on to finish the level. This means 35.1% of players quit the game in that gap. This is a metric for the Ancient Blocks designers to iterate on and improve.

Part 3 continues next week, looking at optimising the flow for in-app purchases (IAPs).