Tuesday, July 20, 2010

Mobile App Analytics (or how I stopped guessing and started measuring)

The only things as important as figuring out your app's user journey and the resulting user experience are prioritizing the list of features and deciding where best to allocate your development resources. Unfortunately, it's hard to get this right the first time as the judgements are subjective and based on how you think users will use your app.

Listen to your customers

There are many ways to get feedback from app users.  Most telling (though least instructive) are your active installs - people seldom uninstall apps they use and like. This is supported by Market feedback ("Earthquake!" has over 1600 comments and 1700 ratings to go with regular emails sent to the support address published in the market).

This feedback is invaluable but, as Chris Pruett noted when reviewing feedback for the excellent "Replica Island", user feedback can be unreliable.

Mobile app analytics packages like Google Analytics for Mobile Applications or Flurry let you measure how users actually use your app to help you make objective decisions on where to focus your attention.

I recently added Google Analytics for Mobile Applications to Earthquake

It's a fairly simple process consisting of adding a dummy site to my analytics account (from which to obtain a tracking ID), downloading the Analytics JAR file, sticking it into my project's /lib folder and adding it to the build path (full instructions here).

Within my app I simply get an instance of the tracker:

myTracker = GoogleAnalyticsTracker.getInstance();

Start tracking:

tracker.start("UA-MY_CODE-XX", this);

And add a page hit for every event I wish to track:


All the hits are stored in an SQL database, so you can batch the updates and dispatch them the next time your app accesses the Internet. I do it every time I update the earthquake feed:


The page names you're tracking are totally arbitrary - letting you create a new page for every action you want to track.

This is relevant to my interests

There are three general categories of data I can analyze:
  • User demographics. I can track the geographic locations (and language settings) of my users and the speed of their connections. I can also track their screen resolutions and if they're viewing the app in landscape or portrait modes.
  • App usage patterns. The real value comes from finding out how people used the app. What options did they enable? Which Activities do they spend most time on? Which menu options were selected? Did anyone long press anything? Did they add the widget? In short: How does their usage confirm or contradict the assumptions I made in the design?
  • Exception tracking. I also tracked every caught exception. Now I can find out which unexpected edge cases are occurring regularly, and try to figure out why.
My Findings
  • My users are based predominantly in coastal areas near fault-lines. Los Angeles, San Francisco account for nearly 25% of my active user-base (with LA nearly half of that).
  • English accounts for 85% of my user-base. Followed by German, Japanese, and Spanish.
  • Less than 2% of my users have small screens.
  • The Map View is only marginally more popular than the List View (52% vs 48%).
  • Less than 10% of my users view the app in landscape mode.
  • 4% of users switch the map type. Of those, the average is to switch it twice - suggesting my default selection is the preferred viewing mode.
  • 4% of users center the map to their current position. Very few people do it more than once.
  • Of the users who have long-pressed an earthquake in the List View, almost none do it more than once.
  • 7% of users install the widget.
  • 0.1% of users use the Live Folder.
  • The average number of refreshes is one every 3hrs. The default is once per hour.
  • 10% of users manually refresh the earthquake list, but most do so only once.
  • The app is throwing exceptions when parsing the incoming earthquake feed for 20% of users. Those users are seeing an average of 6 exceptions daily each (approximately equal to the typical number of daily refreshes). There doesn't appear to be a connection between these failures and the user's country, network, or device (pivoting on the error page against these categories reveals similar proportions to overall users). 
My new priorities
  • I need to track down the cause of those exceptions!
  • It's probably not worth creating an optimized display specifically for small screens or landscape viewing.
  • I should consider advertising the Widget within the app.
  • If I choose to localize I should prioritize German, Japanese, and Spanish.
  • Both List View and Map View seem equally popular - this is counter to an assumption I made on likely user preference.
  • Most users are refreshing less often than the default, and very few people are regularly manually refreshing. Without a distribution, the average isn't particularly helpful in figuring out if the options need changing.
  • The menu options aren't being used. Perhaps I should try moving the most popular one to the main UI to see if discoverability is affecting use.

In truth I've probably performed this analysis a little early. To do a more thorough study it would be smart to collect a couple of weeks of data.

I'm now all set to perform A/B testing on future releases. By tracking a unique version number page within each release I can use Analytics' pivot functionality to track changes in behavior, demographics, and exceptions based on the changes I make for each release.

I also discovered some gaps in my tracking that need to be added to the next release: 
  • How many people click an earthquake to view it on the map.
  • Exactly which exceptions are being triggered in the parsing routine.
  • I need to add extra tracking to figure out the distribution of update frequencies.


  1. I recently added analytics to my app as well, and what I found out is that analytics FAILS consistently. My app has over 15000 users, mainly from Italy (it's an Italian app...) and yet I get data from OTHER countries (!) But not a SINGLE hit from Italy. Any idea as to why? Thanx Reto...

  2. Re: the point about active installs...

    My app has 50,000+ downloads and 4.5 rating (750+ ratings) but only a 50% active install rate.

    I would say, the average rating indicates the quality of the app, wheras the active install rate indicates the usefulness of the app over the users who've been attracted enough to download it.

    A lot of users download apps out of curiosity with no intention of keeping them long term. This is especially true for niche apps.

    Also consider the pre-2.2 problem of lack of space of apps on the device. In that case, even used and liked apps are uninstalled.

  3. Interesting. I need to start playing with it.

  4. One important point you've not mentioned is privacy. Whilst Earthquake is a relatively innocuous app, any app collecting this sort of usage data from users should at the very least warn the user you're doing this, and ideally give them the option to switch off tracking.

  5. @Webreaper: Privacy is an interesting question. Do you expect the same privacy declaration and analytics opt-out from each web site you visit? If not, why? What's the difference between my using the Earthquake app and visiting the web site? Both will record approximately the same data (web analytics will actually capture a bit more).

  6. @Reto Meier: the main difference is that a web site is hosted remotely and so users expect a certain amount of analytics could be going on server-side. An app is hosted locally and so there is a different level of expectation.

    I think the ADT eclipse plugin has a check box somewhere asking if its OK to send usage stats to Google.

    It probably boils down to whether you see the app as a glorified website or just software.

  7. Privacy is extremely interesting, especially in Germany where using Google Analytics on your website without informing your visitor is considered a violation of privacy policies. German law does not allow you or Google to store personal data (your IP is regarded to be personal) without consent or a legal obligation. Google did take this into account by adding the _anonymizeIP() methods to the tracker. So if you distribute your app in Germany, you should definitely inform the users (or use _anonymizeIP()).

  8. Its really hard to find some information on phone analytics, and it's nice to see someone out there who's spending some time researching it. We are all so busy rushing to get an app out there that we're forgetting that it has to do something for our business...and in order to know if this is happening or not we need to be measuring it as accurately as we would our website.

  9. Anonymous3:34 am BST

    Hi Reto,
    I'm curious, what was your approach to code the exception tracking?
    Did you do a huge try/catch and added the tracking code there, or is there a smart way of doing this integrated in Android?