Mobile Native Browsers vs In-App Browsers

I was surprised to find that Adobe Analytics (Adobe SiteCatalyst or Omniture, depending on how far back you go) doesn’t allow you to report on mobile browsers. Even though, I’ve been using Adobe SiteCatalyst for many years and I had always assumed that the ability was there. I am actually a little ashamed that I hadn’t noticed this before.

SiteCatalyst Device Type with Browser Breakdown
SiteCatalyst Device Type with Browser Breakdown

The main reason for me to look at this was that a colleague sent me a link about Embedded Mobile Browser Usage on http://www.lukew.com. While the blog post is more to do with how developers should be testing on in-app versions of mobile browsers as well as native browsers. This is mainly due to the in-app versions of Safari and the native Android browsers having less functionality and lower rendering powers. Luke explains that the in-app version of Safari for example does not include the Nero Javascript engine, certain Safari caches, or the “asynchronous” rendering mode. Meaning that web sites could appear to load more slowly, contain errors etc that would normally not be detected if testing on the native versions.

As Facebook and Twitter app usage continues to grow, its going to become more and more important to understand the differences between these different browser types and how users browse your sites when using these apps. In addition, when users click on your content from apps such as Facebook and Twitter, they appear as either Direct or Typed/Bookmarked depending on which analytics package you use and whether you are using any form of campaign tracking or not.

When I realised that Adobe Analytics does not track mobile device browsers, I turned to our implementation of Google Analytics. Fortunately, GA does track in-app browsers on iOS – so you can see data for Safari and Safari (in-app).

What I have found in my initial dive into this is that users browsing our sites using an in-app verison of the native browser consume less content (in terms of pages per visit), has a higher bounce rate and are on the site for about half the time than the native version.

In-app browsers vs native mobile browsers
In-app browsers vs native mobile browsers

What is going to be hard to work out is whether this is due to a behavioural or technical motivation.

If the site renders slowly, contains certain errors, or is not fully optimsied for the in-app experience, you could attribute some of the reduction of usage to this rationale.

In terms of behavioural, you need to consider what users are doing. If I am on my Facebook app on my phone, its probably while I am in the middle of doing something else. Potentially on the toilet, in a meeting (hopefully my boss is not reading this) or in transit therefore I have limited time to view.

Also, to move this point further, if I am in my Facebook app, I’m also interested in seeing what other content my friends have posted. So I might just read the article that a friend posted, then click back in the app so I go back to my feed and continue my Facebook journey.

One of the main things that I tell colleagues at my job is that Analytics is great at telling you what happened. Trying to find the reasons behind a person’s behaviour is not generally achievable. You can find commonalities between certain types of users, segment them and try to implement changes to convert users to behave in the same way. But in terms of understanding the drivers for users to act in a particlular way is not something that I believe Analytics will answer.

You need to be talking to your users and understanding why they are doing what they are doing – whether it be done through social media, email or surveys. You really need to have all the information to make decisions and I believe that Analytics is only one part of it in this case.

What is Cohort Analytics?

Cohort Analytics in this context is the measurement of how often a user returns to a website over a given period of time.  By understanding how well you retain your users, the better you will understand how best to monetise them – which is what we are all chasing in digital publishing.

This is not the same as the standard vanity stats that you may find in Google Analytics or Adobe’s SiteCatalyst for return visits or visitor retention, this will give you a more detailed understanding.

Also, if done correctly, you can use cohort analysis to measure not only general users to your website but also registered users, logged in users, users that purchase or convert to something.

Cohort analytics is quite a new concept for digital publishing.  In the past, CPMs have been high enough for publishers to only have to worry about unique users, visits and pages. But now, with the advent of Google Adsense and Facebook Ads, advertisers can now target audiences, so now publishers need to focus on other ways to measure and monetise audiences.

How to measure retention

Cohort Analytics is not something that is available out of the box with most standard web analytics tools.  Unfortunately it takes a bit of hacking to get it to work with Google Analytics – even then, it is quite limited as you can only measure over five units of time.  This will all become apparent shortly.

In addition, this explanation will only give you a basic overview for general tracking.

Google Analytics allows you to configure custom variables – of which there are five – and these segments can be persistent over a number of visits.

See Google’s documentation on custom variables here.

Custom variables should be set per time period that you would wish to track.  In this instance it would track user retention over a rolling 5 month period.

Here is some example code for month one using the first of five custom variables:

*** CODE ***

pageTracker._setCustomVar(
      1,                   // This custom var is set to slot #1 for the first month. 
      "Month",           // The top-level name for your online content categories.  Required parameter.
      "January 2011",      // Sets the value of "January 2011" to "Month" for this particular aricle.  Required parameter.
      1                    // Sets the scope to visitor level.   
 );
 pageTracker._trackPageview();

*** END CODE ***

Once this code is in your site, it will need to change each month. The two paramters that will need to change are the first and third variables where each will increment when month 2 begins.

*** CODE ***

pageTracker._setCustomVar(
      2,                   // This custom var is set to slot #2 for the second month. 
      "Month",           // The top-level name for your online content categories.  Required parameter.
      "February 2011",      // Sets the value of "February 2011" to "Month" for this particular month.  Required parameter.
      1                    // Sets the scope to visitor level.   
 );
pageTracker._trackPageview();

*** END CODE ***

Once this has been implemented correctly and has gathered the relevant correct data, you will see some hopefully nice results. In addition, if you are clever with your naming and strategy you will be able to measure much more than this.

The results

Cohort Analysis

Hopefully you will see something like the image above after a period of time.  What the above table shows is how many of the users return in the following months from their first visit.

The reason for the Month 1 statistics all being 100% is that all users in Month 1 are new. In Month 2 it shows how many users from Month 1 returned in Month 2. Month 3 highlights how many users returned to the site in Month 3.

By fully understanding how long your users keep coming back to your site, you can really start to focus on some new metrics.

You will be able to work out the Lifetime Value (LTV) of your users which would help you to work out how much you may want to spend on marketing. By understanding this, you can ensure that your marketing stays profitable.

You can also start to focus your development attention on lengthening the lifetime value of your users.