Flurry "Return Rate" Finally Offers REAL Retention Data

On the Sourcebits blog today, my colleague Andrew Pearlman, @adrenalytics, wrote an article about mobile retention data and Flurry's new metric launched this week. It's an important topic in analytics, since Flurry's one of the most widely adopted tools by mobile developers. His post:

Flurry finally caught up with the rest of mobile development this week when they introduced “Return Rate.” This new-to-Flurry metric (but a built-in-basic with most other mobile analytics providers) now matches the common definition for user retention - you only measure people on the actual days they come back.

Return Rate doesn’t completely replace Flurry’s misleading and potentially dangerous report of “Rolling Retention” - previously the only retention metric the company offered. But at least Return Rate is more prominently featured on the platform - and it’s the metric developers should now use instead of rolling.

Flurry is deploying the “Return Rate” metric out to developers as a beta just in time for the second-annual Flurry Source14 conference later this month.

What’s the difference between “Return Rate” and “Rolling Retention”?

Rolling Retention counts a user as retained on every day prior to their last visit - even if they didn’t actually visit your app on any of the days between. Imagine a user who visits your app once - then forgets about it for weeks before trying it again. He/she will be counted as a daily active user (DAU) every day between visits - which makes it an inaccurate measure of engagement. In short - your DAU numbers look overly inflated and you can’t see real user behaviors.

What does this look like for your app data? The diagram below is an example of 5 users tracked over 10 days. Compare how Flurry’s rolling retention vs. return rate metrics would report DAU.

Why is Rolling Retention dangerous?

Developers that use rolling retention can’t accurately compare daily usage over two time periods (aka user cohorts). The older time period will inevitably have a higher rate, because more time has passed for the users to return. Imagine pushing out an app redesign, and comparing retention rates from the current and previous versions. In a rolling retention report, it would look like daily retention has declined with the new design, when it actually hasn’t changed or could potentially be higher.

I previously told app developers to avoid Flurry because of its rolling retention data gap. But offering “Return Rate” reaffirms Flurry’s free analytics solution is still a viable cost-savings option for start-ups building their data programs. Plus, the improved UI looks great.

Here’s a video from Flurry about their new retention metric:

Andrew Pearlman is the head of user acquisition and analytics for Sourcebits. Check out his recent SlideShare on refining mobile app analytics. Sourcebits is a global leader in mobile app design and development. Sourcebits has created more than 500 products, including 30+ chart-topping apps. Clients include many top enterprises (SAP, Intel, Coca-Cola, P&G) and startups (Skyfire, Touch of Modern, Posterous, Twitpic).

Mona Mohamed M. ALI

Marketer, Communicator,Public Relations Professional,Independent Change Management Strategy Consultant ,Poet

10y

For the Apps developers the scene is different than the Apps owner, different than the Apps users, for the brands, the following questions are the crucial ones: - Mobiles accepting these applications ( all types or certain types only)? - Geographic coverage by retail zones ? - Can we upload directly to the application any texts and videos or only to link to Social Media first ? - Mobile Apps updates and maintenance? - What is the Life long validity of this Mobile Application ?

To view or add a comment, sign in

Insights from the community

Explore topics