Thursday, November 20, 2014

Quantified and Qualifed Data



I'm back on the Data bandwagon and please excuse me for being persistent - if you read my last post, I made some statements about the importance of data. I then listed some ways we, as product management, should assemble data to support our product actions. I'm continuing with some thoughts that have bearing that I was remiss in pointing out the last time. If you read that last post, I made some assumptions that you can infer regarding the data itself, but wasn't very explicit so this is a bit of a clarification. What I called Data that last time should have been called "Quantified and Qualified Data" - I'll explain.

Monday night (2014.11.17) I attended the Atlanta Mobile Developer's Group meeting in Buckhead (this was at Alliance One and hosted by eHire). The presentation was called "The Tau of Mau: How to turn meaningless app downloads into engaged users" provided by Jeff Steinke. The presentation was one of the better I've attended this year - in this case "MAU" is Monthly Acquired Users and refers to a trend in the mobile industry to measure success by registrations. He used a graph bearing from 0 to 700K that hockey-sticks over a period of several months and began with little explanation to see if, based on that little bit of information and making the assumption that the room was full of potential investors, would we be willing to invest.

Without giving too much more of his presentation away and to get to the point of today's post, several slides in Jeff talked about how data should be both Quantified AND Qualified and how that first exercise put all the reliance on the quantity, and not on how qualified the data was. For mobile app users (and really for most B2C web users), downloads mean very little without engagement. For one of Jeff's company's (Less Meeting), he listed three things that allowed them to know a better picture of success: download (registration); completion of a short tutorial; and finally the use of the app to schedule a meeting. The talk itself boiled down to engagement and the definition of engagement (for most it's convergence - if the user isn't using your product, then a free download has little meaning). Jeff had done enough analysis to determine that if their new user accomplishes the three things on his list, there was a high degree of certainty that the newbie would become a real, paying customer. Back to the initial example used by Jeff to illustrate unqualified data, the company had a lot of MAU but very little actually ongoing engagement. Hard to monetize users if your application is a "one trick pony."

From my own personal experience, I've worked on several applications where the project decision was made for me and ultimately that decision was flawed. The problem is in looking at the raw data and not applying some sound reasoning to filter the data into something that's qualified. In the earlier days of the web there was a lot of emphasis on getting application registrations - this was based on an old-school thought that when people buy software they have skin-in-the-game and as a result they become users. The issue with that assumption is that the web changed the paradigm - all those early companies (most now defunct) based their logic on sheer numbers, and it was relatively easy to get funding (everyone wanted to invest in the next new start up and become an internet-millionaire!). Saying you had millions of "users" (meaning registrations) sounds awesome to investors who would hand-over-money just for an opportunity, without any sound reasoning behind what would power the monetization of those users. When the Dot-bomb dropped and there was a rush to convert all those free registrants to paying customers, the companies fell like dominoes. The analysis was flawed.

The other metric that often used to sell-a-company is number of site visits. The argument is that if you have a lot of visitors, you can always build a revenue model of page views and click-throughs. As someone who has also worked in this type of environment, this can also become a flawed statistic. When you look at the actual number of views you need to make any appreciable money from this model, you're not making much until you get into the 100s of millions. The corresponding likelihood of a click-through is likewise a flawed statistic. If the keywords driving those ads aren't relevant to the user (meaning things have to align just-right - user type, application type, paid-for-words and the gods!) the actual revenue gains are significant as single-instances, but flawed as a sum. Also, these types of campaigns are cyclical in nature so they can rarely be relied upon (one exception is to create a "key accounts" model where you have broad-spectrum advertisers who already have established brands). A technique to help you qualify these numbers is to tag pages to ensure that the user stays on the long enough to actually see the ads placed. Another is to use SEM to aid in placing inbound, specialty pages, which tends to have a synergistic affect on organic search.

So what else can you do? I think that using experts to help you decide can go a long way towards qualifying the data (I'm a fan of Data Science). I also think it's very important to use both the experience of your team and the information you garner from existing customers, to determine how that data rolls-up into something that can be used. If you look at something and don't understand it, re-examine the data to see if it fits some patterns or anti-patterns that make sense. Another idea is to leverage your network of technical experts - I'm sure we all know and have worked with professionals who have "the eye" in regards to gaining insight into the data. Ask lots of questions, gather your data and make sense of it.  Put monetary figures against what's happening and compare it to what you know and possibly don't know. Strive for understanding, and make the data work for you.

More information regarding the Atlanta Mobile Developer's Group: http://www.meetup.com/Atlanta-Mobile-Developers-Group/

Jeff Steinke's blog: http://www.jeffsteinke.com/

(also published on LinkedIn)

No comments:

Post a Comment