I've recently been in a few discussions with colleagues about software innovation and how it relates to product management so I thought I would share a few thoughts. We always talk about opportunity, and frankly, we usually use it in a context to defend or soften a product decision to an end user (aka client). In this way we use words like "opportunity" to define a problem that needs fixing - since many of us bill for the work, the term "opportunity" equates to making more money, at least to a consulting company; and the use of the term "opportunity" is much more palatable than "problem." Instead we should identify problems as problems and spend our time working through solutions and really, isn't "solution" a better term than "opportunity"? In any case, I digress, as the topic I want to explore is one of innovation.
What got me thinking about this is an encounter I had earlier this year with a process improvement group. This is the scenario - I'm working within a project to rewrite existing software using new technology and better design principles. Most of the requirements were derived from interviews with the management teams using the existing software (they in turn presumably worked with their teams to come up with innovations and improvements). Fairly straight-forward, right? We then took those requirements and began exploring implementation converting the requirements into epics and user stories, to get the conversation going with our team, derive executable bits and present them back to the business users as we move from bits, to MVP, to production release and consumption. We take a very agile approach to software development so the first hurdle was to get our business users to adapt to this practice - not an easy task. Our BUs tend to throw everything into the requirements and their initial expectation was that everything listed would get done - which made it very difficult to get things through UAT, even when we constantly explained that it was a planned cycle of development moving from simple to complex. In their minds, "working software" meant that everything they wanted on the requirements was done, to be able to define the software as working. Once again I digress, but this time to help provide some context.
In actuality, there's an inherent problem with this requirements gathering method. First, the innovations asked for by the business user usually relate to the existing software platform and amount for the most part to tweaks or small process improvements. Second, when those requirements come through the management filter they tend to be broad-stroke in some regards, and myopic in others (depending on what the manager perceives as important). Third, the methods that the business user is defining in requirements are usually the same as the original software without thinking about how the entire application could be improved by changing the underlying software method - what I mean by that, is when you have staff trained to use software and they are used to that software, it's hard for them to think about an entirely new methodology that's outside of that experience and context. So to continue my story, some of the things we introduced in the rewrite had to do with using an off-the-shelf BPM engine. Our thinking was "why reinvent the wheel when there are already so many companies out there who have figured it out?" So this became the first point of contention - even through a BPM package may satisfy the baseline requirements, there are usually some fairly rigid rules within a fixed methodology that are required by the software itself. It's a tradeoff - you get better accuracy and many of the bells-and-whistles you're looking for, but to take advantage of them you need to conform to the methodology designed within the software. Once implemented, the next challenge was to get our business users to understand that the methods have changed - the outcomes are the same, but by using something more standardized, it will be easier and better to use the same software to do similar tasks. This change is rather disruptive, produces a lot of user angst and confuses people, who are still used to using the original, legacy software.
So around the time we first got our users used to the changes with the introduction of the BPM technology, we were introduced to a new process improvement group. Seeing this as an "opportunity" - the group was brought in to look at the existing methods and suggest improvements that could be incorporated into the newly designed software. This included analysts actually sitting down with the business users and observing how they use the software. We welcomed the additional resources and hoped that a few "magic bullets" would be identified to really make our newly rewritten software fantastic, hopefully saving money and time by identifying process improvements through the application of technology. So now, we're getting to the crux of my article.
The report that was delivered was interesting, not in all the suggested changes and aggregate analysis defending changes to the overall workflow, but in that it copied almost requirement-by-requirement what we had already defined and planned in the rewrite. It basically validated what we already knew and made the same improvement suggestions that we had already planned. What we received was evolution, rather than the hoped for revolution. And please don't think that I'm diminishing the importance of what the process improvement group did - it's nice to have validation in what we are doing and have planned and there's certainly value in that. I think that as a group we had much greater hopes and that our expectations were unrealistic. I also think that in our world of software development we spend a lot of time trying to improve things rather than searching for new solutions - that outside-of-the-box solution that's so new it can redefine what we are doing and how we are doing it. Not to say there's no value in evolution. One of the easiest things to defend are improvements that cut the bottom line - even small process improvements can do this and the accumulated savings can be quite significant.
So what is better? In my opinion, revolution is thinking about a problem and coming up with a solution that is so novel it doesn't fit within the original problem statement. The downside is that it can be very risky. And no, the ideas don't often happen overnight. Evolution on the other hand, involves small, incremental changes and has less risk, but when that's what you rely upon, you could get trumped by a new market contender with revolutionary ideas. I really think we should use both methods but in general try to think in revolutionary terms - more of a mindset than a strict method. We should embrace the good ideas backed by real data, and defer on ideas that offer little to the bottom line that have no data to support them.
How I long for a time when I receive information that is filtered for relevancy (important to me and customized for my needs), from several sources, all in one place using a single sortable and filterable delivery mechanism. There's been a metamorphosis of thinking when it comes to communication that started with emails, went to the use of distribution lists, forums, communities, wikis and now messaging aggregation with software like Slack. This last example is something transformative that's been happening for several years and is only now beginning to get some attention outside of software developers. You would think that this is an example of evolution, but when it moves outside of the development word it becomes revolution.
Post a Comment