Software
value verification is a new concept for me.
Is the agile community paying attention to verifying value in software? Not so much.
I have been trying to get my arms around measuring value -- having been inspired by what
David Hussman and
Jeff Patton have been calling
Discovery in road-shows stretching from
Minnesota to
Mumbai.
In the post
We're Fast, We're Not Cheap, But Are We Good? I paraphrase what David opens with in his
Discovery & Delivery meme:
We've gotten good at delivery at the expense of discovery
Discovery is discovering what your user community, or your investors, value about your software. Why does this matter? Because few teams have devised a verifiable method of tracking it.
Discovery is an equal partner to
Delivery in our mission as programmers since
It doesn't matter how fast we deliver if we deliver junk.
Concise Definition of Value
The concept of value can be squishy. To be clear, my use of the word value is limited to
- Satisfaction in the user community your software is used in, or
- Satisfaction in the investment community staking your software.
High Value = High User Community or Investor Satisfaction.
If
Discovery is what the agile community needs to improve upon, then
verifying value is something we need to determine how to measure.
How do we know the software we're building provides the most value to our user community or our investors?
Direction Component in Velocity
When David questioned the utility of the iteration dashboards our community has become so enamored with, I realized that most teams assume they're tracking velocity, when at best they're tracking speed.
Do you think we measure direction as well as speed? Our backlog is supposed to comprise direction (e.g., encompass vision and priorities), but I contend we measure speed under the delusion of velocity.
Most of us – particularly product owners - don't have a clue what the right direction is because we haven't the time to understand the people using the software (i.e., the people supposedly deriving value).
We are not measuring where the proverbial
rubber meets the road. Say for example our software is a social networking site, simple value measures might be
- How many members did we enroll,
- How engaged are they on the site,
- How good are we at retaining members and fostering brand loyalty.
Criteria like these might give us more more sensible and
verifiable direction. I long to talk about these criteria in a retrospective, rather than pat ourselves on the backs over how fast we completed stories.
The Earth is Flat
If the well-intentioned people prioritizing the backlog had a verifiable grasp of the value delivered to the user community, THEN we might have the component of direction in our Velocity. It has been my experience that the people prioritizing backlogs are making wild-assed guesses based on scant data.
It's easy to say we're all paddling in the right direction, until we paddle ourselves off the edge of a flat earth.
We
are good at patting ourselves on the back in retrospectives simply because we've completed things. We
are not forthcomingly introspective about the value of what we've completed.
Raving Applause?
Do users or investors rave about our software? Sadly, none that I've built. I am either a crappy programmer or I am disconnected from whatever value the users or investors derive from the software I have helped to build.
Sidebar
The Google News team was nearing their debut release. Six engineers respectfully disagreed over which feature they had time to include in their first release. Three engineers vehemently supported Search-by-Date. Three engineers passionately supported Search-by-Location. Deadlock.
Google VP Marissa Mayer made the decision to polish up existing functionality and
not add new functionality. So they released Google News without Search-by-Date and without Search-by-Location! Shortly after the roll-out, they were bombarded with emails asking "How come I can't Search-by-Date?"
Email requests were running about 100 to 1 for Search-by-Date over Search-by-Location.
Guess which functionality had top priority in the next iteration?
Wrap a Bow on it
The Google News story demonstrates
verifiable direction change. Contrast this with the wild-assed guessing we've grown accustomed to and grown complacent with. There's no comparison to butts-in-the-chair data for verifying team direction.
This is the verifiable direction we need to evolve toward.
The Interaction Design community has much to teach about understanding users and providing trackable investor value. I want to learn more about persona development, A/B testing, eye-tracking studies, feedback experiments, and the like. Google runs from 50 to 200 experiments at any given time on their websites around the world (e.g.,
test of a new Google News homepage design).
I would like to obtain and act on
user feedback rather than being steered by the whims and notions of a user proxy (i.e., Product Owners). I want to identify meaningful measures and avoid what
Eric Ries calls
vanity metrics.