‘Analysis’ and ‘data’ are very broad terms. Often people say they have done the analysis, when really they have done an analysis. Different types of analysis give different perspectives on the same question.
I work for GDS assessing the benefits of new digital services. This involves a lot of financial modelling and working with number-crunching experts at HM Treasury.
Andrew Greenway and I had a great chat about his recent blog post, and he’s invited me to write a follow-up with a more financial twist.
Valuable users
Andrew asked whether putting search on the Government Service Design Manual was worthwhile. He looked at the effect of search on user experience and user journeys. But how did adding search stack up financially?
We’ve had 5,881 visits to the service design manual search over 6 months, and reckon it took about £3,000 worth of internal developer time to build. The code should stay useful for about 3 years, so we’ve invested £3,000 to facilitate about 35,000 visits (that’s 9 pence each).
Looking at a minimum viable option
But what about a “do minimum” comparison?
Instead of an internal site search, we could have given a customised search engine link with an explanation - too techy for GOV.UK mainstream users, but perhaps OK (if sub-optimal on usability) for geeks reading the manual!
Given the relatively low cost, service manual search looks clearly worthwhile, but a “do minimum” comparison might have helped when prioritising it against other projects.
Fast sums
If we did deep financial analysis for small code projects, bureaucracy would choke creativity. But, even with small spend, it’s worth spending a few minutes comparing “do nothing”, “do minimum” and “write code”. Quick calculations (using common sense where you don’t have all the data) mean that valuable developer time can be used more effectively.