We’re coming up to a year of live running of the digital service standard. Throughout the year the ‘pass rate’ for service standard assessments has been at about 70%.
I was asked recently if a 70% pass rate is good? What’s the benchmark?
How many services should pass assessments?
The truth is that there isn’t a benchmark for how successful the development of new government digital services should be. In running assessments and publishing data about the pass rate and the criteria most often not passed we’re setting the benchmark. As far as we know there isn’t anyone else around the world assessing the development of digital services on this scale (and if there is then we’d love to hear from them).
To me, having about 70% of services pass their assessment seems about right. Not passing an assessment isn’t the end of a service – it’s a chance for a team to take on board the recommendations and use them to improve the service, making it better for users. In fact, maybe the most satisfying thing about working on the service standard is seeing that in action.
Delivering services people prefer to use
Because of the high numbers of users, a lot of digital services being assessed at GDS come from HMRC. In the last year 19 of our assessments have been for HMRC services, and overall they have passed 74% these.
Recently an HMRC team brought in the Inheritance Tax Online service for an alpha assessment. This assessment happens at the end of the alpha stage, and is an opportunity for an early review, before a team starts their beta development.
While there were lots of good things about the service, the GDS assessment panel were worried that the digital service was too closely modelled on the existing paper form, and the opportunity to test more radical and user focussed designs was being missed. On that basis the service didn’t pass its first assessment.
But the team considered and addressed the panel’s recommendations, simplifying the process and removing unnecessary fields. It led to a much improved service, and a pass when it was re-assessed.
Sharing what we’ve learned
Of course we’d love to get to a stage where everything meets the service standard first time, but it is rightly a high bar. We’ve run almost 100 assessments of new and redesigned digital services, ranging from Contracts Finder to Carer’s Allowance. We’ve also got a new dashboard on the performance platform, which shows how many services pass, and what the common points are that are challenging.
We’ve learned a lot, and we’ve published all of the reports from assessments so that people developing other services can learn too.
Comment by Stephen Edwards posted on
This is great and seeing the dashboard on the performance platform is really informative.
1) I was surprised to see on the dashboard so many service assessment reports that have not yet been published, can you reveal why they aren't available? There are unpublished reports for service assessments going back over the past year with a mixture of pass and not pass outcomes. Is there a reason why some reports are not published, even many months after the assessment?
2) The reports are published as pages linked from the GDS data blog but this does not trigger an email alert for anyone subscribed to the blog. Given the high volume of assessments and how important the service assessments are, have you considered putting them on a separate blog or a CMS of some sort to make it easier to subscribe to updates, search through reports and see trends in the reports? For anyone trying to build new services, this is a valuable library of information on what is good and bad and at the moment it feels a bit unloved, tucked away in the depths of the GDS data bog! A document system with a faceted search and email alerts would be really useful, something like the finders: https://www.gov.uk/cma-cases
Keep up the great work,