Service Manual - Service Assessment
The guidance in the service manual provides the information to help teams meet the digital service standard.
Core users of the service are:
- Teams delivering services in government (for example, developers, user researchers, service managers)
- People supporting teams to deliver services in government (for example, portfolio managers, people teams, procurement teams)
- Suppliers to government
The manual is also used by:
- People working in digital in other governments (for example the US and Australia)
- People working in local government and Parliament
- And more widely than that, for example people working in digital outside outside of a government or public sector context
The new product will replace the guidance contained in the existing service manual.
Department / Agency:
CO / GDS
Date of Assessment:
Result of Assessment:
Outcome of service assessment
After consideration the assessment panel has concluded the Service Manual is on track to meet the Digital Service Standard at this early stage of development.
The service team have been conducting extensive user research, during both discovery and the alpha stage of the project. A variety of different research techniques have been used, and many prototypes for testing have been created. It was impressive that nearly 200 users, including those from beyond GDS and the immediate digital community, have provided input to this critical guidance resource during alpha. The team have also adapted the proposition of the service manual based on this feedback, including seeking practical examples to include from across government. Research will continue until the end of the financial year and is expected to include a diary study.
A full, empowered, multidisciplinary team is in place, with clear separation of roles. Five content designers are working under a managing editor. The team is co-located and working using agile techniques (Kanban wall, Trello boards, daily standups, retros) that they have adapted over time based on the needs of their project.
Prototyping was done in github/heroku during alpha and evaluated their beta platform based on the discovered end user and publisher needs. The team will be working on the core GOV.UK publishing platform, which concerned the panel before confirming that the team are capable of making their own amendments to the codebase should GOV.UK’s roadmap change. Security is being considered well for a project at this stage. Compromising any bespoke tools would not also compromise GOV.UK. New code is being made open where appropriate. A proportionate beta support model is being currently considered.
GOV.UK design principles are being used well. The team have added new patterns where these were absent from the standard content toolkit, researching and iterating each as they go. Content designers are revising all guidance and there is sufficient commitment for the move to beta.
The team are supported by a part-time performance analyst who is helping with the design of custom metrics, such as successful searches and bounce rates. They are being used for benchmarking against the current service before launching features during beta. The team has contacted the performance platform and will be working with them to identify which aspects are relevant to be made public.
The team were able to talk well about many aspects of their work, but will need to concentrate on bringing clearer examples of evidence to the beta assessment. While a user researcher has been committed for beta, no actual research plan was shown to the panel. Questioning led to this being discussed by the team ad-hoc during the assessment.
The panel would like to see the team be better able to succinctly articulate the user needs for the service, and to demonstrate how their research work generated insights that evolved the team’s understanding of those needs. It was not shown to the panel how separate research outputs led to similar needs that could be safely consolidated into an epic. During beta, difficult prioritisation choices will need to be made, and higher granularity may be needed - for example clearly supporting and engaging users who might unwittingly hamper project delivery.
There has been no shortage of user research during alpha, and Kanban methods are appropriate during this exploratory phase. However, a more structured roadmap is likely to be needed in order to ensure all of the major strategic questions and risks are addressed in time for the beta assessment.
The team are making good progress in engaging with the rest of government and ensuring they have a stake in the new service standard. The team mentioned that they were aware from research that the Service Manual is also used in departments as a tool to communicate with wider stakeholders, for example traditional technology and governance functions. In beta it will be important for the team to have demonstrated clearly the research they have done with these users, and how the insight gained has influenced the service.
The team have benefitted from close association with senior leaders in GDS, and this has allowed them to proceed without a dedicated service manager. This should be kept under review as the organisation changes over the months towards beta.
The team have done well to recognise that the service manual has evolved to be a document that has a life across the whole of Government. They should be commended for the volume of engagement and research they have undertaken on such a critical resource to the transformation of public services.
Continue to seek out best practice across digital teams within Government, find the ad-hoc tools departments have created to support the assessment process and use them as indicators of unmet goals. Take the manual out ever further across public service, and carry on bringing back insights to inform GOV.UK in turn.
Digital Service Standard criteria