https://dataingovernment.blog.gov.uk/universal-credit-service-assessment/
Universal Credit - Service Assessment
Universal Credit (UC) Digital provides a simple, personalised digital and non-digital service for claimants and staff. The service encourages individual responsibility, supporting people finding work and finding better paid work. The service fully supports claimants through applying and evidencing the range of benefits supported. UC Digital allows DWP agents to manage the benefit and accurately and timely maintain the claimant information while supporting the operations and delivery of the service.
There are 2 groups of users. The full range of recipients of working-age benefits, including for example, those who would otherwise receive Jobseekers Allowance, Housing Benefits, Employment Support Allowance and Tax credits. And the agents (service centre and work coaches) who will use the system to enable operational delivery of the service.
There is a substantial set of user needs that the service is being designed to meet ranging from 'as a user I want to be able to log in' to 'as a member of a couple at risk of domestic violence, I want to be protected when I split my couple claim'. The current scope of the service is to support the full range of benefit types and complex circumstances.
Department / Agency:
DWP
Date of Assessment:
8/9/2014
Assessment stage:
Alpha review
Result of Assessment:
Pass
Lead Assessor:
R. Reynolds
Service Manager:
L. Sampson
Digital Leader:
K. Cunnington
Assessment Report
After consideration the assessment panel have concluded the Universal Credit Digital Service is on track to meet the Digital by Default Service Standard at this early stage of development.
As this was an alpha assessment of the 26 points of the standard, there are 23 points on which the panel are satisfied the service team are already working along the right lines and the remaining 3 points will require further effort in the next phase of work in preparing for a beta assessment.
Reasons
The assessment panel found that the service team is mostly working along the right lines. The team is building the service based on user needs, has made changes to the alpha based on findings from user research, which has been conducted both with end users (claimants) and internal users (agents). The team is working in an agile way, able to adapt and improve processes as you go. A lot of thought has already been given to the security and privacy implications of the forthcoming beta, as well as to the importance of assisted digital provision. Although they haven't yet have a complete end to end hands-on demo, engagement from ministers has obviously already been strong.
There are some areas that need work. The service is already impressively advanced in many respects, but in the areas of analytics, open source, and building a multidisciplinary team the assessment panel recommends that you still have some more work to do in preparing for a beta phase.
Recommendations
The service is not yet on track to meet the criteria of the service standard on points 13, 15 and 18, and will need to take corrective action in order to pass a future beta assessment:
- Point 13 - There are already plans to strengthen certain aspects of the team. In these early stages of the service, content designers and user researchers should be working closely alongside your developers and designers to solve problems, conducting user research and testing the service with users in order to identify the best ways to that meet users' needs. Good content design will help inform the right approach to your service, e.g. setting an expectation for agents, returning claimants, and claimants coming to the service for the first time. This cannot be tacked on after the fact. The team will definitely need more content designers to ensure knowledge is retained and shared, and so that content can be internally reviewed before it goes into beta.
- Point 15 - Unless there is a reason not to for a specific subset of the source code (for example a verified security risk), the service team should be making all of the source code open and reusable. Despite having identified some reusable components, the lack of a plan to open up any source code at all is disappointing. At this stage the services code should already be opening up (or otherwise be able to explain clearly the specific and compelling reasons why this can not be done for a particular subset of the code). A working assumption that security prohibits the opening up any code isn't reasonable.
- Point 18 - In the proof of concept assessment in December 2013 the panel noted that web analytics should be built in to the alpha as soon as possible. Although the service has now identified a preferred analytics package, it still needs to incorporate analytics capture into the service (e.g. beyond pageviews and dwell time, what data specific to the service will be captured, how to track end-to-end engagement, etc.) and have a process for interpreting and acting on that data. This must have been done before the service considers moving to a public beta.
The assessment panel also have some other recommendations:
- There must be a description how the journey to the Universal Credit Digital Service starts and ends on GOV.UK. With an existing Universal Credit Live Service running in parallel, along with several other benefits which will need to direct users towards the right Universal Credit service for them, the risk for users being confused is high, and a clear approach that helps users get to the right service must be agreed in plenty of time before a live beta. With both the Live Service and the Digital Service planning to roll out to increasingly large numbers of postcodes, people will need to know which service to use. We recommend that you come in for a meeting with a proposition manager and product manager from the GOV.UK team to agree a plan.
- The service must document the cookie in a service-specific page, as described in the cookies page of the service manual. The relevant pages are www.gov.uk/service-manual/making-software/cookies and www.gov.uk/service-manual/operations/operating-servicegovuk-subdomains#cookies
- The panel have a few small suggestions about the visual design re front end, which will be passed on.
- GDS are happy to review the content of the service too.
- The panel look forward to hearing more about the plans to move away from a self-hosted environment to cloud hosting in the future, and to share the learnings from this with others across government.
- In the beta assessment, the panel would also like to hear more about the plans for iterating the content and design of the beta service on a frequent basis. The expected zero downtime and ability to release as often as needed are both encouraging, but the panel were unclear on your plans for a 2 (or 3) week release cycle.
- For the beta assessment, the panel would expect to be able to access the service ourselves beforehand.
- The panel hopes that, after the pilot in one postcode area, the team will capitalise on the opportunities it affords for even more and better hands-on research with agents, and especially claimants.
- Longer term, despite the regulatory and legislative framework imposing some constraints and complexity for the service, the panel hope you will challenge those moveable constraints, iterating the service to increase simplicity as well as identifying testable reductions in administrative cost and fraud.
- The panel recognise that the service team has already made contact with the performance platform team. The planned conversation with that team will be vital in your decisions around benchmarking and publicly tracking your KPIs. The panel are especially keen, before the next assessment, that as a team you have a chance to consider an approach to benchmarking the performance of this new service against legacy benefits.
- In the beta assessment the panel will ask you to clarify what proportion of the team are involved in the core service delivery / planning for business transformation / integrating with departmental systems / assurance and programme governance overhead, and what proportion of the various roles are contractors. The panel are keen to see the work to embed long-term roles start continue, reducing the risk of losing valuable knowledge when contractors leave.
- For assisted digital, the team will need to show initial thinking about the national roll out of assisted digital support, including more detail on delivery such as the number of expected transactions by channel and costs. Similarly the panel would like to know more about how you will approach digital take-up for the eventual roll out of this digital service.
Summary
In summary, the assessment panel was impressed with what the service team have done since the last assessment and being able to share the progress so clearly and candidly. The panel look forward to seeing the team again soon for a beta assessment.
Digital by Default Service Standard criteria
Criteria | Passed | Criteria | Passed |
1 | Yes | 2 | Yes |
3 | Yes | 4 | Yes |
5 | Yes | 6 | Yes |
7 | Yes | 8 | Yes |
9 | Yes | 10 | Yes |
11 | Yes | 12 | Yes |
13 | No | 14 | Yes |
15 | No | 16 | Yes |
17 | Yes | 18 | No |
19 | Yes | 20 | Yes |
21 | Yes | 22 | Yes |
23 | Yes | 24 | Yes |
25 | Yes | 26 | Yes |