https://dataingovernment.blog.gov.uk/tax-free-childcare-service-assessment/
Tax-Free Childcare - Service Assessment
The service will use this report to inform continued development in the interim until they plan to move to private beta development in autumn 2016 when a further assessment will take place.
Tax-Free Childcare is a new Government service, offering support to working parents with their childcare costs. The purpose of the service is to either help parents back into work or to increase their existing hours. The service is intended to be digital by default, with parents applying via an online tool (assisted digital support and an alternative channel will be provided for assisted digital and digitally excluded users). Once eligibility has been confirmed, a childcare account will be opened for each child, into which parents and third parties can pay money. For every £8 paid into the childcare account, the Government will contribute £2 (up to a limit of £500 per quarter) – the money can then be used to pay registered childcare providers. The service is intended to be operational as a private beta version in Autumn 16.
Department / Agency:
HMRC
Date of Assessment:
10/8/2015
Assessment stage:
Alpha
Result of Assessment:
Not Pass
Lead Assessor:
M. O'Neill
Service Manager:
C. Pike
Digital Leader:
M. Dearnley
Assessment Report
The Tax Free Childcare service has been reviewed against the 18 points of the Digital Service Standard.
Outcome of Service Assessment
After consideration the assessment panel has concluded that the Tax Free Childcare service is not yet on track to meet the Digital Service Standard, however the service is at an early stage of development and does not plan to begin private beta until Autumn 2016. We recommend the service team take account of, and address the feedback below, before returning for a full alpha assessment by early summer 2016 when the service will be closer to beginning the next phase.
Reasons
The service team had done good work on starting to put together components of the eventual service and to engage with user testing right from the start. The service manager was able to present a clear statement of their role and the team structure. It was clear that the service manager was committed to the successful delivery of the service.
The team were able to set out how they were tackling core challenges around delivering a secure and safe product, and how they would ensure the right mix of delivery skills. The team were working with colleagues across HMRC to build knowledge, skills, services, and to speed-up delivery.
The reasons for the not pass fall into two parts at this stage.
First, the team did not present an end-to-end service. The team should be able to describe what the gaps are and how they plan to address them. The team will need to continue working on a start page, as when users of a service could potentially remove themselves from existing benefits (such as Universal Credit) a clear narrative from the start is key. The lack of a content editor, a user/customer experience lead, and a front-end developer are issues that need to be resolved urgently.
The user testing was well done, but the team need to be able to evidence that they are engaging with a representative sample of users, and that the issues around the start page do not distort the testing. Similarly, the team should be able to demonstrate how the telephone channel works and how this relates to the overall service operation, for example, when the digital service is down.
Secondly, further work is needed around the technology used to deliver the service. The technology architecture is highly complex and has many parts to it. The team need to be able to explain and understand the relationship between the back-end technology and the front-end and how open standards will be used.
Point 1
Although the service team have conducted regular research, a broader spectrum of potential users should be covered to ensure the service meets their needs and to identify any potential pain points. Testing with users currently claiming Working Tax Credits or Universal Credit is also important as part of understanding the end-to-end journey. The team did not provide evidence of research with assisted digital users, and it was unclear whether the proposed telephone support would meet their needs. In addition, the difference was not clearly presented between the need to provide appropriate support for users who are unable to use the service independently, and accessibility testing of the on-screen service.
Testing to date had been with the sections of the front-end, and it was unclear how users would engage with the service as a whole. Each of the individual parts, application and account, were described individually, but not as part of a whole user journey, which made the user testing process unclear.
Point 3 and 13
While the team has had input from content designers at GDS and HMRC, this was not from anyone installed as part of the team. The team have not had involvement from either a designer or a front-end developer. There need to be people on the team responsible for the overall consistency of the service and its design and content. This will provide the ability to prove that the service works for users along the whole path of each journey. This is especially important in a service with many entry-points and with journeys crossing multiple channels involving as many different areas of the business.
Without front-end developers, the team cannot have confidence that the code delivered to web browsers works for service users on the devices and browsers they use, or in a way that is robust and easy to change; or to be confident in deploys and quick iteration.
Points 5, 6, 9 and 10
The public facing front end service has important dependencies on the back-end. The service team are not able to make rapid or frequent changes to the back-end, and have not yet scoped and derisked this.
The service as presented at alpha only covers the front-end. The choice of back-end technology and the overall architecture, both logical and technical, was not covered by the team. It was unclear how much of the service would be HMRC intellectual property.
Point 11
The service was presented as a series of components rather than as an end-to-end service, and disaster recovery was recognised to be at a very early stage of planning. Because the service is not end-to-end it is unclear what the service disruption risks are, and what the response needs to be.
Point 12
The service does not yet have a coherent user journey to reflect the impact that using the service could have on other benefits. For example, the service currently lacks a start page to provide user context. The team recognise that this is a core need and this reflects the key importance of content and user experience designers.
Recommendations
Point 1
The team should undertake research with a range of users including those on Universal Credit or Working Tax Credits. Ideally this would not all be scenario-based testing to ensure the feedback received is robust.
The team must undertake research with assisted digital users to identify user needs for support. The team must explain how these needs have been considered in design of the support, including why any support routes (e.g. face to face) have been discarded as an option.
Points 3 and 13
The team are recruiting a content designer and a user experience designer; these are key roles and they need to be in place (and to have had a chance to iterate the service) before reassessment.
Points 5, 6, 9 and 10
The team need to be able to explain the service architecture choices in sufficient detail for alpha. This includes being able to set out how the technology and platform choices relate to HMRC and government wide standards and approaches. Without understanding the end-to-end service there is a substantial increase in the risk of commissioning a beta based on assumptions.
While the team may not be able to change the back-end, they should understand the risks and mitigations with the service architecture. The team should set out how they are using open standards.
Point 11
The team should be able to set out what Service Level Agreements already exist, i.e. with NS&I, and how the service will provide for users during times of disruption. This is an alpha so the approach should be descriptive at this stage, but backed up with evidence as appropriate.
Point 12
The team should test methods of including greater context for users, which will allow them to understand impacts on other benefits (for example, by building and testing a start page). This is an alpha so it will change but without a start page and more context it is hard to see it as a service as opposed to a set of service components.
Digital Service Standard criteria
Criteria | Passed | Criteria | Passed |
1 | No | 2 | Yes |
3 | No | 4 | Yes |
5 | No | 6 | No |
7 | Yes | 8 | Yes |
9 | No | 10 | No |
11 | No | 12 | No |
13 | No | 14 | Yes |
15 | Yes | 16 | Yes |
17 | Yes | 18 | Yes |