https://dataingovernment.blog.gov.uk/all-service-assessments-and-self-certification/hmrc/digital-forms-beta/
Digital Forms - Beta Assessment
The Digital Forms Service (DFS) will provide a digital replacement for all appropriate HMRC print and post forms. They can be submitted online from the Multichannel Digital Tax Platform. The first form to use the service will be for claiming tax relief for expense (P87 form).
Department / Agency:
HMRC
Date of Assessment:
2 October 2014
Assessment stage:
Beta
Result of Assessment:
Not passed
Lead Assessor:
J. Barlow
Service Manager:
G. Brown
Digital Leader:
M. Dearnley
Assessment Report
Outcome of service assessment
The tax relief for expenses form is one of 600 forms which the HMRC plan to develop using the DFS (know as iForms) service. This one form has over 150,000 submissions every year. This has the potential to be a very high volume service which could be used by all other appropriate HMRC forms to replace the current print and post forms.
After consideration the assessment panel have concluded the iForms service should not be given approval to launch on the service.gov.uk domain as a public Beta service.
Reasons
1. Understanding user needs
The service does not have a broad or deep enough understanding of their service users to adequately inform the design of the service.
The forms are used by a large and diverse population of users. User research to date has been much too narrow and infrequent. The service needs to do research every sprint, and with a much broader range of users - including those who have low digital skills, low literacy, no experience with tax, motor, cognitive, visual & auditory impairments. etc. There is also greater scope for the use of digital analytics to monitor how people are interacting with the form and if there are areas where people are dropping out.
2. The Team
The design capability on the team are form designers, not interaction designers. They sit outside of the main delivery team and are only focussed on a subset of the service. Likewise the content designers are not part of the delivery team.
3. End-to-end user journey
The service has been very focussed on the digital version of a paper form. The service does not have an adequate understanding of how the service relates to a user’s end to end journey, which includes the acknowledgement or refusal of the claim via post.
There was also not enough evidence to show how a form fits into the wider service. Such as:
What aspect of a wider service tells a user they need to fill in a form?
How do they navigate from there to the form?
Once a user has submitted a form how is that visible in the wider service?
As they need to identify themselves, why do users have to repeat so much information?
With further user research there is the potential to make more fundamental changes and improvements to the end-to-end user experience - particularly as the form it is not constrained by legislation.
4. Design
The form was not styled using the design patterns or style guide. There is a possibility that after a software upgrade this will allow more control over the HTML output in the future, however the team could not demonstrate this and at current the design doesn’t meet the criteria.
As the team do not have full control over the HTML output, it is unclear whether the code is accessible. It was not clear from the presentation that the form components and fragments align with the design patterns set out in the service manual.
5. Tools & systems
The team was unable to explain clearly how the platform was selected and that alternative options were explored prior to it’s selection. As the software is responsible for the auto-generation of the digital forms, it’s unclear how the team would be able to fix any usability or accessibility issues introduced by the software. It’s also unclear how useful analytics could be collected from the forms produced by the software.
The panel have concerns that the output from the software platform requires JavaScript to work and does not work at all in Internet Explorer 6 or 7. This has the potential to exclude a proportion of potential users from accessing the service. Additionally, if JavaScript is unavailable it does not display an error or any information to the user to inform them how to proceed. This does not follow the practise of progressive enhancement as set out in the service manual. JavaScript availability can affect a range of users for reasons out of their control, and therefore the service should provide a rudimental experience to all users.
We are also concerned that reliance on the chosen platform to auto-generate the digital forms would make it difficult to replace the product in the future.
6. Assisted digital
There was no evidence that assisted digital support meets the specific needs of AD users, nor that the channels available are adequate in nature or availability.
Central HMRC support would be available to assisted digital users of this service (HMRC call centre and in-person support from third parties as part of HMRC Network of Enquiry Centres). The telephone support channel would include a triage function. A central HMRC team is looking at assisted digital support for this service, but the pilot is not looking specifically at this service’s users. The service team were therefore unable to demonstrate learnings or support design iterations from research with their specific users.
The service team did not have an estimate of how many support enquiries would be received nor at what cost. The service team are talking to HMRC call centre staff about what assisted digital users are requiring help with, but are not incorporating digital inclusion methods into any assisted digital support.
7. Digital Take-up
The service team have a broad view on potential long term HMRC-wide plans to wind down non-digital channels, but have no specific digital take-up plan in place for this service.
The service team said they wanted to prove the benefits of the digital service before undertaking digital take-up communications to users. As such, the team could not explain plans to increase take-up, nor demonstrate how they had improved messaging.
Recommendations
- Test the service with a larger and more diverse user group - making sure user needs are continuously used to improve the service.
- Test and improve the end to end user experience - including offline elements of the service.
- Put in place digital analytics to monitor how people are using the form and to highlight if there are any parts of the form where users are dropping out.
- Conduct an accessibility audit with an external agency to ensure the output from the software product is demonstrably usable, alongside meeting the WCAG 2.0 AA standards.
- Liaise with the senior designer at the Digital Delivery Centre in Newcastle around design and design patterns. Ensure the form components are in line with the design patterns as set out in the service manual.
- The service the panel was shown in the assessment was not styled accordingly. In order to pass a beta assessment the team need to demonstrate the service they wish to put into beta.
- Ensure the team has control over the HTML output. A lack of ability to change or update HTML is a risk as there is no control over the accessibility or usability of the service.
- Conduct user research with assisted digital users of this specific service, to understand their needs and from where they are currently seeking assisted digital support (including non-departmental, third sector, friends and family).
- Use learnings from this research to design specific assisted digital support that is appropriate for the users of this service. This should include plans for testing; expected number of transactions and costs, by channel (during the beta and after live); joined up and consistent support across other central government transactions, and; incorporation of digital inclusion.
- Plan for testing the assisted digital support for this service during the beta, including how user insights will be gathered and used to iterate the support during the beta.
- Bring full control of the assisted digital support for this service into the service team.
- Contact GDS's Digital Take-up team to put together a plan for the phasing out of any existing alternative channels, where appropriate. This should include evidence-based plans to increase digital take up during the beta; regular analytics/metrics for usage volumes across channels and; evidence that users’ perceived risks have been addressed.
- Test the service with the minister responsible for it.
Summary
While the service is not currently at a standard to move into public beta, there is a longer term vision, sustainable team and infrastructure in place to be able to quickly build on the work done to date.
In particular, increasing the amount of user research undertaken will enable HMRC to further improve the end-to-end experience for it’s very large and diverse group of users.
This is the first form to be developed using iForm, so the recommendations above provide an opportunity to put the service on a good footing ahead of being rolled out across all 600 HMRC forms.
Digital by Default Service Standard criteria
Criteria | Passed | Criteria | Passed |
1 | No | 2 | No |
3 | Yes | 4 | Yes |
5 | No | 6 | No |
7 | Yes | 8 | No |
9 | No | 10 | No |
11 | No | 12 | No |
13 | No | 14 | Yes |
15 | No | 16 | Yes |
17 | Yes | 18 | No |
19 | Yes | 20 | No |
21 | Yes | 22 | Yes |
23 | Yes | 24 | Yes |
25 | Yes | 26 | No |