https://dataingovernment.blog.gov.uk/all-service-assessments-and-self-certification/hmrc/pay-hmrc-beta/
Pay HMRC – Beta Assessment
This service will allow users to make payments to HMRC online via credit and/or debit card.
Department / Agency:
HMRC
Date of Original Assessment:
2 December 2014
Date of Reassessment:
12 February 2015
Assessment stage:
Beta
Result of Original Assessment:
Not passed
Result of Reassessment:
Pass
Lead Assessor:
A. Lister
Service Manager:
O. McGuire
Digital Leader:
M. Dearnley
Reassessment Report
The HMRC Payment Service has been reviewed against the points of the Service Standard not passed at the original beta assessment (points 1, 8, 9, 13, 17 and 18).
Outcome of reassessment
After consideration the assessment panel have concluded the HMRC Payment Service has shown sufficient progress and evidence of meeting the Digital by Default Service Standard to pass their beta assessment.
Reasons
User needs (Point 1 of the service standard)
The service team demonstrated a comprehensive and in-depth knowledge of the service’s users. This understanding was gained from frequent and substantial lab and pop-up user research sessions. In addition around 600,000 users have used the beta service with 385,000 users providing feedback.It was encouraging to see that the team is testing and iterating the service in response to a range of research inputs. Improvements and enhancements based on these inputs was well demonstrated at the assessment.
Forward planning for user research and iteration is both comprehensive and robust.
Use of analytics (Points 8 and 18 of the service standard)
Overall, the approach to analytics and generating actionable insight was well understood. However, there are two routes to the service and a reliable method of measuring the service as a whole needs to be defined. Similarly, the cost per transaction needs to be calculated accurately.
Create a service thats is simple and intuitive (Point 9 of the service standard)
Over the peak self-assessment period in January 2015, over 600,000 people successfully used the service. This clearly demonstrates that the service is both useful and usable. The continuing research and analytics input will ensure the service continues to be simple and intuitive.
Consistency with GOV.UK (Point 13 of the service standard)
Research showed that the transition to pages that did not contain the familiar GOV.UK or HMRC branding caused users to question the legitimacy of the service. These pages have been reworked to reflect core GOV.UK branding. The team has dedicated content design resource and a front end developer. There are minor design issues on some pages, but these are easily rectified with support from the GOV.UK team.
End to end testing (Point 17 of the service standard)
The service team demonstrated the service processing real payments during the assessment. Large volumes of users have left feedback on the services and end to end testing is undertaken frequently.
Recommendations
High volumes of feedback will prove difficult to analyse, the service team needs to define an approach, possibly through segmentation, that allows manageable amounts of feedback from reliably identified groups of users.
The service team should continue to work closely with the GOV.UK teams to ensure that minor design and content inconsistencies are addressed as soon as possible.
Summary
The service team presented a robust, coherent and comprehensive view of their service which demonstrated a thorough understanding of iterative delivery based on user needs.
Original Assessment Report
The HMRC Payment service is seeking permission to launch on a service.gov.uk domain as a Beta service.
Outcome of original service assessment
After consideration the assessment panel have concluded the HMRC Payment service should not be given approval to launch on the service.gov.uk domain as a Beta service.
Reasons
Although the assessment panel considers that the service isn’t ready for Beta status we’d like to stress that all of the components for success are in place. The HMRC team showed capability and commitment and for reassessment, the panel would like the team to ensure the elements explained below are demonstrated as a coherent approach to the service’s development. The panel fully understand the complexity and constraints the team is working within and would highlight the outstanding work the Service Manager is doing to progress this delivery.
User needs (Point 1 of the service standard)
User needs are the foundation of the service standard. Although the HMRC team demonstrated that research was being undertaken frequently there were noticeable gaps between the research activity, identifying actionable improvements, integration into production and measurement. Specifically:
- the sample recruitment brief is extremely niche, describing users most likely to succeed with the service
- the methods used during research to identify user needs and actionable insights were not clear
- user stories were presented in business requirement language, not as service users would naturally express themselves
- convenient payment was highlighted as a key requirement for people choosing to pay by card, yet providing the correct payment amount as a default value has not been prioritised due to the complexity of some back end work. Doing the hard work to make it simple is a key design principle which this service is not currently demonstrating.
Testing the end-to-end service (Point 17 of the service standard)
A significant part of the service is operated by a third party. With a hybrid digital service such as this, end to end testing is of paramount importance. The assessment panel saw limited evidence of this. Specifically:
- the beta service standard requires that the service can be demonstrated end-to-end with all components in place. This wasn’t possible without skipping between environments and simulations.
- it was understood that an ‘available at the point of need’ service is required - paying by card is about user-convenience and current data shows that this means tablets and smartphones. However, the service’s design is not currently responsive and retrofitting this will be a significant challenge.
Design (Points 9 and 13 of the service standard)
The visual design was largely compliant with the GDS toolkit. However the challenges around comprehensive testing, described above, continue to hamper meaningful end-to-end testing of visual design, content and interactions with a broad spectrum of users and a range of devices. Specifically:
- evidencing that the service is ‘simple and intuitive’ requires objective research with users of many different abilities using many different devices. Beyond the lab research sessions, the service was tested with ‘ten tame users’ in Dorset House. This wouldn’t be considered sufficiently objective to gain meaningful, actionable insight
- although content and user experience expertise is available in the team, the components of the service shown evidenced disconnect in terms of close collaboration and having ‘the right people in the same room’
Analysis and benchmarking (points 8 and 18 of the service standard)
It’s accepted that the current service cannot be benchmarked as analytics are not installed. It’s also essential that the new service meets user needs. The assessment panel was concerned that the inability to test a complete service hinders this objective. Specifically:
- as referenced previously, there’s no end-to-end ‘live-like’ test environment
- the team had plans to integrate their analytics into the service’s dashboard but was unclear as to the analytics account in use
- HMRC’s decision not to bespoke the commercial product has limited the amount of web analytics available.This is a concern as much of the complexity of the service happens within those components. The team should be able to demonstrate that their analytics tool allows them to gather actionable insight.
Recommendations
The assessment panel recognised that the HMRC team had done significant work to implement the recommendations of the Alpha assessment and this indicates a clear commitment to building a high quality service. To be ready to pass a beta assessment the team should address the concerns above and the recommendations below.
User needs and Design (Points 1, 9 and 13 of the service standard)
Although the panel understand that this service is being driven by commercial and technical imperatives, the focus on meeting user needs must remain at the forefront of the team’s thinking. The majority of the service standard points which were not passed concern:
- how user needs are identified
- how actionable insight is derived from that research
- the methods for progressing features into production and measuring against the service’s key performance indicators
We would therefore re-iterate the recommendations from the alpha assessment. Specifically:
- stronger user stories are developed and expressed in natural language
- a ‘real’ fully coded end-to-end service is used for user research
- a broader, more representative user research sample is recruited
- user research is wholly objective rather than focussed on proving a predetermined outcome
- all paths through the service are evaluated using live or ‘live-like’ journeys - including the commercial components and journeys which include, or conclude with, error messages/pages
Analysis and benchmarking (points 8 and 18 of the service standard)
A thorough grasp of the service’s approach to analytics is required to ensure that the correct measurements are baked into the service throughout development rather than considering them later in the process.
Testing the end-to-end service (Point 17 of the service standard)
- For the reassessment be able to demonstrate the end-to-end service with all components in place.
- Be able to demonstrate that the design is responsive and can be used on common browsers and devices.
Support and Operations (points 5 and 25 of the service standard)
While it’s understood that much of the operational support for the service will be provided by a separate team, it’s critical that there are strong plans in place to deal with failure scenarios. Plans should be developed so that the team operating the service have the capability to act to resolve issues without relying on development staff who are not formally supporting the service. e.g. steps required to increase the number of instances of the application.
In addition to the recommendations above which should be addressed before reassessment, the below should also be considered.
The Team (point 2 of the service standard)
Additionally, the value of getting the team together in the same physical location shouldn’t be underestimated. The team should take every opportunity to work ‘in the same room’ across the lifecycle of user research through story-prioritisation to analytics and evaluation.
Assisted Digital (point 10 of the service standard)
HMRC provides many different online and offline ways for users to pay tax liabilities and each of the individual HMRC services which use this payment service are developing appropriate assisted digital support for their specific users. Therefore specific assisted digital support does not need to be provided for this service in line with GDS policy but this should be reviewed if the range of HMRC payment options is reduced.
Testing with the Minister (point 26 of the service standard)
For services processing or likely to process more than 100,000 transactions per year, the Minister should test the service before it goes live.
Digital by Default Service Standard criteria
Criteria | Passed | Criteria | Passed |
1 | Yes | 2 | Yes |
3 | Yes | 4 | Yes |
5 | Yes | 6 | Yes |
7 | Yes | 8 | Yes |
9 | Yes | 10 | Yes |
11 | Yes | 12 | Yes |
13 | Yes | 14 | Yes |
15 | Yes | 16 | Yes |
17 | Yes | 18 | Yes |
19 | Yes | 20 | Yes |
21 | Yes | 22 | Yes |
23 | Yes | 24 | Yes |
25 | Yes | 26 | Yes |