Skip to main content

https://dataingovernment.blog.gov.uk/epayments-service-assessment/

ePayments - Service Assessment

ePayments allows users to repay a benefit overpayments online using a debit or credit card through a hosted card payment service.

Department / Agency:
DWP

Date of Assessment:
30/04/2015

Assessment stage:
Alpha Review

Result of Assessment:
Pass

Lead Assessor:
R. Grove

Service Manager:
K. Bruckshaw

Digital Leader:
K. Cunnington


Assessment Report

The ePayments service has been reviewed against the 26 points of the Service Standard at the end of the alpha stage.

Outcome of service assessment

After consideration the assessment panel has concluded that the ePayments service is on track to meet the Digital by Default Service Standard at this early stage of development.

Reasons

The service has successfully met the alpha standard for a Digital by Default service. Key areas of success were as follows.

Point 2 - Put in place a sustainable multidisciplinary team that can design, build and operate the service, led by a suitably skilled and senior service manager with decision-making responsibility.

The service manager has a strong understanding of transformation, the importance of controlling the end-to-end service (i.e. the letter, not just the online journey), and the challenge of change. This perspective will be critical in the future.

The team described challenges which had been made on existing policy, which is good as a principle and should continue throughout the process. Simplification of policy will lead to improvements in the user journey.

Point 6 - Build the service using the agile, iterative and user-centred methods set out in the manual.

The team have a good approach to agile methodologies, running biweekly sprints and evaluating tools and techniques as the project matures. The team now have access to an agile coach, ensuring standards and disciplines are maintained.

Point 13 - Build a service consistent with the user experience of the rest of GOV.UK by using the design patterns and the style guide.

The team have made a good start in terms of the design of the service, and by using the front end tool kit, are on the right track. This work should be continued. The panel would advise the team to join the GDS Hackpad and engage with GDS on any design principles that they may have views on.

Point 16 - Use open standards and common government platforms (e.g. GOV.UK Verify) where available.

The team is working alongside HMRC to share understanding and lessons learned. It will be critical to continue this relationship in order to ensure users have a common experience between services, and to ensure efficient use of taxpayer money.

The service is working alongside the GDS Platforms team to understand what benefits can be delivered to DWP ePayments. In addition, the team is ensuring technology is loosely coupled, thereby enabling a potential future switch to any common government technology. This will derive potential future benefits for both parties.

Recommendations

Point 1 - Understand user needs. Research to develop a deep knowledge of who the service users are and what that means for digital and assisted digital service design.

The team has made good use of existing channels of insight to validate the user needs (such as the customer satisfaction survey), and also followed up by listening in to calls at the contact centre, and with pop-up research. There was a missed opportunity however to carry out discovery research with target users as the user needs themselves were generated during a workshop rather than through discovery research itself. The focus of research during discovery should be on the identification of user needs through research with target users; the panel felt the leap to pop-up testing of a prototype happened before a full understanding of the target user group was reached. This research could have included contextual research with target users to understand their experience of finding themselves in debt to the government, and the steps in their journey to repay this debt. This would have grounded the understanding of how other channels of communication, such as the letter, perform as a call to action, and given a richer service level understanding of the user needs.

Point 3 - Evaluate what user data and information the service will be providing or storing, and address the security level, legal responsibilities, and risks associated with the service (consulting with experts where appropriate).

The team should ensure that a cookie policy is in place before the beta assessment.

Point 8 - Analyse the prototype service’s success, and translate user feedback into features and tasks for the next phase of development.

The customer feedback survey must be built and owned by the team. There should be no restriction on the questions asked, and the data should come directly to the product owner so that specific concerns can be understood.

Point 9 - Create a service that is simple and intuitive enough that users succeed first time, unaided.

The physical letter from the service needs extensive work to be as simple as possible. It should be user tested and worked on by the content designer who also works on the digital element of the service. For beta, a much simpler and easy to follow letter should be shown, with evidence of user-centric design.

The service currently is only useful for users who know their exact balance via letter and are able to pay the full balance. It is critical that the team conduct user research into other functionality which users may require (e.g. account balance, partial payment and regular payment). These should then be implemented for beta, unless there are mitigating circumstances (e.g. prohibitive cost). If this is the case, then this evidence must be brought to the beta assessment.

Point 10 - Put appropriate assisted digital support in place that’s aimed towards those who genuinely need it.

The statistics provided by the team show the importance of assisted digital (AD) to this service, therefore this will be a key focus of the beta stage. The team must engage users who have not directly approached the service (e.g. look at existing customers who have debt but have not phoned up etc.) as these may be the most at need. This should be added to the teams overall understanding of user needs.

The overriding feeling from the team was that AD support would be completed by phone. This may be true, however the sample tested so far must be expanded to more actual users of the service and there should be no pre-existing ideas of a solution from the team before they understand what the user wants help with and why. Evidence will be needed at beta to show why the service has chosen a particular solution and why it is the best solution for their users.

Sustainability is a key concern. The team did not feel support was needed from third parties for AD services, but the team need to understand what is already being provided. For example, Citizens Advice Bureaux (CAB) have an existing relationship; how much help are they offering to users? If this is happening, how is this funded? This must be understood and provision put in place to support ongoing support of AD, based on the user needs understanding which the team define. This will be examined at the beta assessment.

Point 13 - Build a service consistent with the user experience of the rest of GOV.UK by using the design patterns and the style guide.

Although the design of the service met the standard, there must be an ongoing commitment to content, as this was an area of concern for the panel. Examples of where this needs to be addressed are in the name of the service (when viewed out of context), and the reference to both the National Insurance number and the reference number seemed unnecessary. A content designer will be able to help with both this and the letter redesign.

Point 14 - Make sure that you have the capacity and technical flexibility to update and improve the service on a very frequent basis.

Being able to improve the service on a very frequent basis will be an essential part of running the Beta. The team should work to make the process of updating the service as simple as possible. The team should look to automate the current manual deployment process, as this will reduce the possibility of error and to make it quicker to roll out updates and improvements.

Point 15 - Make all new source code open and reusable, and publish it under appropriate licences (or give a convincing explanation as to why this can’t be done for specific subsets of the source code).

The assessment team at the beta assessment will be looking to see that steps have been taken to publish code for the public to view and use.

Point 18 - Use analytics tools that collect performance data.

The team currently have an analyst drawn from a central team. This person will become increasingly important as the service progresses into beta and beyond for translating user behaviour, and should ideally be co-located with the team. It would be advisable to bring some analytical evidence to the beta assessment (e.g. goals, funnels and figures).

Point 20 - Put a plan in place for ongoing user research and usability testing to continuously seek feedback from users.

For the beta stage, the plan for iterative research as an embedded part of the sprint cycle is extremely positive. However, it is important that the formative part of this research is with target users of the service so that there is a more robust understanding of their mental model and how they think about their debt. This will lend greater confidence to design decisions such as language, labelling and grouping of steps, and will also ensure a greater service level understanding of the end-to-end journey for users. Pop-up research would be more appropriate in the latter stages of the design process when refining interaction design detail.

Point 21, 22, 23 & 24 - Establish a benchmark for user satisfaction across the digital and assisted digital service. Report performance data on the Performance Platform; Establish a benchmark for completion rates across the digital and assisted digital service. Report performance data on the Performance Platform; Make a plan (with supporting evidence) to achieve a low cost per transaction across the digital and assisted digital service. Report performance data on the Performance Platform; Make a plan (with supporting evidence) to achieve a high digital take-up and assisted digital support for users who really need it. Report performance data on the Performance Platform.

The team did not show a full set of benchmarked Key Performance Indicators (KPIs) and had not engaged with the Performance Platform team. This must be corrected before the team return for beta assessment. Additionally, the team should look at how they can provide metrics and KPIs from non-digital channels to the Performance Platform during the beta stage.

Summary

The team have made a strong start to developing a solution and building an Alpha service. That work must now be reviewed, with some areas developed going forward and some areas to be entirely reconsidered. This is a fully appropriate approach moving into beta, and is how the best services develop.

The service manager will be crucial going forward, and they clearly understand the importance of a strong team and hold influence over the end-to-end service. This was excellent to see and should be recognised as a strength.

It was very positive to see engagement with both HMRC e-Payments and the GDS Platforms team, this must continue on both fronts to ensure the team are providing the best possible, best value for money solution. As well as learning from both HMRC and GDS, the panel would expect the service to be inputting guidance and requirements into these teams too.


Digital by Default Service Standard criteria

Criteria Passed Criteria Passed
1 Yes 2 Yes
3 Yes 4 Yes
5 Yes 6 Yes
7 No 8 Yes
9 No 10 No
11 Yes 12 Yes
13 Yes 14 Yes
15 Yes 16 Yes
17 Yes 18 Yes
19 Yes 20 Yes
21 No 22 No
23 No 24 No
25 Yes 26 Yes