Skip to main content

https://dataingovernment.blog.gov.uk/all-service-assessments-and-self-certification/pay-penalty-online-service-assessment/

Pay Penalty Online - Service Assessment

The purpose of the Online Enforcement Penalty Payment Service is to provide customers who incur fines with the ability to pay those fines online, in addition to the present offerings of paying by credit/debit cards via DVLA’s Contact Centre or posting a cheque to the Agency.

Department / Agency:
DfT / DVLA

Date of Assessment:
22/09/2015

Assessment Stage:
Alpha

Result of Assessment:
Not pass

Lead Assessor:
H. Garrett

Service Manager:
R. Gye

Digital Leader:
O. Morley


Assessment Report

Outcome of service assessment

After consideration we have concluded the Pay Penalty Online service is not yet on track to meet the Digital Service Standard at this early stage of development.

Reasons

The assessment team were impressed by the knowledge and enthusiasm of the service team who have achieved a lot in a short period of time.

The team are working in an agile way, have all disciplines represented in their team and the service has been iterated quickly at this early stage of development. The major technology decisions were appropriate and well explained during the assessment.

However, there was not enough evidence that the team had done the right types of research during discovery to understand user needs; particularly the service team haven’t yet spoken to users who have low or no digital skills. The research plan for the next few months is focussed on usability testing which isn’t always the most suitable method, particularly at this early stage.

The DVLA payment platform is a significant part of the service, and research must show users completing the service end-to-end, including pages the team have less control over, and ‘unhappy’ paths through the payment platform.

Recommendations

User needs and user research

The team have so far seen all users fail to complete the ultimate user need - driving a vehicle legally - even if the penalty has been paid. The team should investigate easier integration with taxing a vehicle or making a SORN notification as part of the service.

The frequency of research and continued commitment to usability testing is not in question. The team have usability testing sessions booked in and budgeted for until December. They have also used pop-up research techniques to test very early ideas and designed and analysed the results of two surveys (one postal and one online) as part of discovery and alpha. But we are less interested in the quantity of user research that has been undertaken, but rather in the quality and coverage.

The over reliance on usability testing at the expense of other research methods, particularly during a discovery and alpha is something that we recommend is addressed before coming back for a reassessment on points 1 (user needs) and 2 (commitment to continued research). We recommend arranging more contextual research to better understand user needs and perhaps some of the bigger issues around compliance that might be involved. We recommend the service team talks to the user research community if they need any further advice. They may also find the guidance on the user research methods hackpad and the user research blog useful.

The team need to ensure they are speaking to a representative group of users of the service, including users with low or no digital skills to understand their needs for support. The team had plans to visit the Citizens Advice Bureau to talk to them about how they might reach assisted digital users, but this visit hasn’t happened yet. We would expect the service team to have done research to understand the needs of assisted digital users at this stage of development.

The team

Currently user research seems to be conducted by a separate (albeit connected) team, with reports written and fed into the service team. While it was great to hear that the service team were observing usability testing, we recommend that the user researchers regularly sit with the team to get a better idea of what should be researched, and analyse research with the wider service team, led by the researcher.

Test the end-to-end service

The team must test (both as part of QA and user research) on a wide range of mobiles and tablets, and should consider designing mobile-first.

Patterns and style guide

The designers on the team should contribute to cross-government design discussions, and publish changes to current style on the Hackpad. They should contribute new patterns.

DVLA should look at how to manage styles across services, both from a user interface and development standpoint.

Performance data

The team should work with the DVLA payment platform to understand how to track users through the complete service.

The team need information on current cost per user for the existing cheque and phone channels.

Make all new source code open and reusable

The team have a plan to open source the subsets of the code they are able to. We encourage them to do this sooner rather than later, and expect to see the code in an open repository before their beta assessment.

Dependencies

The team is aware that dependencies on the payments platform and the legacy workflow solution are critical. Prioritisation of the changes required by Pay a Penalty Online to the shared DVLA Payments service may be at risk due to conflicts with other programmes. This is likely to be a significant schedule risk for the service.

Technology and change

The Digital Service Standard requires that teams are able to make changes quickly. If payment policy rules are to change often then the service must be tested to support this need. Reliance on any particular technology as a proxy for this activity is not helpful. Specifically calling out a rules engine product (drools) as a solution to the user need to be able to react to frequent complex change is unhelpful.

Rules in rules engines are just programming languages, with no more ability to change safely than any other. If the change is trivial there are many ways to enact them (a spreadsheet, a properties file, a table). If the change is complex no technology makes the change simple. Use whatever technology is appropriate (taking account guidance about openness) but any particular service quality, especially ability to change, has to be built into the service through design and test activity and cannot be acquired through any specific technology.

Summary

The panel would like to thank the service team for their well-informed answers to our questions, we look forward to hearing about what the service team learn from their research and seeing how the service develops.


Digital Service Standard criteria

Criteria Passed Criteria Passed
1 No 2 No
3 Yes 4 Yes
5 Yes 6 Yes
7 Yes 8 Yes
9 Yes 10 Yes
11 Yes 12 Yes
13 Yes 14 Yes
15 Yes 16 Yes
17 Yes 18 Yes