MOT Testing - Service Assessment
The MOT Testing service will enable users to efficiently and effectively record and report the results of an MOT test in accordance with the MOT scheme rules. For example, it will enable:
- qualified and pre authorised vehicle testers, operating at pre authorised private garages, to electronically record and amend an MOT test result and print a certificate
- the private garages to pay a transaction (slot) fee to DVSA for the submission of the MOT test
- DVSA staff to record the outcome of a vehicle re-inspection
- the results of the MOT test to be shared with DVLA, and in turn support DVLA's on-line electronic vehicle licensing service.
Department / Agency:
DfT / DVLA
Date of Assessment:
Result of Assessment:
The MOT testing service is seeking permission to launch on a service.gov.uk domain as a beta service.
Outcome of Service Assessment
After consideration the assessment panel has concluded the MOT testing service has shown sufficient progress and evidence of meeting the Digital Service Standard criteria and should proceed to launch as a beta service on a service.gov.uk domain.
Given the particular time pressures regarding the rollout of the new service this pass is conditional on continuing to provide GDS with weekly updates on rollout status and service availability until that information is available via the Performance Platform.
The panel were impressed with the detailed understanding the team showed of their primary users (MOT garages) and the wider role of the MOT process in British driving. The team demonstrated a significant level of specialist knowledge and an understanding of how digital approaches could simplify the experience for garages and motorists.
The team had identified and spoken to a range of users who’ll need support to complete the service, and were confident that this range accurately represents all users needing support. The team has a model of support in place for the public beta that is intelligently shaped to meet user needs. The model includes phone and face-to-face support for those who need it, and provides alternative support options for those currently relying on colleagues and trade associations.
The team has ongoing plans in place to continue research with users of all levels of digital skills and confidence.
The team contains a good balance of skills and is working together effectively. While DVSA is heavily dependent on suppliers at present there is a clear focus and plan to ensure sufficient in-house capability to manage the service once it is live. The service manager is responsible for all channels including the existing support team.
The team is distributed across the country but have adapted approaches to working that support this and are using it to their advantage in their ability to visit garages. The assessment panel recommend that all members of the team take part in observing user research, so that the team help keep users in their mind through development.
The panel were pleased to hear how the service manager is empowered well beyond the parts of the service that exist on-screen, with a remit including the assisted digital support elements of the service and responsibility for full digital take up.
Security, Privacy, Tools and Standards
The team appear very familiar with the actions and conversations that need to happen when operating a digital service. They are able to talk clearly about the reasons for technology choices, how they will avoid lock-in and future steps they want to take to further improve the service.
The service is not using GOV.UK Verify but that is as a result of ongoing discussions with the Verify team and with their approval. They are aware of the opportunities for improving how they share the underlying data in the service (via APIs and more regular publication) and looking to develop that element of their offering once the core service is live.
The renewal of vehicle tax by DVLA is dependent on being able to obtain the MOT status of a vehicle and there is both a batch data exchange and a real-time API in place to support that. The team are clearly aware of that dependency and how it works. They explained that there have been a small number of data quality issues which they have been working to resolve and are including that aspect of the service in their testing.
Significant amounts of performance and capacity have taken place on both new and old infrastructure but there is currently a gap in ongoing functional testing of full user journeys in the production environment. The team are aware of this and the panel was keen that they prioritise adding that testing.
There is an infrastructure migration scheduled within the next fortnight which is clearly planned but high risk. The team is fully aware of that risk, has contingency plans in place and is in regular contact with GDS to review progress.
Improving the service
The team are clearly making regular improvements to the service and were able to talk about their process in detail. There is a plan in place to continue being able to make improvements and the service manager and product managers are in a position to set priorities.
The team have tested the on-screen service with users with lower levels of skill and confidence, and made changes in response to their feedback (e.g. simpler language).
The team acknowledged that the assisted digital support for the service might struggle to cope with capacity if it went live now, so they are increasing capacity while tackling the issues that are leading to the majority of calls in the first place. The team will be using the actual systems, staff and setup, as it would be if live, for testing during the public beta.
The team have clear plans to test all routes of support during beta, including those currently provided by third parties and the face to face support available through DVSA’s 'Change agents'. It was good to hear how the team have actually pared down other activities to free up face-to-face support staff for when the service moves to public beta, to ensure capacity and that no users fails to get the support they need.
The team do not feel web chat is required to meet users’ support needs and so have not included it in their support model, although they will be looking into it more during beta.
The team are in the process of implementing to the front-end toolkit having been previously built around bootstrap due to technical restraints. Where patterns don’t currently exist in the toolkit they have been creating their own. This is fine, however any new patterns must be tested rigorously. They should also check on the design hackpad that similar, appropriate patterns have not already been developed by other design teams.
The service must follow the pattern of all other digital services on service.gov.uk domains and have the GOV.UK logo rather than the MOT logo. The team stated that having the user information positioned prominently in the header tested well with users so that can remain there along side the GOV.UK logo.
The existing service is 100% digital and as the service team are adding no extra channels, the new service will also be fully digital.
Analysis and benchmarking
The team provided concrete examples of improvements made to the service based on analytics and call centre feedback. Takeup of the new service is being closely tracked and they were able to explain how they would be calculating cost per transaction. Conversations are under way with the Performance Platform team to prepare public dashboards.
The team’s focus over the coming weeks will be on the migration to new infrastructure and switching garages to the new system. Conversations are already taking place between DVSA and GDS to track that and they should continue with regular reporting of rollout progress, service availability, and ongoing test results.
Further efforts should be made to engage all members of the team in the research process not just those in product and design roles, so that the team help keep users in their mind.
The team should liaise with GDS on options for exposing further data and APIs from the service, and to discuss taking part in discovery work on notification platforms across government.
Full, routine end-to-end testing for common user journeys (including that resulting data is made available correctly for DVLA’s use) should be put in place to run in the production environment. The panel is satisfied that this testing is taking place in the pre-production environment and is sufficient to test service capacity but making it routine practice in the production environment would significantly increase confidence that changes are working and that issues will be found quickly.
The team should continue work to confirm that no users are needing to pay for assisted digital support, and have appropriate and free to use options that meet their needs.
The team should liaise with the GDS design team for more detailed recommendations.
This is a complicated and high profile service and the timetable for rolling it out to garages is extremely difficult, but the panel were encouraged to see that it is on track and that the team are responding rapidly to issues as they arise.
The panel was particularly pleased to hear the team’s broad knowledge of their area and that the service manager is genuinely responsible for the overall service.
Digital Service Standard criteria