Skip to main content

https://dataingovernment.blog.gov.uk/vehicle-management-service-assessment-2/

Vehicle Management - Service Assessment

Vehicle Management will provide an online facility for motorists to notify changes to their vehicle registration details. This will include notifying a change of keeper and change of name and address, which will allow tens of millions of motorists each year to transact via digital channels rather than the existing paper based channels.

The first service to be developed within Vehicle Management and the one being assessed for Public Beta is notifying DVLA that you sold a vehicle to the motor trade. This is a real-time service enabling the vehicle keeper to notify the sale of their vehicles into trade, speeding up any refund due for vehicle tax. This service will be provided via Motor Dealers, Fleets and Auction Houses. A service for individual users will be delivered later next year.

https://www.gov.uk/transformation/manage-vehicle.html

Department / Agency:
DfT / DVLA

Date of Assessment:
27/11/2014

Assessment stage:
Beta

Result of Assessment:
Pass

Lead Assessor:
M. O’Neill

Service Manager:
R. Gye

Digital Leader:
B. Etheridge


Assessment Report

The Vehicle Management service is seeking permission to launch on a service.gov.uk domain as a Beta service.

 Outcome of service assessment

After consideration the assessment panel have concluded the Vehicle Management service has shown sufficient progress and evidence of meeting the Digital by Default Service Standard criteria and should proceed to launch as a Beta service on a service.gov.uk domain.

 Reasons

 The service team were clearly able to set out how the service is based on a clear user need and evidenced how it is meeting that user need.

 The panel were impressed by the work the team have done to build an effective devops deployment process. The use of RPMs and a YUM repository was an interesting idea which would be of interest to others deploying services.

 The team have an effective continual deployment process in place with clear ownership of key elements and an understood security model.

 The team have a well maintained private Github repository in place and are making the bulk of their code available as open source. One recommendation for the future is to continue to clearly separate out secure code and patterns from the wider code base to allow more code to be open sourced and shared publicly.

 The team were also able to clearly set out how they worked with legacy systems to minimise any threat or impact to core services.

 The team were able to set out how resources across the delivery team are structured but it would be useful to more clearly define how non-core resources are actually brought into the team as needed.

 The team were able to set out how they planned to tackle any remaining non-digital legacy aspects of the process, such as the follow up letter.

 Though the work on analytics was at an early stage, the team had obviously given thought to the process and the approach.

 The team have considered assisted digital users and their needs, including using assisted digital personas to inform their thinking. The service team has worked with the DVLA call centre to design and put in place assisted digital phone support specifically for this service’s users. The team expects 80% of trade users to be trying to use this digital service within 3 years.

 Recommendations

 Narrative

  • The service should better articulate the statistics around user population, the slide used in the presentation confused the assessment panel by quoting absolute percentages of time-based slices of the user population without providing the context.
  • Demonstrate through evidence that the working hours availability meets user needs.

 Service name and start page

  • The team should work with content designers on GOV.UK to agree a name for the service that makes sense to the service’s users (ie dealers and people in the motor trade). The service should be written from their perspective to help differentiate between this service and future potential services, including individual-to-individual sales.
  • The start page is currently far too long to be included on GOV.UK, and the team should work with GOV.UK content designers to decide what needs to be included on this page, and what should be included in the service.

 Service design

  • The team should make sure that they are taking the latest set of GOV.UK component designs from the service manual. Where possible they should be using the latest front-end toolkit supplied by GDS. A list of design suggestions will be supplied separately to this assessment.
  • Some pages are currently quite wordy and do not meet GDS content design principles.

 Team structure

  • The team must contain a named designer, content designer, user researcher and data analyst that sits with the team (even if not full time on the project).

 Technical

  • The team’s use of continuous integration, package based deployment, and automation of infrastructure as code is impressive. The team should continue to improve their deployment process, reducing the deployment window between preview and live environments, and ensure they are not locked into their current hosting provider.
  • The team demonstrated good reactive approach to security and monitoring. They should continue their good relationship with their SIRO, and technical security team, and consider further ways of identifying any process based attacks possibly introduced by digitising an existing paper-based service with limited hours of availability.
  • The team are periodically publishing the front-end code as open source code, and should continue to identify other parts of the service which may be of use in other services, and look into collaborating on code from other organisations, audit code from HMRC and the address lookup service used by IER, for example.

 Assisted digital

  • The service team must ensure that research and user testing is being carried out with users who have low or no digital skills, as much of their work to date has been through digital channels and thus with users who are already online with some digital skills.
  • The team must contact and conduct research with users who responded to their 2014 survey saying they will need assisted digital support, to better understand their needs.
  • The team must revisit the 10% estimate for the number of users who will require assisted digital support, as this has largely been arrived at through online research, meaning respondents will likely have good digital skills and internet connectivity.
  • The team must use offline channels to find all assisted digital users of their service.
  • The team must monitor support across all channels (phone, face to face, or web chat), iterating in response to both user feedback and performance measurement.
  • The team must revisit the figure of 99% digital connectivity for users.
  • The team should plot users on the digital inclusion scale, rather than basing decision on whether users workplaces have an internet connection.
  • The team must build digital inclusion strategies into their assisted digital support, even though users of this service have indicated that they will likely only need support once or twice.
  • The team must specifically allocate ownership for assisted digital within their team structure.

 Digital take up

  • The team must continue to work with the Digital Take Up team in GDS, to ensure an appropriate plan is in place for this service.
  • The team must explore non-digital channels to discover and engage with users at the lower end of the digital inclusion scale and understand if and how they can be encouraged to switch to the digital service.
  • The team must test digital take up messaging during beta and iterate it in response to user feedback.

Digital by Default Service Standard criteria

Criteria Passed Criteria Passed
1 Yes 2 Yes
3 Yes 4 Yes
5 Yes 6 Yes
7 Yes 8 Yes
9 Yes 10 Yes
11 Yes 12 Yes
13 Yes 14 Yes
15 Yes 16 Yes
17 Yes 18 Yes
19 Yes 20 Yes
21 Yes 22 Yes
23 Yes 24 Yes
25 Yes 26 Yes