Skip to main content

DVLA Webchat - Service Assessment

The assessment looked at a web-chat tool that DVLA developed to put on online services. Web-chat allows the customer to select the option of direct live online communication with a DVLA advisor instead of having switch channels to telephone the call centre, or submit an e-mail enquiry.

This alpha assessment was based on a prototype and the service team demonstrated the tool attached to the ‘Take a registration off a vehicle’ service. This was simply to provide context on how the tool will be used, but the service itself was not assessed just the web-chat tool.

Department / Agency:

Date of Assessment:

Assessment Stage:

Result of Assessment:

Lead Assessor:
J. Gould

Service Manager:
J. Hewson

Digital Leader:
O. Morley

Assessment Report

Outcome of service assessment

After consideration the assessment panel has concluded the DVLA webchat tool is on track to meet the Digital Service Standard at this early stage of development.


This is the first tool (rather than citizen facing service) to be assessed by Government Digital Service against the service standard. This made the assessment a bit unusual for both the panel and the service team - particularly as the team’s focus is on integrating a proprietary tool from Salesforce into their services. For that reason the panel was particularly interested in how supplier/tool agnostic the approach was - in other words how easy it would be to swap out one web chat tool supplier for another in the future if required. Whilst the panel still retains some concerns, the general approach is satisfactory.

A comprehensive list of recommendations is set out below to guide the team as it prepares for and develops the beta phase of the project. The service team must work to meet the recommendations set out below before the beta assessment.

User Research

The team demonstrated a good grasp of the context of the problem, and volumes of incoming calls which they aim to address with the web chat tool.

The team have used a range of research techniques so far, including focus groups, surveys, lab-based research and contextual research. They have successfully identified several key groups of users who the tool is aimed at, and particularly picked up on its benefits for deaf users. The panel was encouraged by the team’s identification of the contexts in which users will engage with the tool.

The survey itself presents cause for some concern and is by it’s nature leading. It offers only 3 options, and should be viewed as confirming business need, rather than establishing a user need. Although out-of-scope for the webchat tool, the real research question should be why the users can’t find what they need online and have to resort to further contact in the first place.

The team showed solid user personas including web usage and support needs and demonstrated an ability to separate true user needs from business needs.

The team presented a comprehensive plan for research up to April 2016 and has identified a need to research what answers customers expect and require.


The team demonstrated how they had used paper prototypes to get an initial understanding of the problems and to explore what users expect. The team has tested a variety of approaches for the call-to-action to initiate the chat tool. However, these have all relied on an icon or imagery. Different designs have been mocked-up for the chat interface, although these haven’t been worked into the prototype or tested with users yet.

In beta the team need to interrogate the chat interface in a lot more detail; relying on the default behavior of the supplier’s solution may not be the best approach to meet user needs.


The team

The current structure and remit of the team appears to be set up to deliver a programme of work, including the webchat, rather than the tool specifically. The panel would recommend splitting the larger team into teams focused on specific products. This will allow the team to develop the product at a cadence suitable for them rather than a wider programme. The panel does not believe that a common cadence will work in the beta phase.

Similarly the current approach to sprints and ceremonies should be changed to support the development of the tool. Whilst the team is nominally working on four week sprints (which the panel believes are too long for the current development phase), two weekly showcases and retrospectives suggest that two weekly sprints would be more appropriate. Rationalising a smaller team to focus specifically on the integration of the tool will allow the team to build cadence more appropriately.

Whilst the team were able to indicate that they had considered the possibility of changing to an alternative provider of webchat services in the future, the panel will want to be satisfied at beta assessment that this approach is truly platform-agnostic and capable of being easily adapted to an alternative solution provider. A workable migration strategy needs to be developed, demonstrating how platform agnostic the the approach is, and how easy it would be to swap out one solution for another.

User Research

Whilst the team has demonstrated a solid basis of user needs, the panel recommends the team reworks these into recognisable user needs - the importance being that people on the team (and wider stakeholders) need to be able to see what real people really say and why they say it.

Further to this, the panel would not recommend using the survey as proof of user need for reasons stated previously.

For the next phase the team is recommended to:

  • Engage with the internal users who will be answering webchat queries. This is a large scale change for the people whose day job will be to answer the queries.
  • Research what happens when webchat is not available. It’s important to gain an understanding of what users expect to happen if they are unable to use the tool.
  • Address what happens if a problem can't be solved in chat, looking at how longthis process takes before handoff. What is the customer reaction when this happens?


Although the team had tested many variations of the call-to-action, it had not been tested without an icon or imagery. It is recommended that the team test a text-only call-to-action as this follows the advice in the service manual.

As variations on the chat interface have not been tested it is recommended the team test different ways of accessing and interacting with the chat agent.

Opening new windows to display the chat interface is not recommended as this introduces usability issues, especially with users with low digital skills, or users on mobile or assistive technology devices. If the team decide to use this route there must be strong evidence demonstrating that users of these devices are succeeding to use the tool.

The chat interface will need to be strenuously tested for accessibility issues, over a variety of assistive technologies. The team has already worked with the Deaf Association and should continue to work with similar groups to observe users with different needs and abilities using the tool.

Only services hosted on GOV.UK are permitted to use the crown and typeface. Users need to be able to verify that they are talking to government through trusted channels, so it is recommended that the service is accessed via a subdomain.


The webchat solution is based on a Salesforce product which has been acquired through Digital Marketplace after various options were considered during discovery. Salesforce is not open source and does not use open standards for data interchange. As such, suitable diligence during all phases of the project is recommended to avoid being trapped by vendor lock-in. DVLA needs to prove that business critical data can be exported without loss and in a useful way that could be used to transition customer contact histories to another solution before going into a large scale beta with real users. This recommendation is vital if other projects in the call centre also begin integrating with Salesforce as part of a unified Customer Relationship Management (CRM) effort within DVLA.

Since the tool’s back-end is provided by Salesforce’s platform, minimal software development has been required in-house to integrate the tool into DVLA’s service flows. However, some customisation has been performed, largely by a Salesforce team working with DVLA. The DVLA team assures us they own the intellectual property and that some Salesforce development capability will be built up in house to take ownership of these changes. The panel expects this to happen during beta.

Customisations to’s platform, and changes to markup and styling of the chat window are made via the Salesforce admin interface. This raised some concern around the level of version control available and testing that could occur, to ensure bugs or regressions aren’t introduced by any changes or Salesforce's regular release process. The team stated a testing environment was available, and that manual testing takes place on each new release. The service team should look at approaches to test the chat window functionality in an automated way.

The service team intends to publish Salesforce customisations, metadata, markup, and styles on GitHub where possible for other teams to re-use if they need a similar chat solution. The panel expects this to happen before beta assessment.

The panel noted that the popup chat window is currently not served by a GOV.UK service domain URL. This raised some concern around user trust, and whether the New Transport typeface and the crown logo could be used were this to remain the case. DVLA should approach Salesforce and investigate if a custom domain ending in can be used to serve the chat window (such as via a CNAME in DNS) and if not, confer with GDS around correct use of these assets and branding.


The panel would like to thank the team for presenting a comprehensive view of the development of the tool and how they plan to deploy it across multiple DVLA services. The panel was impressed by the team’s knowledge and approach and looks forward to seeing them at a future beta assessment.

Digital Service Standard criteria

Criteria Passed Criteria Passed
1 Yes 2 Yes
3 Yes 4 Yes
5 Yes 6 Yes
7 Yes 8 Yes
9 Yes 10 Yes
11 Yes 12 Yes
13 Yes 14 Yes
15 Yes 16 Yes
17 Yes 18 Yes