Skip to main content

Dart Charge - Service Assessment

Dart Charge is the ‘free-flow’ charging scheme, government is committed as part of a strategic objective to manage congestion in the short, medium and longer term. It is a road scheme for a new operating model at the Dartford Crossing to ease congestion.

Users of the Dartford-Thurrock River Crossing of the Thames on the M25 are required to pay a charge to manage congestion of the bridge and tunnels. Dart Charge is a new remote payment system being introduced from October 2014 to reduce congestion and ease traffic flow at the crossing. Instead of stopping at barriers to pay to cross (as happens now), drivers will pay online, by phone, text, or at one of numerous retail outlets.

The service will be for all users of the Dartford-Thurrock River Crossing including a broad range of vehicle users from private individuals to large corporate and government accounts, from local residents who use the crossing regularly to individuals.

The service will provide users with information to understand what they must do to comply with the charging regime and the consequences if they do not. It will help users to pay the charge in a number of different ways to best suit their individual needs.

Department / Agency:
DfT / HA

Date of Assessment:

Assessment Stage:
Alpha review

Result of Assessment:
Not passed

Lead Assessor:
S. Edwards

Service Manager:
L. Forsyth

Digital Leader:
B. Etheridge

Assessment report

After consideration the assessment panel have concluded the Dart Charge service is not yet on track to meet the Digital by Default Service Standard at this early stage of development.


The service was was assessed at the alpha stage, although the service development is not generally following an agile development through alpha/beta/live stages.

The project is being delivered for the Highways Agency by a supplier who have experience of several similar schemes around the world. The representatives from the Highways Agency gave detailed and considered answers in the assessment and demonstrated considerable knowledge and experience that is of enormous value to the project. The assessment panel are also conscious of the framework within which the Highways Agency must operate and procure this project and we must commend them on their work within these limitations. However, the service does not currently meet the Service Standard requirements and will need to make some changes to the project to be able to satisfy the service standard.

The assessment panels key concerns are that the service:

  • does not meet all validated user needs
  • is not being delivered by a multidisciplinary team, and there is no service manager responsible for the service within the Highways Agency
  • is not using any agile or iterative processes and can not be rapidly improved
  • has had no user research to inform the development

The Dart Charge service does not meet the Digital by Default service standard on the following criteria points:

  • Point 1 - Focus groups were consulted before development started but these focus on testing of the brand rather than the actual product. There was no evidence of any user research in the last 5 months and no users have tried out the alpha service in the test environment. There was no evidence of any research to understand any user needs behind the project. There was no evidence that assisted digital service design had been sufficiently considered.
  • Point 2 - The Highways Agency team the panel met showed impressive knowledge of their particular areas of specialism but this must be complemented by additional in-house members of the service team who bring additional skills. For example, the Highways Agency are currently relying on an external technical consultant to advise them and provide technical oversight. The technical oversight of a contractor should ideally be provided by Highways Agency team and not an additional contractor. Currently there is no service manager currently in place. A service manager is due to join the service but the role does not appear to be empowered to make any decisions about the service and appears to be largely a contract manager role. There was no evidence of a multidisciplinary team either within the supplier or within the Highways Agency. In particular there was no evidence of a user researcher, a designer or user experience specialist, and no evidence of technical skills within the project team, or available from the Highways Agency.
  • Point 6 - The service is being created subject to the requirements of the contractual arrangement between the Highways Agency and the supplier. The supplier does not appear to have any experience or capability in working in agile, iterative and user centred methods. The assessment panel recognise that the supplier has been making changes to their processes to aid remote working and collaboration but this is currently insufficient to meet this point of the service standard.
  • Point 7 - The assessment panel understand that some performance benchmarks are included in the contract with the supplier, although these have not been discussed with the performance platform team at GDS and details were not available at the assessment.
  • Point 8 - There was no evidence of user research informing the features and tasks for the next phase of development. The supplier appears to be reusing, from previous road projects in other countries, a set of user personas and a predefined tree of transaction paths through the system. There was no evidence that these have been analysed with user feedback in order to validate or amend these and inform further development.
  • Point 9 - The assessment panel saw no evidence to prove that the majority of users were able to succeed using the service first time. User research of the complete service, and not just the web based interface, would ensure that the whole service meets this requirement.
  • Point 10 - There was no evidence that research to identify users who will require assisted digital support with the digital service had taken place and estimated assisted digital transaction volumes and costs were not known. Assisted digital user personas (and their user journeys) had not been identified, meaning that there was a lack of knowledge around those users’ needs and barriers to access. There were no plans to test the service’s assisted digital support, and funding for that support was not identified.
  • Point 14 - The assessment panel saw no evidence that the service could be updated and improved on a very frequent basis. The service will be updated on a monthly release schedule and there are no automated deployments. It was also unclear what authority the service manager may have, if any, to make changes to the service.
  • Point 15 - The assessment panel were satisfied that the Highways Agency had given consideration to this requirement as part of the procurement process and had ensured that some IP would owned by the Highways Agency. However, the lack of technical oversight in the Highways Agency means that there is limited scope for the Highways Agency to exercise their rights, for the benefit of Government, to open up parts of the codebase for reuse. For example, the .NET frontend toolkit could be developed and released to the open source community.
  • Point 19 - The assessment panel saw no sign that the service could be iterated on a frequent basis, due to the lack of a multidisciplinary team, no user research to inform the product backlog, no service manager empowered to make changes, and monthly release schedules making changes difficult.
  • Point 20 - User research had been limited to focus groups before the project development started. The service team intend to conduct a phase of user research in the summer rather than continuous user research throughout the life of the service. No user research has taken place in the last 6 months and no users have tried out the alpha service in the test environment. Any user research in the future needs to include a wide range of users such as EU drivers and HGV drivers. The service should not rely on industry groups and representative bodies to represent these user groups in lieu of user research by the service team with actual end users.
  • Points 21, 22, 23, 24 - No work has commenced on this although the assessment panel understand that there are KPIs in the contract with the supplier and that the Highways Agency intend to discuss these with the performance platform.

Further Recommendations

The Highways Agency should review the Government Service Design Manual and consider the Dart Charge service in relation to each point raised above. In addition to this, the service team should consider the following recommendations:

  • build the service consistent with the user experience of the rest of GOV.UK by using the design patterns from the Service Standard and the GOV.UK style guide. The service should ensure that there is content design input from Highways Agency content designers to ensure GOV.UK style guide is met
  • The Highways Agency should ensure that they are granted access to any performance monitoring services used, and not rely on performance reporting from the supplier. For example, the Highways Agency should have access to the analytics for the online service.
  • The Highways Agency should consider what open standards might be used or created for this and future projects. Existing standards are not necessarily the most suitable standards to use and creating new open standards may be better for future road charging projects.
  • The absence of a technical specialist in the team that attended the assessment made it difficult to discuss some technical aspects in detail. The Highways Agency should ensure that the service is independently reviewed by a technical specialist. The assessment panel were particularly concerned with the responses to questions regarding planning for managing denial of service attacks.
  • Appropriate assisted digital support should be provided for Dart Charge users who need it, to ensure they are not excluded from the digital service. 21% of the UK adult population lacks ‘Basic Online Skills’ (Source: BBC Digital Capabilities Update, September 2013), although the exact number of Dart Charge users requiring assisted digital support may be very different. Assuming these users will be happy with offline channels is not sufficient.
  • the assessment panel would expect the Service Manager to be able to speak about the service’s assisted digital users’ personas, needs and volumes. The service would then need to evidence how those needs and volumes will be met by the assisted digital support put in place. Assisted digital user testing should confirm which channels of support are required to meet their needs (of which a telephone line will likely be just one).
  • The assessment panel also recommend contacting the Digital and Technology Lead for DfT who may have further advice on providing assisted digital support.

Digital by Default Service Standard criteria

Criteria Passed Criteria Passed
1 No 2 No
3 Yes 4 Yes
5 Yes 6 No
7 No 8 No
9 No 10 No
11 No 12 Yes
13 Yes 14 No
15 No 16 Yes
17 Yes 18 Yes
19 No 20 No
21 No 22 No
23 No 24 No
25 Yes 26 Yes