Skip to main content

https://dataingovernment.blog.gov.uk/dart-charge-service-assessment-2/

Dart Charge - Service Assessment

The Dart Charge is the new ‘free-flow’ charging scheme being developed for the Dartford Crossing to ease congestion. Users of the Dartford-Thurrock River Crossing of the Thames on the M25 are required to pay a charge to manage congestion of the bridge and tunnels. Dart Charge is a new remote payment system which will reduce congestion and ease traffic flow at the crossing.

Department / Agency:
DFT / HA

Date of Reassessment:
19/8/2014

Reassessment stage:
Alpha re-review

Result of Reassessment:
Not passed

Lead Assessor:
S. Edwards

Service Manager:
I. Lewis

Digital Leader:
B. Etheridge


Assessment Report

After reassessment of the service the assessment panel have concluded that the Dart Charge service is not yet on track to meet the Digital by Default Service Standard at this stage of development.

Reasons

The service development is not following an agile development cycle through alpha/beta/live stages but nevertheless the service was reassessed at the alpha stage. The project is being delivered for the Highways Agency by a supplier who have experience of several similar road charging schemes in many countries.

Unfortunately the service does not currently meet the Service Standard requirements and will need to make significant changes to the project to be able to satisfy the service standard. The panel were disappointed that few of the recommendations from the previous assessment have been acted upon and would urge the service to review the previous report and recommendations again.

The assessment panel’s key concerns are that the service:

  • does not understand the user needs that the service should meet: which has led to a confusing online experience for users;
  • have undertaken limited user research on 22 participants, without any skilled user researcher being involved. The user research has validated only parts of the user journey in isolation. There is no detailed plan for further user research.
  • has no evidence that the service as a whole is simple and intuitive enough that users succeed first time, unaided.  For example, instead of doing the hard work to make things easy for users, the service have front loaded the complexity of the charging scheme into the start of the user journey (account creation). The assessment panel were left confused by the 8 different user account options.
  • is not being delivered by a multidisciplinary team and the service manager is responsible for covering multiple roles including user research, content design and product management. In our previous report we recommended that “It is critical that the current Highways Agency staff are complemented by further staff who can bring the skills necessary to create a multidisciplinary team” but this has yet to be acted upon.
  • has shown some flexibility in changing and adapting their processes, but those processes appear to be ill-defined and may be unsuited to rapidly improving a service; and
  • the service team appear to have insufficient understanding of the Digital by Default assessment process and do not understand the significance of the Alpha and Beta reviews to their overall project milestones.

Each of the 16 points of the service standard that the Dart Charge was re-assessed against is detailed below.

Point 1 -  Understand user needs. Research to develop a deep knowledge of who the service users are and what that means for digital and assisted digital service design.

Result: Not passed.

The service do not appear to have any defined user needs or user stories and have an action/task based backlog rather than user needs and user story based backlog.

Since the previous alpha review the service team have conducted guerrilla user research on 22 participants. This user research was conducted with passers-by in a shopping centre in the vicinity of the supplier’s call centre in Leeds. The user research tested small parts of the prototype user journey and did not appear to inform or shape the user needs for the service as a whole. The user research provided some insights into user interface design elements but nothing broader and so is insufficient evidence to pass service standard point 9 (see below).  There appears to have been no user research with those living locally to the crossing, who appear to be one of the key user groups.

The assessment panel found the experience of using the service as a whole confusing and there was no evidence that users would not face the same difficulties.

Point 2 - Put in place a sustainable multidisciplinary team that can design, build and operate the service, led by a suitably skilled and senior service manager with decision-making responsibility.

Result: Not passed.

Since the first alpha assessment a service manager has been appointed from the existing Highways Agency team. The role does not appear to be empowered to make decisions about the whole service and it was unclear how much influence the service manager can have within the current processes (see point 6 below).

Our comments from the first alpha review remain valid:

“There was limited evidence of a multidisciplinary team either within the supplier or within the Highways Agency. In particular there was no evidence of a user researcher, a designer or user experience specialist, and no evidence of technical skills within the project team, or available from the Highways Agency. The Highways Agency staff we met showed impressive knowledge of their particular areas of specialism but this must be complemented by additional in-house members of the service team who bring additional skills. For example, the Highways Agency are currently relying on an external technical consultant to advise them and provide technical oversight. The technical oversight of a contractor should ideally be provided by Highways Agency staff and not an additional contractor.”

Furthermore, the current project team is unsustainable as it will change in October when the project team hand over to a business as usual team.

Point 6 - Build the service using the agile, iterative and user-centred methods set out in the manual

Result: Passed.

The service is being created within the framework of the contract between the Highways Agency and the supplier. The supplier does not appear to have any experience or capability in working in an agile, iterative and user centred method and stated that they follow a civil engineering/waterfall approach to delivery. Nevertheless, we recognise that the service supplier and Highways Agency have made changes to their processes and we would urge them to continue these improvements. We would also urge the team to consider using online collaborative tools in addition to email and telephone communication amongst the team.

Point 8 - Analyse the prototype service’s success, and translate user feedback into features and tasks for the next phase of development

Result: Not passed.

There was insufficient evidence of user research informing the features and tasks for the next phase of development. The user research conducted so far has resulted in a number of simple actions to improve user interface elements. The lack of resources for this makes this unsustainable.

Point 9 - Create a service that is simple and intuitive enough that users succeed first time, unaided

Result: Not passed.

After the first alpha review the assessment panel recommended: “User research of the complete service, and not just the web based interface, would ensure that the whole service meets this requirement.”

The user research undertaken did not provide sufficient evidence that the majority of users were able to succeed at using the service first time. Instead of making things easy for users, the service has front loaded the complexity of the charging scheme into the user journey for creating an account or paying for a journey. There is no clear route for a user to understand which of the accounts is right for them, and each account has an entirely separate journey.  For example, a personal account can’t be created if a user does not own a car (although they will not know this until they’ve entered all of their personal details), instead they must choose the Pay As You Go or the SMS PAYG journey. In addition if a payment method fails or a user is unable to pay at that time, an account is not created and the user loses the personal and vehicle data that they have entered into the service.

This problem is fundamental to the design of the service and can not be mitigated by expecting a user to read guidance content on GOV.UK. before using the service. Instead, the service must be designed to meet user needs with a simple and intuitive service.

Point 10 - Put appropriate assisted digital support in place that’s aimed towards those who genuinely need it.

Result: Not passed.

The service team had not researched the service’s assisted digital users’ needs so as to inform the design of appropriate support. There were no plans to test any assisted digital support, and funding for that support was not identified. The service manager could not speak about the service’s assisted digital users’ personas, needs or volumes.

The service team should carry out research to understand their service’s users assisted digital support needs, their numbers, and from where they are currently receiving assisted digital support (for example charities, friends, family). Research should confirm which channels of support are required to meet their needs (of which telephone support may be just one).

The service team should identify assisted digital user personas relevant to their service, and plot those personas’ end-to-end user journeys through the service. The service team should be able to explain how the needs and numbers of their assisted digital users will be met by the support they plan to put in place. The service team should consider how their assisted digital support would be user tested during the beta.

Point 14 - Make sure that you have the capacity and technical flexibility to update and improve the service on a very frequent basis

Result: Passed.

The service are now making twice weekly deployments and the Service Manager gave some examples of where service changes had been made within 2 or 3 days. However, the Service Manager was a little unclear on the process for making changes and we would urge the team to consider how to clarify and streamline this process.

Point 15 - Make all new source code open and reuseable, and publish it under appropriate licences (or provide a convincing explanation as to why this cannot be done for specific subsets of the source code)

Result: Not passed.

Our view from the first alpha assessment was:

“We were satisfied that the Highways Agency had given consideration to this requirement as part of the procurement process and had ensured that some IP would owned by the Highways Agency. However, the lack of technical oversight in the Highways Agency means that there is limited scope for the Highways Agency to exercise their rights, for the benefit of Government, to open up parts of the codebase for reuse. For example, the .NET frontend toolkit could be developed and released to the open source community.”

The assessment panel were disappointed that no progress has been made on this point and that this is dependent upon the priority given to this by the supplier.

Point 19 - Build a service that can be iterated on a frequent basis and make sure resources are in place to do so

Result: Not passed.

We saw some examples of how the service had been iterated quickly, but due to the lack of a multidisciplinary team, in particular the absence of a dedicated user researcher, this is unsustainable.

Point 20 - Put a plan in place for ongoing user research and usability testing to continuously seek feedback from users.

Result: Not passed.

The service team has undertaken limited guerilla user research without the input of a dedicated or skilled user researcher. There is no fixed plan for continuous user research in future and no plans to recruit a dedicated user researcher for the project. There is an offer from the DVLA to provide some user research resource but it is unclear when this will be available or how much user research this will provide. We would reiterate the recommendation we made in the first alpha review:

“Any user research in the future needs to include a wide range of users such as EU drivers and HGV drivers. The service should not rely on industry groups and representative bodies to represent these user groups in lieu of user research by the service team with actual end users.”

Points 7, 21, 22, 23, 24 - Establishing KPIs and reporting to the performance platform.

Result: Passed.

The service manager has met the GDS performance platform team, have established KPIs, and have made preliminary plans for reporting these to the Performance Platform.

Further recommendations

The Highways Agency should review the Government Service Design Manual and consider the Dart Charge service in relation to each point raised above. In addition to the further recommendations made in the first alpha review, the service team should also consider the following points:

  1. Deployment requires manually configuring services and we would urge the development team to automate these configurations. The development team should also conduct load testing on the service and not rely on performance based on previous experience and projects.
  2. Further planning should be given to the disaster recovery plan and this should be completed before the service enters the beta phase. There is no provision for DDoS attacks, as we understand DDoS is not part of the contract. The service should consider whether it might be cost effective to put in place appropriate DDoS protection rather than rely on taking down the service.

Digital by Default Service Standard criteria

Criteria Passed Criteria Passed
1 No 2 No
3 Yes 4 Yes
5 Yes 6 Yes
7 Yes 8 No
9 No 10 No
11 No 12 Yes
13 Yes 14 Yes
15 No 16 Yes
17 Yes 18 Yes
19 No 20 No
21 Yes 22 Yes
23 Yes 24 Yes
25 Yes 26 Yes