Skip to main content

Performance Platform - Service Assessment

The performance platform is an enabling service to help departments, and specifically service managers within the department, monitor the performance of the service they are responsible for. The platform provides dashboards to present data stored within the departments - Legacy IT systems,  manual data from non-digital channels and online analytics. These dashboards contain important information about a service so it can can be viewed and assessed.

The platform provides details on the performance of a service, both digital and assisted digital. It breaks down details of how citizens use a transaction service to enable problems to be spotted, improvements to be tracked, for example, did the changes I make result in an improvement in the performance of the service

Department / Agency:

Date of Assessment:

Assessment stage:

Result of Assessment:

Lead Assessor:
J. Hughes

Service Manager:
W. Drummond

Digital Leader:
P. Maltby

Assessment Report

The Performance Platform is seeking permission to enter into its public beta phase on GOV.UK.

Outcome of service assessment

After consideration we have concluded the Performance Platform has shown sufficient progress and evidence of meeting the Digital by Default Service Standard criteria and should proceed to launch as a Beta service.


The product meets the requirements of the standard for public beta.

Particular areas of strong performance against the standard included:

  • The service has a programme of user research which is being used to inform decisions about new and developed features
  • The platform is consistent with the GOV.UK design patterns
  • The product has been developed and tested for accessibility
  • The product works in different browsers and on different devices
  • The team can demonstrate some examples where service managers have used data from the platform to inform decisions about the development of their services


We recommend the following priority areas for development over the next 3 months as the service scales from a providing a small number of bespoke dashboards to rolling out the platform across many services.

  • The team should revisit its analysis of the user needs the platform is seeking to meet, based on the evidence it has gathered during the alpha and further user research if necessary, so it can validate and prioritise the range of user needs the team has identified.
  • The team should develop a clear and compelling product vision. This should go beyond rolling out existing products to many services and be based on a clear articulation of the most important user needs the team is aiming to meet.
  • The team should develop a clear, structured and transparent approach to prioritising user needs and resolving potential conflicts between the needs of the different users of the platform the team has identified.
  • The team should prioritise and develop a clear roadmap for its work to allow services to self-serve where they are using standard dashboard products. This will free up the team to work towards delivering its wider product vision.
  • The team should extend its research more fully to include the users in the ‘secondary’ list of users.
  • The team should further develop its approach to measuring the extent to which the product is demonstrably meeting real user needs. This should include more use of the available data about user engagement with the platform (eg analytics such as the percentage of users who drill down to the tabular content, or who download data, etc).
  • The assessment panel would encourage the team to share more of its thinking about priorities, user needs and design decisions as part of their new comms strategy

Digital by Default Service Standard criteria

Criteria Passed Criteria Passed
1 Yes 2 Yes
3 Yes 4 Yes
5 Yes 6 Yes
7 Yes 8 Yes
9 Yes 10 Yes
11 Yes 12 Yes
13 Yes 14 Yes
15 Yes 16 Yes
17 Yes 18 Yes
19 Yes 20 Yes
21 Yes 22 Yes
23 Yes 24 Yes
25 Yes 26 Yes