https://dataingovernment.blog.gov.uk/office-for-national-statistics-website-self-certification/
Office for National Statistics Website - Self Certification
The Office for National Statistics (ONS) is the UK’s largest independent producer of official statistics and is the recognised national statistical institute for the UK. It is responsible for collecting and publishing statistics related to the economy, population and society at national, regional and local levels. It also conducts the census in England and Wales every ten years. The website is the primary channel for dissemination of these statistics.
Department / Agency:
ONS
Date of Assessment:
8/07/2015
Assessment stage:
beta
Result of Assessment:
Pass
Lead Assessor:
C. Foster
Service Manager:
M. Jukes
Digital Leader:
T. Makewell
Assessment Report
The ONS website has been reviewed against the 18 points of the Service Standard at the point of seeking to progress to a public beta of the service.
Outcome of service assessment
After consideration the assessment panel have concluded the ONS website demonstrates the level of progress and evidence expected against the Digital Service Standard criteria and should proceed to launch as a public beta service.
Reasons
The team provided evidence to demonstrate that the service meets all the points of the service standard for beta.
Particular areas of strength included:
- The Service manager’s thorough understanding of the service and how it is meeting user needs.
- Clear evidence that the team is doing the hard work to make it simple, both with the website and also the internal publishing system.
- The service manager and team’s clear commitment to putting the user first and constantly improving the service.
- A strong commitment to open sourcing the code (with all code being placed on Github), as well as to open data with each page being able to be uniquely referenced and called through an API.
- The site has been developed with a strong focus on progressive enhancement and responsive design, and as such naturally performs well across browsers, settings and devices.
- The colocated team is working at pace and effectively, and using appropriate agile approaches to deliver value early and often.
- The site design is clear and well presented
- Clear evidence of a strong relationship and support from senior managers and boards.
- The team are clearly always trying to do the ‘right thing’, even if on some occasions that makes it harder for the team themselves.
Recommendations
The team has demonstrated a high level of commitment to user testing during the alpha, which is planned to continue with the beta. As well as focusing on the primary personas, it is recommended that it does also include the other user types to a lesser extent (e.g. the ‘Inquiring Citizen’). Opportunities should also be explored to do more user-testing outside of lab conditions and constraints, including accessibility testing.
There have been issues with recruiting permanent staff into some key roles, but much effort and imaginative approaches are being used to address that. It is recommended that plans for transferring skills and knowledge from the largely contractor resourced development team, to the internal staff (once in place) are continued and strengthened. Clear, and testable, criteria should be put in place for determining success of those skills and knowledge transfer.
As the beta will shortly be moving cloud service provider, the planned full restore from back-up should be treated as a priority.
The already understood main challenges around ‘search’, and ‘9.30 publishing’ remain the largest factors that will determine success of this project, and so the assessment panel supports the intentions in place for focusing on those elements during the beta phase.
There is evidence of a strong continuous delivery process in place, which (as planned) needs to be strengthened with the introduction of additional automated and manual code quality assessment approaches and tools.
The team has clearly felt that it has needed to work somewhat in isolation from other projects within the organisation in order to deliver at pace and seeking to avoid the use of (or conversations about) tools that are not specific to the project’s needs. While this is understandable, it is recommended that time should be sought to step back slightly, and see if there are any learnings that can be taken from other teams within ONS.
It is recommended that internal senior stakeholders consider whether they could do more to support the team by being more proactive and ‘going and seeing’ rather than ‘waiting and hearing’.
The decision on whether to report on the GDS Performance Platform or separately needs to progressed and concluded, and metrics reported publicly from as soon as possible after the beta goes live publicly.
Digital Service Standard criteria:
Criteria | Passed | Criteria | Passed |
1 | Yes | 2 | Yes |
3 | Yes | 4 | Yes |
5 | Yes | 6 | Yes |
7 | Yes | 8 | Yes |
9 | Yes | 10 | Yes |
11 | Yes | 12 | Yes |
13 | Yes | 14 | N/A |
15 | Yes | 16 | Yes |
17 | Yes | 18 | Yes |