https://dataingovernment.blog.gov.uk/all-service-assessments-and-self-certification/hmrc/employment-intermediaries-beta/
Employment Intermediaries - Beta Assessment
This is a new digital service that will enable intermediaries (or Agents acting on their behalf) to comply with legislation by uploading a report to HMRC providing details of the workers they supply, and the payments they have made to those workers, where they did not operate Pay As You Earn. The intermediary will have to upload and send HMRC a report that contains the information once every 3 months using this service.
Department / Agency:
HMRC
Date of Assessment:
24 March 2015
Assessment stage:
Beta
Result of Assessment:
Pass
Lead Assessor:
A. Keene
Service Manager:
A. Flynn
Digital Leader:
M. Dearnley
Assessment Report
Outcome of service assessment
After consideration, the assessment panel have concluded that the Employment Intermediaries service has shown sufficient progress and evidence of meeting the Digital by Default Service Standard criteria and should proceed to launch as a Beta service on a service.gov.uk domain.
Reasons
The panel felt that the service team demonstrated a commitment to place continual user research at the heart of the service development. The team had evolved their methods of research from emailed on-line questionnaires, through Skype interviews, to attending trade expos, to face-to-face user testing. The service team had also shown an understanding of the advantages and disadvantages of the different methods of interaction with their users. They had conducted extensive and, in in the panel’s opinion, sensible research to uncover users with assisted digital (AD) needs. The panel agreed with the team that, currently, there is no identified need for AD support for this service.
Throughout the assessment, the panel found that the service team showed a depth of understanding of their users needs, based on their research.
The panel also found the Employment Intermediaries team to be a strong multi-disciplinary team, headed up by an empowered Service Manager. The panel was impressed to see that the team coped well with the loss of their User Researcher, with the role taken on by their User Experience Designer. However, the panel do expect the team to have a replacement User Researcher before they return for a Live assessment.
The service team is well supported by the tax platform - they showed an understanding of how they use and interact with the platform and have also given something back, in the form of PDF rendering.
The team have shown the panel how they have deployed frequently, at least weekly, since launching private Beta. They demonstrated the evolution of the service in the assessment from the initial Alpha sketches, through the private Beta minimum viable product, through to the current service.
The team presented a credible plan for the evolution of the service through public Beta through to Live.
Additionally, the panel also found that the team demonstrated a sensible approach to the management of the service data, management of their key stakeholders and the attendant risks, in line with government practice. The panel were glad to see that improvements had been made since Alpha.
The team have immediate plans to open up validation engine code, to support the external software providers, who will in turn support your service users, with plans to open up further.
The team also demonstrated the use of open source tools and open standards to avoid vendor lock in and forcing software choices on their users.
Recommendations
The assessment panel believe that the service team need to be planning for what happens next for Government Gateway.
The service team need to ensure they have a User Researcher embedded, before they return for the Live assessment.
The assessment panel also advise that the team start releasing code into open public repositories for re-use, ideally starting with the validation engine code.
The panel believe that the service team should investigate if there are alternative ways to satisfy the legislation. The current user journey places a non-trivial reporting burden on organisations. Are there alternative sources of this data, or could the data could be collected and collated from several existing sources? If not, could this data be collected through another existing reporting system?
The team need to be able to provide a cost per transaction for the service. To do this, the service team need to work with, and encourage, colleagues in HMRC finance to develop a method of apportioning costs from shared services (e.g. the tax platform).
The assessment panel recommend HMRC move their pre-live service away from the internal VPN, to behind a password. This would save the trouble the service team had in the assessment with the demo, and should also make testing with users in the field easier.
Summary
The panel want to take this opportunity to thank the service team for their hard work in developing the service and for attending the service assessment. It is clear that there is a passion for user research and continual improvement within the team, and the team are striving to learn more about their users and to develop their continually acquired skills in service development.
Digital by Default Service Standard criteria
Criteria | Passed | Criteria | Passed |
1 | Yes | 2 | Yes |
3 | Yes | 4 | Yes |
5 | Yes | 6 | Yes |
7 | Yes | 8 | Yes |
9 | Yes | 10 | Yes |
11 | N/A | 12 | Yes |
13 | Yes | 14 | Yes |
15 | Yes | 16 | Yes |
17 | Yes | 18 | Yes |
19 | Yes | 20 | Yes |
21 | Yes | 22 | Yes |
23 | Yes | 24 | Yes |
25 | Yes | 26 | Yes |