Skip to main content

https://dataingovernment.blog.gov.uk/all-service-assessments-and-self-certification/hmrc/employment-intermediaries-alpha/

Employment Intermediaries - Alpha Assessment

In the Government’s autumn statements of 2012 and 2013, it was announced that a number of measures would be introduced to close the tax gap in the temporary labour market.

This service will provide a standardised and digital way for intermediaries in the temporary labour market to report information into HMRC to meet their legislative requirements and help close the tax gap.

Department / Agency:
HMRC

Date of Original Assessment:
2 September 2014

Date of Reassessment:
29 September 2014

Assessment stage:
Alpha review

Result of Original Assessment:
Not passed

Result of Reassessment:
Pass

Lead Assessor:
L. Scott

Service Manager:
A. Flynn

Digital Leader:
M. Dearnley


Reassessment Report

The assessment  panel agreed that the Employment Intermediaries Service is now in excellent shape to move into Beta development. The service team put together a professional and considered reassessment with the processes and practice they have put in place being invaluable for other HMRC services to follow. Below is the assessment panel’s response on the steps Employment Intermediaries service have taken to address the 4 criteria not passed at the original assessment.

3. Evaluate what user data and information the service will be providing or storing, and address the security level, legal responsibilities, and risks associated with the service (consulting with experts where appropriate).

The panel were really pleased to see the team consulting with GDS on how to approach the security level of the service. They have addressed concerns about how such a high security rating will impact on the operation of the service.

They have explained in further detail that while the data will be classed as a higher security rating, the service itself will be classed as a much lower security rating. This means the service can use the tax platform so long as the data is held temporarily. Accreditors and data guardians are happy with this approach.

The service team explained that they would be using the existing bridge rather than building a new one - they have used data to determine the average file sizes (which are very small) and explained how they will chunk data from larger files.

This is a good solution and the panel is confident that the service will be able to operate and release frequently given their description of how they will manage the higher security level data.

2. Put in place a sustainable multidisciplinary team that can design, build and operate the service, led by a suitably skilled and senior service manager with decision-making responsibility.

14. Make sure that you have the capacity and technical flexibility to update and improve the service on a very frequent basis.

17. Be able to test the end-to-end service in an environment identical to that of the live version on all common browsers and devices. Use dummy accounts and a representative sample of users.


The service team explained in very useful detail how they will work with colleagues at Dorset House to operate the live service. The assessment panel were impressed with the effort they have gone to in setting up a clear process for their team to follow and for other teams to emulate.

They explained how they now have a fully functioning development environment with integrated testing. Their process for getting code deployed is clear and they have set up frequent discussions with colleagues in Dorset House to make sure this process is working, allowing opportunities for iteration and prioritisation.

Although the service are currently relying on the webops team in Dorset House, they are training up their own team to understand and manage the process. Whilst not being the dedicated support, the team is providing support and capacity for the operation of the service, and are monitoring tests visually so the whole team can see the status of the service and infrastructure.

The service team articulated a clear line between responsibilities between the service team and the webops team in Dorset House and provided evidence that everyone is aware of their role.

The assessment panel were very impressed with the high standards and thoughtful quality the service team achieved in this short time to get such an excellent operational model in place.


Original Assessment Report

The Employment Intermediaries service has been reviewed against the 26 points of the Service Standard at the end of Alpha development.

Outcome of service assessment

After consideration the assessment panel have concluded the Employment Intermediaries service is not yet on track to meet the Digital by Default Service Standard at this early stage of development.

Reasons

The assessment panel was extremely impressed with many aspects of the service.

The assessment panel’s concerns around passing this service for the next stage of development are primarily around Criteria 3:

3. Evaluate what user data and information the service will be providing or storing, and address the security level, legal responsibilities, and risks associated with the service (consulting with experts where appropriate).

The statement that the data is of a significantly high security rating is extremely concerning and has implications for the entirety of the project. The concern of the assessment panel is how this rating impacts the rest of the project, from developer hiring, device management, code deployment and so on.

The service team need to articulate exactly why the data has a high security rating, and how they view continued development under that regime. For example, can the team open the code? What machines would developers use? Do developers need security clearance? How would developers deploy code? Whilst these questions can be fully addressed in beta the assessment panel expects there will be  a number of issues and these would benefit from being identified and addressed at this stage.

The team need to demonstrate how they will be able to mitigate this risk, either by restating the risk or by ensuring that the nature of the risk doesn't adversely affect delivery of the service.

In addition, the assessment panel had some concerns around points 2,14 and 17 of the standard:

2. Put in place a sustainable multidisciplinary team that can design, build and operate the service, led by a suitably skilled and senior service manager with decision-making responsibility.

14. Make sure that you have the capacity and technical flexibility to update and improve the service on a very frequent basis.

17. Be able to test the end-to-end service in an environment identical to that of the live version on all common browsers and devices. Use dummy accounts and a representative sample of users.

The assessment panel were concerned about the lack of a clear deployment pipeline. Without this setup in place and understood, beta development will be blocked.

The assessment panel were not yet clear on how development environments and continuous integration will be created and managed for beta development.

The assessment panel understand that the service team intend to use the HMRC set-up in Dorset House to create development and production environments and live support. We would like to see a clear statement that this is to be the case and how that will be managed.

The lack of a devops role (we understand recruitment is underway) is a blocker in the move to beta. As it stands there isn't a complete team able to deploy and manage this service.

Recommendations

1. Review the high security rating. GDS can assist with this.

2. Provide a statement on how the service team will integrate with the development and production environments in Dorset House, and explain how the service can be frequently deployed and managed via this set-up.

3. Demonstrate how the service will have a complete team who can take full responsibility for devops.

4. Demonstrate the steps that are underway to make source code publicly available (notwithstanding the high security rating comments above). It is much harder to open code after development than to build with this in mind. The service team have shown their intent to do this and are undergoing governance procedures.

Observations against other criteria

User needs and user research

The assessment panel noted how the whole service team had a deep understanding of their users, and of the top user needs that the service would address. The team’s user research and data gathering exercise to elicit these user needs was very impressive, particularly their determination (in spite of barriers) to access real users, and their passion in getting the whole team involved in research and analysis.

The assessment panel were impressed with how the team took findings from user research and used these to inform the service design. The panel noted several great examples of how this user-focused approach has produced an intuitive and well-received digital service.

The service team have a great plan in place for user research for the next stage of development, and research is embedded in their sprint cycle. The panel were pleased to hear about the team’s plans to reach all types of users, and to re-test with users from the early stages of discovery.

The team

The service team is an excellent example of a multi-disciplinary, co-located team working together in an agile way to design a service to meet user needs. The assessment panel were very impressed with how the team works, communicates and self-organises, along with how they work as a team whilst keeping a clear separation of roles.

The assessment panel really liked how the team has a clear idea of direction and focus (eg last sprint/next sprint, roadmap) as well as the clear idea they have of the value they are adding to the digital service. (Although the panel had reservations about the lack of a complete team, particularly devops, as discussed earlier).

Security, privacy and standards

The service team have engaged with all the right people and made all the necessary checks for compliance against the standards and codes outlined in the criteria. They are adopting existing HMRC technical approaches using the stack developed by teams working on the exemplars at Dorset House.

The assessment panel noted how the service are sharing learning and knowledge between teams with regular show and tells, communication and code re-use. The panel loved the example of introductory Scala sessions with new developers, and would urge the service team to share this more widely.

The assessment panel saw how the team are complying at least in spirit with open documents standards and noted how findings from user research informed the decision not to mandate this.

The assessment panel were glad to see the team are planning for disaster recovery and business continuity, and that they plan to address this fully during beta. The panel really liked the team’s user-focused approach here, especially their knowledge around peak load times, crucial deadlines, and thinking about the messaging to users.

As noted earlier, the assessment panel were very concerned about the impact of the proposed high security level on the delivery of the service and the team who will run it. The assessment panel want to learn more about how the service team propose to integrate with the Operations Team in Dorset House.

Improving the service

The service team’s approach to updating the prototype in alpha was a good example of agile development. As discussed earlier, the panel were not clear how the service team plan to manage a full development environment for the beta.

Design

The assessment panel were really impressed with the user-focused design of the service. The service team have used the design patterns and style guides and are contributing back to the community with their findings from research and their design and content solutions.

The assessment panel really liked the service team’s approach to tackling the offline steps around penalties, and their vision around removing this in the future.

The assessment panel were struck by the excellent quality of the content design. This is a really good example of designers and content designers working alongside the development team to create a great experience for users.

Assisted Digital and Channel Shift

The service team have carried out considerable user research through known intermediaries and representative bodies. As they have a highly IT literate user base, they have not yet identified a need to provide assisted digital support.

They should continue to seek out potential assisted digital users throughout the beta phase.

Analysis

The service team have a great grasp of how they will measure success. They are engaging with all the right people and sharing and communicating with teams on similar services. They are recruiting a product analyst and the whole team is learning how to interpret data trends. The assessment panel were really pleased to see how well the whole team knew the data around the service, including projected usage.

Testing with the Minister

Preparations for this process are underway.

Next steps

The service should follow any recommendations made in this report and see the Government Service Design Manual for further guidance.

In order for the service to proceed the service will need to return to GDS for a reassessment of the criteria which were not passed.


Digital by Default Service Standard criteria

Criteria Passed Criteria Passed
1 Yes 2 Yes
3 Yes 4 Yes
5 Yes 6 Yes
7 Yes 8 Yes
9 Yes 10 Yes
11 N/A 12 Yes
13 Yes 14 Yes
15 Yes 16 Yes
17 Yes 18 Yes
19 Yes 20 Yes
21 Yes 22 Yes
23 Yes 24 Yes
25 Yes 26 Yes