Skip to main content

https://dataingovernment.blog.gov.uk/immigration-health-surcharge-service-assessment-2/

Immigration Health Surcharge - Service Assessment

The Immigration Health Surcharge is a key measure in the Immigration Act 2014. Temporary, non-EEA migrants coming to the UK for more than six months will be required to pay this fee in order to access the NHS before they are granted leave to enter/further leave to remain in the UK.

Department / Agency:

Home Office

Date of Assessment:

9/3/2015

Assessment stage:

Beta

Result of Assessment:

Not passed

Lead Assessor:

M. Harrington

Service Manager:

S. Cooper

Digital Leader:

M. Parsons

Assessment Report

After consideration, the assessment panel have concluded the Immigration Health Surcharge service should not be given approval to launch on the service.gov.uk domain as a Beta service.

The service team have clearly made significant progress since the alpha assessment, however, the panel were concerned that there were still many things to do in the short time left before the planned beta release date.

Reasons

User needs

The latest iteration of the registration journey had only been tested in one round of research. The panel feels that this should be tested with more users so the team are confident that the majority of users are succeeding first time.

The Team

The team would benefit from greater familiarity with the operations guidance in the service design manual. Some of the operations responsibility is being outsourced, but there wasn’t a clear sense of ownership about who was responsible for the service being available.

Security, Privacy, Tools, and Standards

Related to the issue with team knowledge about operations, the service team did not seem to be familiar with the conversations that should have happened when running a digital service. The panel observed no appreciation for what a modern monitoring and alerting solution should look like. The panel also found there was no convincing answer about who owned what part of the service, when tested with some sample scenarios.

The Home Office had apparently issued guidance on the day of the assessment about how code can be published by Home Office projects. The panel welcomed this news, and eagerly awaited a link from the service team to code on the internet. The panel would also like to see that code being the live repository, rather than a periodic dump of the code.

The service is designed as an API (but hasn’t been exposed as that). The assessment panel hope that publishing the source code will lead to other services learning how to expose similar functionality in that manner. It should also provide working code for ideas that could potentially be taken to the Standards Hub; for example, JSON Web Tokens and a delegated authorisation model.

Improving the Service

A digital service designed for users around the globe should not require any downtime for deployments. The proposed mechanism did not appear well-tested (custom scripts per environment) and no configuration management appears to be used. Consequently, deployments would not be regular, automated, low-risk activities, but would require manual intervention and babysitting. Blue-Green deployments are a standard, proven way of doing this across the industry.

Assisted Digital and Channel Shift

The panel found that there has been a good amount of progress on Assisted Digital (AD) since the alpha assessment and the team presented a much better understanding of this at the beta assessment. However, the panel had some concerns that the support is based around existing structures rather than meeting user needs.

Analysis and Benchmarking

The team have a lead in place for analytics however the service has not yet been instrumented and more time needs to be spent on understanding what good looks like and how this will be measured.

Recommendations

User needs

There has been a considerable amount of user research done since the alpha assessment, and user needs are becoming more well understood. The team should continue to test and iterate, especially the latest journey for the stand alone service. While not a requirement to reach beta, the service may benefit from having another user researcher to help support the team.

The Team

To meet point 2 of the Service Standard:

  • The service manager and team should ensure they are familiar with the operations section of the service manual, and should not underestimate the amount of work that is required.
  • The team should have a clear understanding of the monitoring requirements taking the monitoring stories as a starting point and be able to clearly articulate who is responsible for what and what actions would be taken if there is an issue with any part of the service.

Security, Privacy, Tools, and Standards

To meet point 5 of the Service Standard:

  • This point is closely linked to point 2. The team should have suitable tools in place to monitor the service to understand when there is an issue and what it is. Google Analytics is not a suitable tool for monitoring the complete performance of the service.

Improving the Service

To meet point 14 of the Service Standard:

  • Be able to evidence that taking down a worldwide service for up to two hours to make a database change does not impact on users.

Once in beta the team should:

  • Continue to evaluate whether having a single database is the right solution for a service such as this.
  • Take the time to automate their deployment.

Assisted Digital and Channel Shift

To meet point 10 of the Service Standard the team should:

  • Carry out more research to understand the AD support that users receive from providers other than the department’s own customer support services. The service has an understanding of friends and family support but should focus on support groups, including those mentioned in the assessment who haven’t yet been tested with.
  • Design a model of support based on user needs rather than around existing channels.

Once in beta the team should:

  • Confirm that the AD support could be scaled to meet demand for live. This should be reviewed throughout beta to include an understanding of costs and whether all channels of support (via all providers) are sustainable.

Analysis and Benchmarking

To meet points 7,18, 21, 22, 23, 24 of the Service Standard the team should:

  • Install and configure analytics. This is on the backlog but not yet done.
  • The team should consider other KPIs which will give them insight in to how the service is performing in addition to those mandated by the Service Standard.
  • For the beta, which is a stand alone solution, a done page should be in place to collect user satisfaction.
  • Have a performance platform dashboard.

Design

A list of design recommendations has been provided separately. The assessment panel recommend further testing on mobile devices, especially those popular in countries with high volumes of applications.

Summary

Though the service did not pass the assessment, there were many positives to take and the recommendations in this report are within the capability of the team to complete. The panel were pleased to see the focus on user needs and how this has continued to evolve since the alpha assessment. There has been a significant amount of research and it is clear to see the benefit this is having on the service and the team.

The team structure and the agile way of working has been iterated and the panel were pleased to hear the three key learnings from the alpha were: co-location, user research standups and better sharing of stories, ideas, walls, etc. The panel were also pleased to hear how the team have challenged the accepted way of doing things, especially with regards to copy and terminology to provide a better service for the user. With regards to AD, the panel were pleased that scripts had been prepared for the call centres, that plans were in place to measure volumes and gain insights about the support provided and that the service were investigating how international call centres may be able to offer AD support. Ultimately, the panel felt the service came in for assessment a little too early but expect it to be able to meet the points of beta soon.

Note: subsequent to this assessment the Home Office Digital Leader has confirmed that he, and the programme SIRO and SRO have approved the service and that the Home Office takes full responsibility for launch.

The Home Office Digital Leader has also confirmed that the Immigration Health Surcharge project will:

  •  pass a service standard reassessment (including integrating the customer journey) by the end of July
  •  continue to iterate the service based on user research
  •  provide a formal monthly progress update, starting with a fully resourced plan and timeline to address all service assessment recommendations
  •  have alpha branding on the service until a reassessment is passed (at which point it will be branded a beta service)

The service has been allowed to be made public, as an alpha service.

Digital by Default Service Standard criteria

Criteria Passed Criteria Passed
1 Yes 2 No
3 Yes 4 Yes
5 No 6 Yes
7 No 8 Yes
9 Yes 10 No
11 N/A 12 Yes
13 Yes 14 No
15 Yes 16 Yes
17 Yes 18 No
19 Yes 20 Yes
21 No 22 No
23 No 24 No
25 Yes 26 Yes