Skip to main content

https://dataingovernment.blog.gov.uk/immigration-health-surcharge-service-assessment/

Immigration Health Surcharge - Service Assessment

The Immigration Health Surcharge is a key measure of the Immigration Act 2014. Temporary, non-EEA migrants coming to the UK for more than six months will be required to pay this fee in order to access the NHS before they are granted leave to enter/further leave to remain in the UK.

Department / Agency:
Home Office

Date of Assessment:
21/1/2015

Assessment stage:
Alpha

Result of Assessment:
Pass

Lead Assessor:
M. Harrington

Service Manager:
S. Cooper

Digital Leader:
M. Parsons


Assessment Report

After consideration, the assessment panel has concluded that the Immigration Health Surcharge service is on track to meet the Digital by Default Service Standard at this stage of development. The service team has a working prototype which is helping them learn about the user needs that they are trying to meet. Different designs have been tried out and the rhythm of the team in terms of iterating and learning is established.

The assessment panel is, however, concerned by the amount of work required to meet the standard at Beta within the challenging timeframes the project has, the handoffs between different channels and services that the user will experience and the assisted digital support.

Reasons
User needs
To date, the project has been driven by policy and technical needs, rather than user needs. In recent months, the service team has invested considerable effort into investigating user needs, and it is evident that this work is paying off. It is also clear that much remains to do, in terms of iteration, and more significantly in terms of reaching a representative sample of an extremely diverse audience.

By definition, the assessment panel believes that to become user centred, this team has to reach out to its audience, and elevate their needs to sit above technical and policy constraints.

User Research
The disconnect between policy and implementation is a concern for the assessment panel. In particular, the perceived need for secrecy which has hampered the user research portion of the project and prevented months of potential learning from happening. The waterfall-nature of policy leading to implementation with no feedback loop/involvement earlier is a frequently-seen problem.

The team
The assessment panel was pleased to see that there is a multidisciplinary team in place and that, while development is contracted, there is a close working relationship between the Home Office and their contractors. A user researcher and designer were added to the team after the project started and the team now appears to be in a better place for this. The team have three product owners, a product manager and a service manager and which could create confusion around ownerships but the team seem to be working well with this.

The team is starting to work in an Agile way, although was hampered by RUP being used at the start of the project.

Security, Privacy, Tools and Standards
The service team is developing at a low security level, and will be deploying to a higher security level. The assessment panel was happy to hear of this. The panel thought it would be good to see the team blog about this, to encourage other parts of government/suppliers to have conversations with their accreditors/security people about the feasibility of this approach.

Service design and GOV.UK style
The service team has taken the GOV.UK style guide and started to implement the visual style, but input from a content designer is needed. This will require several iterations with user research to ensure each page is understood. Some browser testing has been carried out on mobile, but this needs to be formalised. The team has not carried out work to make the service accessible, and the assessment panel would expect this to be done before becoming a public beta.

Assisted Digital and Channel Shift
The service team identified 3 or 4 assisted digital users among the 26 users they spoke to during their testing which is positive at this stage of development. However, very little has been done to plan how assisted digital users will be tested outside of the premium service and how support will be free at the point of use for the user except support from family members.

There is no non-digital way to complete this transaction so the service should expect 100% digital take up.

Analysis and Benchmarking
The team has iterated on the service since the original alpha designs and it is clear that user research has led this process. The team has met with the Performance Platform and should now focus on the collection of the KPI data and the instrumentation of their analytics solution.

Testing with the Minister
The team has already demoed to the Minister and will continue to show iterations of the product.

Recommendations

User needs
As identified above, the team should continue their user research and identify and test with a representative sample of users. Testing at premium centres alone is not representative . End-to-end testing of how the service integrates with the other Home Office services is also essential as this could be a particularly difficult experience for the user.

The team
There is a multidisciplinary team in place but there is no one with clear responsibility for assisted digital or analytics. Having specialists in these areas could help the further iteration of the product and the provision of assisted digital support.

Security, Privacy, Tools and Standards
The team has procured a vendor with an existing platform in this space. As a result, little of the product will be open-sourced. This is troubling from a lock-in perspective. The team owns the IP, but a future migration to a new supplier would essentially mean starting from scratch, rather than having a code base that they own and can have someone else improve. The assessment panel recommends that the entire team (in particular, the service manager, policy people and procurement) be familiar with the guidance on choosing technology.

Improving the service
The team should quickly fix the deployment aspect of this project. The assessment panel identified no clear idea of the deployment pipeline for getting code developed in a low security environment deployed to a high security environment in a fast, repeatable manner. The assessors were also sceptical about the need for:

mandatory downtime for application deployments. Downtime of up to 1 hour was mentioned; the assessment panel believes that it should be closer to seconds if necessary at all, and zero-downtime deployments should be the goal, particularly with an always-on, global service
a rules engine to host a calculator service. The assessment panel normally encounters this in situations where deploying software to production is viewed as a risky, complicated practice. Deploying updates to rules is seen as less risky. The assessment panel believes that this is a false dichotomy, and deploying software to production should be an everyday occurrence.

Related to the deployment concerns, the assessment panel would like to be convinced that there is scope for continuous iteration and improvement of the service. There did not appear to be an analytical capability available to the team to enable learning and improvement to happen.

Design
The service team should look at the design patterns hackpad to ensure the service is conforming to the latest approaches of applying the GOV.UK style (for example, progress bars often confuse users more than they help so GOV.UK services avoid them unless shown to improve the service).

A content designer needs to work through the service and make sure that the content is in line with the GOV.UK style.

The assessment panel recommends that the service team contacts a GOV.UK proposition manager to ensure correct start pages and information will be on GOV.UK. The idea of a ‘ready reckoner’ (which the panel did not see in the assessment) would normally fall under the remit of the GOV.UK team and not the service.

An accessibility audit of the service needs to take place, and changes implemented, before going to public beta.

The assessment panel expects to see iterations of the content and UI of the service before public beta with proof that it is getting easier for users to complete the transaction. There must be data analytics in place and a plan to improve the service past public beta and in live running of the service.

Assisted Digital and Channel Shift
The team needs to focus on the assisted digital support for this service, particularly as the service is wholly online, meaning that inadequate support would leave users excluded from the service. Currently the only support is a premium rate phone service, which is not suitable. Support must be free at the point of use for the user. As well as the usual beta requirements, the team must also explain at the beta assessment what else has been learnt about AD users from further user research sessions (including with charities such as Citizens Advice Bureau) and demonstrate how support has then been specifically tailored to meet their needs. The team should also clarify how much assisted digital support will be required, as some of the quoted figures conflicted. The service must confirm that users who currently rely on friends and family for their assisted digital support will have a sustainable and appropriate alternative available. The team must clarify how assisted digital support for the end-to-end user journey will be tested, measured and iterated.

Analysis and Benchmarking
Following the prototype, the team should focus on the installation and configuration of analytics to ensure that useful data is collected once the service goes into beta. In addition to collecting the mandatory KPIs for the service assessment, the team should consider what the important metrics are for the service and how these will help them to continually iterate it.

Testing with the Minister
Continue sharing iterations with the Minister.

Summary
The assessment panel thought the team presented well and has passed the alpha assessment. It is clear that the service team are working in a much better way than when the project started and the assessment panel was pleased to hear examples of user research influencing the design and flow of the service. The assessment panel do, however, think there is much to do before the beta assessment and while not insurmountable, the demanding schedule of this project makes it difficult. The team should follow the recommendations above and take the time required to ensure they meet all the points of the standard for beta before coming in for another assessment.


Digital by Default Service Standard criteria

Criteria Passed Criteria Passed
1 Yes 2 Yes
3 Yes 4 Yes
5 Yes 6 Yes
7 Yes 8 Yes
9 Yes 10 No
11 Yes 12 Yes
13 Yes 14 Yes
15 Yes 16 Yes
17 Yes 18 Yes
19 Yes 20 Yes
21 Yes 22 Yes
23 Yes 24 Yes
25 Yes 26 Yes