Skip to main content

https://dataingovernment.blog.gov.uk/all-service-assessments-and-self-certification/hmrc/company-accounts-and-tax-online-alpha/

Company Accounts and Tax Online - Alpha Assessment

Company Accounts and Tax Online will allow the smallest companies with the simplest tax affairs who are unrepresented to file their Company Tax return, accounts and computations. It will be a digital online product which will be a quicker and easier service to use and will allow the user to file to HMRC and Companies House at the same time.

Department / Agency:
HMRC

Date of Assessment:
17/3/2015

Assessment stage:
Alpha Review

Result of Assessment:
Pass

Lead Assessor:
L.Scott

Service Manager:
M. Duffield

Digital Leader:
M. Dearnley


Assessment Report

The Company Accounts and Tax Online service has been reviewed against the 26 points of the Service Standard at the end of alpha development.

Outcome of service assessment
After careful consideration the panel has concluded that the Company Accounts and Tax Online service is on track to meet the Digital by Default Service Standard at this early stage of development. The assessment panel noticed several areas where the service needs to demonstrate considerable improvement before coming in for a beta assessment. These are outlined in the recommendations below.

Reasons

User needs and user research
The service allows small companies to file accounts to HMRC and Companies House and aims to encourage far more companies to file jointly with both agencies. The team had insight into user needs here - they acknowledged that while jointly filing was beneficial to both users and government, some users preferred to file separately, and had good reasons for doing so.

The needs that the service is currently addressing were identified from a mixture of legislative requirement, customer feedback, and business needs. The team showed some evidence of how they used customer feedback and survey methodology to determine a web-based service was desired by users. They have gathered evidence of dissatisfaction and distrust with the current service (HMRC fields 4,844 calls a month). The team showed some knowledge of as yet unmet user needs, some of which they expected to address shortly, others were awaiting prioritisation in the backlog.

The alpha service demonstrated only accommodates a small-subset of users (micro-entities as defined by the EU). The service team have used targeted research to find users of this type, there are very few, and only 2 users of this type have tested the service in its current form. The service team have expanded the remit of the service and expect more companies to be able to participate in testing from end-April. Meanwhile they have been using clickable wireframes to test sections of the service in lab-based research.

The team have carried out 56 lab tests on partial aspects of the service with real users over the last 15 months. Satisfaction is being benchmarked and is reported as improving.

The team showed how they have made some changes following evidence gathered from research.

The team

This is an unusual set-up with 2 multi-disciplinary teams working together, in 2 different organisations in 2 locations. The service team explained that they are working as one team, with one empowered lead service manager at HMRC.

The team is using agile, although it is suffering from the legacy of the waterfall structure previously used at HMRC. The team hopes to work around that. They are using scrum, working in sprints with a shared backlog and have a common codebase. Elements of agile theatre are in use, eg stand-ups and retrospective. The team introduced kick-offs to give the whole team context behind the project.

The service has been in development for over a year, which is unusual - we’d expect an alpha assessment at a much earlier stage of the project. The user research has ramped up recently and the team seemed confident that the pace of delivery of iterative improvements would increase.

Security, privacy, tools and standards

The team have addressed the security and accreditation of the service and are working closely with 2 SIROs. There are no concerns, aside from the length of data retention, which the team are still working on. The service team are using the tax platform and HMRC is the data controller. The service doesn’t set any cookies other than the one covered by the tax platform. HMRC have produced a white paper called ‘Coding in the Open’. The service team is opening up some source code and are using various open standards.

The service can be run locally and there is also a QA environment which is used during testing with users. The service has re-used the existing Companies House API, which has been pen tested.

The service has adopted the tax platform’s disaster recovery process. They have planned for outages of core Head of Duty systems. They have considered the impact of downtime on users. There is a chance that users will incur penalties for late submissions. This will be handled by a wider HMRC recovery process, where affected users will have their accounts retrospectively corrected. There is also provision to defer filing deadlines in the scenario of an unplanned, lengthy downtime.

The team described a 48-hr deployment process, from ticket to live. The team are keen to make this faster - possibly by removing the dependency on the WebOps team on the tax platform.

The alpha period of development has been unusually long. The team expect the pace of delivery to quicken over the next phase. The capability to iterate the service during this early stage of development is there.

Design

The service team has struggled to find suitable users, despite targeted attempts, and so has not been able to show evidence for users completing the service end-to-end, unaided. However, the team showed how they have tested elements of the service, and how they have made the scenarios more realistic. For example, they originally relied on dummy data for users to populate the form with. They now invite users to bring their actual past tax accounts. This has uncovered another need that the service team are aware they need to address - users are bringing inaccurate data with them from actual accounts.

The assessment panel heard that the design and user experience of the service is still very much a work in progress - completely understandable at this alpha stage. The panel recommend the service team gets in touch with other HMRC services (e.g. Inheritance Tax) to see if they can use common patterns.

Assisted digital and channel shift

During the alpha stage, the team targeted 1800 users thought to have assisted digital needs. They telephoned people who had never filed online and took them through a questionnaire to determine the level of assisted digital support they may require. From this, the team projected they would have 2% of their users with potential assisted digital needs. This does not marry up with wider HMRC and Companies House research which indicates ~30% of users with AD needs. The team acknowledge this, and plan to carry out further targeted research during beta. They have identified 65,000 companies with potential assisted digital needs.

Proposed assisted digital provision is by telephone, drop-in to offices and bi-annual focus groups. The geographical spread of this face-by-face support wasn’t clear during the assessment. The assisted digital support is free at point of use.

This service will replace the previous Adobe product and supports mandatory online filing which was introduced some year ago by HMRC. Filing online is not mandatory for Companies House.

Analysis and benchmarking

The team have tagged the service with Google Analytics. There is an aspiration to use analytics to verify user research - this is dependent on users accessing and using the service during beta.

The team have thought about how to measure success. In addition to the 4 mandated KPIs, they will be measuring drop-outs (although they need expert help to work out how to do this); how long people stay on certain pages; how long the submission takes; how often people are accessing help (to improve the design).

Recommendations

User needs and user research - point 1, point 2 and point 20

Concentrate on planned user research with actual users and ensure the service is regularly tested end-to-end.

Integrate new designs that have tested well into the service and test these with users.

Continue to involve the whole team in user research and help the team understand the user needs this service will be meeting.

There are 3 full-time user researchers on the team - we’d strongly encourage that other research methods are used as well as lab-based testing, to reach many more users.

Re-consider the journey for users who cannot use the service (currently identified in 4 stages), and hand them off to the most useful place to meet their need. The list of third parties for tasks not supported by the service should be considered within the scope this service (even if the content lives on GOV.UK).

The team - point 2

The team should consider how they can operate effectively when split across two locations, in two organisations. They should be able to demonstrate how this does not impede delivery at their beta assessment.

Recruit a full-time content designer to work with the service team. This is a content-heavy service and users have complex tasks to complete. There is a high volume of content, instructional text and micro-copy. The content designer should be working alongside the team, getting involved in user research and feeding in to the design and flow of the service.

Recruit a full-time performance analyst to work with the service team. Due to the complexity of service, the service team need a dedicated analytics person to get into shape before public beta. The multiple possible user journeys means that a complex set of goal filters will need to be built to capture drop off and pain points.

Assisted digital - point 1 and point 10

Carry out further research to identify users with assisted digital needs and develop proposed support to meet user needs and the assisted digital standard.

Design and content design - point 9 and point 13

The service team should work with GOV.UK teams and HMRC content teams to ensure the user journeys around the service provide the best experience for users.

The content designer (when hired) should collaborate with the content community at GDS and across government to ensure that the service adopts the style patterns and best practice endorsed by its application in comparable, successful services.

During the next assessment, the service team should be prepared to show more examples of how evidence gathered from user research and testing has informed the service design.

Work with the Inheritance Tax team on the design and user experience of the form. They encountered similar challenges and have some well-researched solutions.

More detailed front end design recommendations and observations will be sent separately.

Analytics, benchmarking and reporting - points 7, 18 and 21-24

As well as recruiting a performance analyst, the team should work with HMRC exemplar services to get a better understanding of how the use of data can inform service development.

Work with the GDS performance platform team to have a dashboard measuring performance against KPIs publicly available when you are ready for public beta. The assessment panel recommend the team measure successful submissions in addition to the KPIs.

Open standards and common government platforms - point 16

The PDFs generated in the service should be PDF/A to comply with open standards. The team should familiarise themselves with the government standards hub.

The service will need IDA for Business. The team should engage with the GDS team working on this to feed in their needs.

Make source code open and reusable- point 15

The service team should continue the work to open their source code.

Testing the end to end service - Point 17

Ensure the service has been penetration tested.

Testing with the Minister - point 26

The team have demoed the service to the Chancellor. To fully pass this criteria, before the live assessment, the minister needs to complete the service themselves, as if they were a user.


Digital by Default Service Standard criteria

Criteria Passed Criteria Passed
1 Yes 2 Yes
3 Yes 4 Yes
5 Yes 6 Yes
7 Yes 8 Yes
9 No 10 Yes
11 Yes 12 Yes
13 No 14 Yes
15 Yes 16 Yes
17 Yes 18 No
19 Yes 20 Yes
21 Yes 22 Yes
23 Yes 24 Yes
25 Yes 26 Yes