Skip to main content

https://dataingovernment.blog.gov.uk/complete-the-deputy-report/

Complete the Deputy Report - Self Certification

When a person loses mental capacity and has no appointed attorney, the Court of Protection will appoint a deputy to manage the financial affairs, health and welfare of that person (the client). A deputy can be a:

  • lay deputy, i.e. a trusted family member or friend of the client
  • professional deputy, e.g. an accountant, solicitor or a specialist from an approved charity
  • representative of the local authority

Deputies are obliged to report annually to the Office of the Public Guardian (OPG) on how they have managed their client’s finances. This is a measure to prevent fraud. Currently, the only way for deputies to submit this annual report is via a paper form.

The first iteration of the Complete the Deputy Report service will allow lay deputies to compile and submit online to OPG their annual report of how they have managed their client’s finances. The scope of the service does not yet include:

  • the Court of Protection’s process to appoint a deputy
  • the initial assessment and guidance provided by OPG to a newly-appointed deputy
  • support for professional deputies
  • processes relating to a deputy’s management of their client’s health and welfare

Note that user research into the name of the service is ongoing.

Department / Agency:
MoJ

Date of original assessment:
09/04/15

Date of Reassessment:
28/04/2015

Assessment Stage:
Alpha

Result of Assessment:
Pass

Lead Assessor:
J. Busuttil

Service Manager:
K. Collingwood-Richardson

Digital Leader:
M. Coats


Assessment Report

Outcome of service assessment

After consideration the assessment panel has concluded the Complete the Deputy Report service is on track to meet the Digital by Default Service Standard at this early stage of development.

Reasons

The service currently meets the requirements of the standard for an alpha. Areas of good performance against the standard are as follows.

User needs

The user needs mapping from discovery is clear and pragmatic, with good user personas and journey maps. It is evident that discovery was very thorough and that the product owner was closely involved.

The team has worked collaboratively on user research and used a variety of quantitative and qualitative techniques. These have included surveys, interviews, lab-based testing, contextual research and guerilla testing. These have allowed the team to discover and understand user needs well, which in turn have informed the design of the service.

Amongst other findings, the results of the ongoing research has prompted the team to redesign the navigation and bookkeeping aspects of the prototype service during the alpha phase to be measurably more intuitive for users.

The team

Despite having a delivery team spread across a few locations in the UK, the team has been successful in working collaboratively and remotely.

It is also notable that a business stakeholder from OPG co-locates with the delivery team periodically and is anticipated to become more involved soon as a deputy product owner.

Recommendations

User needs

  • Continue to research and understand in more detail the needs of OPG staff users, who receive, review, reject or accept a report submitted by a deputy, then process that information in the case management system. Their interaction with the service has a knock on effect on the deputies’ use of the service, and in turn on the ultimate beneficiary of the service, the client.
  • Once these needs are better understood, create personas for OPG staff users and use them to inform the design of the administrative functions of the service.
  • When researching needs of administrative users, be aware that it is a common pattern for users to tend to express what they want the service to do in terms of their current processes, rather than articulating their underlying problem needing to be solved by the service.
  • For the beta phase, bring more focus to research by stating up front and more clearly the questions the team needs to answer through research. Incorporate these research goals into the ongoing research plan and find ways to reduce the turnaround time from research to design to implementation. This will allow development in beta to continue at pace.
  • Define and select a private beta segment and create a pilot plan based on what the team feel the challenges could be and iterating it in stages.

Security, privacy, tools and standards

  • Speak to technical architects in MOJ Digital about the possible reuse of the existing postcode lookup API.
  • Separate the service’s user interface into two distinct parts on separate domain names: the citizen-facing service; and the administration interface for OPG staff. This will simplify security, user privilege separation, and will allow a reduction in the set of web browsers the admin interface needs to support to the set OPG staff use in practice.
  • Complete the Requirements for Secure Delivery of Online Public Services (RSDOPS) assessment.
  • Implement smoke tests as part of the deployment process to the production environment.
  • Implement health checks, accessible securely by external monitoring systems, which can determine the health of the overall service and the health of key components.
  • Carry out capacity and load testing on the digital service, even though traffic patterns to the service are predictable and anticipated to be of low volume. Run Distributed Denial of Service (DDoS) stress tests to exercise the service to breaking point.
  • If not already, use BrowserStack or equivalent to test the service’s web browser compatibility.

Improving the service

  • Be careful not to overload the staging environment with performance testing when acceptance testing is underway. Consider a dedicated acceptance environment.
  • Ensure that any member of the delivery team, technical or otherwise, is able to perform a successful release of the service to the staging and production environments.

Design

  • Consider ways to reduce the amount of guidance text needed on the overview page, and elsewhere in the digital service, ideally by making the service itself more intuitive, or perhaps by using an alternative medium to show users what to do rather than having to tell them.
  • There appears to be still a lot of jargon in the microcopy, particularly around the bookkeeping information requested. Definitions, e.g. ‘opening balance’ will need refinement in the future. Look for ways to improve the experience for users by simplifying or rewording these prompts.
  • There are several recent commercial bookkeeping products that users feel have intuitive interfaces for explaining financial transactions, and for locking transactions partially and permanently. If helpful, consider reviewing their approach.

Assisted digital and digital take-up

  • Meet with GDS assisted digital (AD) leads regularly.
  • Seek best practice from GDS on surveying user satisfaction for AD channels and frequency of surveys, and describe how this will inform performance and channel shift plans.
  • Arrange for the contact centre to log as much AD information as possible to uncover where users are on the spectrum of digital inclusion (DI); seek guidance from GDS to ensure the right things are being measured.
  • Carry out a more precise analysis of proportions of each type of users on the spectrum of digital inclusion and how they will change over time.
  • Determine whether OPG staff users have any AD or digital inclusion needs.

Analysis and benchmarking

  • Look into measuring satisfaction in ways in addition to the ‘done’ page - perhaps other satisfaction surveys in the header or a pane.
  • Survey existing users to gauge channel shift percentages and needs, i.e. ‘Why wouldn’t you use the online service? ‘ and provide them with several options to answer.
  • Establish what successful interactions with the service look like for users of all types performing different tasks. Break them down into specific tasks for success, e.g. successfully adding a new bank account, then assess satisfaction scores by task.
  • Be able to assess satisfaction across segments of users both with and without AD needs.
  • Define anticipated conversion funnels before move into beta. Best guesses will suffice if there is no initial data to define them. Ongoing analysis will test any assumptions made and highlight any areas requiring qualitative testing to uncover why users are dropping out at a particular stage of the funnel.
  • Based on the breakdown of specific user tasks and conversion funnels (see earlier points), define other operational and functional KPIs that allow the current success of the service to be assessed at a glance.
  • In addition to metrics that can only be measured infrequently (e.g. results from an annual survey), identify meaningful metrics for the service that can be monitored monthly, weekly, daily. A meaningful metric is one that prompts a change in behaviour in response.

Digital by Default Service Standard criteria

Criteria Passed Criteria Passed
1 Yes 2 Yes
3 Yes 4 Yes
5 Yes 6 Yes
7 Yes 8 Yes
9 Yes 10 Yes
11 Yes 12 Yes
13 Yes 14 Yes
15 Yes 16 Yes
17 Yes 18 Yes
19 Yes 20 Yes
21 Yes 22 Yes
23 Yes 24 Yes
25 Yes 26 Yes