https://dataingovernment.blog.gov.uk/lasting-power-of-attorney-service-assessment/
Lasting Power of Attorney – Service Assessment
The service is aiming to make it simpler and faster to apply for a lasting power of attorney by guiding users through the service.
https://www.gov.uk/transformation/lasting-power-of-attorney
Department / Agency:
MOJ / Office of the Public Guardian
Date of Assessment:
28/4/2014
Moving to:
Live
Result:
Pass
Lead Assessor:
M. Sheldon
Service Manager:
K. Collingwood-Richardson
Digital Leader:
M. Coats
Assessment report
The lasting power of attorney service is seeking permission to be branded a live Digital by Default service on the service.gov.uk domain.
Outcome of service assessment
After completing our assessment GDS can confirm that the service has shown sufficient evidence of meeting the Digital by Default Service Standard and can go live as a Digital by Default service on GOV.UK when the final preparations the lasting power of attorney team discussed with us are complete.
Reasons
The service was assessed against and has met all 26 points of the Digital by Default Service Standard.
The lasting power of attorney service team have shown that they are:
- researching and understanding the users' needs of the service. Improving it based on evidence and making decisions informed by data.
- clearly and concisely capturing user needs and prioritising, developing and improving them based on feedback from user testing.
- a skilled multi-disciplinary team working in an agile way. Using appropriate methodologies, tools, techniques and processes to deliver working software early and often.
- working with the CLAS consultant, IAO and SIRO to assess the associated risks and legal responsibilities around the data held.
- using an appropriate amount of security to not degrade the service experience for its users.
- choosing the right tools and systems on which to host and operate the service.
- putting automated processes in place for scaling the service and responding to incidents.
- using many environments to allow full end-to-end testing of the service.
- planning to allow the release of improvements to the service on a daily basis.
- making source code open and reusable, publishing, sharing and reusing it with other departments. The team provided convincing evidence on why some source code is not open; eg authentication libraries. They were also able to provide evidence of how they will continue to review that position.
- collecting, measuring and analysing performance data to help make improvements to the service.
- benchmarking and comparing the performance data to the paper based service.
- concentrating on the KPIs of the standard and reporting these through the Performance Platform. The service has many states of "done", completed over a longer than usual period of time and requires "wet signatures". The team are working with GDS to measure the correct completion rate and with wider government to change legislation.
- working with 3 external partners to offer assisted digital support, either by phone or in person.
- planning for market and channel shift and working with intermediaries to promote the service.
GDS were particularly pleased to hear about the way the team is working when not co-located. With the Product Owner spending the majority of her time with the contact centre, collecting and collating feedback from users. Then using video chat, instant messaging and group collaboration tools to remain in contact with the wider team.
The Service Manager and Product Manager have the correct division of responsibilities. The Service Manager is senior, skilled and empowered to make decisions within Office of the Public Guardian that allow the service and team to improve. She removes blockers where necessary and shields the team from any outside influence. The Product Owner for the service has day-to-day responsibility for leading the team. She can make product decisions without the need for approval by the service manager or senior management. This division gives the product team the autonomy and space to self-organise so that they can focus on delivering a service so good that users will prefer to use it.
Recommendations
GDS recommend that the service team:
- concentrate on filling the skills gaps identified in the team as soon as possible. These were:
- Digital analyst
- Designer
- User researcher
- continue risk assessment of potential attack vectors and address them with appropriate security measures.
- continue improving tooling and automation once the planned move to the new hosting provider is complete. Take full advantage of the benefits of their offering, over the current provider’s offering.
- work with GDS to help showcase their code around wider government.
- review the risks identified that informed the decisions to not release all source code.
- ensure that any publicly offered APIs are consistent with developing standards.
- improve the layout and presentation of the service emails to aid readability.
- continue reviewing and improving the design patterns used throughout the service. The lack of a designer in the team has introduced minor inconsistencies and design stagnation since the previous assessment.
- put a plan in place to allow A/B and multivariate testing using the govuk_frontend_toolkit.
- upgrade to Google Analytics Universal Analytics tracking code, use IP anonymisation and use goals and funnels to model user journeys.
- work with the Performance Platform team to collect user satisfaction and feedback at the right points throughout the service.
- work with the GOV.UK Content team to improve the current service start page on GOV.UK.
- test the live service with the current minister, Simon Hughes.
- ensure the assisted digital support is sustainable following the expiry of the agreed pilot with the 3 external partners
Next Steps
This service has been given approval to go live as a Digital by Default service.
Summary
In summary GDS are pleased to report that the service meets the Digital by Default Service Standard. The work carried out during the alpha and beta started with the users' needs of the service. The team already have a plan in place to run regular user research of the live service and measure its performance so that it can be improved further.