Skip to main content

https://dataingovernment.blog.gov.uk/defence-solicitor-service-assessment/

Defence Solicitor - Service Assessment

The Defence Solicitor service provides custody teams with a one click request for a duty defence solicitor. The product creates a two-way connection between custody software systems, the Legal Aid Agency's duty rota, and then to contracted law firms.

Department / Agency:
MOJ

Date of Assessment:
27/2/2015

Assessment stage:
Alpha review

Result of Assessment:
Pass

Lead Assessor
N. Williams

Service Manager:
W. Rowan

Digital Leader:
M. Coats

Assessment Report

Outcome of service assessment
The Defence Solicitor service has been reviewed against the 26 points of the Service Standard at the end of the Alpha development. After consideration the assessment panel have concluded the service is on track to meet the Digital by Default Service Standard at this early stage of development.

Reasons
The assessment panel were convinced of the need for this valuable service, which will bring significant improvement to a 12 year old phone-based chain of communication between multiple parties involved in assigning a defence solicitor to a detainee. The service team explained clearly that it will bring direct benefits of: time and cost savings for custody officers and the defence solicitor call centre; a more convenient system for solicitors; a removed bottleneck in the form of busy phone lines; and a reduced risk of inaccurate information from re-keying information. The service team anticipate further, indirect benefits such as the ability for custody officers to make better informed judgments about waiting times, and consequently for more detainees to access legal help - but it was less clear to the panel that these are measurable hypotheses given the other variables involved.

The service team have achieved a lot in a short time, are empowered and well set up with the right mix of skills, and understands and are following the approach set out in the service manual.

The team’s technical design is strong - the panel found it refreshing to see a team conceiving of their service as being primarily an invisible API, favouring partnership with vendors of existing custody suites above building a new separate interface. The panel was impressed that the service have already got commitment from the 3 vendors of custody software to integrate with the new service. Their answers to the questions about data protection and security were also reassuring to the panel. Additionally, the adoption of open source code from the outset was excellent.

Recommendations

User research
The assessment panel felt there were gaps in the service team’s understanding of:

  • The entirety of the user needs which the service must support across a complex service and communication chain.
  • How the new system will affect the amount and type of information which currently moves among participants via the phone system, and how the experience and behaviour of the participants may be affected as a result.
  • How the benefits which users like about the existing system might be replicated and retained in the new system.

To address these gaps, the panel believes that there should be a greater focus on uncovering - through research - clear, experience-driven user needs for each participant.

(A notional example of an experience-driven user need for a custody officer might be: “As a custody officer I like to maintain some rapport with the detainee and get them to the cell in a cooperative frame of mind, if I can. My information gathering therefore needs to be quick and easy, so that I can focus on them, and help them not to get too restless or agitated.”)

By beta assessment, the service team should make sure they are able to answer the following research questions:

  • How might the new system impact on the information-gathering moment when detainee is standing in front of the custody desk, and how can the design accommodate the pressing needs of this moment?
  • What changes might there be to the information which flows as a result of shifting to an online system and how might they affect the ultimate behaviour of the participants, eg new or unanticipated delays?
  • How will you ensure that the online system fits into call centre workflow and that no new delays are introduced?
  • If improving the reciprocal information flow between custody officer and solicitor is a key objective, how will the system ensure this?
  • How will resistance to new technology among solicitors affect their interaction with the product and the process? How should the design respond to this?
  • What costs (time, money and otherwise) will adoption of the new system impose and how will they be offset against the costs and benefits of the existing system?
  • How will the team continue to explore and bottom out the issue of local variation?
  • How will the risks of a skewed sample be mitigated in the team’s recruitment of research respondents? (As we mentioned in the meeting, a reliance on undirected selections by police contacts may lead to unintentional bias)

In the Alpha, usability research did not result in any changes to the design due to time constraints. During the next phase, the service team should make sure they plan for taking a more structured approach to research, user needs development and design iteration. The team’s research plan needs to set out the different segments of the audience, the number of respondents of different types that will be drawn from each segment, the research methods that will be used, and any regional dimension. It should also lay out plans for getting the new system to a point where it can be tested by custody officers in real situations, in the presence of detainees.

Assisted digital
The panel clarified that the service team only need to provide assisted digital support for solicitors, as the end users of the only public-facing part of the service. Duty solicitors have contracted with the Legal Aid Agency to engage electronically, so there will be no user requirement to provide assisted digital support for this group as they will already have online skills and access. The service team will develop their research into own (ie non-duty) solicitors during Beta development to identify user needs and likely demand for support. The team also need to develop a plan to test proposed assisted digital support to show that it meets user needs.

Design and content
While the web-based user interface for custody officers is likely to only be needed short term, the interfaces for the call centre and solicitors are critical parts of the service. The panel saw numerous issues with the prototypes, and will send a snagging list separately. The panel accept that it’s early days for this service, but there were many avoidable deviations from GOV.UK style (like use of camel case) and a lack of design thinking apparent in decisions such as the order and titles of columns. As the service moves forwards the team will need to do much more work on the design and content of all three user interfaces, testing frequently with users. The assessment panel suggest the team conduct a design review with GDS designers before the beta assessment.

Measurement
At the next assessment the panel will expect clarity about the service team’s KPIs and metrics and, where possible, benchmark figures for comparison against the current service. The panel believes that it might be beneficial to appoint a dedicated performance analyst in the team - or at least, make sure it is part of one of the team members’ roles to proactively analyse performance metrics in every sprint, feeding the outcomes back into the each iteration. The assessment panel will also expect the service team to have worked with GDS to have a performance platform dashboard available.


Digital by Default Service Standard criteria

Criteria Passed Criteria Passed
1 Yes 2 Yes
3 Yes 4 Yes
5 Yes 6 Yes
7 Yes 8 Yes
9 Yes 10 Yes
11 Yes 12 Yes
13 Yes 14 Yes
15 Yes 16 Yes
17 Yes 18 Yes
19 Yes 20 Yes
21 Yes 22 Yes
23 Yes 24 Yes
25 Yes 26 Yes