Skip to main content

https://dataingovernment.blog.gov.uk/cap-information-service-service-assessment/

CAP Information service - Service Assessment

The Common Agricultural Policy (CAP) delivers EU support to rural areas, and aims to achieve environmental, social and economic benefits.The policy is updated roughly every 7 years, and is currently underway, with the new policy expected to be in place by 1 January 2015.

The CAP delivery programme has been set up by Defra to look at the best way to provide IT systems and processes that support delivery of the Pillar I schemes (for agricultural subsidies to support production) and Pillar II schemes (programmes to promote rural development) and replace the existing internet and paper based application processes, with a user focused digital system where customers will be able to apply and manage their applications online themselves or via an appointed intermediary.

The system will be a long-term, efficient solution which is capable of delivering not just one cycle of CAP policy.

https://capreform.blog.gov.uk/

https://www.gov.uk/transformation/rural-support

Department / Agency:
DEFRA / RPA

Date of Assessment:
25/06/2014

Moving to:
Beta

Result of Assessment:
Pass

Lead Assessor:
M. O’Neill

Service Manager:
G. Portman

Digital Leader:
J. Pierce


Assessment Report

The CAP information service has been reviewed against the 26 points of the Service Standard at the end of the Alpha development.

Outcome of service assessment

After consideration the assessment panel have concluded the CAP information service is on track to meet the Digital by Default Service Standard at this early stage of development.

Reasons

The alpha service passes 24 of the 26 criteria at the level appropriate for an alpha service.

The service team set out how the alpha had been built around user feedback and how it reflected the ongoing feedback from user testing.

The current phase of work presented at the assessment is important to the wider programme and in particular how data quality would be a core dependency in the wider programme and to ensure auditable compliance with EU regulation.

The service team were able to set out their approach to delivery and how it works across teams.

The service team spoke fluently about assisted digital principles and demonstrated that they have used many methods to understand the barriers faced by their assisted digital users, including engaging directly with farmers, agents, charities and other government agencies. The service team has an appropriate plan to test assisted digital support in beta and has submitted a request for funding which is now with their Finance Board for agreement.

The assessment panel did have a number of recommendations for the next phase which should be addressed before the beta service standard assessment

Recommendations

Before this service can move beyond beta there are a number of key points that must be addressed.

The team

  • The programme needs to better articulate how the teams are structured, how responsibilities are set and what the cadence of delivery is.
  • The programme will need to be able to set out how the components of the service fit together, what the licensing models are for each component and how they propose to make the code available.

Security, privacy, tools and standards

  • The programme needs to set out how the Identity Assurance (IDA) user journey will work given the ability for users to change personal identity data such as their National Insurance number.
  • The programme will need to provide a clear description of their development and operations (devops) model, in particular how incidents are managed, how issues are prioritised and how they maximise uptime.

Improving the service

  • The programme will need to be able to give clear details of how rapidly the service can be iterated and how the service team will capture user feedback and tackle priority issues.

Design

  • The programme must urgently address feedback given by GDS on the design and user interface to ensure it complies with the GOV.UK design patterns and the service manual. A content designer should be working with the designers to make sure the service is easy to understand.
  • The service team should urgently check that the service works on a variety of phones and tablets. There should be a plan for regular browser testing.
  • The service team should start discussions with the GOV.UK team about the start page required for the service, including the service name.
  • The service team should test with users the impact of the map on the user journey and adapt the service in line with feedback.

Accessibility

  • The service team should carry out an accessibility audit, checking that the service works for users with a variety of different needs. There is more information on this in the service manual.

User research

  • The service should give greater clarity on what the user is able to do, and what is required of users. For example, if the purpose of the service is to get users to check their data and to gather good quality information about user's land, there should be some way for users to feed back to Defra that they have checked the information and it is correct.

Analysis and benchmarking

  • The service should have a performance analyst to continuously assess the service against key indicators, and develop recommendations based on evidence gained via web analytics and user feedback.
  • The programme needs to clearly set out how KPIs such as completion rate and cost per transaction will be captured.

More broadly assessors had concerns about the wider programme

The service team is clearly pursuing an aggressive timescale to get appropriate assisted digital support in place. During the early stage of beta, the service team will need to pin down their assisted digital user journeys by different channels (including Identity Assurance) and demonstrate iteration of the digital service following assisted digital user feedback. Funding must be confirmed for assisted digital and the service team must demonstrate that the support is scalable to cope with high volumes in a tight timescale.

The reliance on a proprietary package for the core functionality of the wider system poses a number of challenges and it was unclear what the programme's fallback route would be if that package failed to meet their needs or if the supplier withdrew from the market. There are also clear questions to be answered around intellectual property (IP).

The delivery approach needs to be much better explained in terms of how user needs are addressed rather than in terms of story point velocity.

The actual scope of the full service needs to be much more clearly explained and related to user needs.

Further stage assessments will require clear evidence that the recommendations and concerns have been addressed.

Summary

The CAP delivery team have put a huge amount of work into getting the service to this stage and this is reflected in the quality of the product and the clear commitment of the people the assessment panel met..

The recommendations set out a range of things which the service team will want to consider but this should not take away from the work which has been done so far and the ambition of the wider programme.

The strong commitment of the team to customer engagement makes this an opportunity to demonstrate the value that comes from putting the user at the heart of the service and the assessment panel look forward to seeing the next stage.


Digital by Default Service Standard criteria

Criteria Passed Criteria Passed
1 Yes 2 Yes
3 Yes 4 Yes
5 Yes 6 Yes
7 Yes 8 Yes
9 Yes 10 Yes
11 Yes 12 Yes
13 No 14 No
15 Yes 16 Yes
17 Yes 18 Yes
19 Yes 20 Yes
21 Yes 22 Yes
23 Yes 24 Yes
25 Yes 26 Yes