Skip to main content

Referral Capture - Service Assessment

The Referral Capture service is an online service that will allow the citizen to report people who are potentially committing benefit fraud

The department loses £3.6bn to fraud and citizen error each year, and so the department has had to get smarter about preventing and detecting fraud. Although the department are proactively developing new ways of preventing fraud within the system, we still currently rely on the citizen to tell us about incidences of fraud that we are currently unable to detect. Whilst we will eventually be able to prevent some of this fraud in the future through more sophisticated data matching and other means, there will always be situations we will never be able to identify without public assistance, e.g. cash in hand cases.

Department / Agency:

Date of Assessment:

Assessment Stage:

Result of Assessment:

Lead Assessor:
J. Gould

Service Manager:
B. Leggett

Digital Leader:
K. Cunnington

Assessment Report

Outcome of service assessment

After consideration, the assessment panel have concluded that the Referral Capture service is on track to meet the Digital Service Standard at this early stage of development.


The service team is well defined, has demonstrated a good understanding of user needs, are working in an agile iterative manner, and clearly have strong ownership of the development of the service.

User needs

The team have used a number of different sources of insight to identify user needs and develop an understanding of the service’s users. These include desk research, talking to frontline staff on the telephony channel, a survey of users ringing the hotline, and pop-up and lab-based testing. The team were able to describe user needs with a clear understanding of the complex range of emotions and drivers underlying them. The team were also able to demonstrate an understanding of some of the contextual issues that influenced target users, including being part of a close-knit community, and lower literacy levels.

The team have begun to feed this understanding of service users into the design of the service, and are iterating regularly, with design iterations driven by feedback from both pop-up and lab-based user research.

The team had a good understanding of service users’ likely support needs, having conducted research with actual users of the service, including those with low confidence and low digital skill levels, and those seeking support from third parties. The team estimates that around 20% of users will need support of some kind. The team has identified that anonymity, privacy and flexibility were the key benefits of the on-screen service over alternative channels.

The team has identified that confidence is the key barrier both to users migrating to the on-screen service from alternative channels, and to independent service completion.

User research

User research has been successfully established as an integral part of the sprint cycle, with team members conducting and observing pop-up and lab-based research, taking part in analysis, and findings from user research feeding into the backlog. The team is actively aware of, and looking at, how to recruit users for specific user journeys through an external recruitment agency. The recruitment for user research is also targeted to include representative users, such as those on low incomes and those with lower digital literacy.

The team has ongoing research plans to recruit two users who need assisted digital support per lab session, testing the on-screen service.

Iterating and improving the service

The team has built a prototype during their alpha and is now looking to begin implementation of the real solution. This has been sensibly scoped not to include transformation of the legacy backend system, which is being dealt with in another project running across DWP.

Tools and systems

The technical architecture of the beta will be based on guidance from DWP’s Digital Blueprint. The team is empowered to change elements of this ‘standard stack’ when justified. A feedback loop is being honed to ensure the Digital Blueprint is updated with DWP’s evolving understanding of how services are best implemented.

Managing data

The application being developed by the team will not store any data except in-session, however, details of claimed benefit fraud are passed on to backend systems, which presents information security and fraud risks. The team has a good understanding of these backend systems and processes and the risks involved, which are largely unchanged by the new digital project, and satisfied the assessors that the risk appetite is understood and accepted by stakeholders in DWP.

Open source

DWP is currently working on their open source policy. The team were keen to make code open source, and understood the benefits and challenges involved in working in the open.

Service availability

The team shows a clear understanding of the implications of the new service being taken offline and has ensured the capacity exists to handle users diverted to other channels.

Creating a simple and intuitive service

The team is iterating on-screen service design in response to research and testing with low-skilled and low confidence users. The team are also considering how to make sure the research and improvements made to the digital service can be used to benefit the telephone and paper versions.

Consistency with GOV.UK

The content design is high quality with excellent use of plain English. This represents a significant
improvement over the old form. The panel identified some specific suggestions to improve the design during the assessment that will be forwarded separately.

Digital take-up

The team has communication plans in place to help users overcome barriers to adopting the digital service.


  • Whilst the service is being developed as part of DWP’s Fraud and Error strategy, the framing and messaging on the service is entirely focussed around reporting fraud. Given the emotive and sensitive nature of the service, it is recommended that the team explore the language used and whether it could make more applicable to users wanting to report errors.
  • The team discussed plans to gather feedback from users at the end of the service. It is recommended that the methods used reflect what the team already understands about their users and their needs.
  • The team has currently worked on two user journeys; some of the remaining user journeys are complex, such as those where the user would be expected to provide details of the person they are reporting and their partner. It is recommended that the team tests these journeys with specifically recruited users that match these scenarios and that testing is based on user-generated information where possible rather than hypothetical scenarios provided to users.
  • The team must do more work to understand how users who are not online at all will find out about the service’s assisted digital support options.
  • The team must do more work to understand how users needing assisted digital support will receive an equally high quality service as those using the on-screen service independently.
  • The team should also be sure to avoid over-reliance on talking to users who are currently phoning the department to use this service, as these users are likely to be more confident and less isolated.
  • The team must ensure that ongoing research plans include working with users who will need support to fully understand their needs, and a model of support is designed to meet those needs during beta.
  • The team should help make the case to DWP for a positive outcome on their policy around open sourcing of software.
  • The service will ultimately form part of a wider transformation of DWP’s legacy estate when FRAIMS is replaced. The team should continue to be mindful of this when implementing the beta so their solution can flex when this work takes place.


The panel were impressed by the team’s positive commitment to building the service and their early integration of user research into their work. The panel looks forward to reviewing the service at a more mature stage in its development at a future assessment.

Digital Service Standard criteria

Criteria Passed Criteria Passed
1 Yes 2 Yes
3 Yes 4 Yes
5 Yes 6 Yes
7 Yes 8 Yes
9 Yes 10 Yes
11 Yes 12 Yes
13 Yes 14 Yes
15 Yes 16 Yes
17 Yes 18 Yes