Skip to main content

Report a Food Problem - Service Assessment

This service allows consumers to report a food problem they have experienced or seen (e.g. rat in restaurant, dirty hands serving food, foreign object in food), and does the hard work of getting that report to the local food safety/environmental health team responsible for food businesses in the area.

Department / Agency:

Date of Assessment:

Assessment Stage:

Result of Assessment:

Lead Assessor:
D. Williams

Service Manager:
C. Hammond-Aziz

Digital Leader:
J. Pierce

Assessment Report

Outcome of service assessment

After consideration, the assessment panel have concluded that the Report a Food Problem service is on track to meet the Digital Service Standard at this early stage of development.


The panel was pleased to see that the service team have top level support and have been allocated the required budget and resources.

The team have recognised the importance of user research as part of the project and demonstrated a good understanding of the user needs. The panel were impressed with the findings that there is a public assumption that the FSA is responsible for dealing with food problems - which isn’t actually the case - and with this in mind it is gratifying that the team are taking action to provide a service that falls outside of the agency’s remit.

The panel felt that there had been excellent discovery research and a clear definition of users. It was very interesting that the team want to encourage consumers to be empowered to report food problems directly to the business and are working on ways of building this. The team have also undertaken research with some assisted digital users, have identified that they may have a slightly different set of needs, and realised the importance of further research into different groups of assisted digital users in beta.

The panel were impressed that the team had gathered insights from social media and service contact.

The team have given thought to analytics and what to measure, and the tools required to do so. Although there seemed some confusion regarding the cost per transaction, there is a clear understanding of the other three Key Performance Indicators (KPIs). In the panel's experience, this is pretty advanced thinking at an alpha stage.

The team have considered and mitigated a number of external and internal threats to the service, and minimised the data captured and retained on the service.


The panel do have some concerns as the team move into the beta stage, and the panel's recommendations on how to overcome these are outlined below:

Point 2 - Put a plan in place for ongoing user research and usability testing to continuously seek feedback from users to improve the service.

The panel feel that having a user researcher in a team for two days a week is not enough, and suggest that one is allocated for a minimum of three days a week.

Point 3 - Put in place a sustainable multidisciplinary team that can design, build and operate the service, led by a suitably skilled and senior service manager with decision-making responsibility.

The team are lacking key roles. Existing members of the team have been carrying out multiple roles but for a beta the team will need the expertise that a competent content designer and interaction designer can bring.

Point 4 - Build the service using the agile, iterative and user-centred methods set out in the manual.

The team should obtain an agile coach to help the team as it grows in order for it to become a truly agile, collaborative and co-located service development team. In addition the team don't appear to have sprints, stand-ups, stories, retrospectives or indeed a clear process that involves research, design and product owners making decisions about what to change. We suggest that the introduction of these would help the team achieve its goals in the beta.

Point 5 - Build a service that can be iterated and improved on a frequent basis and make sure that you have the capacity, resources and technical flexibility to do so.

The ease of changing content via the admin interface presents some risk, as you can make major changes to the service with little tracking and visibility to other members of the team. You will need to mitigate this, either with technical measures (e.g. automated logging of changes), or with well defined process and convention.

Point 7 - Evaluate what user data and information the digital service will be providing or storing, and address the security level, legal responsibilities, privacy issues and risks associated with the service (consulting with experts where appropriate).

Whilst the team have considered the risks to the service and the data collected, the panel suggest that the team assess the volumes and aim to move to non-manual ways of dealing with fraud if necessary.

As the citizen-facing part of the service captures data and does not present it back to the citizen, the team should consider separating out the citizen-facing and agent-facing parts of the system, so that a compromise of the citizen-facing part of the service would not expose the stored data.

Point 10 - Be able to test the end-to-end service in an environment identical to that of the live version, including on all common browsers and devices, and using dummy accounts and a representative sample of users.

The panel suggest that the team undertake a true private beta with a narrow user group who can be researched closely; for example, a single local authority.

Point 12 - Create a service that is simple and intuitive enough that users succeed first time.

The service needs a start page that clearly states what this form is and isn't for, and suggests alternative services if appropriate.

The team need more research into what users think is the end of the service and to fully understand what outcome users really want, i.e. do they need feedback or have they reported the issue to the right body?

Watching users search for a service provides valuable insight into the context of the user needs.

Point 13 - Build a service consistent with the user experience of the rest of GOV.UK including using the design patterns and style guide.

Using the design patterns might help save time as these have been developed after extensive research. This is not about using GOV.UK styling, but more about using the interaction patterns, coded elements and form design guidance that are tried, tested and proven to work.

The panel encourage the team to engage with the cross government design community as there is much to learn in both directions.

The team should consider using the front-end toolkit where possible, as it could save time and offers a lot of accessibility advice for free.

Point 14 - Encourage all users to use the digital service (with assisted digital support if required), alongside an appropriate plan to phase out non-digital channels/services.

The service should provide really clear guidance about which service to use and what to do about issues.

Point 15 - Use tools for analysis that collect performance data. Use this data to analyse the success of the service and to translate this into features and tasks for the next phase of development.

In addition to measuring and analysing completion rate on the service, the panel would encourage the team to find ways to measure the quality or utility of the reports downstream, at the local authority, and keep this metric in mind when iterating the service.


The panel were impressed with the cohesion and knowledge within the team. The team demonstrated a passion and dedication to providing the best possible solution for users, and a commitment to continue with gaining further insight into user needs. This was one of the best alpha assessments that the panel has seen, and the panel look forward to seeing the team again at the next assessment stage.

Digital Service Standard criteria

Criteria Passed Criteria Passed
1 Yes 2 Yes
3 No 4 No
5 Yes 6 Yes
7 Yes 8 Yes
9 Yes 10 Yes
11 Yes 12 Yes
13 No 14 Yes
15 Yes 16 Yes
17 Yes 18 Yes