Skip to main content

https://dataingovernment.blog.gov.uk/blood-donation-service-voluntary-service-assessment/

Blood Donation Service - Voluntary Service Assessment

The online blood donation service (http://www.blood.co.uk) provides the public searching for blood donation sessions real time booking and the ability to view donation history. Bookings are based on the donors eligibility (time since last donation and health) and preferences (location). The service provides information and access to support channels and simple registration for online access and registration as a blood donor for new donors. Donors can also connect their account to Facebook and Twitter logins for ease of access.

The users are Whole Blood donors, members of the public and internal support teams. The service is currently in live. The current service replaced a prior version of an online booking system (having gone live in November 2013).

Department / Agency:
DH

Date of Assessment:
7/10/2014

Assessment stage:
Live - Voluntary

Result of Assessment:
Pass

Lead Assessor:
P. Ferris

Service Manager:
R. Creighton

Digital Leader:
W. Cavendish


Assessment Report

The Blood Donation service was assessed against the standard to conform to a Cabinet Office spend control requirement. This was a voluntary assessment in order to help you assess performance and to identify areas for improvement.

Summary

The assessment panel thought your presentation was strong and demonstrated a clear understanding of what the business aim was. The team had a clear understanding of the 26 points of the standard to a level that we would wish other assessments to aspire too. The panel were also very impressed with the passion across the whole team to deliver the new service, and the depth of knowledge of the product and thinking behind the decision making of all those attending.

There are a number of points where the panel would suggest you give some additional thinking as follows:

User needs

The service has a number of mechanisms in place for gathering user data (surveys, focus groups, analytics, etc.). The service team has broken down the experience into 7 user journeys – tested through a build and QA environment. These had been subject to sprint testing and built in an integration environment and subsequently into UAT.

The panel noted the service had completed 2 user research sessions so far – 1 at wireframe stage. The service has a slight bias to gaining user feedback via survey, although it was clear the team is highly responsive to the evidence that has come out of this methodology. The multiple browser capability testing and that your site was mobile friendly was good.

The panel welcomed the decision of Blood Service to have an onsite Product Manager and Behaviour Insight Team that led thinking and you had a good understanding of your product and the needs it was intended to meet and were planning iteratively to continue your development towards this.

Recommendation: increase the face-to-face user feedback opportunities, prioritise the feedback from these over high level survey feedback.

The team

The service has a Service Manager who sits on the Project Board and is able to influence project direction and thinking. He demonstrated clear knowledge of his field and the potential challenges associated with delivery of this new channel and we were impressed with the link to the Portal Manager role which acted as the ’voice of the customer/donor’. A team was in place to deliver the project with a clear accountable executive role. However the panel did note that the team was not fully cross functional - different functions were not co-located and development was separate from the business.

The service highlighted the work underway to respond to this specifically recognising that design was a current gap in provision and this is dealt with further under the design area. The description of the move from Portal 1 to Portal 2 and the differing operating arrangements to be put in place (including moving to more agile approaches and blending current support provision with an in-house team) which the panel thought showed clear direction in terms of ownership of the service.

Recommendation: the approach to a more in-house team is absolutely right and will benefit the success of the product in terms of future development.

Security, privacy, tools and standards

The assessment panel were convinced the service had clearly communicated to the right people early and kept an engaged conversation to inform development. Where challenged, the service had responded promptly to incorporate feedback and had acted on recommendations promptly. It was clear the service had integrated security into the development and testing process from the start. Overall the panel were convinced your approach exemplified exactly how this area should be done. The panel also thought the technical representative demonstrated strong knowledge and had clearly thought through the issues around information assessment and threat effectors etc. The service team has also ensured the security policy had been fully signed off by the relevant Project board and a full mitigation plan was in place for identified risk areas.

In terms of being able to test the end-to-end service in an environment identical to that of the live version on all common browsers and devices, it was clear that the service team had undertaken layered testing and there was excellent browser and capacity device testing capability.

The panel suggests in the future this may prove onerous in terms of development but, that the service team were thinking of the opportunities for improving automated testing as a future response and that this would ensure less potential impacts on the agility of the release process.

Improving a service

The current approach would benefit from greater agility - particularly in relation to the release cycle. The service team stated that the latency that currently exists between the code being completed and deployed was measurable in days however, currently achieved 4 big releases per year. The panel would suggest that wherever it is achievable for completed code to be deployed this should be done regularly and iterated as needed.

Recommendation: the future success of this service is in part reliant on the organisation adopting an agile methodology. The organisation should also be clear regarding the responsibility for sign off as the change management process seemed to require several steps to achieve and no single individual appeared to have complete risk management / response authority.

Design and content

The Blood Donation service is exempt from the GOV.UK look and feel as described in the Digital by Default service standard, but the panel have a few general recommendations to put in place for the future.

Recommendation: Currently, the design and content resources are provided by a third party. It would be beneficial to the overall consistency of the service to bring that in-house, ideally under the umbrella of a digital team who could maintain and own the style.

The confirmation emails and website copy would benefit from a light proof read. Avoid switching between first and third person and using 'click here' as link text. Make sure you follow Plain English (www.plainenglish.co.uk) standards throughout and consider the GOV.UK style guide (https://www.gov.uk/guidance/style-guide) as the language has been tested with users of all digital capabilities.

Assisted digital

The service team demonstrated an emphasis on good customer service for donors regardless of the channel used for registration or appointment booking. Currently the service provides a high quality offline support for users through several channels, including talk through by telephone. Support was easy to access and awareness was good, with the service being joined up with other areas of the NHS, through GP surgeries. The service team are on the right track with their thinking behind assisted digital support but needed to demonstrate how the support meets user needs and be further ahead with testing and implementation.

Recommendation: that the service team undertakes research through their current support channels (face to face and telephone, rather than online) to understand the barriers and needs of the full range of their assisted digital users, including people who are unable to use or access the digital service independently.

Based on analysis of this research, the offline support should then be iterated and tested. Research and testing of the support should include other elements of the assisted digital standard which have not yet been fully explored (eg digital inclusion) to assess appropriateness, given the nature of this service.

Digital take-up

The panel considered the service being provided was intended to fill the aims of digital take up as the targeted demographic. The service is not altering your current provision in terms of phone and face to face support and are introducing this new digital channel to promote digital take up amongst your altruistic customer base, providing an additional channel for bookings, customer information etc.

The service team has seen significant digital take-up through self-selection since the digital service went live and have plans in place for scaling up the digital service and work proposed for the next phase (the ‘paperless donor journey’, enhancements to capability and information related to appointments) which will continue to have a positive impact on digital take up. There is also a plan to undertake research into why users aren’t using the digital service and develop your plan to shift more users on to the digital service over the next five years. The service team should also consider that over time a greater shift to a paperless donor journey may provide a need to assess future assisted digital and digital inclusion requirements as part of reaching the entire public demographic able to register to become a blood donor, as well as reassessing the future reliance on more traditional offline contact channels.

Recommendation: the assessment panel thought the digital take up approach was excellent and clearly providing an additional channel to your customers, although it was important that research into why people weren’t using the digital service was considered as part of future thinking and development.

Analysis and Benchmarking

The service team demonstrated a mature well developed digital analytics system and an approach to good data culture using a mix of actionable analytics data, along with user research to drive improvements.

The only element of the Service Standard the service would have been deemed to not meet would have been that of Standard 10 (Assisted Digital) as this had not formed part of your thinking to date and no processes to consider the implications of this workstream had been put in place.

However, overall the panel was very impressed with the online service and thought it was an excellent example of a new digital channel that should benefit its users.


Digital by Default Service Standard criteria

Criteria Passed Criteria Passed
1 Yes 2 Yes
3 Yes 4 Yes
5 Yes 6 Yes
7 Yes 8 Yes
9 Yes 10 No
11 Yes 12 Yes
13 Yes 14 Yes
15 Yes 16 Yes
17 Yes 18 Yes
19 Yes 20 Yes
21 Yes 22 Yes
23 Yes 24 Yes
25 Yes 26 Yes