Skip to main content

https://dataingovernment.blog.gov.uk/centre-for-defence-enterprise-portal/

Centre for Defence Enterprise Portal - Self Certification

An online portal which allows organisations and individuals to submit science and technology innovations and proposals to MOD (in response to specific competitions or general invitations) and for these bids to be assessed by MOD subject matter experts.

Department / Agency:
MOD

Date of Assessment:
27/8/2015

Assessment Stage:
alpha

Result of Assessment:
Pass

Lead Assessor:
R. Riley

Service Manager:
D. Hazleton

Digital Leader:
M. Stone


Assessment Report

Outcome of service assessment

After consideration, the assessment panel have concluded that the Centre for Defence Enterprise Portal service is on track to meet the Digital Service Standard at this early stage of development.

Reasons

The service currently meets the requirements of the standard for an alpha service. Areas of good performance against the standard included:

User needs and assisted digital

  • Strong understanding of users and their needs, extensive programme of research including significant face-to-face element.
  • Good understanding of user devices/behaviours and assisted digital needs.

Security, privacy, tools and standards

  • Understood that applicants' intellectual property was the key asset that needs to be protected.

Design

  • Following GOV.UK pattern.

Recommendations

User needs and assisted digital

  • Review what similar needs are being met on GOV.UK and learn from these.

The team

  • Need succession planning for key team roles.
  • Will likely need dedicated sysadmin role as project moves to live.

Security, privacy, tools and standards

  • Threat modelling needs to be more rigorous. May need more specialist knowledge on security threats.
  • Don’t be complacent on choice of tools; consider different tools during move to
    beta.
  • Need to look into static analysis, and improved automated testing.

Improving the service

  • Need to flesh out the plan for outages more before moving to live phases.
  • Look at end-to-end testing (e.g. Cucumber).
  • Need to consider test coverage reporting (e.g. X% of the codebase is covered by tests) and static analysis / code-quality metrics (e.g. CodeClimate).

Open source

  • Project needs a clearer plan of what code will be released and when.

Design

  • Will need to review pinch points of failure as prototype is developed.

Analysis and benchmarking

  • Need to be on top of server metrics - the beta phase should include visibility and alerting on server performance & metrics, as well as business key performance indicators (KPIs).
  • Must complete registration asap with Performance Platform.

Hosting

  • Need to resolve hosting provision.
  • Solicit opinions from other government teams on their proposed options for hosting.

Digital Service Standard criteria

Criteria Passed Criteria Passed
1 Yes 2 Yes
3 Yes 4 Yes
5 Yes 6 Yes
7 Yes 8 Yes
9 Yes 10 Yes
11 Yes 12 Yes
13 Yes 14 Yes
15 Yes 16 Yes
17 Yes 18 Yes