Readiness Reporting and Deployability Discovery - Alpha Assessment Report

The report for MOD's Readiness Reporting and Deployability Discovery service on 24 September 2020

Service Standard assessment report

Readiness Reporting and Deployability Discovery

From: Government Digital Services
Assessment date: 24/09/20
Stage: Alpha
Result: Met
Service provider: MOD

Previous assessment reports

N/A

Service description

Readiness Reporting and Deployability Discovery (R2-D2) provides an accurate, trusted, shared and timely understanding of the British Armed Forces’ capacity to meet the commitments articulated in the Defence Plan.

Readiness reporting within Defence (e.g. for the deployability of assets to support a task) is currently carried out using a significant amount of human power and with inefficient tools (e.g. spreadsheets) backing the process. Gathering the data that allows the analysis and reporting to be carried out is laborious, repetitive, and inefficient.

In December 2019, the Defence Digital Service (DDS) – sponsored by Deputy Chief of the Defence Staff - Military Strategy and Operations (DCDS MSO) – carried out a 3-week discovery sprint, to understand the accuracy and efficiency of readiness reporting as well as the organisation’s understanding of military resource availability more widely. The discovery sprint report confirmed the issues originally outlined and, in response, DCDS MSO sponsored DDS to continue the Readiness Reporting and Deployability Discovery (R2-D2) through to an alpha phase which started work in early-mid May 2020 with the aim of delivering a data-driven view of resource demand and supply capable of informing efficient allocation and investment decisions.

Service users

This service is primarily for MoD staff and members of the respective armed forces. However, there is potential for it to be used by civilians working in this space.

Users fall into the following groups: Analysts/Reporters, Demanders, Receivers of Demand, Decision Makers.

1. Understand users and their needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has increased the number of user interviews, and usability testing on the prototype. At the time of the assessment the team had conducted 9 contextual interviews with the RAF and Navy, and usability research with 37 participants from the Army, RAF, and MOD. In addition to this the team has also held co-design workshops with the Navy. It was clear in the assessment that this has improved the teams understanding of the users, and enabled them to increase their confidence in design decisions. The team found particular benefit from the co-design work as it involved more of the team, and users, to participate in research and be involved in decisions. The co-design work also enabled the team to understand where there was overlap between the army and navy
  • the team has continued to review the jargon on the service, and has a good understanding of how the use, or removal, of jargon can increase cognitive load
  • finding people with access needs has continued to prove difficult for the team, however the team has made some good connections with accessibility communities within the armed forces which they hope will prove fruitful in Beta. Although the team has struggled to speak to users with access needs in Discovery/Alpha, they have mitigated this by using patterns from the Design System which are WCAG 2.1 compliant

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to engage/develop relationships with the accessibility community so that they can learn more about the needs of these users and the barriers they face
  • consider other approaches to ensure the service is accessible, and the support methods are appropriate. The team spoke about using proxy users, however given these people with lack context of the service, ideally this should only be used to supplement research with real users.

2. Solve a whole problem for users

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team were able to provide details of the alternative options for meeting the identified user needs and explained their rationale for proceeding with this option
  • the team were able to clarify the purpose of this service is not to change or introduce a new command system, but to better empower participants within it to discharge their responsibilities by increasing evidence based decision making
  • the team were able to confirm the current focus of the project is on the workforce and equipment elements of the problem area and specifically in relation to the air, sea and land domains due to their commonality and in order to maximise early benefit realisation. Other aspects of the problem area (training & sustainability) and domains (cyber & space) are captured in the roadmap as potential areas for future exploration
  • the team have secured the backing of the project sponsor and been able to engage with other teams who are working on related aspects of the problem space and across the MoD business & defence sector

What the team needs to explore

Before their next assessment, the team needs to:

  • continue with the current roadmap and programme of cross team engagement

3. Provide a joined-up experience across all channels

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • as stated in section 1, the increase in volume of user research interviews, usability sessions, and the introduction of co-design workshops has improved the teams understanding of the user needs across the navy and airforce. This has resulted in increased confidence in the design of the service and help to reduce risk when moving into Beta.

What the team needs to explore

Before their next assessment, the team needs to:

  • continue research and usability testing with a cross-section of navy and airforce users to ensure the consistency of user needs and effectiveness of the service design.

4. Make the service simple to use

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team was able to conduct research with more participants and provide clear examples of where they have iterated the design in response to feedback, leading to significant readability and understandability improvements. These changes include specific design elements mentioned in the recommendations from the previous assessment – eg. modal box, % achievable
  • the team has demonstrated to have identified and be testing their riskiest assumptions with the MVP. These are mainly tied to the groups already included in the MVP, but the team has also started conducting research with other user groups to ensure the solution is designed based on everyone’s need
  • the team is striving to create connections in the communities of practice to learn from others and share knowledge
  • although, due to security constraints, service access is expected to be limited to laptop and desktop units for the foreseeable future, the team have used bootstrap to ensure the product will be responsive should the need for mobile usage arise
  • the team has reconsidered the service name. Since this is an internal system with multiple uses rather than a service that needs to be discovered and understood by the general public, the team will not be using a verb-name as per the service manual guideline. Instead, a compromise has been achieved with the name ‘Ready to Deploy’, which the team will be testing with users during Beta

What the team needs to explore

Before their next assessment, the team needs to:

  • continue designing and building the service in a progressive enhancement based approach, with the iterations informed by user research and keeping with the principles of the service standard

5. Make sure everyone can use the service

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is making efforts to identify and recruit users with access needs within MOD, recognising that many people will not disclose their disability. The team has identified 2 accessibility champions within the department and has been working with them
  • the team is identifying other avenues to explore what tech these users would be using – eg. investigating what is available on MOD net
  • the team has been taking WCAG requirements into account to iterate the prototype

What the team needs to explore

Before their next assessment, the team needs to:

  • conduct research with users with access needs – as per previous recommendations, if finally it’s not possible to recruit service users with access needs, recruit external user
  • keep referring to the service manual guidance on accessibility as per the previous recommendations

6. Have a multidisciplinary team

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have stood up an appropriately sized multi-disciplinary team, comprised of the main relevant roles and suitably experienced team members who will all remain in post for the next phase
  • the team are comfortable with their current levels of autonomy and empowerment, whilst retaining the ability to include input from project sponsors and other stakeholders
  • the team has appropriate governance routes with fortnightly meetings with the senior stakeholders to the future service and monthly meetings with the overall project sponsor

What the team needs to explore

Before their next assessment, the team:

  • must consider how best to include content management resources within the team
  • must continue to work with project sponsors to ensure the current reliance on contingent labour for key team roles is mitigated, ideally by the creation and recruitment of full time positions

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have adopted the appropriate agile methods and working practices, including working in fortnightly sprints and observing key sprint ceremonies such as team retrospectives and show & tells
  • the team have addressed the challenge of remote working during lockdown by utilising a range of digital and collaboration tools including Teams, Slack, Trello, Miro and GITHub
  • the team have identified the appropriate senior member of the organisation with whom to test the service with before beginning their public beta

What the team needs to explore

Before their next assessment, the team:

  • must complete the remaining fresh research and prototyping in their alpha phase before beginning their beta

8. Iterate and improve frequently

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the team were able to provide clear examples of where the prototype design had been changed in response to user feedback
  • although a trail-blazer digital project of this type for the MOD, the team had engaged with and influenced senior stakeholders to secure their commitment to stand up an appropriately resourced and empowered business team that will introduce the capability needed to ensure the continued improvement and iteration of the service design in line with the needs of its users

What the team needs to explore

Before their next assessment, the team:

  • must continue monitoring the wider plans to set up the new business delivery team and report any concerns to future GDS assessment or spend control panels
  • should spend more time iterating the design of the service in response to user research

9. Create a secure service which protects users’ privacy

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has started to consider the security requirements of the system as a whole and the data within it
  • the team has taken into account the GDPR requirements that may be applicable to the system
  • the team is looking to fill a gap in the MOD infrastructure regarding a modern authentication mechanism and use that as a blueprint for the future
  • working with internal security teams to formulate best practice and negate “urban myths” regarding where data can reside

What the team needs to explore

Before their next assessment, the team needs to:

  • consider how data and system layers should be segregated in AWS and perhaps consider separate VPC’s rather than relying on IP filtering
  • consider a more secure “zero trust” model for user access and not rely on VPN
  • data encryption, in transit and a rest to the appropriate level
  • have plans in place for appropriate IT Health check activities, pen testing etc

10. Define what success looks like and publish performance data

Decision

The service met point 10 of the Standard.

Previous Recommendations

Before their next assessment, the team:

  • the team must continue to develop plans for tracking how the service contributes to the wider business aim of increasing the availability of assets and have a plan in place for how service performance will be measured

What the team has done well

The panel was impressed that:

  • the team were able to explain that the projections for potential performance and efficiency gains are largely based on qualitative evidence identified during the earlier phases of this project and that the nature of the problem means the business itself does not yet have meaningful MI data from which to produce baselines for tracking purposes. One of the early benefits of the initial release will be the generation of MI data from which such baselines can be produced. The team will then use this data to develop the benefit tracking model throughout private beta, with a view to being able to forecast the performance improvements and cashable savings going forward.

What the team needs to explore

Before their next assessment, the team needs to:

  • continue with their current plans to develop a benefit realisation model, in addition to defining and monitoring relevant performance indicators for the service itself.

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has selected appropriate tools i.e. JavaScript and Python as the development tools
  • the team has selected a cloud first model i.e. AWS while some components reside on premise to allow easy migration between cloud providers
  • the team is using cloud tools that are portable across cloud providers and is looking to deploy based on infrastructure as code
  • the team is deploying micro services to ease deployment and supportability / upgradeability
  • the tools selected are widely available and have a very large pool of people from which they can call for help as required

What the team needs to explore

Before their next assessment, the team needs to:

  • the team has based their decisions on technology on the needs of their users and should continue to do this rather than what people had “used in the past”

12. Make new source code open

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has selected well used and proven technologies that can be shared
  • the team intends to share some code with the MOD GitHub community

What the team needs to explore

Before their next assessment, the team needs to:

  • explore how source code can be made more open to a wider audience, certainly within MOD and Government but also to a wider base

13. Use and contribute to open standards, common components and patterns

Decision

The service did not meet point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has identified a number of areas where components they develop e.g. the user authentication service that can be reused within MOD and are working using open standards and open source components
  • the team is using the GOV.UK Design System components where possible

What the team needs to explore

Before their next assessment, the team needs to:

  • develop the plan further to allow reuse of those additional elements and components that may be developed as part of this system, such as the user authentication service, and ensure other teams know what they are developing those components

14. Operate a reliable service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the team had started to consider the options for ensuring the roles and responsibilities required for supporting users and managing service performance in beta are captured and documented - for example, in the form of an SLIs and SLOs with the relevant team(s)
  • the team realises that for a future production system 24x7x365 availability is required and is working towards this outside of Alpha
  • the team intends to use a CD/CI pipeline to ensure minimal downtime when deploying changes or enhancements to the system

What the team needs to explore

Before their next assessment, the team:

  • should continue developing the service performance and support plans in preparation for a future public beta
  • continue to develop the thinking around AWS availability zones and how other AWS services can be used to ensure downtime is minimised, e.g. use of cloud watch to analyse metric and gain insight to service issues
  • ensure the performance requirements of the system are understood and delivered against user needs

Published 4 January 2021