Request A Historic Service Record

The beta report of Ministry of Defence's Request A Historic Service Record service assessment on 05/03/2020

From: Central Digital and Data Office
Assessment date: 05/03/30
Stage: Beta
Result: Not met
Service provider: Ministry of Defence

Previous assessment reports

  • Alpha assessment report: 7th August 2017, MOD Internal Assessment, Met

Service description

MOD holds approximately 10 million archived service records containing information on people who served in the armed forces from 1780 to 1960. The records are due to be transferred to the National Archives. Whilst held by MOD, access to them for members of the public is provided under Freedom of Information Legislation. The current service provides a set of downloadable forms which can be completed and sent by post to the service disclosure branches (Army, Navy and RAF) and requires payment by cheque of the £30 administration fee. The service currently receives around 30,000 requests per annum.

On receipt of a request, the disclosure branch logs the request and conducts a search for the record based on the details given. When the record is found, it is copied, redacted as required and sent to the requestor, completing the transaction. Private beta for the new service commenced in May 2019 with key aims of improving user journeys by removing the need to post forms and payment.

Service users

Users fall into two key groups:

  • family members – users range from those just starting their journey in family history research or making a one-off request ‘I want my grandfather’s service record’ to those who are experienced and enthusiastic family history researchers familiar with the numerous official channels and processes involved e.g. findmypast.com, ancestry.com, local records offices, national archives

  • genealogists - professional family history researchers and historians also use the service. Genealogists and family history researchers often work on behalf of others and request multiple records associated with a family/name. Military historians and historical researchers may be looking for details of members of a troop or regiment and again may request multiple records at one time

1. Understand user needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team conducted user research in Discovery and Alpha to ascertain user needs which have been used during the design and iteration of the Beta

What the team needs to explore

Before their next assessment, the team needs to:

  • present how the current end to end service (including the online and offline parts) is meeting or not meeting the user needs, this will help to identify service gaps and areas in need of improvement
  • present how the user needs have been further developed or prioritised in the Beta phase based on the additional research conducted
  • analyse web analytics to understand issues users have

2. Do ongoing user research

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team tested the beta at roadshows and conferences i.e. The Genealogy Conference at the NEC in Birmingham, The Family History Show in York and Roostech Family History Conference at the Excel in London. This meant that an optimised version of the service was used by 180+ people
  • insights from the testing at the roadshows and conferences were used to iterate the beta and resolved some of the usability issues with the service
  • the team gathered insights about the user issues from the Disclosure Branches (Army, Navy and RAF). Disclosure Branches are the teams that deal with all requests relating to disclosure of information in MOD and provide a helpdesk facility as part of this service

What the team needs to explore

Before their next assessment, the team needs to:

  • conduct usability testing of the end to end service in real conditions rather than in optimised conditions i.e. the turnaround to receive a report was reduced from several months to 3 hours at the Rootstech Conference
  • conduct usability testing of the end to end service with all user types i.e. family members and professional users (genealogists and historians)
  • test the service with people with access needs
  • present outputs from user research so the assessors can see how the user research has impacted the service
  • present how the team prioritised design changes based on user research
  • analyse web analytics to understand issues users have and investigate these problems in user research

3. Have a multidisciplinary team

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the project uses a fully spread multidisciplinary team
  • the team is collocated across the UK and appears to coordinate and collaborate well

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that they continue to transfer knowledge from contractor resource to in-team staff (FTE)

4. Use agile methods

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team follows scrum principles and is fully ‘agile’ and runs 2 week sprints
  • the team runs remote daily stand-ups to identify blockers/issues etc
  • communication methods include Trello

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to use agile tools and methods in their digital delivery

5. Iterate and improve frequently

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team uses semi and fully automated testing processes to speed up development and deployment

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that dynamic error checking is in place to track service issues with real-time notifications
  • ensure that performance analytics and feedback are used to inform service improvement

6. Evaluate tools and systems

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has chosen the GOV.UK PaaS cloud service and avoided on-premise hosting
  • the team chose GOV.UK Pay and GOV.UK Notify for payments and emails, avoiding lock-in contracts

What the team needs to explore

Before their next assessment, the team needs to:

  • review the remaining third-party services used (Sentry for monitoring and Travis for deploying) and options for the service: explore what to monitor and why before choosing tools and systems. See the Service Manual for details
  • review the choice of web framework. While PHP + Laravelle is an acceptable choice, it isn’t as well-known as other languages and frameworks, and so carries some risk with maintenance and developer onboarding
  • review the support arrangements and what to do if a third party service was to become unavailable or if the team had to change hosting platforms for any reason

7. Understand security and privacy issues

Decision

The service did not meet point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has made the decision not to store any personal data on the service, by keeping session information as a browser cookie until the process is complete and the data is sent to by email via the server (and the data is deleted immediately). Although this is a rare alternative to ‘save and return’, the team has found no evidence of a user need for it, probably due to the fact that the whole transaction is long enough that multiple sessions are not expected to complete it
  • the team has evaluated security risks and threats that have led to the choice above

What the team needs to explore

Before their next assessment, the team needs to:

  • storing the session data as a cookie has privacy drawbacks: a cookie set to expire at the end of the session will not expire when the user closes a tab or a window, but only when the whole browser is closed. There is a risk that someone using the computer after the user will be able to retrieve their personal information. The user should be made aware of that fact and offered a way to delete the data at the end of their interaction with the service
  • using Google Tag Manager opens the application to malicious activities: it allows people with admin access to the Google Analytics account used to inject code into the service pages. See point 3 in this article on GTM. The recommendation is to ensure that the SRO has signed off on using GTM, and ensure Performance Analysts with access to GA are not allowed to publish tags. The final publishing should be done by the Tech Lead / Developer or someone qualified
  • add a privacy policy page describing what happens to the user’s data and how they can request modifications under the terms of the Data Protection Act
  • add a cookie policy page explaining what cookies are used (both essential and optional)

8. Make all new source code open

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

What the team needs to explore

Before their next assessment, the team needs to:

  • flesh out the documentation and explain the context in the repo, link the repository back to the service page
  • publish a software licence and a copyright statement
  • make it easier for other services to reuse the code

9. Use open standards and common platforms

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the team mostly uses the GOV.UK stack: PaaS, Notify and Pay
  • the code uses PHP and Laravelle, both open source

What the team needs to explore

Before their next assessment, the team needs to:

  • consider using the country register (https://country.register.gov.uk/) and the GOV.UK country picker (https://designnotes.blog.gov.uk/2017/04/20/were-building-an-autocomplete/). To avoid having to maintain a list of countries, and to also avoid a very long and unfriendly country dropdown

10. Test the end-to-end service

Decision

The service did not meet point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the code includes functional and unit tests
  • there are multiple environments for dev and testing
  • the continuous integration (CI) process runs the tests automatically
  • the team has tested major browsers and assistive technology (note that ??/??/?? doesn’t read too well using text-to-speech and using readable text like “not available” works better)

What the team needs to explore

Before their next assessment, the team needs to:

  • the GitHub repository currently show CI failing, and has been for more than a week
  • given the many crashes and errors that the panels experienced when testing the service, it is clear that the service team needs to put a lot more work into testing

The panel recommends:

  1. creating automated smoke tests

  2. making sure the application cannot be deployed if tests don’t pass

  3. properly set up alerts if tests fail or the application malfunctions, including making sure there’s always going to be someone to receive the alerts

  • have a third party carry out a penetration test at the application level, and not just rely on the fact that the PaaS platform is already pen-tested
  • the 2MB limit on uploaded files imposed by GOV.UK Notify is a strong limitation when it comes to uploading scanned documents. Although the service team haven’t encountered the problem when testing the private beta service it is expected that during public beta users will find that they can’t upload photos that they’ve taken of the required documents with their phone. It is recommended to allow for larger files to be uploaded, possibly by resizing them on the server before sending to Notify, or by storing them on a server that can be accessed by the email’s recipient
  • make sure that service deployments result in no downtime by using blue-green deployment

11. Make a plan for being offline

Decision

The service did not meet point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the team explored how users would be affected if the service was unavailable
  • the team presented a strategy for dealing with outages

What the team needs to explore

Before their next assessment, the team needs to:

  • explore scenarios in which the application crashes to the point where it’s unable to serve error pages. There should be a static 500 page hosted separately, which the user can be redirected to prevent them being confused and be presented with an offline alternative (the previous process)
  • estimate the most likely causes for the service going offline and articulate a plan to deal with them
  • consolidate the strategy for dealing with outages so that it doesn’t rely on a single developer

12. Make sure users succeed the first time

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has iterated various elements of the service responding to findings from their user research
  • the team tested their online service with a multitude of users and proved users are succeeding the first time
  • the team considered the possibility of data integration with other parts of government e.g. GRO to simplify the journey and reduce the burden on users, who currently have to provide a death certificate alongside their record request

What the team needs to explore

Before their next assessment, the team needs to:

  • whilst the team presented evidence that users are succeeding first time when completing an online request, it is also true that the prototype was tested significantly more with one user group (family members) than others. The team should test the end-to-end service (from finding the service until receiving the records) with a variety of people representing all user groups identified in the research, as professional users (genealogists/historians) might have different needs and expectations
  • there’s an opportunity for the team to look more closely at the ‘offline’ parts of the end-to-end service – more specifically, what happens after a user completes the request online (always keeping in mind the user’s goal is completed only when they receive the records they requested). The digital service presented at the assessment represents just a part of the user journey, whilst the other parts of the service are not being made digital. There is room to make processes more streamlined and joined-up at the backstage, when DBS’s receive and process users’ requests, looking at not only to reduce the waiting time for users but also to potentially reduce the cost to serve and making the service consistent between the different providers
  • the rules of the service are somewhat complicated and there’s room to make it clearer to users – e.g. the records will contain more or less detail depending on who requests it. Explore setting the expectations of users throughout the journey, providing relevant guidance at pivotal points, instead of only at the landing page Consider users would do things differently (e.g. have the next of kin to make the request instead) if they know the rules and implications
  • the team should monitor the upload of death certificate during their public beta to understand whether the file size limitation presents an issue in the journey and if so, aim to address it as soon as possible

13. Make the user experience consistent with GOV.UK

Decision

The service did not meet point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the service is integrated to GOV.UK Pay and Notify

What the team needs to explore

Before their next assessment, the team needs to:

  • the prototype provided for the assessment had some broken elements on the code (e.g. font was defaulted to Arial, arrow images not loading on the start button). These should be fixed to make the style consistent to GOV.UK
  • the start page presents a tab pattern, which is not currently the recommended style. In any case, the start page should be created in conjunction with the GOV.UK team
  • the pages should be reviewed for where GOV.UK components and patterns are not behaving as per the design system (e.g. font on radio pattern options being presented in bold)
  • the copy should be reviewed throughout the service, as per the content review (sent separately to the team), and ideally, this should be done with the help of a content designer. In addition to what is pointed out on the content review, in several pages, there is unnecessary repetition of the question/call to action, which appear as the page title as well as just before the form fields. In some pages, there is also what seems to be a page-filler copy on the lines of ‘fill this information so we can complete your request’. Attention is also needed on content that can behave unpredictably on different screen readers – for example, the slash symbol (serviceman/woman). Prefer plain English over symbols (‘or’ instead of ‘/’ )
  • required fields in the form could benefit from validation to be consistent with the rules of the service as well as help prevent user errors – e.g. date of birth validation limiting it to the present date or a specific year. User errors could also be prevented by maintaining consistency of required fields – e.g. city/town not required on contact details page but required on the payment page

14. Encourage everyone to use the digital service

Decision

The service did not meet point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team has considered multiple user types and has already engaged with external stakeholders to promote the service (e.g. Rootstech)
  • demonstrated that Google search terminology has been considered

What the team needs to explore

Before their next assessment, the team needs to:

  • address the issues in Point 12 above in order to accommodate the needs of the user types highlighted by the team e.g. Genealogists
  • working with the GOV.UK content team to ensure that landing content is correctly presented to ensure simplicity of user experience after arriving at/searching for the service

15. Collect performance data

Decision

The service did not meet point 15 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has begun work on calculating the cost per transaction for the service

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that the implementation of Google Analytics (GA) and Google Tag Manager is documented so that other developers can easily pick up the work
  • use GA to track specific events on the service form (e.g. validation errors, menu selections) - to provide more useful detail, these could be segmented by user type (e.g. genealogists or one-off users)
  • obtain and monitor data from offline sources such as the disclosure team, the telephone support service or the external suppliers. This would help the team to understand how well the needs of users are being met: for example, how long are users waiting for documents, how much information do they typically receive, how frequently can no records be found, what problems do they have using the online service. This could be segmented by military service, and then be used to help users understand what they will get if they use the service. It could also be used to try to ensure consistency across the armed forces
  • compare the efficiency of using online and offline sources: for example, is there a difference in processing times, less need to contact users, or are there fewer errors as a result of using the online process? Such data might help to show that investment in the service results in greater efficiency, and could help to justify further improvements in the end-to-end process
  • make plans to measure user satisfaction for the end-to-end process - that is after the service record has been received - not just after the request has been submitted

16. Identify performance indicators

Decision

The service did not meet point 16 of the Standard.

What the team has done well

The panel was impressed that:

  • the team will be monitoring completion rate, drop-out-points, and user satisfaction to measure success of the service

What the team needs to explore

Before their next assessment, the team needs to:

  • identify more service-specific measures for evaluating success – the current measures identified are the standard GDS KPIs which do not reflect all of the user needs for this service
  • run a performance framework exercise to help identify more metrics based on the specific goals of the service and the needs of its users

17. Report performance data on the Performance Platform

Decision

The service met point 17 of the Standard.

What the team has done well

The panel was impressed that:

  • the team was proactive in contacting the performance platform team in advance about setting up a dashboard. There are significant delays in setting up dashboards, but this is through no fault of the service team

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that data for the 4 mandatory metrics is supplied so that the dashboard can be populated and published
  • put in place a process for ensuring the dashboard remains up-to-date

18. Test with the minister

Decision

The service met point 18 of the Standard.

What the team has done well

The panel was impressed that:

  • The team had tested the service with the Chief of Defence Personnel, Lt General Richard Nugee - anecdotally he requested and received his grandfather’s service record during the test!
Published 21 April 2020