Service Manual - Beta Assessment

The report from the beta assessment of the GDS's Service Manual on 10 June 2016

Service provider CO/GDS
Stage Beta
Result Met

Result of service assessment

The assessment panel has concluded the Service Manual has shown sufficient progress and evidence of meeting the Digital Service Standard criteria and should proceed to a public beta.

Detail of the assessment

Service Manager : Zuz Kopecka

Lead Assessor: Tom Dolan

User research

The team have spoken to over 300 users, and reached another thousand through surveys.

The panel is pleased to see that since alpha the team has been actively asking their contacts to help them find the most sceptical users as further test subjects, attempting to minimise any bias in the research findings, contacted suppliers, and that users were being contacted around the time of the assessment to gain the most insight. As a result, the team was able to speak confidently of a far broader range of users, with more nuance, than at alpha.

The team had identified the difference between new users and those who were already experts and using it as evidence of good practice. They also had an understanding of where people got guidance if they didn’t turn to the manual, for example colleagues, local guidance within departments and so on.

A wide range of research techniques have been used extensively to inform the important navigation of the site.

Needs from alpha have been acted upon and then iterated - for example, the need to know when a page has been updated is now supplemented with the thinking behind it. More subtle findings, such as the way some content came across as slightly tone deaf to specialists, have also been discovered and acted upon to improve engagement.

Team

The team is clearly very passionate about the project and have all roles in place, except where they are relying on GOV.UK infrastructure. The panel was pleased to see they had hired content designers from outside GDS to bring fresh perspectives to the service manual. A recurring team health check is in place and findings were discussed with the panel.

The team could demonstrate they were working well together, showing examples of analytics and research insights moving through their Trello board - to which any team member feels able to contribute. They were also able to show their roadmap and talk about its relationship to the Kanban process.

While the panel was concerned that reviews of their PRs by the core GOV.UK team might slow development, it was good to see the Tech Lead’s collaborative attempts to mitigate this.

Resources are in place for the team to proceed throughout the public beta phase.

Technology

The technology approach is similar to various other applications on GOV.UK, and much re-use is made of existing components and infrastructure. This means the application is relatively simple, which is a good thing. The GOV.UK team provide and manage the infrastructure and much of the deployment pipeline. The three components which are bespoke for the Service Manual are the Service Manual Front end, Service Manual Publisher and supporting Postgres database. The database will eventually migrate to a database server provided by GOV.UK as part of a rationalisation project.

The Service Manager Front end provides purely static content. Any active components (e.g. search, feedback, notifications) come from standard GOV.UK components. The GOV.UK Router is used to manage the redirects from the old Service Manual app to the new one where content is available. The current service manual experience is a mix of the old and new content presented by their respective apps.

Authentication is through GOV.UK Sign on. The publisher application does not currently implement any role based access control - any user can edit content. This is not thought to be an issue whilst the only users are those in the team during the beta, but it will need to be addressed as more users from outside of the team are brought on board. The team plan to do this in future.

CI/CD tools are used to automatically deploy a reference environment when updates are made to the code repository on Github. The standard offering from the GOV.UK team is used. Given the dependencies on some GOV.UK components the Service Manual team had made pull requests on other code to add features or fix bugs. Getting these changes adopted has sometimes been difficult, which has hampered progress in some areas.

Security had been considered and the main risks are mitigated through controls within the GOV.UK platform (e.g. the CDN service mitigates attempted denial of service attacks against the front end). A risk was noted that the publisher tools for GOV.UK are not protected from a denial of service attack, however, this was not thought to be an issue for the Service Manual (it will rarely, if ever, need to be updated in an emergency). The assessment team identified an additional risk to be considered - the service manual would be a great ‘watering hole’ for adversaries to target if they wish to compromise the devices used by developers, designers etc. working on government services.

No specific penetration testing of the publisher had been done so far, but it will be regularly tested as part of the periodic penetration tests of GOV.UK.

Design

The design approach is also very close to other GOV.UK content and the team has made full use of the GDS patterns and GOV.UK style guide. There are a few notable new patterns or variants, including a +/- accordion, a floating anchor link element, and a page version area. These have all been implemented well and the panel understand the patterns have already gone through several iterations. The +/- accordion is one that we would caution keeping an eye on during testing as they’re not always fully understood by less tech-savvy users. With that said, the implementation should mean it’s fairly obvious to most.

We’re confident that the team will continue to iterate the design in line with GOV.UK styles and hope that they will also feed the new patterns into the central library. In particular, the page versioning pattern would be useful across government.

It is worth considering the use of infographics and other graphical elements that, used in the right way, would definitely enhance the content. The agile methodology explainer is one area in particular that could benefit. We understand that the team already have plans in the roadmap to look at the use of graphics. In addition, simple downloadable presentations (for things like the agile explainer) could be great for those wanting to present to a wider group using a slideshow, instead of clicking through a website.

It was very encouraging to hear how many features are in the team’s future plans and most of the suggestions we proposed during the assessment are already on their roadmap for the product.

Analytics

The team had clearly thought about KPIs and what success is for the product and they had attempted to trace the impact of the manual on service assessments, though the results were not forthcoming.

The team had also discovered that videos were rarely clicked on in the manual, and are attempting to find out why and whether the content is needed.

Recommendations

To pass the next assessment, the service team must:

  • Be in a position to turn off the last Github pages, if they have not already.
  • To have conducted further investigation into users with access needs and low digital skill, making appropriate adjustments based on those findings.
  • Be able to demonstrate that appropriately granular access controls are in place (e.g. roles) within the service manager publisher application, if needed, given the wider community access.
  • Demonstrate a robust plan and confirmed resources to migrate content from their own Postgres database after live, if this work has not happened already.

The service team should also:

  • Continue to develop their network of experts from communities across government, as well as those within GDS, and update the panel on progress at the next assessment.
  • Find a way of working with the GOV.UK teams to ensure that pull requests submitted as part of improving the Service Manual (as well as other products) are adopted in reasonable timescales. Establish an escalation path to resolve conflicts or ensure appropriate prioritisation is happening.
  • Continue to think about the relationship between disciplines and the service manual hierarchy - for example whether front-end developers or product managers should also be mentioned in the context of design.
  • Make language more consistent as per the content review findings

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met
Published 22 December 2016