Street manager alpha assessment report

The report from the beta assessment for DfT's Street Manager service on 22nd March 2018.

From: Central Digital and Data Office
Assessment date: 22 March 2018
Stage: Alpha
Result: Met
Service provider: Department for Transport

The service met the Standard because:

  • the team have through the Alpha phase developed a very good understanding of user needs and have ensured that these user needs are reflected in the prototype
  • the team are working in agile and demonstrating practices that allow them to adapt to changing priorities and user needs
  • the team explained how they had made their technical choices for the prototype, including the languages and frameworks. The intention was to continue with these for beta, and the team were able to set out clearly how they would change part of the technical stack if it became necessary because of unforeseen difficulties.

About the service

Description

Street Manager will be a new digital service that will transform the planning, management and communication of street and road works through open data and intelligent services to minimise disruption and improve journeys for the public. Street works are carried out by utility companies (water, gas, electricity and telecommunications) to install, repair and maintain their infrastructure. Road works are carried out by local authorities to maintain the highway or to install, e.g. cycle and bus lanes and traffic management equipment.

Street Manager is being designed for registered users from local authorities and utility companies. It will enable them to plan, coordinate, manage and communicate over 2 million street and road works that are carried out on the local road network each year. It will replace the current outdated systems, in use for over 15 years, with a modern service that has been designed to meet user needs, including the DfT’s own needs for data that will inform further policy development aimed at reducing the effects on congestion from street and road works.

It will also provide a subset of open data on live and planned works for data customers including app developers, SAT NAV companies, network managers, etc.

Service users

The users of the Street Manager service are from utility companies and the local highway authorities who are responsible for managing the local road network and carrying out their own highway works. Highways England, who manages the strategic road network, will also use Street Manager for any utility works on its network. The DfT will be a user of the data, as will the Welsh Government.

The following key user personas have been captured to date:

  • a promoter / planner of a street or road work. The user can be from either a utility company or a local authority
  • a highways authority officer. The user in the local authority who will assess and approve a permit application to do a street work.
  • a network manager. The user in the local authority who manages traffic and traffic management equipment.
  • a street works inspector who will inspect the work on completion.
  • a supervisor. The user who will manage or oversee a group of users applying for and issuing permits.
  • a site manager. The user who will oversee work underway on site
  • DfT/Welsh Government users including statisticians, policy makers and analysts.

Detail

Overall the panel were impressed with the progress made through Alpha. The scope of the Alpha seemed to match what had been tested and good progress has been made on the basic journey for planning managers to highways authority officer.

The panel did feel that more work needs to be done in Beta to explore the API elements of the service. The Service Manager explained that the real value add is for utility companies to share a knowledge base of all roadworks so that work on the same bit of road is not done multiple times therefore minimising disruption.

The panel therefore would like to see the service team again just prior to Private Beta to ensure that work on the APIs are progressing and that a good understanding of these user are being generated and tested. The panel recommend that this session should be in the form of a workshop. In this workshop the team would like to see more user journeys, covering the other user groups that were not covered in the Alpha demo. A demonstration of meeting user needs through the mapping tools is a good example of what should be covered in the session. The Service team should be very clear on what they what to test in Private Beta.

The panel would also like to see recruitment to key roles within the team for Beta. Specifically product owner, content designer, full time designer, performance analyst and front end developer. To ensure that there is no separation between the technical development of the service and the rest the panel are keen to see the developers integrated into a co located team rather than developing in Ireland. This would bring a lot more cohesion to the team.

User needs

The team has identified various user groups within utility companies and local highway authorities (LHA). Eight personas have been created, although these are based on roles rather than behaviour. Research has been carried out in both rural areas and large conurbations (London and Manchester), which demonstrates a good understanding of the need to consider differences in user behaviour as the product moves forward.

The team hasn’t yet identified all of the user groups and plan to do more of this research during beta. Research is needed in particular to identify the user needs for using the API - for example some LHA’s will be building their own user interface, and the open API will enable app developers to innovate and build new services. Little is known so far about these users or their needs. The team also acknowledges there is significantly more work to do to understand deeper user needs relating to digital skills, and also the context of use - for example teams working on site using mobile technology.

So far the research has identified more than 100 user needs, and no doubt more will emerge as the research progresses. Many of these user needs are very detailed, which can make them very difficult to work with. It’s not clear how the team are referring to and testing against these needs, but this practice should evolve with the product development during beta.

A significant amount of user research has been carried out which is to be commended. So far more than 200 hours of user research has been conducted, including 22 usability testing sessions, contextual research and surveys. Work has focused on the 2 main user journeys, but there is more research to be done on other user journeys prior to the private beta. The team has identified the dashboard and map-based tools as being the biggest challenge to resolve for users. These two areas need more research and design work, which will be ongoing.

Since the service is not public-facing, and training and support is already in place for the system to be replaced, there is no need for provision of Assisted Digital support. However the team does need to address accessibility needs. They plan to carry out a DAC review once the product is built, but there is also a need to test the product with real users with access needs (for example mild visual and cognitive disorders), especially around the mapping tools, and on mobile.

The private beta is not planned until April 2019. There is a significant amount of further research and design iteration to be done prior to that. The team plan to continue to carry out similar research activities to those so far, but there is no detailed plan. The team needs to provide a detailed research plan both for the phase up to private beta, and for private beta itself. This should include planned work to test with novice users and those with access needs, research with API users, mobile testing, more testing to ensure that all users can use the map-based tools, and end to end testing to include registration to use the service. The private beta research plan should also outline how feedback from real users will be collected and how KPIs will be measured.

Recommendations:

You should:

  • carry out research with API users to fully understand the user needs for LHA’s who want to build their own user interface, and for App Developers who may use the API to build products
  • test the product with novice users, those with lower digital skills, those with mild access needs, and mobile users. In particular the dashboard and map-based tools need further testing to ensure that all users can use the end to end service
  • produce a detailed research plan for the next phase during which the product will continue to evolve, and for the private beta.

Team

The team are working well and have adopted recognised agile practices. Although the team are not co located they have used ceremonies to ensure that during Alpha they are working well as a cohesive unit. There have daily skype calls to communicate and have 2 weekly cadence meetings. The team demonstrated a good pattern of 2 weekly sprints Wednesday to Wednesday.

Currently the team is dispersed in a number of locations with the development element based in Ireland. Whilst this is not such an issue during Alpha once in Beta the panel would like to see the development element embedded and co located with the rest of the team. This will lead to a rhythm of the sprint and prevent any loss of clarity in what user needs are being met by the product.

That said the panel were impressed that the developers all attended user research sessions which was encouraging. The panel were encouraged that the whole team analyse feedback that comes back from the user research sessions.

There are some gaps in the team that the panel would like to see filled as a priority for Beta these are product owner, content designer, designer, performance analyst and front end developer. The Service Manager advised that recruitment was underway for Product Owner and content designer. The panel were also encouraged to see that plans are underway to integrate a business change element into the service team for beta. This will help when the team start to engage more widely in Beta cross departmentally, with utility companies, operations and policy.

The team are an empowered service team and the Service Manager is empowered by a steering group and the SRO to prioritise, this gives the team the maximum amount of flexibility to iterate and meet user needs without any impediments.

The team have few disputes and when they occur they are resolved democratically, which is good to see only occasionally has the Service Manager had to step in to resolve disagreements.

The service team advised that Show and Tells are well attended and attended by the SRO (Senior Responsible Owners), which demonstrates senior level buy in. They have also tested with the minister.

Technology

The service team explained how they had made their technical choices for the prototype, including the languages and frameworks. The intention was to continue with these for beta, and the team were able to set out clearly how they would change part of the technical stack if it became necessary because of unforeseen difficulties.

The Service Team talked through how they had used an approach for threat modelling similar to that used successfully at a number of other central government departments. Initial contact had been made with NCSC (National Cyber Security Centre) who had asked to be kept updated as the service is developed further.

All new source code that had been developed was open and reusable, with the intellectual property owned by Department for Transport. For the mapping aspects of the prototype, some code from Land Registry had been reused.

The Service Team had explored the extent of their opportunity for reusing existing code with a variety of other public sector organizations, including local authorities. Examples were given of how Verify and Notify could be used for parts of this service as development continued in the next phase. There had also been some early thought given to where new standards might arise from this service and how these might become more widely adapted.

During the pre-assessment technical call some of the detail of the deployment environment used at this point was explained. At the assessment, some further explanation was given to how features were prioritized between the three developers who were all based at a different location to the rest of the team. It came across that the back end development work was more advanced at this point than the front end development.

Recommendations:

You should:

  • examine the flow of how new design related feature requests are prioritised and then delivered by the Development Team, and ensure there is at least one dedicated front end developer during the Beta phase, as the team scales up.

  • work closely with NCSC to ensure that all the threat vectors for this service are fully understood, including those which arise from the aggregation and more powerful search capability for data held by this service.

  • continue to explore whether there are better solutions to selecting locations where works are being carried out in addition to or instead of the current method of highlighting a particular area using a map. This should take account of users with a wider range of accessibility needs, and a full range of devices and types of browser.

Design

It was good to hear how the whole team, including stakeholders from utility companies and local highways authorities were involved in sketching and co-design sessions exploring solutions.

The team developed the service based on research and usability testing with users of the existing legacy system. We’d like to have seen more evidence the team were doing the hard work to make the service intuitive to all users, including those who haven’t been trained using the legacy system.

More focus needs to be given to the content within the service, the service lacked guidance and used inaccessible language, for example, one of the tasks users can perform from the homepage is titled ‘FPNs’. As per our style guide, we should spell out acronyms the first time they’re used.

The team iterated on their designs, for example the homepage, but hadn’t shown that they’d tried any radically different approaches to problems.

In Beta it might be useful to explore a one-thing-per-page pattern for some of the multi-question forms. This may help focus users attention when you need to offer guidance and will make it easier to validate as users go, helping them fail early on and fix their mistakes.

It was good to hear the team were looking at work done by and talking to TfL (Transport for London) and ONS (Office for National Statistics) around some patterns around the map. In Beta the team should show they’ve explored ways to make sure the mapping patterns are as accessible as possible and explore alternative interactions for accessing the information.

When asked about how they’d support users to succeed first time the team said they’d plan for users to be able to get training and support from their colleagues. Peer support doesn’t seem scalable, the team should aspire to build a an intuitive service that works for non-legacy system users and have a plan to make sure it works for users with access and assisted digital needs.

The current name of the service (’Street Manager’) doesn’t give users a clear idea of the types of things they can do in the service. When the team comes for the Beta assessment it would be good to see the team have explored more active names for the service, for example, ‘Plan and Manage Street and Road Works’.

Analytics

The team have a good understanding of what it is they are measuring. They are measuring the 4 mandatory KPIs; user satisfaction, completion rates, channel shift and cost per transaction. The Panel advised the Service Team to get in touch with performance platform team at GDS during Beta who can assist with the performance analytics side of the service and advise on how to use analytics to continually improve the service.

The panel also advised the Service Team to get a full time Performance Analyst as part of the Beta Service Team.

Recommendations

To pass the reassessment, the Service Team must:

  • carry out research with API users to fully understand the user needs for LHA’s (Local Highways Authority) who want to build their own user interface, and for App Developers who may use the API to build products
  • test the product with novice users, those with lower digital skills, those with mild access needs, and mobile users. In particular the dashboard and map-based tools need further testing to ensure that all users can use the end to end service -produce a detailed research plan for the next phase during which the product will continue to evolve, and for the private Beta
  • plug the gaps in the teams for key roles product owner, Content Designer, full time Designer, Performance Analyst and Front End Developer
  • demonstrate that the development element of the team is co located with the other elements of the team
  • examine the flow of how new design related feature requests are prioritised and then delivered by the development team, and ensure there is at least one dedicated Front End Developer during the Beta phase, as the team scales up
  • work closely with NCSC to ensure that all the threat vectors for this service are fully understood, including those which arise from the aggregation and more powerful search capability for data held by this service
  • attend a workshop prior to private Beta with a clear plan of what is to be tested.

The Service Team should also:

  • continue to explore whether there are better solutions to selecting locations where works are being carried out in addition to or instead of the current method of highlighting a particular area using a map. This should take account of users with a wider range of accessibility needs, and a full range of devices and types of browser.

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met

Updates to this page

Published 8 October 2019