Submit Data On Social Housing Sales And Lettings Alpha Assessment Report

The report for MHCLG's Submit Data On Social Housing Sales And Lettings alpha assessment on the 8th July 2021

Service Standard assessment report

Submit data on social housing sales and lettings

From: CDDO
Assessment date: 08/07/2021
Stage: Alpha
Result: Not Met
Service provider: MHCLG

Service description

This is a digital service for Local Authorities and Housing Associations to submit data about new social housing sales and lettings with the Ministry of Housing, Communities and Local Government (MHCLG).

The new service aims to make the statutory obligation of data submission easier and less manual where possible for the data providing organisation, leading to a time saving for providing organisations and analysts and better quality data for MHCLG policy makers.

There are likely to be three main methods to submit data:

  1. Webform to submit an individual log
  2. Data upload to submit multiple logs at once
  3. API submission

Service users

The primary users of the digital service will be data providers, groups within this are:

  • front line housing officers in Local Authorities and Private Housing Associations. These users collate data from multiple sources and “create logs” of new social housing sales and lettings via webforms
  • middle managers of the above housing officer teams. These users are responsible for quality and timeliness of their organisation’s submission
  • business analysts within Local Authorities / Private Housing Associations. These users will upload data in bulk (multiple logs created at once) if the organisation chooses to do that as a method of submission

Data consumers include the following groups:

  • analysts and policy people within MHCLG who analyse the data to produce stats releases, insights and policy.
  • Local Authorities and Housing Associations for example Housing Performance Managers (via data visualisations within the service)
  • external organisations – for example NGOs, Academia

1. Understand users and their needs

Decision

The service did not meet point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the user researcher has made user research a team sport by including and involving the team in all research activities, which has clearly strengthened the teams understanding of and empathy for users of the proposed service
  • the team has done research with an impressive amount of data providers, which has helped them identify their pain points and needs
  • appropriate methods have been used to identify user groups and their needs

What the team needs to explore

Before their next assessment, the team needs to:

  • do more usability research looking at end to end journeys for different user types or a chosen user type, and unhappy paths, across different devices. Research findings need to support design decisions and prototype iterations. It might help if the focus is on one user type to start with, as it was a little unclear how journeys would work for all users from the prototype presented and how design decisions were made based on research findings
  • do research with users with access needs and explore journeys for users with assisted digital needs. The team could test their service using persona profiles in the GDS empathy lab. For beta the team could look at assistive technologies, getting an expert to do an audit of the accessibility of the service, assistivlabs, and think about other ways to recruit users with access needs.The team could also look at plotting known digital confidence of user types for their service on a recognised digital inclusion scale
  • think about support staff as users and identifying their needs when working with /using the service. This will help the service be more inclusive
  • do more research to understand the needs of users who will be using the API docs. Usability research, where a task is set, can be more helpful at understanding how well API documentation works for users over other methods that focus more on evaluation of content

2. Solve a whole problem for users

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team had a strong understanding of the journey the CORE data took from tenants through 3rd parties and into MCHLG. Further, they had picked a part of that data journey to focus on that they hypothesised would deliver value across the whole chain
  • they were trying to solve the data submission problem in multiple ways - from the API for mature organisations through to the bulk upload and simple online form for those with less technical chops
  • the team was clear that, ultimately, they would look to encourage all the user organisations to work up to using the API - for greater simplicity and timeliness of getting data. However, they recognised that this might be a long journey and the online forms were still critical to prioritise

What the team needs to explore

Before their next assessment, the team needs to:

  • consider how to articulate the service for users - The service start page had no mention of what the thing was, why it was the best way to submit data, nor why it was replacing the old tool. A content designer will be able to help understand user needs better and be able to answer questions such as How the service will help users choose the right submission option for their needs
  • consider the non-form related aspects of the whole problem. Some of their key users struggled with motivation to submit data and not inferring or making some information up. Some of those problems didn’t seem to be on the roadmap. It would be interesting to hear what they could do to help with that
  • consider, sooner rather than later, doing some further high level thinking through the potential areas of risk ahead on the road map. For example, the panel felt that there might well be plenty of complexity to unpick with accounts and collaboration

3. Provide a joined-up experience across all channels

Decision

The service did not meet point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the team had clearly been empowered to try lots of different ways / prototypes to help users get case data from source through to MHCLG
  • there had been wide consultation with front line staff

What the team needs to explore

Before their next assessment, the team needs to:

  • work up plans for transitioning users and encouraging take up of the new tool. The panel would have liked to hear more about how they’ll signpost users towards the new tool and how they’ll articulate it vs other methods of getting data to MHCLG by having answers to questions such as: Why’s it important? What value to the user does switching bring?
  • consider how to use what they learn to make offline improvements. Get a Content Designer to look at all the journeys and see how they can improve the overall and offline service
  • think about designing for different devices. It would be good to hear more about the evidence / thinking for non desktop use of the form tool and answer further questions on details such as: Might users submit data via their mobile? How might that be designed for?
  • further help desk analysis. The panel would have liked to hear more about what problems users come to the help desk with and how the team intends on tackling them. Also - while not needed to pass alpha - it would be good to have something on the backlog / roadmap about engaging with the help desk team to get them up to speed to support the service
  • retiring Core is another action that can be tackled during beta. It would be interesting to know what conditions need to be true for them to be able to retire the existing outdated service

4. Make the service simple to use

Decision

The service did not meet point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has made efforts to iterate on the design solution for this service, attempting to streamline information and improve usability
  • the team went into a lot of depth when presenting the data journey and shows a good understanding of the needs of their target user group, who are primarily data providers
  • the team has come up with a clear service name that indicates what the service is and who it is for

What the team needs to explore

Before their next assessment, the team needs to:

  • conduct more usability testing with each actual user type, and with each potential user type including edge cases, and be able to demonstrate how you have iterated your design based on this feedback in a meaningful way. Visual designers, UX and data colleagues should interface here and use appropriate research techniques to assess whether data providers can successfully submit data first time with minimal assistance, and come away from the process understanding why it is important for them to submit this data and how it fits into the bigger picture
  • ensure that offline user journeys are tested as well as online user journeys: have a clear assisted digital route in place that is signposted clearly and is included in all rounds of usability testing. Apply the principles of best practice in content design to all the material touch points that your data providers have with you, including paper forms. Include stakeholders at all levels in usability testing, including frontline help desk staff, and be able to demonstrate implementation of their feedback, and how it has improved the usability and accessibility of your service
  • demonstrate clearly how they have ensured that the service works across a range of digital devices
  • recruit and maintain a full-time content designer to work as part of the multidisciplinary service team who can contribute strong attention-to-detail, analytical and problem-solving skills. A primary task is to perform a content review of the service as it stands, simplifying and opening up language, framing clear questions, and providing a user-friendly and easy-to-understand information journey from end-to-end, using the GOV.UK style guide and Design System as references. Content across online and offline channels, including paper forms, should be reviewed and validated, using 2i where necessary. A good place to start with the content review of the online journey is the Start page, ensuring consistency with the GOV.UK Design System, and rethinking the information priority, styling, and content here
  • perform a design audit across the service: ensure that it is aligned with the GOV.UK Design System, and where user evidence indicates deviating from it, ensure that you contribute to the Github repo on this component and contribute your learnings so other service teams across government can benefit. A good place to start with a design audit would be to review your use of the Check answers component
  • have a robust migration plan in place for when it is time to switch over from CORE to the new service, with proper mitigation. Ensure that the start of the journey is clearly signposted on GOV.UK, and that any legacy links point the users to the right place and not to a dead end
  • ensure that at each step of the journey, the user knows what to expect next: for example, the confirmation screen at the end of the journey tells the user that they will be sent an email, but the purpose and destination of this email is unclear and tests of the demo allowed for the panel to proceed through the journey and reach this message without providing an email address

5. Make sure everyone can use the service

Decision

The service did not meet point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team profiled on access needs in the usability testing that they conducted

What the team needs to explore

Before their next assessment, the team needs to:

  • consider recruiting user testing participants from within the civil service to ensure that you are carrying out user research with participants who represent the potential audience for the service, including people with access needs
  • ensure that a full accessibility audit of the service is carried out, using automated as well as manual testing tools, across offline and online parts of the user journey, including paper forms. Consider approaching an organisation like the Digital Accessibility Centre for this when relevant, but initially, ensure that you are doing hands-on testing of your service yourselves as you develop it, using automated tools like Lighthouse and manual approaches combined. Bear in mind that automated testing can only pick up around 30% of accessibility problems
  • ensure that you are not excluding any groups within your target audience such as frontline help desk staff, new starters who are new to the systems, or existing staff with access needs or low digital confidence. The principles of accessible service design exist to ensure that we are truly opening up our services to everyone, regardless of whether they are internal users or not, and whether their access needs are temporary, permanent, or situational
  • consider investing in internal accessibility and inclusivity training to improve awareness of the principles of accessible, inclusive service design and why it is so important to embed these right from discovery phase

6. Have a multidisciplinary team

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • all the right disciplines were either in the team or being recruited
  • developers and designers were working well together
  • user research had become a team sport
  • the service owner was contributing to the team and involved in ceremonies

What the team needs to explore

Before their next assessment, the team needs to:

  • recruit a content designer - It would be great to hear how this recruitment is progressing and what their priorities will be when they join the team

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team was nimbly trying different ways of working to find what suited them best. They switched up from design sprints to scrum sprints. This was also impressive given they are part of an organisation which sounded like it had a preference for waterfall over agile
  • retrospectives were happening and were helping the team reflect and change and adapt how they were working
  • the team had used lots of awesome on-line collaboration tools to mitigate the pandemic remote working and to be a really effective agile team
  • they had experimented with different prioritisation techniques to see what worked best for them
  • they utilised sprint goals

What the team needs to explore

Before their next assessment, the team needs to:

  • carry on trying to do things the right way
  • consider how they’ll celebrate and share progress across the organisation and perhaps use that as a way to shine a light on agile ways of working within MHCLG
  • figure out which very senior person they intend to test the service with at the end of beta

8. Iterate and improve frequently

Decision

The service did not meet point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the team tried some ‘radical’ ideas and made strong calls to de-prioritise/stop those experiments when it became apparent that they were not the most valuable thing to progress
  • the team was keeping good documentation on what they’re doing and learning

What the team needs to explore

Before their next assessment, the team needs to:

  • do further usability testing as the apparent lack of getting the core prototypes in front of real users is the main reason why this aspect is ‘not met’. This type of testing, which would help them to start finding problems with the service before private beta users do, feels central to improving and iterating regularly
  • share more information on the iterations as the panel needed to see iterations of the prototype and how research findings informed the design. It would be good if the team gave a clear explanation about how many rounds of usability testing they had done, with the amount and type of users in each round. It would be good if they also specified hypotheses, the task(s) set, devices used and issues/ findings that informed design decisions

9. Create a secure service which protects users’ privacy

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the service team has started considering the privacy implications of gathering the data submitted
  • the team is aware that even though the data can arguably considered free of personal information there might be a possibility of re-identification using external datasets
  • the team is also aware of the possible data breaches that may happen and their relative risk

What the team needs to explore

Before their next assessment, the team needs to:

  • conduct a Privacy Impact Assessment on the data collected
  • reduce the data collected to the minimum needed by the service
  • prepare for and schedule a penetration test
  • write a formal threat model to establish the relative risk of possible data breaches

10. Define what success looks like and publish performance data

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the team had a clear overarching assumption to test that improving ‘the user experience for all data submission methods, while guiding organisations towards increased automation we will reduce burden and improve data quality’
  • starting from that hypothesis they used a performance framework to reach the metrics and KPI’s they’ll track
  • they were trying to establish benchmarking where they could

What the team needs to explore

Before their next assessment, the team needs to:

  • consider qualitative evidence for KPIs. Consider whether there could be some benchmarking user research or survey to help track a before and after to prove improved user experience and reduced burden
  • publish performance data. It would be good to hear a bit more about what their beta plans are for this aspect of the standard
  • consider errors and alerting. It would be good to hear a bit about beta plans for what service level indicators (SLIs) the team will track when they release this for real users. Consider non functional requirements of the service too

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the technology choices made so far are sound and the tools chosen are modern and well-suited for this kind of project. They include AWS, Github Actions, ECS
  • the technical team has held back from making choices that don’t need to be made at this point, such as what programming language or framework to adopt. These can be decided when beta starts, and even change as needs develop
  • the developers have not started building the service (as they shouldn’t) but have designed enough of the architecture to give the panel confidence that the team has the technical skills to start beta
  • The team has also demonstrated a broad knowledge of the web technologies that could we explored as potential candidates for the beta service (for example PWAs)

What the team needs to explore

Before their next assessment, the team needs to:

  • make sure that the tools and tech made at any point during beta are the best for the service (including CI/CD, logging and monitoring)
  • expect that those tools and techs might change as a result of the whole team iterating the service
  • consider creating a log of decisions made regarding technology, in order to have a record that can be shared or passed on, and can even act as a reminder for the team of why things are the way they are (useful for future assessments)
  • the team needs to bear in mind that even though using popular frameworks provides features like authentication and user management interface for free, user-centred design principles still apply, which might still incur substantial additional work

12. Make new source code open

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the the alpha prototype code is available on github
  • the repository is documented and therefore could be helpful to other teams building a new service
  • the repository shows that the team is familiar with useful github features like pull requests and actions

What the team needs to explore

Before their next assessment, the team needs to:

  • keep the same level of openness for beta: working in the open from the start in order to promote government transparency and demonstrate that government services can be built by highly skilled software engineers using modern technology
  • separate out various parts of the service in different repositories, if and when this becomes necessary
  • the infrastructure configuration (in this case terraform files) should be separated out from the service code, in order to make things easier should the infrastructure need to change (AWS to Azure, for instance)

13. Use and contribute to open standards, common components and patterns

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the tech stack built for alpha doesn’t use any proprietary technology. Everything uses open standards
  • the prototype kit was used to build the alpha service
  • the team is aware of cross-government work to design a tool to make it easier to create form-filing applications

What the team needs to explore

Before their next assessment, the team needs to:

  • work with the cross-government front-end community or the Design System group in order to find out what knowledge can be exchanged regarding the specific design of this service, specifically what decisions the team will make to address the sheer size of the data collection form, including huge drop-downs that presumably will intimidate new users, or users with special needs
  • track the progress, and possibly contribute, to the x-gov forms task force

14. Operate a reliable service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the team hasn’t presented a technical solution for ensuring that the service is reliable, which is a good thing. The beta service will be created from scratch, therefore presenting a reliability infrastructure would be premature
  • however the team’s familiarity with AWS, as well as some of the points it made regarding the possibility of working offline (for example progressive web apps) gives the panel confidence that a decent reliability infrastructure can be designed for beta

What the team needs to explore

Before their next assessment, the team needs to:

  • articulate what “appropriate monitoring [the team will have] in place, together with a proportionate, sustainable plan to respond to problems identified by monitoring (given the impact of problems on users and on government)”
  • articulate how they will “test the service in an environment that’s as similar to live as possible” - as the service standard states

Next Steps

[reassessment]

In order for the service to continue to the next phase of development, it must meet the Standard and get CDDO spend approvals. The service must be reassessed against the points of the Standard that are not met at this assessment.

The panel recommends the team sits a reassessment. Speak to your Digital Engagement Manager to arrange it as soon as possible.

Published 22 September 2021