Get help arranging child maintenance
DWP's Get help arranging child maintenance beta assessment report
Get help arranging child maintenance
| From: | Central Digital & Data Office (CDDO) |
| Assessment date: | 20/7/2022 |
| Stage: | Beta |
| Result: | Not Met |
| Service provider: | Department for Work and Pensions |
Assessment panel
Personal information on assessors below will not be published on GOV.UK and it is only included here for internal purposes.
Lead assessor: Siju Salami
User research assessor: Andrew Clark
Design assessor: Amy Marie Phillips
Tech assessor: Mudjidat Sowunmi
Performance analyst assessor: Clifford Sheppard
| Service Delivery Manager: George Blacklock |
| CDDO assurance lead: Simon Everest |
Previous assessment reports
Service description
The service provides supportive information to help users understand their options when setting up a Child Maintenance arrangement.
It better enables them to make a decision on how they can initiate an effective arrangement with the other parent. It asks a set of eligibility questions which allows users to understand whether they can make an arrangement through Child Maintenance. At the end of the service the user can select the option that is right for them – to attempt/set up a Family Based Arrangement (FBA) to consider more information before making a decision, or to start an application for a Child Maintenance Arrangement.
Service users
This service is for, parents of children who want to either set up, or get some information on a child maintenance agreement, they fall under the categories of either:
● Receiving Parents (RP’s)
● Paying Parents (PP’s)
● Third parties wanting to know the options available when setting up Child Maintenance (for example, third party organisations such as Citizens’ Advice Bureau acting on someone else’s behalf)
1. Understand users and their needs
Decision
The service did not meet point 1 of the Standard.
What the team has done well
The panel was impressed that:
- the team carried on from the learnings in discovery and alpha
- the team had a good view of users on the various dimensions users face and a scale of positive and negative aspects at each point
- the team have a good grasp of the product and iterations made in response to user research
- the team have a lot of usage data from which to draw hypotheses
What the team needs to explore
Before their next assessment, the team needs to:
- research the digital service with more users across the digital inclusion scale. While it’s acknowledged the team iterated the call centre script in extensive rounds of research, which was applied to the digital service, the digital service itself hasn’t had enough user research attention in beta
- iterate personas to cover assisted digital needs too
- look after themselves when dealing with such a sensitive subject. Consider support for the whole team, not just user researchers
.
The team are blocked from recruiting users who are less confident or able to access digital services.
The team should conduct usability testing to provide qualitative insight behind quantitative patterns. This must be done through a wide range of users across the digital inclusion scale, building on work already done by the ‘Apply’ team
- the team are awaiting outcomes of a trial by the research operations team to determine if it is possible to fund incentives for users to take part in research
- internal guidance post-Covid still prohibits pop up research at walk in centres
- the team do not have budget or capacity within the existing central user research contract with a recruitment supplier to provide users
- strain on the call centre to complete calls and move to the next call due to growing call volumes, meant that call handlers were less able to recruit research participants
2. Solve a whole problem for users
Decision
The service met point 2 of the Standard.
What the team has done well
The panel was impressed that:
- the team were able to adapt to the termination of the call centre contract and get a digital service up and running that satisfies the policy intent to provide information before a user applies
- the team have ensured that users do not have to re-enter information and can opt to start the next part of the journey (application) with their information stored or start a fresh session
- the team have considered the sensitivity of domestic violence, which can affect many of their users, and have signposted advice as well as a button that switches windows so they can hide the page if someone comes into the room
What the team needs to explore
Before their next assessment, the team needs to:
- consider if this service is really a service itself or an information page before applying
- consider where the service sits in the wider journey of child maintenance. For example, enforcement has been removed from the To-Be service map, but is still a relevant component for users.
3. Provide a joined-up experience across all channels
Decision
The service met point 3 of the Standard.
What the team has done well
The panel was impressed that:
- the team has established a strong relationship with the contact centre and have built the learnings from calls into the digital service. They have been involved in both the online and offline journey
- the team designed the system to allow for online and offline starts and continuations to the journey
- because the system generates a reference number and some of the users are vulnerable, the team has taken care to provide options for how people receive the number (e.g. they might not want a text showing up on their phone)
What the team needs to explore
Before their next assessment, the team needs to:
- continue to refine the transition to the application service
- test the full end to end journey for a user to ensure that they can complete the full process from information to a decision in the application. While the project might be the information part, to the user it is all one DWP transaction to accomplish their goal
4. Make the service simple to use
Decision
The service met point 4 of the Standard.
What the team has done well
The panel was impressed that:
- the team is using GOV.UK Design System and patterns effectively
- they have solicited feedback on their content from Citizen’s Advice to ensure the language is simple and the guidance clear
- there is a content designer on the project
What the team needs to explore
Before their next assessment, the team needs to:
- ensure they have tested the content with users as well
5. Make sure everyone can use the service
Decision
The service did not meet point 5 of the Standard.
What the team has done well
The panel was impressed that:
- the team has had an accessibility audit as well as tested the service with colleagues who have various accessibility needs
- their development UAT codes for accessibility in each release
What the team needs to explore
Before their next assessment, the team needs to:
- recruit and test with users who have accessibility needs. An audit is not a substitute for speaking to real users. Where recruitment is a challenge, consider engaging a specialist recruiter
6. Have a multidisciplinary team
Decision
The service met point 6 of the Standard.
What the team has done well
The panel was impressed that:
- the service had a well-formed multidisciplinary team, underpinned by permanent staff and augmented with contractors where necessary
- while there has been some churn of staff, there was evidence of knowledge transfer
What the team needs to explore
Before their next assessment, the team needs to:
- consider whether more structured knowledge management would be beneficial to reduce knowledge loss (for instance on personas) as Digital teams across government experience an uptick in churn
- consider whether some further performance analysis bandwidth would be beneficial. While there is strong evidence of good work and support from performance analysis, the panel wondered if this support may be spread too thin
7. Use agile ways of working
Decision
The service met point 7 of the Standard.
What the team has done well
The panel was impressed that:
- the service had well-established ways of working, based on scrum, that is supportive of agile delivery, also evidenced by a regular cadence to delivery
What the team needs to explore
Before their next assessment, the team needs to:
- consider how governance (beyond the team) can be leveraged to unblock critical issues such as limited access to users
8. Iterate and improve frequently
Decision
The service met point 8 of the Standard.
What the team has done well
The panel was impressed that:
- the team showed that there were 6 major iterations (versions) of the prototype
- options are being considered to remove the need for a URN to ‘stitch’ steps of a user’s journey together
- the team had the ability to release frequently
What the team needs to explore
Before their next assessment, the team needs to:
- test prototypes with more users. While themes can start to emerge after a small number of usability tests, there needs to be significantly more users cutting across user groups to build confidence and avoid blind spots of unmet needs
9. Create a secure service which protects users’ privacy
Decision
The service met point 9 of the Standard.
What the team has done well
The panel was impressed that:
- the team have engaged with the Security risk assessors, no issues were raised
- internal security testing was conducted: Anti DDoS, DPIA, Risk Assessments; CMG followed the DWP Enterprise Security and Risk Management (ESRM) processes
- GHACM cookie policy will follow the standard GOV.UK Design System
- accessibility testing have been conducted through an External Accessibility Audit
What the team needs to explore
Before their next assessment, the team needs to:
- resolve API delivery from external teams
- plan to conduct continual testing of Security, Micro-services, API testing and Performance testing
10. Define what success looks like and publish performance data
Decision
The service met point 10 of the Standard.
What the team has done well
The panel was impressed that the team:
- had created a set of success metrics that were derived from the high-level goals of the service wherever possible
- had a good range of well-defined supporting metrics which related back to the needs of users and the business
- had a plan of other areas of analytics work to develop, including how they will join up data from different parts of the journey
- had good examples of how they had used data to help them improve the service
- were able to provide data as questions arose throughout the assessment; this helped the panel and demonstrated that the service team had easy access to relevant data
What the team needs to explore
A performance analyst currently works on the team on an ‘as required’ basis. The panel would like the team to consider having some dedicated performance analyst time to monitor data and ensure that insights are spotted quickly and brought to the team’s attention
11. Choose the right tools and technology
Decision
The service met point 11 of the Standard.
What the team has done well
The panel was impressed that:
- the team have decided to go with Cloud technology with Microservices and event driven architecture where possible
- the team has shown it has considered the choice of technology using AWS ECS as a tactical platform; Built using a DWP common platform, using CASA, an internal DWP framework
- the team have shown an effective approach to managing the Gov.uk Notify legacy technology
- have utilised technical choices around the established technology of DWP
What the team needs to explore
Before their next assessment, the team needs to:
- make good the plan and migrate the technology to move to containers when HCS is more robust
- Consider that when CMG collectively decides to use a common strategic service like DWP Notifications in order to align with that strategy
- consider further ARA(Applications Reference Architecture) i.e. DWP Management Information and DWP Analytics
12. Make new source code open
Decision
The service met point 12 of the Standard.
What the team has done well
The panel was impressed that:
- code is stored in Github
- the team have used open source software where possible, including CASA, NodeJS
What the team needs to explore
Before their next assessment, the team needs to:
- explore an alternative to the bespoke ERMIS solution
13. Use and contribute to open standards, common components and patterns
Decision
The service met point 13 of the Standard.
What the team has done well
The panel was impressed that:
- the team made use of the open Standard MDTP pattern for web front end; public API
- there was iteration and enhancement of existing government service architecture
- the team has complied with the WCAG 2.1 standard
- the team follows the GDS compatibility standards
- the service was built utilising DWP’s open sourced CASA framework; with the reuse of the Apply service
- they adopted and adapted contributions from the GOV.UK Design System
- GMG services have incorporated cross government standards including Apply, Calculator
What the team needs to explore
Before their next assessment, the team needs to:
- continue exploring open standards in the Gov.uk design system
14. Operate a reliable service
Decision
The service met point 14 of the Standard.
What the team has done well
The panel was impressed that:
- CMG is working towards using DWP’s standard CI/CD
- the service will be able to deploy software changes regularly, without significant downtime; The System is 24 X 7 with an SLA 99.99
- CMG has a clear DR process
- the eligibility system had passed CDDO alpha in June 2021
- there is a clear CI/CD process with elements such as Giglab, Jenkings and Terraform to create infrastructure as code in AWS
- QA is tested on multiple devices
- Standards are based on DWP’s CI/CD pipeline
- there is appropriate monitoring in place, together with a proportionate, sustainable plan to respond to problems identified by monitoring by adhering to the DWP Engineering and Dev-Ops principles for alerting. (Monitoring is aligned with AWS’s Cloudwatch)
What the team needs to explore
Before their next assessment, the team needs to:
- the team needs to respond to the dilemma whether this an independent service or another branch of the Apply journey