Pay a late filing penalty alpha assessment

The report from the alpha assessment for BEIS's pay a late filing penalty service on 7 September 2017.

From: Central Digital and Data Office
Assessment date: 07 September 2017
Stage: Alpha
Result: Met
Service provider: Companies House

The service met the Standard because:

  • The service team has validated assumptions made through user-focused development and testing of a prototype
  • The service team is well structured, and has good ways of working, and lightweight, appropriate governance
  • The team has clearly learnt from working with a third party supplier and has good engagement with stakeholders and other parties, such as HMRC, which has resulted in changes that benefit the service’s users
  • The team has designed a simple architecture in line with what is a reasonably simple service, with good integration to back-office finance systems
  • The team has clearly thought about appropriate metrics to help test a service that will best meet the needs service users

About the service

Description

The service enables users to pay online penalties and other debt recovery costs associated with the late filing of company accounts.

Service users

Company directors, accountants filing accounts on behalf of their clients.

Detail

User needs

The team made an assumption that users would want to pay late filing penalties online, and tested this assumption via a survey to people who had filed late, plus some interviews with some accountants and company directors. Although the discovery was limited, and not quite as thorough as the panel would have expected, the panel feels that the correct assumption had been made by the team, and the discovery confirmed it was the correct one.

The team continued to conduct some contextual research throughout the development of the alpha. This has involved further interviews with accountants, while also speaking to support staff and spending time listening to calls at contact centres. From this research (and some secondary research) the team has developed 3 personas (which are plotted across key points on the digital inclusion scale) that they then used for recruitment through the Alpha.

Research for the Alpha has involved testing against these profiles, and has also included some testing with people with disabilities. The testing has been conducted at trade shows and in the lab. The team mentioned conducting some research in libraries to access people with low digital skills, however the panel was unsure if they would find true users of the service here.

In testing, participants have been shown the letter, and then taken to the first screen of the alpha. The findings from this research have been helpful in developing the alpha so far. However, going forward, the panel agreed that the team need to look to testing the whole user journey - starting at the letter, and asking participants to complete the journey from there. This will involve users taking key information from the letter e.g, URL of the service or alpha, and using this to find the service. This will help identify any further issues with the journey, and help create confidence that users can complete the whole journey first time.

Overall the team is user focused, with all attending research sessions, and research being streamed back to Companies House so it can be observed by other stakeholders.

Team

The panel was delighted to see that service team clearly works well together, and has a real desire to make the service as simple and as useful as possible for their service users. The team is well balanced with all appropriate roles filled. There is an appropriate blend of permanent and contract staff, with key roles filled by permanent Companies House staff, and plans in place to replace the contract Content Designer with a permanent member of staff.

Whilst, due to capacity limits, the service was originally built by the external supplier, Advanced Systems, with oversight from Companies House, the panel recognised the efforts to bring development back in house, and were pleased to hear that much had been learnt by the service team from working with that external supplier. The plan to take development of the core service in house, with integration to the finance system developed in conjunction with Advanced Systems, is welcomed.

Ways of working are good, with daily communication across the team and regular, effective communication with appropriate stakeholders, including lightweight, appropriate governance, for example via the monthly Change Approvals Board. The panel was particularly pleased to note the engagement with finance - clearly evidencing how the service team had demonstrated the effect of using complex finance language on users to finance colleagues. Such engagement is exemplary. Other engagement, for example with call centre trainers, is also commended.

Technology

The service team presented a one page diagram showing the architecture behind the digital service. This was well designed, avoiding complexity, and the service team clearly understood how it works. The service team had considered possible threats to the service and how they would deal with them. The common government platform GOV.UK Pay is integrated with the service.

The approach so far has been to use a third party for development, for expediency. This has meant that the intellectual property is not currently owned by the department. While the code developed so far uses an open source solution, it is not published. The service team has recognised this, and advised that in the next phase of development will instead be carried out in-house as this will give more flexibility.

Design

We were pleased to see the team had followed an iterative design approach. The team was able to show different design directions they explored. During their Alpha phase, they investigated and tested nine different versions. At the beginning, the team considered a free field to let users pay a part of the penalty and also decide when at a ‘made up’ date. Through tests, the team learned these don’t work for users and simplified the payment flow.

The service was developed using the GOV.UK prototype kit, the ‘Check before you start’ pattern and the GOV.UK Pay component. Therefore it applied common design patterns throughout the Alpha phase. During these iterations, the team made changes to the content order like placing information above the start button which made it easier for users to see critical content. The team also changed the name of the service during the Alpha phase as users got confused with paying penalties to HMRC.

As the GOV.UK Pay component was one of the central parts of their rather simple service, we were pleased to hear the team had regular interactions with the GOV.UK Pay team, including sharing lessons with its designers and user researchers. For Beta, the team considers using GOV.UK Notify for sending the confirmation email.

The team consulted a team at the DVLA also dealing with paying penalties. They consider using similar key performance indicators in the Beta version of the service but decided to take a different interaction design flow as it works better for their users. The team demonstrated edge cases where users had to pay increased fees and how these are displayed which was good.

The team demonstrated how users succeeded first time with a video recording from user test session. An offline version of the service will continuously be offered in the future via phone.

Only two users with particular accessibility needs tested the prototypes so far. For the Beta phase, the service team needs to develop a clear plan to include users with accessibility needs in all future tests. The team discussed working with the Digital Accessibility Centre (DAC) in Beta and should do so early in the phase. They reached out to assisted communities and expert groups in Wales and LA.

The journey starts with a letter which is developed by another team in the communication department. The service team should work closely with them to make sure the language in the letter and digital service is aligned. The letter should point to a short address instead of Companies House’ general presence on GOV.UK as it does now – particularly since the service would not be listed directly on the Company House landing page. The team talked about testing the end-to-end journey with users. We were not sure if this was the case in all of the tests as no short URL to the prototype seemed to exist. This needs to be the usual test setup in Beta. Also, users should be allowed to opt-out from the paper letter and be able to receive it electronically.

In the assessment, the team was not able to describe the alternative journey in which a user has lost the letter and penalty reference number. During the assessment, the team was not able to answer how the user would recover the reference number, eg via the phone channel. They need to clarify this in the Beta phase.

Analytics

Despite currently prototyping a small service, the panel was really pleased to see the thought and engagement that has already happened with regard to data. For example, the service team has plans to engage with DVLA around similar penalty services with regard to understanding how to measure user satisfaction with a service where people are paying fines.

The service team is also engaged with the Performance Platform team, via Government Digital Service directly, and has agreed to prototype different versions of Performance Platform.

The team are looking at a range of KPIs, including but not limited to the 4 mandatory indicators, including, for example, visitors paying for more than one penalty fine in one session, monitoring telephone calls for any reduction in volume, and trying to identify whether people pay sooner.

Recommendations

To pass the next assessment, the service team must:

  • Broaden the research conducted, focusing on the following:

  • need to conduct contextual interviews to develop the personas further
  • need to further understand more detailed needs of users, for example by user type, disability and so on
  • need to ensure they conduct research with a people with disabilities - this ideally needs to be in each sprint, and not just limited to the DACC audit they spoke about
  • need to ensure testing replicates the full journey users will experience, starting only with the letter containing an appropriate URL pointing them to the digital service
  • need to conduct contextual research and user testing with people for whom English is not their first language

  • Undertake the planned accessibility audit
  • Work with the Communications team in the iteration and testing of the letter that goes out to users
  • Understand how users might access the service if they have lost the letter

  • Ensure that intellectual property for this service is owned by the department
  • Give further consideration in the next phase as to how others could reuse the code they develop

The service team should also:

  • Clarify an alternate journey for when a user has lost the letter and penalty reference number. eg via the phone channel
  • Consider where the service could provide nudges to sign up to filing email reminders, for example, on payment confirmation pages
  • Engage with HMRC regarding any work that they have undertaken to improve filing compliance, for example, around email reminder sign-up rates
  • Continue to explore areas closely associated with the payment service, for example, those highlighted in the Service Briefing such as:

  • integration into the Companies House search and filing platform
  • payment by instalments

Next Steps

You should follow the recommendations made in this report before arranging your next assessment.

Get advice and guidance

The team can get advice and guidance on the next stage of development by:

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Not met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met
Published 19 November 2020