Sign up to report your income and expenses quarterly - beta assessment

The report from the beta assessment for HMRC’s Sign up to report your income and expenses quarterly service on 7 November 2017.

From: Government Digital Service
Assessment date: 07 November 2017
Stage: Beta
Result: Met
Service provider: HMRC

The service met the Standard because:

  • The team has a thorough grasp on who its users are and a way of working that continually identifies their needs and uses them to inform change.
  • The team iteratively evolve the service with a good choice of technology, tools and methods of work.
  • The service team has well defined plan for Public Beta, backed by the right team, technology and up take strategy.

About the service

Description

The service provides eligible sole traders and landlords an ability to learn and voluntary sign up to Making Tax Digital for Business (MTDfB) quarterly reporting and then submit reports about their income and expenses via an approved 3rd party software or mobile app.

The service is a single transaction service, once done it creates certain obligations for the user, such as to file reports quarterly and go paperless. Currently, the service is only available to users who have previously registered with HMRC SA and fall under either one of both categories of:

  • Sole traders (self-employed individuals with single source of income) excluding partnerships.
  • Individuals who are in receipt of rental income

The service also allows Authorised Agents to sign up eligible users.

Users will then need to download a compliant third party software, which will send their quarterly updates via API to the MTDfB platform.

Users with more complex tax affairs will be included in later iterations of the service.

Service users

The users of this service is anyone who is a:

  • Self Assessment registered Sole trader with single income (excluding partnerships)
  • Self Assessment registered individual receiving rental income (even via a letting agency)
  • HMRC Authorised agent (accountants)

Detail

The MTDfB programme is an important initiative from HMRC to reduce the tax burden on small business, increase visibility of due tax, provide ability to report and pay tax more frequently and move to a paperless accounting and tax affairs.

This service is an enabler for selected early adopters to start using the MTDfB platform before it becomes mandatory in 2020. One of the biggest benefits for the eligible citizens would be the complete paperless income and cost reporting and incentivise 3rd party vendors to develop suitable bookkeeping applications which should ease the reporting process for business people and allow them to focus more of their effort back to the business. Whereas all approved software products are likely to be paid for, the collaboration agreement between independent vendors and HMRC requires free licensing for certain type of users, most likely for those under a specific yearly income threshold.

The scope of this assessment is to allow the service team to expand the service cover in Public Beta and also increase the credibility of the service by adding it to the GOV.UK platform. The intended target uptake for the next 12 months is to reach 50,000 signed up users.

This service is designed as a single short transaction with simple and distinctive user journeys.

User needs

The team has a thorough grasp of its users beyond categorisations like ‘Landlord’ or ‘Sole trader’ but by their levels of motivation to sign up, beliefs about income tax reporting, levels of digital experience and confidence, and by the sort of help that they are likely to need. The team provided specific examples of how the needs of these users informed change within the service as well as within policy, content and support teams whose work influences the service user’s experience.

Although subscription to the service is currently voluntary, the team is taking measures to support users who may choose to but lack skills or confidence to sign up. By observing the questions that users ask, where they look for help, and what language they use when they do, the team is able to label guidance and position support links in a way that users are likely to find. Beyond this, users who are unable to succeed using the digital channel have recourse to service-specific support via a phone call or face to face visit with a specialised contact centre.

The team’s research to date anticipates and addresses many of the difficulties that live users would otherwise encounter like specifying an accounting method or describing income sources. For other potential issues that wouldn’t surface in a lab setting with a prototype, the team intends to conduct research with volunteer subscribers as they actually sign up.

The panel’s recommendations align with the team’s plans, namely:

  • To research whether users of assistive technology are able to succeed in their own environments using their own tools. (Must)
  • To research whether users who lack digital skills or confidence or skills are able to succeed, how they interact with support channels, and what adjustments to the digital interaction or support model promote success. (Must)
  • Research whether users, when exposed to the Tax Reference Number (that lets them continue to provide updates when Government Gateway credentials aren’t available), retain this number and what may ensure that they are able to retrieve it when they need it. (Should)
  • The ‘Go paperless’ step in the journey is using pages from another service and the team is continuing to see users having difficulty with this step, as it presents a false choice to the user. The the team should continue to relay findings from research to the appropriate product owner.

The panel also recommends that the team differentiate between observed user needs like to understand what I need to do with accounting software, and supposed benefits like efficiency or productivity. Needs which the team observes should form the basis of design changes. Supposed benefits may form the basis of research questions like ‘How do users improve productivity?’ ‘How might this service enhance their productivity?’

Team

The team is currently made of 2 Developers, 2 Test analysts, a user Researcher, a Technical Architect, a Product Manager, a Service Manager, a Content Designer, Business Analyst and Data Analyst. One of the developers acts as a Technical Lead and also a Scrum Master for the team.

The team are operating Agile Scrum methods and provided evidence that they operate iterative design/development. This was demonstrated to good effect in the assessment with an example of how the design evolved following user research.

The team know their SIRO and SRO and are feeling well supported, and with certain level of autonomy.

Technology

The team have chosen a modern and common development stack, heavily influenced by the larger HMRC Digital ecosystem. The stack includes open source languages and platforms such as Scala, Play, MongoDB, GitHub, Kibana, etc. The service also re-uses some common HMRC functionality packaged into libraries. The release process is semi-automated, yet well scripted and based on popular tools such as GitHub and Jenkins. The testing stack includes tools like Gatling, Zap, Wave and is partly manual, due to the test data initialisation process. The service can be tested end to end when required and independently of integration systems availability.

The release pipeline includes the usual DEV, QA, STAGING and PROD environments and although not fully automated, releases are done with minimum manual effort and can see service code updated 2-3 times a day when required. The panel recommends to automate testing and deployment as much as possible and look for an automated process of anonymising live data to synthesise adequate test data.

The service integrates mainly with other HMRC services and the Government Gateway IDAM service. Close collaboration with dependent teams across HMRC and stakeholders was demonstrated and the team is independent enough to make design changes.

The team are Coding In The Open where they can and demonstrated their public GitHub repositories with an active commit history.

The panel was pleased to know the service is reusing a lot of common functionality from HMRC Digital estate, however no applicable GaaP shared services have been re-used, such as Verify, PaaS or Notify. The service team have benefited from the HMRC Digital and MTD in-house components and have no control over the choice of integration points for IDAM or messaging. We would advise them to still explore and consider the use of Verify as an additional IDAM and Notify for messaging, if such functionality gets introduced.

The team have considered data privacy and capture as little information as possible only for the duration of the transaction. The HMRC MTD have central data protection team who will further check this service for GDPR compliance.

The technical team is knowledgeable of the main security threats, and are regularly supported by a central HMRC security team who perform due threat modeling, assessment and pentests. The service SIRO is also known.

Regarding service resilience, the team explained the criticality of the service and the impact of the service going offline, which is not that critical as it is still a voluntary service. The deployment model includes multi AWS AZs and load balancers to guarantee resilience to performance hits and outages. The operational model appears to be little ad hoc at the moment with people in the team on voluntary Pagerduty development support, however the service benefits from a central HMRC Digital web operations support team providing 24/7 cover equipped with detailed service runbooks.

Design

The team demonstrated a good understanding of design patterns and the service was consistent with GOV.UK. They worked closely with the GOV.UK content team to design their guidance and continue to feedback relevant research findings to the GOV.UK team.

The team had iterated and changed the journey based on research with users, in one example, they discovered that some users don’t always know if they are landlords or self employed. The team changed the content several times based on the user’s own understanding of the subject. When testing the guidance they found they needed to offer alternative methods of support for some users who struggled using the existing web chat, a support phone number was introduced.

It’s clear they have spent time making the language and content easier to understand. In one example the team talked about how they reduced a 6 page non-contractual terms and participation document down to a shorter bulleted list which the team want to continue to improve based on research. It would be good to see this happen.

It’s good to hear how closely the team has worked with other service teams and departments to help improve the end to end user experience. They have done a great job sharing research findings with HMRC teams (that own parts of the journey), contact centre staff and other stakeholders to help them understand the value of designing based on user’s needs.

The assessment team raised concern about a part of the journey where users are given a false choice between going paperless when using the service, this is a component which is shared with other services. The service team acknowledges this and have taken steps to own and iterate this part of the service to create a pattern that works for mandated paperless services like theirs.

On the confirmation page the user is presented with an Income Tax Reference number, it was unclear why the user was being given this. The service team mentioned that it is a useful identifier if the user forgets their Government Gateway login details. It’s not clear how this relates to the user journey – if there is no evidence for this we would recommend removing it.

Analytics

The team employs technologies such as Google Analytics, Kibana and Grafana to monitor and capture performance data from the service and users interactions. They also run ad-hoc customer surveys at the end of the service journey.

The team have had their performance dashboard allocated on the government performance platform (https://www.gov.uk/performance/report-income-expenses-quarterly), however they are not currently publishing the four mandatory KPIs, but only the number of total transactions.

The team should work out how to measure the four mandatory KPIs and report them regularly to the performance platform in beta.

Recommendations

To pass the next assessment, the service team must:

  • To research whether users of assistive technology are able to succeed in their own environments using their own tools.
  • To research whether users who lack digital skills or confidence or skills are able to succeed, how they interact with support channels, and what adjustments to the digital interaction or support model promote success.
  • Capture the four mandatory KPIs and regularly report them to the Performance platform

The service team should also:

  • Research whether users, when exposed to the Tax Reference Number (that lets them continue to provide updates when Government Gateway credentials aren’t available), retain this number and what may ensure that they are able to retrieve it when they need it.
  • The ‘Go paperless’ step in the journey is using pages from another service and the team is continuing to see users having difficulty with this step, as it presents a false choice to the user.
  • Automate testing and deployment as much as possible and establish operational support model as required by business needs
  • Consider the use of Verify as an option for identity assurance for the appropriate types of users. Contact the GDS Engagement team and discuss if and how Verify can be introduced to the service.

Digital Service Standard points

Point Description Result
1 Understanding user needs Met
2 Improving the service based on user research and usability testing Met
3 Having a sustainable, multidisciplinary team in place Met
4 Building using agile, iterative and user-centred methods Met
5 Iterating and improving the service on a frequent basis Met
6 Evaluating tools, systems, and ways of procuring them Met
7 Managing data, security level, legal responsibilities, privacy issues and risks Met
8 Making code available as open source Met
9 Using open standards and common government platforms Met
10 Testing the end-to-end service, and browser and device testing Met
11 Planning for the service being taken temporarily offline Met
12 Creating a simple and intuitive service Met
13 Ensuring consistency with the design and style of GOV.UK Met
14 Encouraging digital take-up Met
15 Using analytics tools to collect and act on performance data Met
16 Defining KPIs and establishing performance benchmarks Met
17 Reporting performance data on the Performance Platform Met
18 Testing the service with the minister responsible for it Met
Published 30 July 2018