Digital Trade Finance Service Alpha Assessment Report

The service allows banks to provide financial support to UK exporters who need to fulfil a contract by submitting an application on their behalf to UKEF, whose advisors process those applications and deals.

Service Standard Alpha assessment report

Digital Trade Finance Service

From: Central Digital and Data Office
Assessment date: 25/08/20
Stage: Alpha
Result: Met
Service provider: Department

Previous assessment reports

N/A

Service description

The service allows banks to provide financial support to UK exporters who need to fulfil a contract by submitting an application on their behalf to UKEF, whose advisors process those applications and deals.

The service has been developed in 2 parts: the public-facing portal which is an online form that allows partner banks to submit Export Working Capital and Bond Support transactions to UKEF and the back-end workflow case management system, an internal system for managing the processing of new deals, facilities, and amendments to deals that have already been processed.

The public facing portal has previously passed an internal alpha service assessment that was undertaken by the Department for Business, Energy and Industrial Strategy (BEIS). This alpha assessment report is based on assessing the workflow case management system. The feedback takes the end-to-end service - the front-end and the case management system - into consideration as the team has been continuing to work on both elements simultaneously.

Service users

The public facing portal is used by relationship managers, makers, checkers and the operations team within banks.

The case management system is used by underwriters, credit risk analysts, export finance managers, post-issue management officers, inputters, approvers, and due diligence officers within UKEF.

1. Understand users and their needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has made good use of qualitative research (e.g. interviews, observations, workshops)
  • the team has developed a really strong understanding of user needs (e.g. developing detailed personas and user need statements), and have successfully identified pain points (for each user group), and prioritised them effectively
  • user research and identifying user need is at the heart of product development and is thoroughly understood by all team members
  • the team has created a service blueprint, user journey maps, and user experience maps
  • the team has made good use of usability testing and moderated testing to test new concepts and ideas in Alpha

What the team needs to explore

The team does not have a wide user base to test with and have identified a risk of research fatigue developing amongst their usability testers in Private Beta, as users feed into multiple research sessions. The team has also struggled to identify users with low digital confidence (all users they have tested with have been 6+ on the Digital Inclusion Scale) or accessibility needs.

Before their next assessment, the team needs to:

  • widen their user base, including identifying users with assisted digital needs and low confidence users. The team should focus on seeking additional users from DIT. An alternative way of expanding the user testing base is by seeking help through social media channels (e.g. UKEF twitter account has over has over 7000 followers)
  • develop a detailed user research plan for Private Beta, to include:

  • research methods (primary and secondary) and proposed approach to iterative user centred design
  • an accessibility plan. Testing with users with low confidence and accessibility needs
  • mapping success criteria to their users’ needs. Integrate quantitative analysis with their qualitative analysis (e.g. by tracking a user’s journey to help understand friction and pain points and comparing with user feedback)

2. Solve a whole problem for users

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team understand the constraints in delivering this service: shifting users from the existing service, prescribed technologies in the UKEF IT strategy and the service being underpinned by a legal document - the ‘Master Guarantee Agreement’ and made progress at working around some of them
  • the team has used service blueprints to map out the end-to-end service and to identify what terminology is used in every task
  • the scope of the service has been revalidated based on research with a wide range of users and potential users including exporters and legal and compliance teams
  • the team has spoken to various international government ECAs (Export Credit Agencies) to understand how they’re delivering digital services in this space
  • the team is sharing knowledge and discussing their design challenges with teams in DIT, Home Office, Homes England and HMCTS working on case management
  • the Service Owner is a member of the cross-government import/export service community
  • the engagement across the agency ensures all UKEF teams are moving along on the journey together

What the team needs to explore

Before their next assessment, the team needs to:

  • the team mentioned that one of their biggest challenges is to ensure that the users trust the data in the service as the single source of truth and are trying to mitigate this through conducting user research and co-designing with their users. The team should continue to do this into private beta to ensure successful adoption of the service
  • applications from banks that do not have delegated authority come through email and paper, meaning UKEF users need to re-enter the information into the old workflow tool. An extension in the scope of the public facing service for the banks would improve the end-to-end service significantly. The team should continue to re-assess if extending the scope to include those banks is possible and include it on their roadmap

3. Provide a joined-up experience across all channels

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has worked hard to find a balance between the underwriters and credit analysts, who, after multiple workshops, have conflicting expectations of the priority of the information that needs to be passed through the service for each case. The team will be testing further iterations of these screens and continue to iterate based on the feedback from those sessions
  • the team has and will continue to co-design the service with its users which include underwriters, credit risk analysts, export finance managers, post-issue management officers, inputters, approvers, and due diligence officers within UKEF. From this, the team is aware of the operational impact the service will have
  • users who need assisted digital support will be able to contact specific members of the service team using the phone, MS teams or email where they can be guided through the digital journey

What the team needs to explore

Before their next assessment, the team needs to:

  • It would be good to learn more about the ‘portal V2’ work that the team will be doing as part of the next phase of work. How might this impact the scope of work that the team is doing? Will they require additional design or research support to manage the additional workload?
  • Is there a need for a dedicated interaction designer to join the team to work with a frontend developer on the portal frontend accessibility improvements?
  • It would be good to understand whether or not the user’s reliance on offline channels is reduced over time, especially as the team starts to explore how to build more trust in the tool

4. Make the service simple to use

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • it is clear that the design disciplines are working together to make the service simple to use. The work on the service dictionary was particularly impressive in helping the team to understand the language used across the service
  • the team is using the GOV.UK design system as a starting point for design patterns and referring to other department design systems for case working design patterns were not found in the GOV.UK design system. If there are any new research insights around specific design patterns, the team should contribute back to the design systems
  • the team has highlighted areas for improvement from the user research that they have conducted and understand what to do next. Specifically to explore how to build trust in the users to be more confident with using data metrics and overwhelming interactions of the credit analysis form
  • the team have been iterating their designs based on user research and design work is a sprint or two ahead of development

What the team needs to explore

Before their next assessment, the team needs to:

  • the team should continue to iterate and improve the service as they learn more from users, especially around designing for trust and managing the cognitive load of pages like the credit analysis form. It would be good to see examples of how the team explores these areas of the design in the next assessment

5. Make sure everyone can use the service

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team are working with the accessibility network within DIT
  • the team are using design patterns from other design systems that are accessible by default
  • the team is examining the content of the service to make sure the language is more consistent for users

What the team needs to explore

Before their next assessment, the team needs to:

  • it would be good to understand if there are ways to measure specific tracking metrics that could help the team learn more from their users? This could be quantitative metrics or qualitative feedback loops built into the tool. This could help the team alleviate some pressure of their small pool of user research participants
  • to cater for future users of the case working tool who may have accessibility needs, it would be good for the service to undergo an accessibility audit to understand how well the service might perform for users with different accessibility needs
  • to cater for future users who may have low digital skills (lower than 5), it is recommended that the team do user research with users that have low digital skills

6. Have a multidisciplinary team

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • there is a very capable multidisciplinary team in place consisting of permanent civil servants and contractors who have adapted well to full-time remote working
  • the team is the first fully multidisciplinary team in UKEF, and by showing how service delivery can work with this team in place, is leading the culture change in UKEF from traditional to agile ways of working
  • the Service Owner is empowered and has introduced DDAT roles which are new to UKEF, including User Researcher, Service Designer and Technical Architect
  • the Delivery Manager has worked with the PMO function in UKEF to create and test a new agile reporting template. This aims to streamline the monthly budget reporting process by reducing the duplication of information and reducing the time it takes to do so
  • the same team will move into private beta and plans are in place to adapt the tie commitment for each role when needed

What the team needs to explore

Before their next assessment, the team needs to:

  • the team understands that a live service needs to be continuously iterated based on users needs and the team should continue to prove the value in UKEF funding service teams rather than projects to ensure that happens
  • continue to balance the end users’ needs with the business needs to ensure the service is successful

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is using various tools to communicate with each other but also with other teams in UKEF, such as Slack, Miro, JIRA and Confluence. The Delivery Manager has help other non-agile teams adopt some of those tools have learnt about them at the team’s UKEF-wide weekly drop in sessions
  • the team is working in sprints and using agile ceremonies such as show and tells to further share their work with a wider UKEF audience
  • the team has blogged about their work on the GDS blog and regularly share an internal blog post to help increase UKEF teams’ knowledge of agile service delivery and the experience of people in DDAT roles

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to take the organisation on this transformation journey. The team acknowledged that stakeholder management can be time consuming but recognises it is worthwhile to help promote the culture change within UKEF
  • on occasion, the team is expected to have further meetings with teams that are outside the formal governance process. The team should continue to challenge these if they’re not providing value to the service team or repeat information the team has already shared

8. Iterate and improve frequently

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the team review the scope of MVP against findings from each round of research and testing and use this to inform future iterations and designs of the service
  • the team is empowered to make decisions on their service
  • the team is showing how to iterate designs and prototypes to improve the service by hosting the departmental-wide show and tells at the end of each sprint which is helping to change the culture in UKEF

What the team needs to explore

Before their next assessment, the team needs to:

  • continue to use the same user-centric approach when the team begins to expand the service to include GEF, amendment requests and other casework

9. Create a secure service which protects users’ privacy

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • data encryption in transit and at rest and DDoS/WAF protection has been done
  • team has been able to externalise significant risk via SaaS and serverless services
  • team is managing the User accounts and passwords via OAuth
  • team is using vulnerability alerting and doing Pen testing for application risk

What the team needs to explore

Before their next assessment, the team needs to:

  • team should plan to do full DPIA, Security Risk Assessment and Pen Tests

10. Define what success looks like and publish performance data

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • there is a Performance Analyst from the Business Insights Centre (BIC) embedded on this service and working closely with the User Researcher, Service Designer and Business Analyst
  • the team has used a GDS template to create a performance framework which maps the users’ needs to the benefits, hypotheses and KPIs
  • the team will only measure metrics that will give actionable insights to improve the service and have pushed back on stakeholders to maintain that approach
  • in addition to reporting on the 4 mandatory KPIs, using Power BI and Qlik, the team are aggregating them to create an overall Customer Satisfaction Index Score that’s been included in the business plan and will be a key indicator for the organisation

What the team needs to explore

Before their next assessment, the team needs to:

  • use the data collected from the private beta to inform improvements to the service
  • continue to explore if there is a need to implement Google Analytics for the service

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has used Node.js to reduce technical diversity by replacing Drupal CMS used as a transactional system, File-shares and xml documents used as a messaging interface and K2 Forms workflow
  • team is planning to use MuleSoft for Manual for middleware for data translation to integrate with downstream systems
  • team is using GOV.UK Design system externally and internally (reusing MoJ addition), Progressive enhancement, Microservices, Serverless/fully-managed, API & integration first, Government APIs, Continuous Integration / Deployment using Github Action

12. Make new source code open

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has done good work with the use of the appropriate open source and open standards technologies
  • the team informed us that they are using the cloud, coding in the open on Github and using Github Actions: for Continuous Integration and Continuous Delivery
  • team is using OAuth for authentication and Mulesoft as middleware translation

13. Use and contribute to open standards, common components and patterns

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • team is using GOV.UK Design system externally and internally (MoJ style), Progressive enhancement
  • team is using API integration first, Government APIs, CH APIs, GOV.Notify
  • team has gone for the cloud solution with Azure using Azure App Service for utilising PaaS for containers. The team is using a Stateless computer for ephemeral containers

14. Operate a reliable service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the team has considered both short term and long term outages
  • For Short Term outages, the team has planned for redirect to a static site and displaying error page with contact details
  • for Long term outages, the team has planned for redirect to a static site with explanatory content and contact details including alternative options for submission for external users and potential fall back to current paper/document process for internal users
Published 25 September 2020