The report from the alpha assessment for DVLA's Webchat service on 4 November 2015.
Department / Agency:
Date of Assessment:
Result of Assessment:
About the service
The webchat tool will allow users to interact online with a DVLA advisor to assist them in their online transaction or to provide information to answer an enquiry. The tool will offer live, real-time responses.
Outcome of service assessment
After consideration, the assessment panel has concluded that the DVLA Webchat tool is on track to meet the Digital Service Standard at this early stage of development.
DVLA webchat is the first tool (rather than citizen facing service) to be assessed by Government Digital Service using a service standard assessment. This made the assessment somewhat unusual for both the assessment panel and the service team - particularly as the team’s focus is on integrating a proprietary tool from the supplier into DVLA services. For this reason the panel was particularly interested in how supplier/tool agnostic the approach was - in other words how easy it would be to swap out one webchat tool supplier for another in the future if required. Whilst the panel still retains some concerns, the general approach is satisfactory.
A comprehensive list of recommendations are set out below to guide the team as it prepares for and develops the beta phase of the tool’s development. The service team must work to meet the recommendations set out below before the beta assessment.
The team demonstrated a good grasp of the context of the problem, and volumes of incoming calls which they aim to address with the webchat tool.
The team have used a range of research techniques so far, including focus groups, surveys, lab-based research and contextual research. They have successfully identified several key groups of users who the tool is aimed at, and particularly picked up on its benefits for deaf users. The panel was encouraged by the team’s identification of the contexts in which users will engage with the tool.
The survey itself presents cause for some concern and is by its nature leading. It offers only 3 options, and should be viewed as confirming business need, rather than establishing a user need. Although out-of-scope for the webchat tool, the real research question should be why users can’t find what they need online and have to resort to further contact in the first place.
The team showed solid user personas including web usage and support needs and demonstrated an ability to separate true user needs from business needs.
The team presented a comprehensive plan for research up to April 2016 and has identified a need to research what answers customers expect and require.
The team demonstrated how paper prototypes were used to get an initial understanding of the problems and to explore what users expect. The team has tested a variety of approaches for the call-to-action to initiate the chat tool. However, these have all relied on an icon or imagery. Different designs have been mocked-up for the chat interface, although these haven’t been worked into the prototype or tested with users yet.
In beta the team need to interrogate the chat interface in a lot more detail; relying on the default behavior of the supplier’s solution may not be the best approach to meet user needs.
The current structure and remit of the team appears to be set up to deliver a programme of work, including the webchat, rather than the tool specifically. The panel would recommend splitting the larger team into teams focused on specific products. This will allow the team to develop the product at a cadence suitable for them rather than a wider programme. The panel does not believe that a common cadence will work in the beta phase.
Similarly the current approach to sprints and ceremonies should be changed to support the development of the tool. Whilst the team is nominally working on four week sprints (which the panel believes are too long for the current development phase), two weekly showcases and retrospectives suggest that two weekly sprints would be more appropriate. Rationalising a smaller team to focus specifically on the integration of the tool will allow the team to build cadence more appropriately.
Whilst the team were able to indicate that they had considered the possibility of changing to an alternative provider of webchat services in the future, the panel will want to be satisfied at beta assessment that this approach is truly platform-agnostic and capable of being easily adapted to an alternative solution provider. A workable migration strategy needs to be developed, demonstrating how platform agnostic the approach is, and how easy it would be to swap out one solution for another.
Whilst the team has demonstrated a solid basis of user needs, the panel recommends the team reworks these into recognisable user needs - the importance being that people on the team (and wider stakeholders) need to be able to see what real people really say and why they say it.
Further to this, the panel would not recommend using the survey as proof of user need for reasons stated previously.
For the next phase the team is recommended to:
- Engage with the internal users who will be answering webchat queries. This is a large scale change for the people whose day job will be to answer the queries.
- Research what happens when webchat is not available. It’s important to gain an understanding of what users expect to happen if they are unable to use the tool.
- Address what happens if a problem can’t be solved in chat, looking at how long this process takes before handoff, and what the user reaction is when this happens.
Although the team had tested many variations of the call-to-action, it had not been tested without an icon or imagery. It is recommended that the team test a text-only call-to-action as this follows the advice in the service manual.
As variations on the chat interface have not been tested it is recommended the team test different ways of accessing and interacting with the chat agent.
Opening new windows to display the chat interface is not recommended as this introduces usability issues, especially with users with low digital skills, or users on mobile or assistive technology devices. If the team decide to use this route there must be strong evidence demonstrating that users of these devices are succeeding to use the tool.
The chat interface will need to be strenuously tested for accessibility issues, over a variety of assistive technologies. The team has already worked with the Deaf Association and should continue to work with similar groups to observe users with different needs and abilities using the tool.
Only services hosted on GOV.UK are permitted to use the crown and typeface. Users need to be able to verify that they are talking to government through trusted channels, so it is recommended that the service is accessed via a service.gov.uk subdomain.
The webchat solution is based on a product that has been acquired through the Digital Marketplace after various options were considered during discovery. The product is not open source and does not use open standards for data interchange. As such, suitable diligence during all phases of the project is recommended to avoid being trapped by vendor lock-in. DVLA needs to prove that business critical data can be exported without loss and in a useful way that could be used to transition customer contact histories to another solution before going into a large scale beta with real users. This recommendation is vital if other projects in the call centre also begin integrating with the supplier as part of a unified Customer Relationship Management (CRM) effort within DVLA.
Since the tool’s back-end is provided by the supplier’s platform, minimal software development has been required in-house to integrate the tool into DVLA’s service flows. However, some customisation has been performed, largely by the supplier’s team working with DVLA. The DVLA team assures us they own the intellectual property and that some supplier development capability will be built up in house to take ownership of these changes. The panel expects this to happen during beta.
Customisations to the platform, and changes to markup and styling of the chat window are made via the supplier’s admin interface. This raised some concern around the level of version control available and testing that could occur, to ensure bugs or regressions aren’t introduced by any changes or supplier’s regular release process. The team stated a testing environment was available, and that manual testing takes place on each new release. We recommend the service team look at approaches to test the chat window functionality in an automated way.
The service team intends to publish supplier customisations, metadata, markup, and styles on GitHub where possible for other teams to re-use if they need a similar chat solution. The panel expects this to happen before beta assessment.
The assessors noted that the pop-up chat window is currently not served by a GOV.UK service domain URL. This raised some concern around user trust, and whether the New Transport typeface and the crown logo could be used were this to remain the case. DVLA should approach the supplier and investigate if a custom domain ending in service.gov.uk can be used to serve the chat window (such as via a CNAME in DNS) and if not, confer with GDS around correct use of these assets and branding.
Digital Service Standard criteria
Published: 27 January 2017
Assessment date: 4 November 2015