Guidance

Quality assurance maturity model

Updated 25 September 2025

Applies to England

This quality assurance (QA) maturity model supports government departments and relevant arm’s-length bodies to assess and improve QA practices in line with the Aqua book guidance.

Who this publication is for

The model applies to all government analysis carried out by anyone within or on the behalf of the government, whether they are an analyst or not. It applies to:

  • departments and arm’s-length bodies
  • decisions on policies, project delivery and operational services
  • internal or external publications and as an integral part of internal decision making
  • any analytical methodology or technique used

The purpose of the model

The maturity model was designed to help continuous improvement within and across government organisations.

It supports the Department for Education (DfE) to:

  • assess its relative strengths and weaknesses across the management of quality assurance in relation to analysis
  • set direction and priorities within the constraints of limited resources

The model:

  • is designed considering Aqua book guidance and the assessment framework against the Analysis Functional Standard
  • sets out different levels of maturity against the most important aspects of QA in relation to analysis so that senior leaders can understand the overall performance across QA in their organisation
  • highlights a suggested minimum level of good practice and encourages improvement beyond that baseline

The model does not provide specific actions for people to carry out or say what is ‘good’ or ‘bad’ within a given context.

Figure 1: assessment framework

From the 'Guide to continuous improvement against functional standards' published by Cabinet Office.

                                                                  

The structure of the model

The assessment framework is designed to indicate how well an organisation is doing using assessment criteria across the following themes and practice areas:

  • themes - the overall topic being addressed in that section of the assessment framework
  • practice areas - each theme has 3 practice areas with an overall statement of what is expected
  • assessment criteria - each practice area is supported by a number of criteria that denote developing, good, better or best practice

Criteria help to define what is happening in an organisation and:

  • should be observable in practice and backed up by evidence
  • are written to make intuitive sense to readers and may be adapted to suit to the context of the organisation

Each practice area is supported by benchmark measures to use alongside the assessment criteria to support objectivity.

Assessing an organisation

The role of the chief analyst

We recommend that the chief analyst decides who should be involved in assessing the organisation and the frequency of assessments. This may include several individuals or teams to ensure the assessment is objective and useful and can also involve non-analytical stakeholders and policy colleagues.

We recommend conducting annual assessments, but organisations can adapt this to their own needs.

The chief analyst is responsible for selecting participants to ensure an effective assessment. This should include a range of perspectives to maintain objectivity. The process benefits from involving both analytical and non-analytical voices.

Maturity levels

Sometimes, an organisation may fall between 2 maturity levels. If it meets some but not all criteria for a higher level, assessors should take a cautious approach and record the lower level.

Assessors should aim to gather diverse views and remain impartial. The maturity model is a tool for continuous improvement, not for appearances. The goal is to ‘be good’ rather than just ‘look good’.

Assessors should decide with the chief analyst which maturity level is the ‘target’ for each practice area.

All organisations should aim to meet ‘good’ at a minimum. Beyond this level, the criteria may not be necessary or realistic for the organisation to achieve, requiring a level of resource that is disproportionate to the assessed risk.

Summary table examples

Example summary tables showing the practice areas under each theme of the maturity model assessment framework.

Table 1: knowledge and skills

Practice area Developing Good Better Best
Formal training for analysts   Current Target  
Formal training for non-analysts     Current Target
Skills Networks   Current Target  

Table 2: guidance and tools

Practice area Developing Good Better Best
Quality assurance framework   Current Target  
Quality assurance tools   Current Target  
Central reporting       Current

Table 3: practice and assurance

Practice area Developing Good Better Best
Quality assurance in practice       Current
Organisational governance     Current Target
Local governance   Current Target  

Recording the assessment, evidence and benchmark measures

Record the ratings alongside evidence to support each rating. You can contact the DfE QA team for a template to record evidence and assessments at: education.qateam@education.gov.uk.

Use existing internal data where possible. If this data isn’t available, consider using alternative evidence or give a ‘developing’ rating to reflect the evidence gap.

Use benchmark measures to support objectivity

These can help assessors quickly narrow down ratings. Benchmarks might be yes or no questions. For example:

  • Is QA training available?
  • What proportion of analysts complete QA training in their first year?

Each practice area includes a suggested benchmark, and organisations can add their own to support the assessment.

Using the output of an assessment

A completed assessment will:

  • set direction and priorities to address the perceived weaknesses in the quality assurance of the organisation
  • help identify and share good practice and input to continuous improvement activity

It is designed for internal use with government departments and arm’s length bodies to prompt an open discussion around QA in relation to analysis. Individual completed assessments are not intended for publication, but summary information of the assessment should be returned to the chief analyst and other relevant senior leaders.

The assessment will provide an assessment of a developing, good, better or best rating against each of the 9 practice areas.

The ratings indicate the following:

Developing

The requirements are recognised but not fully implemented. Initial processes are being established to address basic needs, but there is limited consistency and structure.

Good

The requirements are implemented, though with less emphasis of best practice. Basic processes are in place to acknowledge requirements and achieve basic outcomes. The aim is to achieve minimum requirements.

Better

The requirements are fully implemented. The activities and outcomes are clearly defined and clearly owned. Standard processes are in place and are improved through internal review. Limited processes to enable continuous improvement against external best-practice are in place.

Best

There is an advanced and forward-looking implementation of the requirements. Processes are cutting-edge and based on external best practices. External experts are used to leverage expertise. Regular monitoring and self-assessment are in place, along with processes to adapt quickly to new practices and requirements.

Theme 1: knowledge and skills

Do all staff have the required awareness and capability to carry out effective QA?

This theme focuses on the practice areas of:

  • formal training for analysts
  • formal training for non-analysts
  • skills networks

Practice area: formal training for analysts

This area focuses on ensuring that those undertaking analysis are provided with the necessary knowledge around QA and the skills to carry it out effectively.

Developing

There is no QA training offer available for analysts or signposting for supporting guidance. Analysts remain unaware of what makes analysis business critical, and their responsibilities regarding the identification, QA and governance of such analysis.

Good

A QA training offer is available and is undertaken by some analysts. Some analysts are aware of the department’s definition of business critical analysis and their responsibilities regarding the identification, QA and governance of such analysis. Relevant guidance exists but is not easily accessible to analysts.

Better

A QA training offer is available, and most analysts complete the training within 12 months of joining the department. The training covers the department’s basic QA requirements and elements of QA in practice including how to work with non-analysts. There is training for senior analysts to understand their role in ensuring effective QA. Relevant guidance is easily accessible to analysts.

Best

A QA training offer is available and updated regularly. The department actively develops and improves its training offer based on feedback and collaboration with other government departments to exchange ideas and best practice. Analysts refresh their knowledge of QA on a regular basis. Relevant guidance is easily accessible to analysts and is updated in line with changing standards.

Suggested benchmark measure:

How often analysts undertake QA training:

  • developing: never
  • good: some analysts (less than 60%) complete QA training when joining the department
  • better: more than 90% of analysts complete QA training when they join the department. Some analysts complete refresher training on QA
  • best: more than 90% of analysts complete some form of QA training at least once every 24 months

Practice area: formal training for non-analysts

This area focuses on ensuring that all non-analysts understand the importance of analytical QA. It aims to educate them on how they can and should support QA processes and how to make best use of the information resulting from effective QA provides.

Developing

There is no QA training offer available for non-analysts or signposting for supporting guidance. Many non-analysts remain unaware of the department’s definition of business critical analysis or when to draw on analytical support.

Good

A QA training offer or guidance is available for non-analysts. Some staff who work closely with analysts are aware of the department’s definition of business critical analysis and their own responsibilities regarding the identification and governance of such analysis, including when to draw on analytical support.

Better

A QA training offer or guidance is available for non-analysts. Some staff who work closely with analysts are aware of the department’s definition of business critical analysis, and their own responsibilities regarding the identification and governance of such analysis. This includes when to draw on analytical support. The training also helps staff understand their role in proportionate QA including how to quality assure simple analysis effectively. Specific training is available for staff in relevant roles, for example commissioners of business critical analysis.

Best

A QA training offer and guidance is available for non-analysts and updated regularly. All staff who work closely with analysts are aware of the department’s definition of business critical analysis and their own responsibilities regarding the identification, governance, and QA of such analysis, including when to draw on analytical support. The training also helps staff understand their role in proportionate QA including how to quality assure simple analysis effectively. Specific training is available for staff in relevant roles and all staff in these roles have done this training within 6 months of being in post.

Suggested benchmark measure:

A training offer aimed towards non-analysts working near analytical work is available:

  • developing: no training offer available
  • good: yes, but take up is low and less than 40% of staff in relevant roles have undertaken training
  • better: yes, some staff in relevant roles have undertaken training (above 40%)
  • best: yes, all staff in relevant roles have undertaken training within 6 months of being in post

Practice area: skills networks

This area focuses on communicating knowledge, raising awareness and enhancing skills at the grassroots level and throughout government. It involves providing sufficient support for all staff and creating forums to address any issues that arise or to share developments in best practice.

Developing

There are no internal networks for maintaining and improving QA. Collaboration on QA across business areas is informal and ad hoc. The department is not represented in cross-government QA networks.

Good

The department has a network or forum with membership from across analytical areas. The network is used to share essential information on QA, provide feedback to the central QA team, and help staff address local QA needs. Members of the central QA team regularly attend cross-government QA networks meetings.

Better

The department has at least one network, or forum, with membership from across analytical areas. The networks are used to share essential information on QA, provide feedback to the central QA team, and help staff address local QA needs. They can also provide support and develop skills and best practice within the department. Regular activities are carried out to ensure skills are maintained and championed. Members of the central QA team regularly attend and contribute to cross-government QA networks meetings.

Best

The department has many networks, or forums, with membership from across analytical areas. The networks are used to share essential information on QA, provide feedback to the central QA team, help staff address local QA needs, and support and develop skills and best practice within the department.

Networks are self-sustaining, meeting regularly to identify and resolve issues and there is a mechanism to get direct QA support to carry out independent verification of analysis when needed.

Members of the central QA team regularly showcase the department’s work at cross-government government network meetings. They connect internal teams to external expertise from the cross-government networks, actively supporting the development of best practice across government.

Suggested benchmark measure:

A QA specific network or group forum that meets at regular intervals to maintain and improve QA is available:

  • developing: no, there is no QA specific network or group forum
  • good: yes, there is a QA specific network or group forum, but it does not meet at regular intervals
  • better: yes, there is a QA specific network or group forum, and it meets at regular intervals
  • best: yes, there is a QA specific network or group forum that meets at regular intervals and connects colleagues for QA support

Theme 2: guidance and tools

Are the necessary information and means to measure effective QA available?

This theme focuses on the practice areas of:

  • QA framework for analysis and supporting guidance
  • tools for performing and documenting QA
  • central reporting on QA

Practice area: QA framework for analysis and supporting guidance

This area focuses on establishing comprehensive and clear guidance on QA requirements, ensuring that anyone conducting any form of analysis understands the set standards.

A QA framework sets out the responsibilities and expectations around quality assuring analytical work in the context of the department.

Developing

There is no mandated QA framework in place.

Good

A QA framework is in place which takes cross-government guidance from the Aqua Book and applies it to the context of the department. Senior leaders have approved the QA framework. Using the QA framework is mandatory.

Better

A QA framework is in place which takes cross-government guidance from the Aqua Book and applies it to the context of the department. The QA framework also includes a definition of business critical analysis and signposts to supplementary guidance on the requirements for different types of analysis. Senior leaders have approved the QA framework. Using the QA framework is mandatory

Best

A QA framework is in place which takes cross-government guidance from the Aqua Book and applies it to the context of the department. The QA framework also includes a definition of business critical analysis and signposts to supplementary guidance on the requirements for different types of analysis.

The QA framework has been approved by senior leaders and is regularly reviewed to ensure it is fit for purpose. Using the QA framework is mandatory. There is a feedback mechanism in place for the framework. There is guidance how to manage risks associated with Artificial Intelligence (AI) in analysis and QA.

Suggested benchmark measures:

A department-mandated QA Framework for analysis is in place:

  • developing: no QA Framework is in place
  • good: yes, a department-mandated QA Framework is in place which has been approved by senior leaders
  • better: yes, a department-mandated QA Framework is in place which covers different types of analysis and has been approved by senior leaders
  • best: yes, a department-mandated QA Framework is in place which covers different types of analysis. It has been approved by senior leaders and is regularly reviewed

The department regularly signposts QA-related training or guidance to help staff fulfil the responsibilities outlined in the QA Framework:

  • developing: there is no training or guidance available
  • good: yes, guidance and training are available but not signposted in the QA Framework
  • better: yes, guidance and training are available and signposted in the QA Framework
  • best: yes, guidance and training are available, are signposted in the QA Framework and regularly signposted in other communication channels

Practice area: tools for performing and documenting QA

This area focuses on enabling staff with the tools needed to effectively conduct and document their QA activities.

Developing

There are no tools for analysts in the department to record QA activity in high impact or complex pieces of analysis.

Good

A standard record for QA activity is in place, indicating which QA checks are mandatory for all analysis. QA guidance describes other documentation to record and manage governance of the QA process, including having a QA plan to ensure relevant activities are conducted and appropriate levels of sign-off are obtained.

Better

A standard record for QA is in place, indicating which QA checks are mandatory for all analysis. Examples or templates for mandatory documentation for business critical analysis or models are available, including a template for a QA plan to ensure relevant activities are conducted and appropriate levels of sign-off are obtained. QA guidance also describes how teams can document QA proportionately for lower impact, ad hoc or urgent pieces of analysis.

Best

A standard method for recording QA is in place, indicating which QA checks are mandatory for all analysis. Examples or templates for mandatory documentation for business critical analysis or models are available, including a template for a QA plan to ensure relevant activities are conducted. Other tools are available for use on lower impact, ad hoc or urgent pieces of analysis. Tools guide a range of aspects of QA across the analytical cycle from commissioning and designing the analysis to sign-off and delivery or publication.

Suggested benchmark measure:

A departmental standardised method for recording QA evidence is available:

  • developing: there is no standardised method for recording QA evidence
  • good: yes, there is a standardised method for recording QA evidence
  • better: yes, there is a standardised method for recording QA evidence including templates for other forms of documentation
  • best: yes, there is a standardised method for recording QA evidence, including templates and examples of other forms of documentation

Practice area: central reporting on QA

This area focuses on collating information centrally to support strategic decision-making on the department’s QA programme.

Developing

No established method for reporting on QA activity across the department.

Good

Important information on significant QA-related activities is summarised, including data on business critical analysis or business critical models (BCMs). The data is reported and used by senior leadership to set priorities around QA.

Better

Important information on significant QA-related activities is summarised, including data on BCMs. This QA information is accessible at various levels within the department, allowing targeted efforts to improve QA or share best practices. Senior leaders effectively use reports on QA and BCMs to hold discussions with their peers and colleagues to set priorities around QA.

Best

Important information on significant QA-related activities is summarised into a variety of customer-focused reports. These reports are accessible, available for staff at different levels and roles and allow teams to identify issues and drive improvement across the department. Senior leaders effectively use reports on QA and BCMs to hold discussions with their peers and colleagues to set priorities around QA. The department references important outputs within its annual accounts statement and uses these reports to demonstrate best practices across government. They also provide reassurances to ministers and external users, but also remind them that some risks will remain. BCMs are regularly assessed for compliance and the effectiveness of their QA.

Suggested benchmark measure

An internally well-maintained list of BCMs or evidence confirming their absence is available:

  • developing: no, there is no list
  • good: yes, there is a list
  • better: yes, the list is regularly maintained
  • best: yes, the list is regularly maintained and used to identify high risk areas for QA in the department

Theme 3: practice and assurance

Are the right business structures and working practices in place to ensure effective QA?

This theme focuses on the practice areas of:

  • using and recording QA
  • organisational governance
  • local governance

Practice area: using and recording QA

This area focuses on ensuring QA guidance is followed and that staff are supported to achieve this.

Developing

QA activities are incidental and defined on an ad hoc basis. Some evidence of activities is recorded as recommended by the department’s QA framework. Business critical analysis undergoes some QA, but this is inconsistent.

Good

A proportionate level of QA is conducted and evidence of this is recorded as recommended by the department’s QA framework. QA activities extend beyond the verification of formulae, code and calculations to include supporting documentation and ensuring the validity of the analytical approach. Business critical analysis undergoes a higher standard of QA, including a QA plan which is agreed upon in the early stages of the analytical cycle.

Better

A proportionate level of QA is conducted, striking an appropriate balance of maintaining high standards without compromising on quality. QA activities extend beyond the verification of formulae, code and calculations to include supporting documentation and ensuring the validity of the analytical approach. Interested parties in the analysis have a comprehensive understanding of QA requirement, with clearly defined ownership of activities and outcomes. Evidence of this is recorded as recommended by the department’s QA framework, and analysts can clearly communicate the balance between efficiency and quality to other analysts and non-analysts, actively seeking feedback. Standard processes for business critical analysis are established and regularly reviewed. Business critical analysis undergoes a higher standard of QA, including a QA plan which is agreed upon in the early stages of the analytical cycle.

Best

A proportionate level of QA is conducted, maintaining high standards without compromising on quality. QA activities extend beyond the verification of formulae, code and calculations to include supporting documentation and ensuring the validity of the analytical approach. Interested parties in the analysis have a comprehensive understanding of QA requirement, with clearly defined ownership of activities and outcomes. Evidence of this is recorded as recommended by the department’s QA framework. Analysts can clearly communicate the balance between efficiency and quality to other analysts and non-analysts, actively seeking feedback. All residual risks are discussed and documented. There is either implicit or explicit sign off by an approver and owner of the plan and the outputs.

Standard processes for business critical analysis are established and regularly reviewed. Business critical analysis undergoes a higher standard of QA, including a QA plan which is agreed upon in the early stages of the analytical cycle. Those working on business critical analysis actively share best practice in QA and proactively seek feedback on their work.

Suggested benchmark measure:

QA activities are recorded depending on the complexity and risk associated with the analysis:

  • developing: no, QA activities are rarely recorded
  • good: yes, QA activities are recorded, but are rarely proportionate to the complexity and risk associated with the analysis
  • better: yes, QA activities are recorded, and are sometimes proportionate to the complexity and risk associated with the analysis
  • best: yes, QA activities are recorded, and are often proportionate to the complexity and risk associated with the analysis

Practice area: organisational governance

How the wider department, for example for example senior and analytical leadership, is set up to support QA best practice.

Developing

No or limited departmental oversight of QA, with considerations only within the broader context of risk management. There is minimal or no oversight of business critical analysis, and analytical leadership and senior civil servants do not regularly engage in discussions about QA.

Good

Senior leaders occasionally discuss QA and coordinate and define improvements in QA activities across the department. Clear governance structures are established for business critical analysis. These structures align with cross-government guidance such as the Orange Book and Aqua Book. These include discussions about the QA status of the work, any remaining risks, and documented evidence of sign-off by the agreed commissioner of the work.

Better

There is a dedicated forum such as a steering group for senior leaders to regularly discuss QA, coordinate, and define improvements in QA activities across the department. These structures align with cross-government guidance such as the Orange Book and Aqua Book. Senior leaders feedback into their areas from the forums, taking clear, appropriate steps to ensure compliance and best practice within their areas. Senior leaders enquire about QA as a matter of course with their staff.

Best

Forums for discussing QA are not limited to a steering group but include governance structures across all analytical work. These governance structures align with and improve upon, cross-government guidance, such as the Orange Book and Aqua Book. Processes designed to ensure high-quality QA such as steering groups, BCM Boards and deep dives are used to identity common risks and best practice across the department, and there are mechanisms to allow these to be shared. Senior leaders take appropriate steps from these to ensure compliance and best practice within their areas. Senior leaders enquire about QA as a matter of course with their staff to coordinate and define improvements and to ensure they are well used.

Suggested benchmark measure:

A forum where senior leaders provide direction to the wider departmental QA strategy is available:

  • developing: no, a forum for senior leaders is not available
  • good: yes, with limited engagement from senior leaders
  • better: yes, with some engagement from senior leaders
  • best: yes, with strong engagement from senior leaders

Practice area: local governance

How local teams are set up to support best QA practice, using central QA guidance and tools by default and adapting them to their local risks as required.

Developing

Local teams have minimal or no oversight of QA in their areas, and QA standards are not aligned with the department’s QA Framework. The central team has little oversight or understanding of business critical analysis in local areas.

Good

Local teams use central QA guidance by default. In some cases, local governance processes are established to meet the needs of the team. The central team has oversight of business critical analysis in the area as its QA status is reported to the central QA team.

Better

Local teams use central guidance and governance systems by default and sometimes offer feedback on centralised guidance. Occasionally, teams establish their own governance structures in a formalised local QA framework, designing approaches to QA that are proportionate and meet the needs of their team as well as the minimum standards set by the department. The central team has oversight of business critical analysis in the area as its QA status is reported to the central QA team.

Best

Local teams use central guidance and governance systems by default. They can feed back into centralised guidance and governance procedures, helping the QA team maintain the quality and relevance of their products. Teams feel empowered to establish their own governance structures, designing approaches to QA that are proportionate, and meet the needs of their team. They do this in consultation with the central QA team to help balance specialised guidance with consistency across the department. Teams also commonly share knowledge and best practice to other local areas in the department for example across common types of analysis, common inputs or common risks. The central team has oversight of business critical analysis in the area as its QA status is reported to the central QA team.

Suggested benchmark measure:

Local teams develop their own QA guidance:

  • developing: no, all teams use central guidance
  • good:  yes, teams use central guidance by default, but some develop their own QA guidance where appropriate
  • better: yes, teams use central guidance by default, but some develop their own QA processes and formalise them in guidance where appropriate
  • best: yes, teams use central guidance by default, but some develop their own QA processes where appropriate and feed back into centralised guidance

Glossary

Analysis

The collection, manipulation and interpretation of information and data for use in decision making.

Assessment criteria

Each practice area is supported by a number of criteria. Criteria help to:

  • define what is happening in an organisation – observable in practice, backed up by evidence
  • denote good, better or best performance

Business critical analysis

Business critical analysis:

  • influences significant financial and funding decisions
  • is necessary to the achievement of a departmental business plan, or where an error could have a significant reputational, economic or legal implication

Business critical models

These are quantitative or qualitative models that play a crucial role in decision-making processes, significantly influencing financial and funding decisions. Any errors in the models could lead to substantial reputational, economic, or legal consequences. As a subset of business-critical analysis, business critical models provide the structured framework and calculations necessary to support analysis.

Functional standard

Functional standards are government standards designed to provide a:

  • coherent, effective and mutually understood way of doing business across organisational boundaries
  • stable basis for assurance, risk management and capability improvement

Governance

Relationships and the distribution of rights and responsibilities among those who work with and in the organisation. Governance:

  • determines the rules and procedures that the organisational objectives are set
  • provides the means of attaining organisational objectives and monitoring performance
  • defines where accountability lies throughout the organisation

Individual assessor

Assessors are responsible for independently evaluating or reviewing the specific criteria, processes, or performances.

Organisation

In the context of government functional standards, ‘organisation’ is the generic term used to describe a government department, arm’s length body, or any other entity, which is identified as being within the scope of the functional standard.

Practice areas

Each theme has 3 practice areas. Each practice area has an overall statement about what is expected. A practice area might relate to one or more clauses in the functional standard.

Themes

A theme is the overall topic being addressed in that section of the assessment framework. The context and more information about the themes addressed can be found in the functional standard.