Quality management approach
Updated 28 April 2026
Introduction
Quality is one of the core principles of the Code of Practice for Statistics. The Code states that producers of official statistics should:
- prioritise quality
- be rigorous
- be open about quality
Statistics producers in government need to ensure they manage the quality of their outputs effectively. This document sets out our approach to quality management (QM) in relation to the production of official statistical in DESNZ. A QM approach sets out how quality is built into processes from the outset, rather than checked only at the end. It focuses on creating the right environment, behaviours and controls so outputs are reliable, fit for purpose, and continuously improved.
QM is fundamental to the production of official statistics. Good QM reduces the risk of errors in statistical products, which improves efficiency, credibility and value. However, it is not purely a mechanistic approach, it relies on a “culture of curiosity”.
A culture of curiosity as part of QM
At DESNZ we promote an overarching culture of curiosity, encouraging individuals to go beyond routine quality checks and truly engage with their data. When statisticians are interested in understanding the stories their data tell and the context in which they are used, they are more likely to spot anomalies, question unexpected results, and seek out the root causes of those results rather than accepting figures at face value.
Curiosity throughout the data cycle
To embed this culture, statistics producers must consider three core questions throughout the data cycle:
- Does the data make sense?
- Does the process make sense?
- Does the story make sense?
Five pillars
Our QM approach is supported by five pillars:
- capability
- automation
- documentation
- communicating quality
- governance
Capability
We ensure that teams producing official statistics are resourced with appropriately skilled and experienced staff, so that quality assurance (QA) is planned, carried out and evidenced throughout the production cycle. Line managers ensure QA responsibilities are clear within each team, and that staff have the time and support needed to complete independent checks to the required standard.
New team members are inducted into the team’s specific QA procedures and tools early, so they can contribute safely and consistently from the outset. We actively maintain capability through multiple training routes, including central Analysis Function quality training, DESNZ statistics community sessions focused on quality, and our in-house Energy Statistics Improvement Programme (ESIP) training programme. ESIP provides structured learning and tailored technical support to help teams introduce Reproducible Analytical Pipelines (RAPs), strengthen reproducibility, and reduce reliance on manual processes. We also promote knowledge-sharing and handover within teams to protect continuity and maintain standards.
Automation
We build automation and reproducibility into our statistical processes to reduce the risk of human error and to make QA easier to run, repeat and evidence. Teams are encouraged to use scripted workflows, automated validation and sense checks, and version control so that problems are flagged early, changes are traceable, and outputs can be reliably recreated from source data. Automation is applied proportionately to the size, complexity and risk of the production process, but teams are expected to move away from manual steps over time and adopt robust automated approaches where feasible.
Some DESNZ official statistics include long time series and, as a result, some energy data are held in legacy systems, including spreadsheets. We actively manage this risk through the ESIP, which prioritises and supports the transformation of existing processes into Reproducible Analytical Pipelines (RAPs) where this is feasible and appropriate. For new statistical series, RAP principles are applied by default from the outset, so that reproducibility, automated QA and clear audit trails are designed in rather than retrofitted later.
Documentation
Good documentation underpins transparency, reproducibility, and effective peer review. Clear documentation also supports handover, continuity, and trust in the outputs.
Our QM approach is supported by the DESNZ protocol on ‘Documenting the quality assurance (QA) of official statistics’. QA covers all procedures that aim to ensure quality requirements are met and that problems are anticipated. It does not mean checking only the final outputs. It should apply at all stages in the process of producing statistics for publication or internal use. This means rigorous QA relevant to the team should be in place for:
- any specification sent to data suppliers
- any raw data received
- any code or formulae used to process the data
- any datasets produced by transforming or linking raw data
- the statistics produced for publication
- the interpretation and presentation of results
The protocol mandates that every statistical publication in DESNZ has the following documentation:
- a documented QA process
- a clearance statement signed off by a senior analyst before publication
- a QA feedback process to record and learn from mistakes and near misses
Given the range of data and outputs published by DESNZ, we do not specify a list of QA steps that all teams should perform. However, we do mandate that all figures that are published as official statistics have been independently checked by at least one other person, who was not responsible for producing them.
Any QA steps should be designed by teams and be proportionate to the size, complexity, risk and impact of the publication or analysis that is being undertaken.
We also maintain a central error log to record any errors in published statistics and ensure mitigations are put in place and relevant lessons are shared across the statistics community.
Communicating quality
We are open and transparent about the quality of our outputs, so users can understand the strengths, limitations and appropriate uses of the statistics. Every official statistics release includes technical information on the data sources and methods, and clearly sets out any known quality issues, limitations or changes that users need to consider. The level of detail is proportionate to the complexity and risk of the series: for simpler outputs (for example, some statistics on government energy efficiency schemes) quality information may be provided within a dedicated tab alongside the data, while more complex outputs (including greenhouse gas emissions statistics, fuel poverty statistics and the Public Attitudes Tracker) are supported by detailed Technical Reports. Where appropriate, we also publish estimates of uncertainty (for example for greenhouse gas emissions statistics) to support correct interpretation and use.
We actively signpost quality considerations where users will see them, for example using a prominent ‘Things you need to know’ box at the start of releases; clear notes alongside tables; and explicit flags where findings rely on small samples or are subject to greater uncertainty. Where errors are identified after publication, we correct them promptly and make the nature and impact of the correction clear in line with our Revisions Policy.
Governance
DESNZ official statistics are overseen through a clear governance structure with defined responsibilities and sign-off routes. All releases are independently verified, and a senior statistician signs off publications to confirm that QA has been completed and that the statistics are ready for release. The Senior Statistics Board regularly reviews the error log to identify recurring issues or emerging risks and agree actions to strengthen controls and prevent reoccurrence. The Board also shares good practice on quality and cross-team working to reinforce consistent standards and embed this QM approach across the statistics community.
Ongoing review
Ongoing review is an important element in driving quality. Curiosity is also supported when teams have regular, safe forums to challenge results, share anomalies, and learn from issues identified. At DESNZ, this includes:
- ‘lessons learned’ sessions after key publications, where teams are encouraged to reflect on what went well and identify areas for improvement
- knowledge sharing through various fora including our regular statistics community meetings and statistics producer meetings
- ‘curiosity’ sessions, particularly for new or complex statistics, including subject matter experts to inform understanding
- peer review
In addition, we recognise that the environment within which we operate is not static. We are mindful of the risks that arise from change (to people, systems, data, methods etc) and the need to have increased scrutiny in place to address changing risks.
Our review processes also incorporate input from our users to ensure our statistics remain relevant (see our Public Involvement and Engagement Strategy).