Official Statistics

Section 70 Weights and Measures Methodology and Quality report – updated September 2023

Updated 21 September 2023

1) Background to the statistics

Under Section 70 of the Weights and Measures Act 1985, Local Weights and Measures Authorities (LWMAs) in Great Britain have a statutory duty to report to the Secretary of State the level of local Weights and Measures inspection and enforcement work conducted over a twelve-month period. LWMAs submit this as an annual return to the Office for Product Safety and Standards (OPSS) part of the Department for Business and Trade (DBT).

Full details of the questions and categories used in the recording of activity carried out by LWMAs are available in the Returns Template.

Access the Returns Template

The data can be used to give a picture of inspection and enforcement work in Great Britain.

The data cannot be used to compare different LWMAs due to differing resourcing and prioritisation policies between LWMAs.

1.1 Publications

The data is used to produce an annual report which is published on the GOV.UK website. Collection of the data is a statutory requirement under the existing regulatory framework.

Access the annual reports

The report has been designated as Official Statistics.

1.2 Quality Assurance (QA) Process

Data is collected annually from LWMAs by operational delivery teams in OPSS. The collection follows the ONS data management principles, for example, following best practice, auditing changes and reviewing quality.

Access the ONS data principles

The quality assurance processes in place are focussed on the accurate capture of data, consistency of recording, and the accurate transfer of processed data into a range of publications and published tables. If the underlying data is inaccurate, then this has significant impact on all its potential uses. Data that are widely used and that inform important and high-profile decisions will receive the highest level of QA. Other data will undergo a more limited, but proportional level of QA. This ensures the data is fit-for-purpose in terms of the individual uses of the dataset.

In 2015 the Office for Statistics Regulation (OSR) published a regulatory standard for the quality assurance of administrative data.

Read the regulatory standard – OSR website

To assess the quality of the data provided for this release OPSS has followed that standard. The standard is supported with an Administrative Data Quality Assurance Toolkit which provides useful guidance on the practices that can be adopted to assure the quality of the data they utilise.

Access the Toolkit – OSR website

This section draws on that guidance, recognising that quality assurance of administrative data is more than just checking that the figures add up. It is an ongoing iterative process to assess the data’s fitness to serve their purpose. It covers the entire statistical production process and involves monitoring data quality over time and reporting on variations in that quality.

An assessment of the level of risk based on the Quality Assurance Toolkit is as follows:

  • Risk/Profile Matrix Statistical Series: LWMA activity statistics
  • Administrative Source: Excel template
  • Data Quality Concern: Low
  • Public Interest: Low
  • Matrix Classification: Low Risk [A1]

The publication of weights and measures inspection and enforcement activity statistics can be considered as low profile as the scope is limited to the enforcement work of LWMAs. There was more mainstream media interest in the subject in 2021-22 due to the OPSS consultation on units of measurement. The data quality concern is considered a low concern given that the data are checked by providers and the data are then further quality assured in detail by the statisticians responsible for the publication, who perform further detailed validation and checks, spotting and correcting any errors. These checks involve comparisons with data provided, published or historical data.

Overall, the weights and measures inspection and enforcement activity statistics have been assessed as A1: Low Risk. This is mainly driven by the low-profile nature of the figures. This is not a reflection of the importance OPSS puts on the inspection and enforcement work undertaken by LWMAs, but in terms of a factual statistical report, it is regarded as a politically neutral subject with interest limited to niche user base of those in the legal metrology field.

The Weights and Measures Act sets out the terms and purpose of what data are provided, and additional guidance is provided by OPSS.

Quality assurance by LWMAs

The Chief Weights and Measures Inspector (CWMI) is the senior responsible person for the accuracy of data returns. This responsibility includes checking the validity and accuracy of data and amending any errors entered or omissions onto the Section 70 returns template before it is sent to OPSS.

Quality assurance by OPSS

Data received by OPSS undergo a quality assurance process to ensure the data is fit-for-purpose and published to the highest possible standard. Any data quality issues are flagged and subsequently resolved with LWMAs.

As the returns are received, operational delivery teams in OPSS check them for missing data and that the data entry is of the correct type e.g. text or numbers have been entered into the correct cells. OPSS statisticians look at data gaps and variance checks to identify figures that seem unusually large or small compared with figures for the previous year or unusual patterns in the data. A large increase or decrease is flagged to the LWMA and they are asked for an explanation, for example if they carried out no inspections in a particular year.

OPSS statisticians collate the returns and produce the HTML commentary, charts and analysis using a Reproducible Analytical Pipeline (RAP) using the software package ‘R’. Not only does using RAP greatly reduce the production time, but it also reduces human interaction needed and therefore minimises risk of errors. Any errors that do occur are likely to be systematic, and therefore, only need to be fixed once. The R code is often complex and so needs to be well documented, with considerable attention to detail.

Once the commentary and charts are produced for the publication, they are independently checked by a second member of the Analysis team against the raw data. They also work through a clear checklist for each table, to look at, for example, totals that should be consistent within and between tables, consistency checking between tables, and checking totals, percentages etc. within tables are calculated correctly, hyperlinks work, formulae and drop-down lists update correctly. Each check is systematically signed off when it has been completed.

OPSS statisticians are responsible for checking that the commentary appropriately describes the trends seen in the data and is not biased and it is checked against the data for accuracy. The report and data tables are checked for accessibility using tailored guidelines and a checklist drawn up from the Web Content Accessibility Guidelines.

Access the Accessibility Guidelines

Reports are peer reviewed and signed off at a senior level in OPSS prior to publication.

The data underpinning publications is held in a snapshot form so that the content of publications can be replicated and that the option remains for additional historical scrutiny and analysis if required.

Feedback received may include queries on the data itself or for reasons behind levels or changes in the data, in order to understand trends or unusual figures. It is rare for errors to be made in publications. When made, these are corrected either immediately or in the next release (depending on severity and frequency) in line with the revisions policy.

1.3 Revisions policy

OPSS will notify users when they plan to publish new editions and if any revisions are required, in line with T3.9 in the Code of Practice for Statistics. A revision is defined as any change to official statistics in the public domain. These revisions can take the form of pre-announced revisions, or un-announced revisions such as when data updates are published without prior knowledge.

Read the Orderly release section of the Code of Practice for Statistics – UK Statistics Authority website

Non-Scheduled revisions

Where a substantial error has occurred as a result of the compilation, imputation or dissemination process, the statistical release, tables and other accompanying documents will be updated with a correction notice as soon as is practical.

Scheduled revisions

Changes to the data sources used in the releases are incorporated in the next scheduled release of data.

2) Quality summary

Quality is defined in terms of how well outputs meet user needs. The Code of Practice for Statistics states that quality means that statistics fit their intended uses, are based on appropriate data and methods, and are not misleading. This definition is a relative one allowing for various perspectives on what constitutes quality depending on the intended use.

Access the Code of Practice for Statistics – UK Statistics Authority website

In order to determine whether outputs meet their needs, quality is measured in terms of the quality dimensions of the European Statistical System.

Read about the European Statistical System – Europa website

2.1 Relevance

The degree to which the statistical product meets user needs in both coverage and content.

The statistics provide information on weights and measures inspection and enforcement activity carried out by LWMAs in Great Britain. There is some information available and published from April 2008, although consistent and comparable information is not available for many variables until April 2017.

In addition to the annual report, the underlying data returns from 2019-20 onwards are published in OpenDocument Spreadsheet (ODS) format on GOV.UK.

Access the data returns

The data is used by OPSS policy colleagues to inform decisions, and by operational delivery teams to advise LWMAs. The reports have been used, for example, to identify themes for national projects. In addition, LWMAs use the data to monitor and benchmark performance, to make strategic decisions, for planning and risk assessment and to identify the most important areas to target for inspection activity. Other interest and uses of this data are outlined in the section on Uses and users.

We review our data collections and outputs periodically to ensure that they are relevant, collect reliable data and meet user needs. More details are in section 2.7 on Assessment of User Needs and Perceptions

The content of the return outputs was reviewed in 2019 and a few fields were removed due to data completion issues. The 2020-21 report was peer reviewed in Summer 2023 by Statisticians from DBT.

Uses and users

The statistics produced in the series are used by a range of users to monitor trends in the weights and measures inspection and enforcement activity of LWMAs in Great Britain.

We believe the uses of the weights and measures inspection and enforcement activity statistics are:

  • Informing the general public – the statistics may be used by both national and local media, which in turn informs the public about trends in compliance rates; information on the statistics can also be requested by Parliamentary Questions and Freedom of Information requests.
  • Policy making and monitoring – the statistics are used by policy areas to monitor the state of the LWMAs and to provide context and evidence for policies; the data is also used to provide advice to Ministers.
  • Drawing comparisons e.g. year-on-year, or by equipment type.
  • Highlighting risks
  • ** Comparisons and benchmarking** enabling LWMAs to identify which areas their resources should be prioritised towards using targeted inspections.
  • Third parties – the statistics may be used by a range of third parties, eg businesses manufacturing or using measuring instruments.

We believe the users of weights and measures inspection activity statistics may include:

  • Ministers
  • Members of Parliament
  • LWMAs
  • other colleagues within OPSS
  • other government departments for example DLHUC
  • trade unions
  • journalists
  • Chartered Institute of Public Finance and Accountancy
  • Local Government Association
  • individual citizens
  • private companies and trade associations
  • students, academics and universities
  • charities

2.2 Accuracy and reliability

The proximity between an estimate and the unknown true value.

Weights and measures inspection and enforcement activity data is collected by sending a message to all CWMI’s and Heads of Services along with an excel template for collating their responses. All LWMAs enter data into the template and sent their completed returns to the OPSS Local Authority Unit (LAU) enquiries mailbox.

The Operational Delivery Team follow a five-step process to prepare the returns for the Analysis team:

  • Open the email with returned template.
  • Check the data entry – ensure text/numbers in field.
  • Input the date the return was received on an excel tracking spreadsheet.
  • Save template as S70 returns [year] – [name of LA].
  • Upload saved excel template file into SharePoint folder.

The two final actions can also be completed from the mailbox.

The template has 60 fields without any in-built validation rules to ensure that basic validation errors are avoided. There are therefore likely to be some inaccuracies in the data due to reporting or keying errors, such as misclassification or missing cases. The level of missing data on fields is very low, with such missing data reported as blank and therefore no grossing, imputation or other estimation methods are used.

OPSS provides support through the LAU enquiries email inbox and a general telephone helpline to LWMAs if they experience any difficulties in completing the template or unable to complete their returns within the deadline.

As discussed in the section on quality assurance, the Analysis team run some specific checks before publishing the data.

Accuracy can be broken down into sampling and non-sampling error. The data requested and provided by LWMAs are required under legislation, and we aim to achieve 100 per cent response for the collections, therefore reducing sampling error to the minimum. Occasionally LWMAs cannot complete the return in the timescale required or due to technical difficulties with their recording systems and any missing returns are noted in the published report.

The published report is produced via a RAP as discussed in the section on quality assurance and these statistics are therefore considered to be robust and reliable.

Non-sampling error includes areas such as coverage error, non-response error, measurement error, processing error.

We aim to reduce non-sampling error through the provision of guidance about the data collections and the definitions of the data items.

2.3 Timeliness and punctuality

Timeliness refers to the time gap between publication and the reference period. Punctuality refers to the gap between planned and actual publication dates.

There is a trade-off between timeliness and the other quality dimensions, in particular accuracy, accessibility and clarity. It is important to ensure that the release has adequate processes to ensure accuracy of the dataset and produce clear publication outputs.

To provide timely data to users, we aim to publish figures annually in as timely a manner as possible given resource constraints. OPSS publish the report in the early autumn to help inform LWMA business planning decisions for the following year. The April 2021 to March 2022 data were published on 31 October 2022.

LWMAs are given around eleven weeks to complete the Section 70 return. The data is then quality assured by OPSS Statisticians, charts and commentary are prepared, and they are then quality assured and prepared for publication.

The data production and publication schedules are kept under review and take into account user needs when considering the timeliness of future data releases.

2.4 Accessibility and clarity

Accessibility is the ease with which users are able to access the data, also reflecting the format in which the data is available and the availability of supporting information. Clarity refers to the quality and sufficiency of the metadata, illustrations and accompanying advice.

The Weights and Measures statistics webpages are accessible from the OPSS GOV.UK pages. An RSS feed alerts registered users to this publication. All releases are available to download for free.

The outputs aim to provide a balance of commentary and charts accompanied by ODS files of the data returns. The aim is to ‘tell the story’ in the output, without the output becoming overly long and complicated.

The publication is available in HTML format and includes email contact details for sending queries and feedback to the production team.

The format used is in line with gov.uk guidance. It aims to make outputs clear for the audience and all outputs adhere to the accessibility policy. Key users of the publication are informed of the statistics on the day of their release. Further information regarding the statistics can be obtained by contacting opssanalysis@businessandtrade.gov.uk.

The data published on the Weights and Measures collection page of GOV.UK are subject to rights detailed in the Open Government Licence v3.0: ‘All content is available under the Open Government Licence version 3, except where otherwise stated’.

View the Open Government Licence – National Archives website

The statistics are taken directly from the source data that is collected for administrative purposes with little manipulation between source and publication.

2.5 Coherence and comparability

Coherence is the degree to which data that are derived from different sources or methods, but refer to the same topic, are similar.

For many of the statistics covered by this report, they are the only source of official data on that subject. The data is collected from all providers on the same form with accompanying guidance and definitions. This ensures consistency across the different types of data provider.

Comparability is the degree to which data can be compared over time and domain.

The statistics are taken directly from 190 LWMAs in Great Britain (some of these are combined returns from more than one LWMA). All LWMAs use the same question set and guidance documents.

Comparison over time

The OPSS published reports cover the period from 2017 onwards. Prior to this, reports were published by the National Measurement and Regulation Office and the data was presented in a different format. Statisticians are working to look at the source information to see if it can be included in future publications.

Devolved administration data sources

Wales and Scotland use the same data collection tool. However, strategies, policies and approaches may differ between countries, and this will affect the comparability of statistical outputs in some cases.

Northern Ireland statistics are published by the Northern Ireland Department for Economy.

Geographies below England level

The data is published at LWMA level. The provision of clear guidance and definitions and the extensive validation carried out help to ensure that the data is consistent across LWMAs. There is no data collected at a lower geographical level.

Harmonisation of statistics inputs and outputs

A cross-governmental programme of work is currently underway looking into standardising inputs and outputs for use in National Statistics This is known as harmonisation. The Government Statistical Service published a Harmonisation Strategy in 2019. Its aim is to make it easier for users to draw clearer and more robust comparisons between data sources. OPSS adopts harmonised questions where possible, and harmonisation will be part of any ongoing changes to the system.

Access the Harmonisation Strategy – Government Analysis Function website

2.6 Trade-off between output quality components

Trade-offs are the extent to which different aspects of quality are balanced against each other.

The small cost to local authorities of providing the return is far outweighed by the resources saved from authorities being able to target their resources.

As discussed previously, the timetable for publication is designed to provide users with the best balance of timeliness and accuracy. At the time of production of the publication, some LWMAs data may still be missing – the extent of this is pointed out in the publications.

2.7 Assessment of user needs and perceptions

The processes for finding out about users and uses, and their views on the statistical products.

OPSS teams work closely with key customers and stakeholders in both OPSS and LWMAs to keep track of developments, and to continuously review the coverage and content of publications to ensure that they meet the needs of users.

We also consult our users on data collection issues in order to better understand user requirements and priorities for the future. As part of this, OPSS policy colleagues, LWMAs and others have provided information on how they use statistics as discussed in the earlier section on Uses and users. We encourage feedback on all our outputs and data collections. Contact details are available at the bottom of each publication page for users to get in touch if they have a query.

Stakeholder engagement

OPSS statisticians have worked closely with stakeholders to ensure that as much data as possible is shared, upholding the principle of collect once and use many times.

We use networks such as LWMA legal metrology groups at national and regional level as well as LWMAs to get feedback on the data collection and engage where necessary to answer queries and provide advice. In addition, the Local Authority Reference Panel which meets quarterly, sometimes discusses Weights and Measures inspection and enforcement activity data collection data requirements, including identifying new priorities and areas for improvement.

The national Legal Metrology Expert Panel discusses legal metrology technical and policy issues of relevance to LWMAs. It meets twice a year and consists of senior level weights and measure inspectors from around the UK and is attended by OPSS.

User consultations

Broader consultations are conducted when appropriate, for example when significantly changing the provision or coverage of the published data, or revising the methodology used. This is done via a sub-group of the national Legal Metrology Expert Panel, including representatives of the professional organisations the Association of Chief Trading Standards Officers (ACTSO) and the Chartered Trading Standards Institute (CTSI).