Official Statistics

Planning Inspectorate Ministerial Statistics Background Quality Report 23 May 2024

Published 23 May 2024

Applies to England

1. Introduction

This background quality report assesses the quality of experimental statistics for the Planning Inspectorate using the European Statistics System (ESS) Quality Assurance Framework (QAF). This is the method recommended by the Government Statistical Service (GSS) Quality Strategy. Statistics are of good quality when they are fit for their intended use.

The ESS QAF measures the quality of statistical outputs against the dimensions of

  • relevance
  • accuracy and reliability
  • timeliness
  • accessibility and clarity
  • comparability and coherence

The GSS also recommends assessment against 3 other principles in the ESS QAF. These are:

  • trade-offs between output quality components
  • confidentiality and transparency
  • balance between performance, cost and respondent burden

These dimensions and principles cross the three pillars of trustworthiness, quality and value in the Code of Practice for Statistics.

This quality assessment covers the experimental statistics which are produced to allow anyone to see how the Planning Inspectorate are performing in relation to a set of Ministerial measures, introduced in January 2022.

2. Background and Context

The Planning Inspectorate’s job is to make decisions and provide recommendations and advice on a range of land use planning-related issues across England. This is done in a fair, open and timely way.

The Planning Inspectorate deals with planning appeals, national infrastructure planning applications, examinations of local plans and other planning-related and specialist casework in England.

The Planning Inspectorate is an executive agency, sponsored by the Department for Levelling Up, Communities and Housing.

3. Methodology and Production

The casework statistics provided in this publication has used data from:

  • The casework management systems used for processing appeals casework, Horizon and Picaso. This has been used to produce the statistics on our casework.
  • Spreadsheets – some of the casework data, for Tree Preservation Orders, High Hedges appeals and Hedgerow appeals, is also extracted from source MS Excel spreadsheets. This data has been used in conjunction with Horizon data to calculate performance data.

The data on valid on first submission (Section A) and casework timeliness (Section C) were extracted on 1st May.

4. Relevance

The Planning Inspectorate has proactively decided to produce these statistics monthly to better meet user needs. We welcome feedback and will continue to develop the statistic over time to ensure we continue to meet user needs.

As these are experimental statistics, we are particularly keen to receive feedback via statistics@planninginspectorate.gov.uk on whether readers find them relevant.

The release may also be used to answer press queries, parliamentary questions and Freedom of Information requests. The report is also useful for internal customers to support evidence-based decisions and to support discussions with external stakeholders.

5. Accuracy and Reliability

The Planning Inspectorate use administrative data from operational delivery systems to compile these statistics, as these data come from live systems there are occasions when this data changes. Data used on the publication is based on data recorded in these systems at the time of extraction.

We are using administrative data for measuring performance for the first time in some cases. We are working with operational colleagues to ensure that data collected is complete and fit for purpose going forwards and our analysts will continue to develop our quality assurance processes with regard to both the data and our subsequent analysis.

This information and associated data collection methods will be quality assured, to develop a longer-term solution to collecting these statistics. Whilst this work is in progress these numbers should be treated as experimental.

The possible changes that could occur in these statistics include:

  • Data entry error – Some data may be entered in a form that is incomplete or in a format that cannot be processed. An example of this is that there are occasionally errors in date fields; these are highlighted in internal data quality reports and the Inspectorate is working to improve the quality of data that supports this publication.
  • On occasions the categorisation of cases may change e.g. the procedure type can change and this will be recorded differently in the latest monthly statistic compared to previous versions.
  • Delays in updating records on Operational systems mean that changes may apply to data older than the latest month released.

There are instances where case records indicate a case has been closed and a decision (such as whether the appeal has been dismissed or allowed) has been recorded, but no date has been entered. It is not clear whether the decision has been added in error, or the date omitted in error. Any such case record will be excluded from the counts of the number of decisions (which use the month of the decision) which may give an under-estimate. This applies to fewer than 100 cases received in a year, in the context of over 17,000 decisions a year. Further work is required to automatically identify these cases and get any errors amended.

One of the measures in the report is the number of decisions in a given time period. This is not the same as the number of closed cases, which is considerably higher as it includes cases where an appeal is withdrawn, notice is withdrawn, or the appeal is turned away.

The section on Coherence and Comparability gives details of issues identified in relation to data quality and accuracy.

6. Timeliness and Punctuality

These statistics are intended to be published quarterly (every three months) within two months of the end of the reporting period. This is to allow time to produce the statistics while ensuring they are timely for users.

7. Accessibility and Clarity

These are new statistics and we are keen that they are as clear and accessible a possible.

In some areas, such as where we provide information on a selection of case types, we are keen to get the balance right on providing too little information and too much or too complex information.

We would be very grateful for any feedback on whether we have got this balance right as outlined in the main document.

The statistics are published on the GOV.UK website. The publication is available from 09:30 hours on the day of release.

All Tables and most Figures from the statistic are separately available in MS Excel format for users to download. This allows for use in individual research and reports.

8. Coherence and Comparability

There are some small discrepancies between timeliness statistics presented in this publication and equivalent measures in the April 2024 monthly official statistics publication, which covers the same period. This is due to a one-month gap between the two sets of data being extracted from our administrative systems, during which time updates can be made to case records.

The publication includes trends over an 18 month period to allow comparisons over time. If significant changes are observed in the statistics these have been explained.

These are official statistics under development and as such we may change them as a result of feedback received. This means that it may not be possible to ensure comparability of future statistics with those in this release – for example if a different measure is used, or if quality assurance results in changes to data production methods.

Issues with the data identified in this statistical bulletin are as follows:

A. Appeals Valid on First Submission

Calculation (and delays in validation)

The proportion of appeals valid on first submission is calculated as the number of appeals valid on first submission, as a proportion of all appeals received for a given period. In making this calculation however, appeals that have not yet been validated are excluded. There are two issues in relation to this:

  1. Some cases experience extensive delays in being recorded as validated.
  2. There seem so be some case types that are not recorded on the main case management system, where validation is not being recorded in a way that enables statistical analysis.

Calculation (and use of proxy measure)

Our data systems do not enable us to know accurately whether an appeal is valid the first time it is received. For our measure we are using a proxy measure – this looks at the date that an appeal is validly received, and if this is the same as the date it is received, it is counted as “valid first time”.

This means that an appeal which is not valid first time, but has any issues quickly resolved and is resubmitted on the same day, will be counted as valid first time. This means the measure may be a slight over-estimate of the cases valid first time.

It should be noted that the date for ‘validly received’ is the date on which the information was received, even if it is assessed as being valid on a later date. This avoids a situation where appeals are judged not to have been valid, due to any delay in their being assessed.

The calculation method for producing the proportion of cases valid first time has changed for this issue, following a further review. This change relates to the way that cases without a recorded valid date are treated. The complete five-quarter time series presented in this bulletin has been updated with the new method. The observed impact of this change on the series presented in this edition is that the proportion of cases valid first time has increased by between approximately 2 and 3.5 percentage points.

B. How Long Appeals Take

The data used cover decisions made in the 12 months January 2023 to December 2023. These were downloaded on the 4th of January 2024 and are consistent with other Official Statistics published with data for the same period.

A small number of cases were found to have missing information on time from received to decision, or a negative time. These cases have been excluded from the analysis. The reasons for the missing or invalid times are not clear.

C. Customer Satisfaction

The Planning Inspectorate have worked with the Institute for Customer Service (ICS) to conduct a satisfaction survey. The data capture phase was carried out in April and early May 2023.

The response rate for this survey was 16 per cent – this represents 635 responses. As with all surveys it is important to note that those who do not respond to the survey may have different views to those who do respond.

A stratified sample was used to select customers to be contacted with the survey. This sample covered eight types of casework (planning appeals; householder appeals; commercial appeals; advertisement appeals; planning listed buildings and conservation area appeals; enforcement notice appeals; enforcement listed building and conservation area appeals; and lawful development certificate appeals) and all three categories of procedure (written representations; hearing; and inquiry), and both agent-represented and unrepresented appeals. This sampling method differs from the previous ICS survey, which hinders our ability to compare results over time.

D. Number of Cases Quality Assured

The number of decisions quality assured may be an under-estimate. This is because some decisions are recorded without a grade for the person recording them. If some of these decisions were in fact made by Appeals Planning Officers, then they will have been omitted from the count of APO decisions, and thus from the total count of decisions quality assured.

It has been established that some case reference numbers are being recorded inconsistently, which means that they are being excluded from the report used to count cases quality assured. This is a result of a recent change to casework management systems in the Inspectorate. The scope of this issue is limited to specialist cases such as Tree Preservation Orders, which make up a small percentage of all appeals casework. The impact of the inconsistent reference numbers is an under-reporting of cases quality assured.

9. Trade-offs between Output Quality Components

Where possible the cost to Government of producing these statistics has minimised by using data already collated for operational delivery purposes. The main sources of data used for compiling these statistics are the casework management systems, Horizon and Picaso , these systems are large administrative databases, and as such, data quality across fields is of varying quality and completeness.

10. Quality Assurance

These are new statistics and the quality assurance processes around them have been put in place by members of the Data and Performance Team are also new; while based on good practice the processes are not well-established. They will be reviewed and developed for future publications.

Data feeding the publications undergoes quality checks to ensure the correct data has been extracted and the appropriate filters have been applied. Subsequently, the layout and presentation of the data in the statistical release is read by multiple members of The Data and Performance team to ensure that the data is presented appropriately, to aid correct interpretation by the user.

11. Assessment of User Needs and Perceptions

Publication of this report is proactive in anticipation of user interest in the Planning Inspectorate’s performance against Ministerial Measures published in January 2022.

The Experimental Statistics are intended to pre-empt questions from the media and the general public about the Planning Inspectorate’s performance. This report also contributes to the Planning Inspectorate’s commitment to release information where possible.

The Planning Inspectorate invite users to provide feedback to any of their publications or reports using the contact information within the publication.

12. Performance, Cost and Respondent Burden

The production of the Experimental Statistic requires less than one FTE per annum.

The report at present uses administrative data sources already collected by the Planning Inspectorate. As such, there is no respondent burden, and the main cost is the production of the statistics including quality assurance and data interpretation. Once the statistics are expanded to include customer satisfaction, that will involve a respondent burden.

13. Confidentiality, Transparency and Security

The Data and Performance team involved in the production of this Experimental Statistic have completed the Government wide Responsible for Information training and they understand their responsibilities under the Data Protection Act and the Official Statistics Code of Practice.

The Data and Performance team adhere to the principles and protocols laid out in the Code of Practice for Statistics and comply with pre-release access arrangements. The Pre-Release Access list for our publications are available on the GOV.UK website.

14. Contact Details

The Planning Inspectorate welcome feedback on our statistical products. If you have any comments or questions about this publication or about our statistics in general, you can contact us as follows:

Media enquiries 0303 444 5004 email press.office@planninginspectorate.gov.uk

Public enquiries email statistics@planninginspectorate.gov.uk

Please note we are currently reviewing our statistics with a view to making them as clear and helpful as possible for users. We would be delighted if you could contact us via the address below with any views on this approach; particularly on what content would be most useful and why.

email statistics@planninginspectorate.gov.uk

15. Official Statistics Designation

The Planning Inspectorate Ministerial Measures bulletin is designated as Official Statistics in Development. The bulletin, and this associated Background Quality Report, are produced according to the principles of Trust, Quality and Value. The statistics are undergoing development; in particular, we are assessing aspects of data quality, coherence with other statistics that we produce and the clarity of charts and other elements of presentation. If you would like to provide feedback to contribute to this, please contact: statistics@planninginspectorate.gov.uk

Our statistical practice is regulated by the Office for Statistics Regulation (OSR).

OSR sets the standards of trustworthiness, quality and value in the Code of Practice for Statistics that all producers of official statistics should adhere to.

You are welcome to contact us directly with any comments about how we meet these standards.

Alternatively, you can contact OSR by emailing regulation@statistics.gov.uk or via the OSR website.