Official Statistics

Adult Oral Health Survey 2021: technical report

Updated 5 April 2023

Applies to England

Introduction and background

The 2021 Adult Oral Health Survey (AOHS) was carried out in February and March 2021 with a representative sample of adults in England aged 16 and over. Data was collected using self-completion questionnaires, completed in web and paper form. A total of 6,343 responses were received.

The survey was commissioned by Public Health England, now the Office for Health Improvement and Disparities. The survey was carried out by a consortium led by the National Centre for Social Research (NatCen), and includes the University of Birmingham, King’s College London, the School of Dental Sciences at Newcastle University, the Dental Public Health Group and Department of Epidemiology and Public Health at University College London (UCL), and the Office for National Statistics (ONS). The University of Leeds and School of Clinical Dentistry, University of Sheffield also provided guidance and support to the survey and its design.

The AOHS is the latest in a series of nationally representative surveys of adults’ oral and dental health in England. The first Adult Dental Health (ADH) surveys took place in 1968 and 1978 in England and Wales. These surveys were carried out by consortia of academic dental centres, led by the Government Social Surveys Unit in 1968 and the Office for Population, Censuses and Surveys thereafter. Similar surveys were carried out in Scotland in 1972 and in Northern Ireland in 1979.

The coverage of the surveys in 1988 and 1998 was extended to Scotland and Northern Ireland. In 2009, the survey covered adults in England, Wales and Northern Ireland and was carried out by a consortium led by ONS, and including the Universities of Birmingham, Dundee, Cardiff and Newcastle, UCL, NatCen and the Northern Ireland Social Research Agency (NISRA).

All these surveys collected data through face-to-face interviews with adults in their own homes, followed by an oral examination by a qualified dental practitioner. The content of questionnaires and examinations included dental and oral health and their impact, service use, and attitudes and behaviours. While reflecting changing priorities, many measures remained unchanged across survey waves and have enabled detailed tracking of trends over time.

Development work for the 2021 survey was commissioned in early 2020 and included the proposal to rename the survey as the Adult Oral Health Survey, to reflect the scope of its concerns. At that time, it was expected that the survey would follow closely the design of previous ADH surveys, including a face-to-face interview and an oral examination. By late 2020, when data collection was commissioned, the progress of the coronavirus (COVID-19) pandemic meant that this was not feasible at that time. Consequently the 2021 AOHS was carried out as a web and paper survey with no oral examination, thereby reducing comparability with previous surveys in the series.

Ethical clearance for the survey was provided by NatCen’s Research Ethics Committee.

Survey development

In early 2020, PHE commissioned 3 strands of development work for a survey of adult dental health in England to be carried out later in 2020:

  • a stakeholder consultation
  • the survey protocol, including the sample design
  • data collection instruments: an interview questionnaire and the protocol for an oral examination

These 3 strands were undertaken by a consortium comprising NatCen, Birmingham University, King’s College London, Newcastle University, UCL and the ONS. While all consortium members were actively involved in all 3 strands, UCL was the lead organisation for the stakeholder consultation and ONS for the survey protocol and the sample design. NatCen was the lead organisation for the design of the data collection instruments and also provided overall project co-ordination.

The consultation ran from February to April 2020 and consisted of the following elements.

  • 3 in-person meetings in London, Newcastle and Birmingham
  • 1 virtual meeting
  • individual conversations by telephone and email
  • an online questionnaire

The consultation attracted responses from:

  • PHE (central team, consultants, trainees)
  • NHS (NHS England commissioners, consultants, specialists)
  • British Dental Association
  • Faculty of Dental Surgery
  • Faculty of General Dental Practice
  • Health Education England
  • General Dental Services
  • Community Dental Services
  • third sector organisations
  • academia (in the UK and internationally)
  • scientific societies

Overall, 89 individuals and organisations contributed their views to the stakeholder consultation for the next ADH survey:

  • 52 took part in the 4 stakeholder consultation events
  • 8 individuals contributed through individual remote meetings and email communication
  • 29 individuals or organisations provided input through the online survey questionnaire

The findings emphasised the need for the survey and its findings, which were seen as the only source of representative oral health data for adults needed to inform health policy and workforce planning. The findings also identified priorities for existing and new areas of coverage. In addition, the methodological approach was considered and suggestions made for improvement and innovation.

Final reports were presented to PHE in June 2020. A change to the survey name was recommended. The survey protocols included updating the sampling frame to reduce the proportion of ineligible addresses in the sample, and a new approach to the recruitment of dental examiners. The draft questionnaire included a broader focus on general health and lifestyle and more focused questions about past treatment, as well as retaining questions on the impact of oral health problems and barriers to accessing care (anxiety and cost).

New areas for inclusion were outlined, including the impact of COVID-19 (this was in the first few months of the pandemic) and the prevalence of dry mouth. The examination protocol was designed as far as possible to be comparable with past surveys, with the inclusion of enamel caries within the criteria for coronal surfaces and some revisions to the periodontal examination.

By this time the COVID-19 pandemic had seriously affected everyday life within England, including the practicality of carrying out face-to-face survey research. Large-scale government-funded surveys, such as the National Travel Survey and the Family Resources Survey were suspended and, where possible, data collection was moved to other modes, chiefly via the web, telephone or postal questionnaires. The challenge to health-related surveys, such as the Health Survey for England and the National Diet and Nutrition Survey, were particularly acute, given that these surveys collect physical measurements and biological samples. The original design of the AOHS, which included a dental examination, was affected by similar considerations.

In late 2020, PHE commissioned the same consortium that had produced the development work to carry out the data collection for a 2021 AOHS. There was no certainty about when face-to-face interviewing or dental examinations would be possible. A ‘push-to-web’ design was agreed, using online questionnaires, complemented by paper questionnaires for those who were unwilling or unable to complete the survey online. The original sample design was kept, but the issued sample was increased to reflect the lower response rate to be expected from an online rather than interviewer administered survey (see the ‘Sample’ section below). The ‘Questionnaire’ section of this report details the changes made to the questionnaire content to accommodate the change in mode.

Sample

Overview of the sample design

The sample was designed to be representative of adults aged 16 and over living in private households in England. In common with previous ADH surveys, the 2021 AOHS was based on a multi-stage stratified probability sample design. The sample was provided by the ONS and was drawn from the AddressBase Premium database.

This database combines data from local authorities, Royal Mail and Ordnance Survey, and enables more accurate identification of residential units than the Postcode Address File used in previous ADH surveys.

Several changes had occurred since 2009 that influenced the survey design for the 2021 survey:

  • the 10 strategic health authorities were superseded by 7 public health areas (or regions)
  • primary care trusts were replaced by clinical commissioning groups (CCGs)
  • the change to an online and paper self-completion approach necessitated the drawing of a larger sample than in 2009 in order to compensate for the lower response expected by using these modes instead of face-to-face interviewing

The sample was designed to cover the 7 PHE regions, with a minimum of 1,500 addresses sampled per region. (The boundaries of these regions remained the same; the reduction from 10 to 7 was achieved by combining the North East with Yorkshire and the Humber, the East Midlands with the West Midlands, and the South East Coast with the South Central regions.)

The aim was to interview a target of at least 500 adults aged 16 and over in each region (see the ‘Response’ section below for more on response). A reserve sample was selected to ensure that this target would be met. In the event, the fieldwork period was too short to allow for a review of response before releasing the reserve sample and it was issued at the same time as the main sample.

Although face-to-face fieldwork was not possible at the time the survey was carried out, a clustered sample design, similar to that used in previous surveys, was used. This was to allow for a clinical follow-up should that become feasible within a reasonable period after the initial data collection.

The sample design of the 2021 AOHS was defined as follows:

  • a clustered probability sample of 23 primary sampling units (PSUs) per region
  • PSUs formed by pairing neighbouring postcode sectors within each region
  • 38 addresses per postcode sector selected at random for the main sample and 22 addresses for the reserve sample

The sample was designed to be representative of the adult population of England and large enough to facilitate analysis by key demographic indicators and geographical characteristics including:

Sampling method

Forming the clusters and selecting the PSUs

ONS used a dataset containing the distance between every pair of postcode sectors in England to run an algorithm that obtained the best pairing of each postcode sector in each region. Nine files were produced, one for each PHE region and a representative sample for each of the 7 regions was drawn from these 9 files.

The PSUs were selected from a list sorted by 2 census-based factors (also used in 2009):

  • the proportion of households in a postcode sector where the household reference person is in National Statistics Socio-economic Classification groups 1 to 3 (referred to as HNSSEC)
  • the proportion of people aged 65 or over (P65plus)

Pairs of matched postcode sectors that included a postcode sector not available on the PSU table were removed from the sampling frame. Very small postcode sectors were combined with other postcode sectors. A final total of 3,899 PSUs were arrived at on this basis, which compares with 4,122 PSUs in the full list of matched postcode sectors.

The PSUs on the sampling frame were split into 2 HNSSEC3 bands and 3 P65plus bands. The bands were defined in such a way that the total number of delivery points was approximately equal across bands within each category.

Systematic probability proportional to size sampling, where size is given by the number of delivery points in a cluster, was used within each region to select 23 PSUs. NSSEC3_2bands and P65plus_3bands defined the minor strata and within each minor stratum, the PSUs were sorted by CCG.

To assess the representativity of the sample by CCG, the percentage of postcodes in each CCG in the selected sample was plotted against the equivalent measure in the full list of postcode sectors.

Selection of addresses

In each postcode sector, the aim was to select 38 addresses for the main sample and 22 for the reserve sample. To achieve this:

  • 60 addresses were first selected from each postcode sector included in the selected PSUs using systematic random sampling from AddressBase. Addresses that had recently been used in ONS surveys were excluded
  • 38 addresses were drawn from the initial list of selected addresses (see above) in each postcode sector to form the main sample, with a total sample size of 12,236 addresses. The remainder of addresses formed the reserve sample of 7,084 addresses
  • the unique property reference numbers (UPRNs) of the selected addresses were then matched to the address register to obtain the full addresses; 48 of the UPRNs were not matched successfully. As a result, the main sample had 12,202 addresses and the reserve sample 7,070 addresses

Questionnaire

Background

The survey protocols and data collection instruments designed by the consortium for PHE in early 2020 were produced with the expectation of undertaking data collection face to face, as was the case for previous national surveys of adult dental health. By late 2020, the circumstances of the pandemic made it clear that face-to-face interviewing in individuals’ homes would not be possible when the survey fieldwork was due to be completed (in early 2021). As a result, the survey design was adapted to a web-based self-completion approach, supplemented by paper questionnaires.

This had a number of implications for the questionnaire design.

The original face-to-face interview was estimated to be 30 minutes in length. A length of no more than 20 minutes is recommended for web-based questionnaires (and other self-completion modes) because it is easier for respondents to lose interest and fail to complete the survey than when they are interviewed. Consequently, the overall amount of content had to be reduced.

The content and presentation of questions had to be adapted to a self-completion format (web or paper). This included some simple changes (for example, removing interviewer instructions, changing response options to the first person). Crucially, the language, content and presentation of the questions needed to be accessible to enable unsupported completion by as wide a range of people as possible.

In order to maximise response, the questionnaire was also made available in a paper self-completion format. For reasons of space and to avoid confusing respondents with an over-complex structure, the questionnaire structure needed to be mainly linear, with minimal routing or questions directed at subsamples.

A key element of the original survey design was a dental examination in participants’ homes. This was no longer feasible, so it was necessary to include questions that would collect reliable self-reported information on the health of respondents’ teeth and mouths.

One consequence of COVID-19 restrictions was that the public’s access to dental treatment was restricted after March 2020. The survey needed to reflect the impact of this.

Comparability with previous national surveys of adult dental health was desirable, but it was recognised that the change of mode would limit this.

The time available to develop the questionnaire was limited by the deadlines imposed in order to program and test the questionnaire for fieldwork starting no later than mid-February 2021.

The development process

The starting point was the questionnaire produced by the consortium for PHE in the summer of 2020. The questionnaire was developed to tight deadlines necessitated by the need to program, test and launch the questionnaire in mid-February 2021 in order to complete fieldwork by the end of March 2021.

The first step was to identify areas to retain from the original questionnaire as well as new areas to include.

For existing questionnaire content, PHE and academic members of the consortium members carried out a ‘traffic light’ exercise to identify content to drop (red), include if possible (amber) and keep as a priority (green). This was informed by a range of considerations, including the consultation exercise in spring 2020, PHE’s current planning and information needs, and past use of survey data. The results of this exercise were collated by NatCen and UCL and shared with PHE.

The priorities for new content, as agreed by PHE and the consortium, were self-reported condition of mouths and teeth, and the impact of COVID-19 on access to dental advice and treatment since March 2020. In an iterative process, members of the consortium agreed on the precise information that should be captured, and then explored existing instruments to identify suitable questions. Where possible validated questions were used, and where no suitable questions were available, new questions were drafted. All questions were reviewed and refined if necessary; with final decisions on wording and content based on the need for simple, accessible questions.

NatCen compiled a draft questionnaire, using survey design expertise to adapt question formats where necessary and monitor the overall length. This was reviewed by the academic consortium members and revised before presentation to PHE in mid-December 2020.

The questionnaire was reviewed by PHE, who suggested further revisions, which were reviewed in turn by the consortium. The revised questionnaire was reviewed again by PHE and a final version was agreed by PHE and NatCen in the first week of January 2021.

In the absence of sufficient time to test and pilot the questionnaire, it was reviewed by a specialist from NatCen’s questionnaire design and testing hub and tested by a convenience sample of individuals who were neither researchers nor dental health experts, recruited through the NatCen research team.

2021 questionnaire content

This section presents an outline of the final questionnaire content, including the amendments made from the questionnaire that was originally developed for the AOHS (‘the 2020 questionnaire’). A copy of the full 2021 questionnaire is available on request from dentalphintelligence@dhsc.gov.uk.

Household questions

This section captured household composition, household income, and the respondent’s age, sex, ethnicity and employment status, using validated questions adapted from the Health Survey for England (HSE) 2020 push-to-web feasibility study.

Health and lifestyle

This section covered self-assessed general health and oral health, tobacco use (smoked and non-smoked), vaping and alcohol consumption. It also included questions about whether the participant had ever received advice about these and other behaviours from a dentist or a member of the dental team.

The questions were drawn from the following sections in the 2020 questionnaire:

  • general and oral health
  • lifestyle
  • lifetime treatment history

Questions on non-smoked tobacco derived from the HSE were added, with the format of some questions adapted to be suitable for a self-completion mode.

Detailed questions on general health and specific conditions were dropped for space reasons. Questions on sugar consumption were not asked because no suitable questions could be found.

Your teeth and mouth

This section was designed to capture data on the current state of the respondent’s mouth, including limited information on lifetime experience. This was included as a proxy for the dental examinations that were not possible at the time.

The choice of content was informed by the 2020 consultation and by the need for simple, unambiguous questions that could be reliably answered by any member of the general public. Questions were derived from several sources, including the 2020 questionnaire and other existing surveys, and were adapted in an iterative process to ensure they met the accessibility threshold.

This section replaced the sections ‘Natural teeth’ and ‘Lifetime treatment history’ in the 2020 questionnaire.

The questions covered number of natural teeth, filled and crowned teeth, fixed bridges and dentures, root canal fillings and implants, current pain or damage to teeth or mouth, posterior contacts, symptoms of periodontal disease and dry mouth.

Oral health impact profile (OHIP-14)

The oral health impact profile (OHIP-14) is a validated measure of the frequency with which issues with the teeth, mouth or dentures are experienced. It covers the frequency with which such issues impact on areas of life such as speaking, diet and eating, pain and aching, emotional and mental health, social interactions and employment.

This section was retained and adapted for self-completion.

Current oral health behaviour

This section covered tooth cleaning including frequency, aids used to clean teeth, the use of toothpaste and prescribed high-fluoride toothpaste. Questions about cleaning dentures were dropped for reasons of space.

The 2020 questionnaire asked interviewers to code the fluoride content of the participant’s usual toothpaste. This was replaced by a question that asked the respondent directly whether their usual toothpaste contained fluoride.

Need and access to treatment during the pandemic

This was a new section designed to assess the impact of the pandemic on access to services. The questions were derived from a questionnaire developed by UCL for patients accessing care during 2020. These were reviewed and refined by the consortium and PHE to be concise and relevant to a general population sample. The questions covered the need for dental advice and treatment since March 2020, whether and how this was accessed, the outcomes, and – if relevant – why treatment was not sought.

Usual pattern of dental attendance

This section replaced sections in the 2020 questionnaire on the pattern of dental attendance, accessing treatment and treatment received, which were agreed to be of comparatively low priority in the traffic light assessment (see earlier). The reliability and applicability of responses about usual behaviour and most recent visits to a dentist was also unclear given that fieldwork took place in February and March 2021, almost a year after the first lockdown came into force.

The usual pattern of dental attendance section included questions about the usual (pre-COVID) pattern of dental attendance, reasons for not attending regularly (where that was the case), whether treatment at the respondent’s most recent visit was provided by the NHS or privately and, if the latter, reasons why private treatment had been used.

Questions dropped from the 2020 questionnaire included whether the last dental visit was for a check-up or because of a problem, reasons for choosing NHS care, what would encourage private care users to use the NHS, whether lifestyle advice was received, how treatment was paid for, and how long respondents had been attending the practice they visited most recently.

Lifetime treatment history

The section on lifetime treatment history was dropped, although some content was included elsewhere in the questionnaire.

Attitudes and barriers

This section was retained in full and covered choices of treatment and the impact of cost.

Modified dental anxiety scale (MDAS)

This 5-item scale covers anxiety in different treatment situations and was retained in full. It was presented for self-completion in a grid format.

Impact of oral health problems

This 9-item scale covers the severity of impacts of oral and dental problems on different aspects of daily life. It was retained in full and presented for self-completion in a grid format. In addition, a follow-up question which asked about the nature of the dental or oral condition that caused any of these problems was also included.

Data linkage and re-contact for future research

Towards the end of the questionnaire, requests for consent to linkage to NHS patient data and also, separately, for willingness to be re-contacted for research purposes, both based on NatCen standard questions, were included.

Images of teeth and mouth

Respondents who completed the survey on a smartphone were asked if they would be willing to take pictures of the inside of their mouths and upload them, an experimental measure in the absence of in-person dental examinations. Detailed instructions and example photographs were provided for those willing to do so.

This innovation was piloted in order to explore whether images taken by respondents could be used to validate self-assessed oral health data from the survey in the absence of a clinical examination. An evaluation of this innovation will be published in 2023.

Fieldwork

Approach

Previous adult oral health surveys were undertaken using face-to-face interviews and clinical examinations. However, given the ongoing COVID-19 restrictions, the 2021 survey required the use of a push-to-web approach. Potential respondents were contacted by letter and invited to take part in an online survey, a single stage opt-in process. Willing participants were provided with access codes and the survey link within their invitations so that they could go straight to the survey rather than needing to register interest first. Respondents were provided with a freephone number that they could call with any queries.

The initial contact was supplemented by 2 further reminders to take part. An alternative to web completion was offered to potential participants who were unable or unwilling to complete the questionnaire online with the second reminder, in the form of a paper questionnaire. Evidence from the Health Survey for England (HSE) 2020 push-to-web feasibility study suggests that this approach not only improves overall response but also encourages responses from different types of individuals from the web questionnaire.

The paper questionnaire included the same questions as the online version in a visually appealing and accessible format. In the online survey, the survey routing was an automatic part of the program; within the paper questionnaires respondents had to choose the correct route themselves. This had informed the overall questionnaire design, and simple clear routing instructions were an integral part of the design.

Survey recruitment

The random probability sample (see the ‘Sample’ section) was provided to NatCen via a secure file transfer portal by ONS. A sample of 12,000 addresses and a reserve sample of 7,000 addresses were drawn in accordance with the specifications outlined in the ‘Survey development’ section of this report. The aim was to achieve a minimum of 500 completed interviews with adults aged 16 and over in each of the 7 PHE regions. After reviewing the experiences of similar surveys and taking into account the restricted time available for fieldwork, it was agreed that all the sampled households would be contacted at the same time (19,000 addresses in total).

Each address in the sample was sent a letter inviting up to 3 adults aged 16 and over living in the household to take part in an online survey about oral health. The aim of the letter was to encourage participation, with the following information provided:

  • an explanation of the importance of the survey
  • a broad overview of the survey content
  • clear instructions on how to opt in – with opt-in provision made for up to 3 adults per household (additional paper questionnaires were available on request)
  • NatCen contact details (freephone number and email address) to find out more or opt out
  • ‘frequently asked questions’ covering:
    • how the address was selected
    • who was eligible to take part
    • data protection assurances (including a link to the survey privacy notice)
    • who was carrying out the research
    • survey content
    • length and incentives

The initial invitation letter was accompanied by a survey leaflet that provided further details about the survey, what it involved, how the data would be used and treated and who would be using the results. It also included details of where to find the relevant privacy notice and contact details for NatCen should potential respondents wish to find out more.

Two reminder letters were issued to non-responding households. Due to the tight fieldwork timescales, these were sent within weeks of the original advance letter (see Timings below) and included text recognising that surveys may have been completed after the letters had been issued but before they were received. The second reminder included 2 paper questionnaires and freepost return envelopes (one per questionnaire to preserve respondent confidentiality).

Incentives

Respondents received a £10 Love2Shop gift voucher on completion. The majority of incentives were issued as e-vouchers. The option of receiving incentives via post was available for those without an email address and those who preferred to receive their voucher in this way.

Timings

The fieldwork timings were as follows:

  • invitation letters were sent on 18 February 2021
  • reminder 1 was sent on 1 March 2021
  • reminder 2 was sent on 10 and 11 March 2021
  • the online survey closed on 21 March 2021
  • the cut-off date for paper questionnaires was 26 March 2021

Response

Overall response

Response is calculated at the household and individual level. The household response rate is the percentage of households contacted as part of the survey in which at least one interview was completed. The individual response rate is the estimated response rate among all adults that were eligible to complete the survey.

In total, 19,286 addresses were sampled, from which 6,254 fully productive responses were achieved.[footnote 1] In addition, 67 incomplete responses were received with sufficient data for inclusion in the final data set. For the purposes of this analysis, the incomplete responses have been included, giving a total of 6,321 productive individuals.

Household response

At least one interview was completed with 4,429 households, an unadjusted household-level response rate of 23%. In an online survey of this nature, no information is known about the reason for non-response in each individual household. As described in the ‘Sample’ section of this report, it is estimated that around 4% of residential addresses in the AddressBase Premium database from which the sample was drawn were ineligible to complete the survey. Once the sample size was adjusted to take this into account, the household-level response rate was estimated to be 24%.

Individual response

An average of 1.88 adults aged 16 and over lived in responding households. This was used to estimate the number of adults potentially eligible within the whole sample.

The unadjusted response rate among individuals was 17.5%, adjusted to 18.2% to take the likely number of ineligible addresses into account.

Within productive households, 76.1% of eligible adults provided productive responses.

71.4% of responses were collected via the online questionnaire, with the remaining 28.6% completed on paper.

Table 1: survey outcomes

Outcome Value
Addresses issued 19,286
Addresses assumed eligible 18,515
Productive addresses 4,429
Adjusted household response rate 23.9%
Adults assumed eligible 34,722
Fully productive individual responses 6,254
Partially productive individual responses 67
Total productive responses 6,321
Adjusted individual response rate 18.2%
Questionnaires completed online 4,515 (71.4%)
Questionnaires completed on paper 1,806 (28.6%)
Eligible adults in productive households 8,306
Individual response rate within productive households 76.1%

Profile of achieved sample

Tables 2 to 4 compare the response, in total and by mode, with the population of England aged 16 and over (source: 2019 ONS mid-year population estimates). Not all categories sum to 6,321 due to missing data (not answered or ‘prefer not to say’). A small number of participants who answered ‘other’ have been excluded from the analysis of gender.

Response was higher among women than men, and this was true for both modes.

The overall response was higher among older adults (aged 55 and over) and correspondingly lower among younger adults (aged under 35). Additionally, the pattern of response by mode varied strikingly by age.

Web responses were more likely to have come from younger participants. For example, 7.6% of web responses came from young adults aged 16 to 24 and 14.9% came from adults aged between 25 and 34, compared with 2.8% and 5.9% of paper responses respectively. By contrast, more than half of paper responses came from adults aged 65 or over: 26.3% from adults aged between 65 and 74 and 25.0% from those aged 75 or over, compared with less than a quarter of online responses from those aged 65 and over.

The population profile by ethnicity shown in Table 4 is based on the all-ages population of England and Wales in the 2011 Census. Although this does not provide an exact comparison, it is likely that that the AOHS sample under-represented people from non-white ethnic backgrounds. Minority ethnic participants were more likely to respond to the web questionnaire than to the paper version; this may have been a function of age.

Table 2: profile of achieved sample (unweighted) by age

Gender Population estimate for England (%) Online (%) Paper (%) Total (%)
Male 49.0 42.4 41.7 42.2
Female 51.0 57.6 58.3 57.8
Base (number) 45,470,282 4,501 1,791 6,292

Table 3: profile of achieved sample (unweighted) by gender

Age Population estimate for England (%) Online (%) Paper (%) Total (%)
16 to 24 13.1 7.6 2.8 6.3
25 to 34 16.7 14.9 5.9 12.4
35 to 44 15.7 18.7 9.4 16.0
45 to 54 16.8 19.7 11.7 17.4
55 to 64 14.9 17.3 18.9 17.8
65 to 74 12.3 15.6 26.3 18.7
75+ 10.5 6.1 25.0 11.5
Base (number) 45,470,282 4,515 1,804 6,319

Table 4: profile of achieved sample (unweighted) by ethnicity

Ethnicity Population estimate for England (%) Online (%) Paper (%) Total (%)
White British 80.5 83.3 88.9 84.9
White (other) 5.6 6.1 3.9 5.5
Mixed 2.2 1.9 1.0 1.7
Asian 7.5 6.3 4.0 5.7
Black 3.3 1.8 2.0 1.8
Other 1.0 0.6 0.2 0.5
Base (number) 42,989,620 4,510 1,755 6,265

Equal numbers of addresses were sampled for each PHE region. Response was highest in the South East and South West, lowest in the North West and London (Table 5).

Table 5: achieved sample by region (unweighted)

PHE region Issued sample (%) Online (%) Paper (%) Total (%)
North East and Yorkshire 14.3 14.4 14.1 14.3
North West 14.3 12.3 14.0 12.8
Midlands 14.3 13.6 13.6 13.6
East of England 14.3 14.6 13.1 14.2
London 14.3 12.1 10.2 11.6
South East 14.3 16.0 15.4 15.9
South West 14.3 16.8 19.5 17.6
Base (number) 45,470,282 4,515 1,806 6,321

Data processing

Data editing and coding

The questionnaire was designed to require minimal editing.

The program for the web questionnaire had the necessary checks to ensure consistency and plausibility built in. For example, participants’ given age and date of birth need to be consistent and they could not enter an age greater than 120 or claim to have more than 32 teeth.

The paper questionnaires were scanned. Unlike the web questionnaires, the paper questionnaires could include errors where respondents did not follow instructions, so the data was manually edited. Edits included checking that questions requiring a single answer had not had more than one option ticked, that all questions answered were relevant to the respondent, that questionnaire routing was followed and that answers were realistic and consistent (for example, that a participant’s date of birth and age corresponded, that a respondent reported having no more than 32 teeth and that that the number of teeth in total was not less than the number of teeth reported as filled). Rules were developed and applied to ensure that all such anomalies were dealt with in a consistent fashion.

The questionnaire included a small number of questions that allowed the participants to write in a response if none of the existing answer categories applied to them. These were back-coded into existing categories or remained coded as ‘other’ (that is, no new coding frames were developed).

Data validation

Up to 5 responses could be received from any household: 3 web responses and 2 paper questionnaires. Post-fieldwork validation was carried out to identify any duplicate responses. The checks included checks across modes to see if a participant had completed the survey both online and on paper. Matches were checked using the following criteria:

  • household serial number
  • full name
  • date of birth and age
  • sex

If more than one case had the same full name, these were manually reviewed to determine if they were a cause for concern. Full names were manually checked against other demographic data such as date of birth and sex to see if they were valid duplicate cases.

Where duplicates were identified, no more than one was included in the final data set, according to a set of consistent rules (for example, web responses were preferred to paper questionnaires).

Additionally, a similar set of rules was applied to ensure that information from respondents within the same household about the household and its members was consistent (for example, information about the numbers of adults and children, their sex and age, and the household income).

To determine which household data to use for the whole household (where there were inconsistencies) the following steps were applied to prioritise the responses:

  • online responses were prioritised over paper responses
  • if mode of completion was the same, then responses from the oldest participant were prioritised
  • if participants’ date of birth were identical then household-level responses were taken from the first serial number identified in the household

Quality assurance

Several steps were used to quality assure the data processing:

  • there was a single survey weight, which was created and checked by NatCen specialist statisticians
  • tables were run using NatCen’s standard tables syntax that has been developed and checked over several years. This standard syntax required a survey weight and this was duly applied
  • the Excel files were run twice and cross-checked. Additional checks were made to ensure that bases were correct
  • the report text was checked against the final tables.

Data disclosure

Disclosure control was applied to the data and therefore no results are shown for bases when the number of participants was below 50.

Weighting

The achieved survey sample was weighted to adjust for:

  • the probability of each address being selected for the survey sample
  • differential non-response at the address (household) level
  • differential non-response within addresses (households)
  • the overall demographic profile of the responding sample (calibration weighting)

Addresses and households were treated as identical, although some addresses contained more than one household. However, it was not possible to record this reliably using the push-to-web methodology.

Weighting was conducted in the following 4 corresponding steps.

Step 1: selection weights

In each of the 7 PHE regions, 23 PSUs were selected to form the basis of the survey sample (see the ‘Sample’ section). The same number of addresses was sampled in each of the 7 regions (2,760). As not all regions have the same total number of addresses, this meant that addresses in some parts of England had a higher probability of being selected for the survey than those in other regions. For example, addresses in the South West of England had approximately a 1 in 869 chance of being selected. This compares to a 1 in 1,602 chance for addresses located in the Midlands.

To help ensure that respondents in regions with fewer addresses (such as the South West) were not over-represented in the achieved survey sample, the probability of selection for the survey was factored into the weights. A selection weight (W1) was calculated for each address selected for the survey sample as the inverse of the probability of selection (W1 = 1 / P1, where P1 is the probability of the address having been selected for the survey sample).

A minor adjustment was then made to the selection weights to account for some addresses that were selected for the survey sample but not issued. This was because the sample selection was based on unique delivery point reference numbers (UDPRNs), with full address data being subsequently matched on. Full address data could not be matched on to 48 of the UDPRNs and these were not replaced. This meant that there were some minor discrepancies in the sample sizes for each of the 7 PHE regions. This was corrected for through a simple scaling adjustment.

Step 2: address (household) non-response

The selection weights were applied to a non-response weighting model which was used to estimate the likelihood of receiving at least one survey response from an address. This helped to minimise bias arising from any systematic differences between the addresses that participated and those that did not. (For the purposes of the address non-response model, participation is defined as at least one survey response having been received.)

The probability that at least one valid survey response was received from a given address was estimated using a logistic regression model. Each address was characterised by the known profile of the surrounding geographic area and this was used to predict the probability of participation by individual addresses. The variables specified in the final model were determined using a stepwise procedure. The variables used in the final model were:

The non-response weights (W2) were calculated as the inverse of these probabilities. A combined weight (W3) was then calculated as the product of the selection and non-response weights, that is W3 = W1 x W2.

Step 3: non-response within households

At each address, it was possible for up to 5 surveys to be completed. (Up to 3 survey responses could be submitted by web; up to 2 paper survey responses could also be returned. Only 1 address submitted 5 surveys; all other responding addresses submitted between 1 and 3.) The second stage of weighting was therefore designed to reduce bias caused by systematic differences between the households and individuals who completed multiple surveys compared with those completing only one.

The expected number of completed surveys at each responding address was estimated using multinomial regression. (This model is weighted by the weight produced in step 1 and, as such, accounts for address or household participation (W3).) Addresses which contained only one adult were excluded from this analysis as only a maximum of one survey could be returned, so that the value for the expected number of responses at addresses containing a single adult was set as one. Separate models were then run for addresses containing 2 adults or 3 or more adults. As in step 2 of weighting, the explanatory variables used in these models were determined using a stepwise procedure. The explanatory variables used in the final model were:

  • PHE region
  • IMD rank (quintiles)
  • population density (quintiles)
  • household income
  • whether any paper surveys were returned from the address (binary variable)

The output from the final models was then used to fit values for the expected number of survey responses from each address containing 2 or more adults. The inverse of this value was then used to create an expected number of responses weight (W4). This weight was then combined with W3 to create a pre-calibration weight (W5), that is W5 = W3 x W4.

Step 4: final adjustment of the profile by age and sex

A final calibration stage was then used to adjust the weights so that the characteristics of the weighted achieved sample match known population estimates for region, age and sex, ethnicity and tenure. ONS mid-year population estimates were used as benchmarks for region, and age and sex, while data from the February to April 2021 wave of the Labour Force Survey were used to estimate the composition of the population in terms of ethnicity and tenure.

Once the calibration adjustment had been made, the weights were then scaled to the responding sample size for the survey to create the final weighting variable (aohs_weight).

Survey limitations

The 2021 AOHS was intended to be the latest in the series of ADH surveys carried out every decade since 1968. Because of COVID-19 restrictions, the survey differed from previous ADH surveys in a number of important respects, and these differences mean that, where time series data are available, comparisons the findings from 2021 and earlier surveys should be treated with caution.

The change in mode of the 2021 survey had a number of effects, already discussed throughout this report.

Data collection was via a self-completion mode (online or paper). This had the following implications:

  • the clinical examination of participants’ mouths could not be carried out, so all information on the condition of participants’ teeth and mouths was based on information reported by the individuals themselves
  • participants were recruited to previous ADH surveys by interviewers who made a number of calls to selected addresses, in order to explain the survey, encourage participation and include hard-to-reach participants. In common with other surveys where selected participants opt in, the 2021 AOHS experienced lower response rates across the sample. While weighting the data accounted for under-represented socio-demographic groups, there may have been other relevant characteristics that affected participation that were not possible to identify and correct for
  • the change from face-to-face interview to self-completion mode may have affected responses in a number of ways, for example by encouraging participants to be more honest in reporting sensitive information, but losing the support provided by the rapport created by an experienced survey interviewer
  • the self-completion mode made it necessary to reduce the content of the questionnaire by about half, so fewer topics could be covered

The smaller sample size means that confidence intervals around survey estimates are wider than in previous surveys.

In addition, the 2021 survey was carried out approximately a year after the start of the pandemic. The resultant restrictions will have led to some changes from participants’ usual behaviour, for example in their dental attendance patterns and health-related behaviours such as smoking and drinking.

Revisions

Any revisions to past publications will be in line with DHSC’s revision policy. Any unscheduled or substantial revisions that do not fit into the scheduled revisions criteria will be highlighted accordingly.

  1. This excludes 22 cases that were removed following validation checks.