Official Statistics

Armed Forces Sexualised Behaviours and Sexual Harassment Survey: background quality report

Published 13 November 2025

1. Contact

The Responsible Statistician for the Armed Forces Sexualised Behaviours and Sexual Harassment Survey (SBSHS) is the Head of the Analysis and Performance Directorate Surveys Team, email: Analysis-Surveys-Enquiries@mod.gov.uk.

2. Introduction

This is the first publication of the Tri-Service Armed Forces Sexualised Behaviours and Sexual Harassment Survey (SBSHS) 2025. This survey has been created in accordance with a recommendation by the House of Commons Defence Committee (HCDC) in 2021, following the Wigston review (2019), to centralise research into sexual harassment within the Armed Forces.

The SBSHS aims to help the investigation into sexualised behaviour and perceived sexual harassment within the Armed Forces over the past 12 months, supporting the assessment of the overall organisational climate across Defence. SBSHS reports on both Regular and Reserve service personnel.

3. Statistical Processing

SBSHS has eight main development stages. Each of these stages is summarised below.

3.1 Stage 1: Questionnaire design

A literature review and initial survey design was produced through an independent review of existing legal and psychological research by Loughborough University. A working group of internal Ministry of Defence staff from Defence People, single Service Occupational Psychology, and Analysis Surveys teams agreed and finalised a SBSHS questionnaire.

Loughborough University informed the 19 sexualised behaviours used in the SBSHS. The 19 behaviours (listed in table 1 of the main report), were curated following an independent review of existing legal and psychological research by Loughborough University, and categorised into 4 behaviour classifications. Across research and legislature, behaviours that constitute sexual harassment fall into three categories: ‘Verbal’, ‘Non-verbal’, and ‘Physical behaviours’ (Zhu et al., 2018; Rubino et al., 2017; European Parliament 2004).

In addition to these three categories, the Government Equalities Office (2019) and Loughborough University’s report identify and recommend the inclusion of an additional category for sexual harassment behaviours: ‘Cyber harassment’. In accordance with these recommendations, the sexualised behaviours investigated by the SBSHS have been categorised into one of the four classifications, and responses were analysed based on these groupings. This list of sexualised behaviours is not exhaustive, and a free-text response question has been included in the survey allowing personnel to inform the researchers of any which should be included in future iterations of the survey.

3.2 Stage 2: Sample design, selection and cleaning

The Armed Forces SBSHS 2025 was open to all currently serving Regular, Reserve and Trainee personnel, across all Services (Royal Navy, Royal Marines, Army, and Royal Air Force). When references are made to Royal Navy (RN) Regular personnel, this population includes both Royal Navy and Royal Marines personnel. When references are made to Maritime Reserve personnel, this population includes Reserves of both the Royal Navy and the Royal Marines.

Though open to all current Service personnel, a sample which consisted of 94,275 personnel (65,795 Regular and 28,480 Reserve personnel) were selected under a (disproportionate) stratified simple random sampling process, and sent an invitation to complete the survey directly via their MOD email (see details in stage 3).

Samples were designed to provide sufficient responses to yield estimates with a margin of error of plus or minus 3% for the main comparison groups of Sex, Service and Officer/Rank. Due to the smaller size of groups when broken down by the three variables of interest, a census was taken of all Servicewomen, Reserves, Trainees, under 18s, and Royal Marines (these census populations received a direct email invitation, see stage 3).

The report has been split by population type, where Regulars and Reserves are reported separately due to the differences in their commitment types (for example, how often personnel interact together and their living environment). This survey was open to all Regular and Reserve Armed Forces Personnel, including phase 1 and phase 2 Trainees. However, the Trainee population has been removed due to a low uptake of responses and subsequently not meeting the threshold for precise analysis. This is a known difficult population to reach and special considerations will be implemented in the (planned) future edition(s) of this survey to ensure we reach a sufficient response rate for robust analysis.

‘Other’ population groups which were excluded from analysis include: University Training Corps Officer Cadets, Sponsored Reserves, Military Provost Guard Service, and the Strategic Reserve. These are groups of personnel who have distinct working and/or living environments which would not be comparable to a Regular or Reserve population and have too few responses to analyse as their own population. We will look to incorporate these groups where appropriate in future versions. We also exclude deployed personnel, and those on training courses, because of the low response rates typically achieved from these personnel.

The review of any free-text responses to the question “What is your commitment type? – Other, please specify;” identified a number of responses from those who did not meet the eligibility criteria and subsequently removed from analysis. These were Civil Servants, RFA (Royal Fleet Auxiliary) Personnel and UAS students.

3.3 Stage 3: Survey distribution and communications

Those in the sample were sent the questionnaire directly through an emailed invitation link. In addition, any Service personnel (not in the sample) could access the survey through a ‘general’ link on internal Defence communications. The general link was made available to ensure personnel did not feel as though they were excluded from participating. Analysis was conducted on the two data sources to check for any bias and was deemed appropriate to combine them.

Personnel were offered two options to complete the survey online, either via the MOD Intranet (internal), or the Internet (external) in order to maximise participation. Single Service Occupational Psychology and Defence People teams helped to publicise the survey and provided support on communications through disseminating comms up through Chains of Command.

3.4 Stage 4: Data input and validation

Online survey responses are held securely on protected MOD servers (named access only). Online datasets are available to download once the survey has closed. Regular and Reserve personnels’ responses are split and validated separately. Validation procedures at this stage include checking for expected response options and missing question items, identifying contradictory responses (e.g. impossible age/length of service combinations), and analysing ‘repeated pattern’ responses (e.g. first answer option selected for all questions). The datasets are then combined to form a single dataset. Any invalid responses (e.g. completely blank responses) are removed and do not contribute to the response rate.

3.5 Stage 5: Data compilation and results production

Where required, questions are recoded to simplify the output. For example, all 5-point Likert scale responses are recoded into a 3-point positive, neutral, negative scale. Responses are weighted by Sex, Service and Rank. This accounts for bias caused by disproportionate stratified sampling and differing levels of response. Full details of the weighting plan are available in the supplementary tables. Finally, the data is transferred into RStudio.

Tables of results are produced using RStudio to ensure estimates and their corresponding standard errors are correctly weighted. Each estimate carries a margin of error to enable users to observe the level of uncertainty in the estimate. In-year tests between Services/Rank/Sex groups are also carried out, a 5% significance level was applied. Non-significant changes are not described as changes in the report. A statistically significant difference means that there is enough evidence that the change observed is unlikely to be due to chance variation (less than a 5% probability that the difference is the result of chance alone).

3.6 Stage 6: Further Quality Assurance

There are several stages of both automated and manual validation built into the data cleaning process. Each section of tables, along with the content of the report, undergoes several layers of quality assurance. These include cross-checking by members of the Surveys team and from nominated members from the Analysis Directorate, independent from the Surveys team, and automated data checks. Quality assurance checks at this stage include unweighted counts are plausible, significant differences are correctly reported and that the report commentary aligns with the tables.

3.7 Stage 7: Publication

SBSHS is published through a headline report, supplementary tables and this Background Quality Report on the SBSHS Gov.uk landing page.

4. Quality Management

4.1 Quality Assurance

The MOD’s quality management process for Official Statistics consists of three elements which the SBSHS will adhere to: (1) Regularly monitoring and assessing quality risk via an annual assessment; (2) Providing a mechanism for reporting and reviewing revisions/corrections to Official Statistics; (3) Ensuring Background Quality Reports (BQRs) are publishing alongside reports and are updated regularly.

4.2 Relevance

The survey asks about the experiences of sexualised behaviours in the last 12 months (the survey was in field May – July 2025), therefore the statistics are relevant only to that timeframe. For perception questions, they are relevant to those at the time of data collection, and not those who have more recently joined the Armed Forces. Please see the ‘Using and interpreting the statistics in this report’ section of the main report to understand what you can and cannot do with this publication.

Although the SBSHS is a new statistic, we know some of the main stakeholders/users will be in Defence People Strategy and from the Conduct Equity and Justice team (including the Raising our Standards Directorate), as well as single Service policy makers. However, we also anticipate external interest (e.g. the Defence Committee) which we will track and promote. The question set will be reviewed on a regular basis to ensure SBSHS reflects policy user requirements and the priorities of the department, the research team will also conduct free-text analysis to identify potential changes to future iterations.

5. Accuracy & Reliability

5.1 Overall Accuracy

SBSHS collects data from a disproportionate stratified random sample of approximately 94,000 Regular, Reserve, and untrained Armed Forces personnel. The sample size is designed to achieve a margin of error of plus or minus three percentage points for each of the estimates. A number of questions are only asked of a subset of respondents, and they typically carry a larger margin of error. For example, questions on how a participant responded to experiencing a sexualised behaviour is only asked of those who reported experiencing one. Margin of error information for each question is published in the Supplementary Tables.

The SBSHS is designed to give a recent understanding of the attitudes and perceptions of our Armed Forces regarding sexualised behaviour and sexual harassment. However, it should be considered that these attitudes and perceptions may be liable to change within the calendar year, for example, as a result of events, interventions, or even due to the time of the year that the responses were collected (a seasonality effect).

The raw data is passed through a range of automatic and manual validation and editing routines. This helps to minimise the risk of error and improves timeliness. Where in year comparisons are possible, 95% confidence level statistical tests are carried out. This level is used to minimise the possibility of finding false positive differences that can be expected when performing a large number of significance tests.

The Analysis Surveys team do not present any results where the responding group size is less than 30 as results for groups of this size are considered too unreliable, yielding margins of error far outside the target range of plus or minus three per cent.

5.2 Sampling/Non-sampling errors

As the SBSHS does not achieve 100% response rates, there is always the risk that those who returned questionnaires have differing views from those who did not. We assume that all non-response is Missing At Random (MAR) within each weighting class. This means we have assumed that, on average, those people who did not return their questionnaires do not differ from those who did respond in their perceptions and attitudes, within each weighting class. There is a small risk that this assumption is violated by self-selection bias. This is where participants are more likely to partake in a survey which impacts them. Due to the targeted and sensitive nature of this survey, those who have been impacted by, or have strong opinions on sexualised behaviours and sexual harassment in the Armed Forces may be more likely to take part – leading to a less representative sample. To limit the impact of this bias it was made clear through active communications strategy and in the email invitations that this survey wishes to hear from all military personnel. Messaging around this was extensive and made as clear as possible.  

One area to note is low response rates among certain groups. Response rates tend to be lower for junior ranks and Reserve personnel. This may be partly due to distribution issues; some invitation emails to this group ‘bounce’ suggesting they do not access the Intranet or electronic systems. This group is oversampled to compensate for expected low response rates.

The number of responses yield estimates with a good level of precision. When no subsets are applied, estimates for Regular personnel at a Tri-Service level and for Royal Navy, Army and RAF personnel are within the +/- three percentage point margins of error. Margins of error are slightly higher for the Tri-Service estimates for Reserves and for Rank and Sex groups within the Services, but most have a margin of error within +/- five percentage points. Breaking down Reserve personnel estimates at a Service level (or when comparison Rank and Sex differences within Services) yield margin of errors within +/- ten percentage points, here there are less precision around the estimates and the likelihood of a statistically significant difference decreases (only significant differences are stated in the report). Tables of corresponding margins of error for each estimate are published in the Supplementary Tables.

The results are weighted to account for the differing response rates observed in SBSHS. This ensures that the results reflect the distribution of Service, rank and sex within the population of Armed Forces personnel. A lower response rate means that those at the lowest ranks had relatively high weights when compared to other rank groups. Despite these caveats, weighting the data has still enabled the analysis to produce more reliably representative estimates for the perceptions, attitudes and experiences of each weight class, and therefore each Service, rank and sex. Further details on the weightings used can be found in the Methodology section of the main report.

6. Timeliness and Punctuality

6.1 Timeline

The survey was in field for six weeks with an additional two weeks (from May to July 2025) to improve response rates. During the two weeks extension, additional physical communications (such as posters) were installed in areas of high foot traffic where difficult to reach populations may see them. The analysis window has taken 3-4 months. This, along with the large and complex nature of the survey, means that there is only a small gap between the beginning of fieldwork and the publication of the report. The timing of data collection also has the presentational benefit of allowing results to be published within the same calendar year as the data was collected.

6.2 Punctuality

The publication date is pre-announced on the GOV.UK Official Statistics Release calendar and confirmed over four weeks before publication. All pre-announced publication deadlines have been met.

7. Coherence and Comparability

SBSHS is a definitive source of data about Service personnel’s experiences and perceptions of sexualised behaviours and sexual harassment in the Armed Forces. The SBSHS is a new and standalone survey and allows the department to make robust comparisons between Service and Rank group, notwithstanding the potential to compare by other demographics. Results are not meant to be directly compared to existing research due to differences in questionnaire design, sample frame, and points of investigation. There are no other tri-Service data sources that we are aware of that collect the same attitudinal information with a similar level of coherence.

Similar data which should be highlighted is from the Armed Forces Continuous Attitude Survey (AFCAS), this estimates Service personnels’ self-reported experience of ‘sexual harassment’ by asking the question: “Do you believe you have been subjected to sexual harassment in a Service environment in the last 12 months?”. SBSHS asks no such question and should not be directly compared because the SBSHS focuses on sexualised behaviours and not on a legal definition of sexual harassment. Using a self-reporting approach where individuals identify if they have experienced sexual harassment may be open to subjective variability, which is reduced when using a set of behavioural indicators. Estimates of perceived levels of experienced sexual harassment are calculated based on alignment of behaviours that Service personnel have experienced, and whether they identified the same behaviour in the corresponding question to be sexual harassment. Estimates should also not be compared because the SBSHS is anonymous whereas AFCAS is confidential, Service personnel may be more likely to share their views on a sensitive subject when the survey is anonymous. Lastly, the questions are asked to different populations – due to inclusion based on lived experience, SBSHS includes Full Time Reserve Service (FTRS) personnel into the Regular statistics, AFCAS does not include this group.

8. Accessibility and Clarity

8.1 Accessibility

The SBSHS report and this BQR is published in HTML format and available on the SBSHS landing page.

Supplementary tables are saved in both Excel and ODS format and accessibility guidance from central Government and Analytical professions were consulted during their formation. 

8.2 Clarity

In addition to this BQR, the SBSHS report gives a summary of the main findings on the front page, contains a narrative section which aids users’ interpretation of the data and a methodology section including target population, information on the sample, respondents, weighting, statistical tests used, and notations and definitions. Contact details for further questions are also included in the Main Report.

Margin of Error tables are also included in the Supplementary tables for every question to indicate precision of the estimates and relevant footnotes are shown below tables to indicate any filters that have been applied to the data and data quality issues.

8.3 Trade-offs between Output Quality Components

The main trade-off is between timeliness and quality. The tables are broken down by Service rank, and sex status. This is so that the basic statistical information can be made available to policy users and the public as soon as possible in a clear and accessible format. Additional analysis for internal use will be discussed and external use can be requested such as through the Freedom of Information process.

8.4 Cost and Respondent Burden

Costs for production including resourcing are closely monitored, and the Surveys team and the working group strive to balance quality and timeliness against them. The sample size is calculated to be the most efficient in order to meet the levels of precision outlined before.

Response to SBSHS is voluntary. Participant information is provided within the questionnaire to encourage informed consent. Most respondents complete the survey within 25 minutes.

9. Confidentiality and Security

9.1 Confidentiality

SBSHS is an anonymous survey and personnel access the survey through a link provided. No personal information is required for access. Personal data is used to gather email addresses to send invitations to the selected sample, where appropriate Data Protection Impact Assessments and Data Access Agreements are in place to minimise risk to confidentiality, in accordance with the Data Protection Act. Only members of the research team have access to this personal individual data, and records are destroyed after use.

Results where the responding group size is less than 30 are not presented, one reason being that it may be disclosive. Appropriate secondary suppression is also conducted before analysis has been presented in the Supplementary tables.

9.2 Security

All staff involved in the SBSHS production process adhere to the MOD and Civil Service data protection regulations. In addition, all members of the working group have to follow the relevant codes of practice for their professional groups; the Government Statistical Service (GSS) and the Government Social Research (GSR) Service. All data is stored, accessed, and analysed using the MOD’s internal, secure IT system.