Research and analysis

Review of English language assessment methods

Published 29 January 2026

Introduction

About the review

English language is a cornerstone of the immigration system and proficiency is vital for international students to successfully follow and complete their course of study. Students are more likely to successfully graduate if they have sufficient grasp of the language in which their course is delivered.

Currently, higher education providers (HEPs) with a track record of compliance can, using any method they choose, self-assess the English proficiency of their sponsored students studying at degree level or above.

In May 2024, the previous Government announced a series of measures to reduce the potential for abuse on the Student and Graduate routes. One of these measures is to review methods of English language assessments with the sector to understand how they are currently being carried out and if the process can be improved. On 30 July 2024, the Government confirmed it would continue the previous Government’s announced measures to prevent misuse of the Student and Graduate visa routes.

This survey was undertaken as part of the ongoing review into the English language assessment methods used by HEPs. It is intended to improve understanding of the processes that are currently being followed by HEPs in ensuring international students are meeting the required standard of English language proficiency set by the Home Office.

About the research

Prior to beginning the fieldwork process, higher education (HE) sector representatives were consulted and provided with the opportunity to share their views on survey content and development. Sector representatives were also consulted in helping to promote the survey to increase engagement from HEPs.

The survey was conducted online via Qualtrics and was open for completion from 10 March to 14 April 2025. To provide input, 343 contacts were invited across more than 200 HEPs. Only one response was collected per institution. In total, 144 fully completed responses were received to the survey and are included in this report. This equates to a response rate of 42%.

The survey included questions on participant demographics, including institution size, type, location, and nationalities most frequently encountered when assessing for English language proficiency. The survey also included questions on:

  • accepted methods of assessment
  • test providers and English language tests
  • details of the assessment process, including important criteria, challenges, and barriers in assessing English language proficiency
  • in-house English language testing
  • fraud, compliance and monitoring, including information on document verification processes
  • use of third-party services

Reporting considerations

  • HEPs’ refers to higher education providers
  • SELT’ stands for ‘Secure English language test’
  • SELT Providers’ include Pearson PTE; Trinity College London; IELTS; Skills for English (PSI Services UK Ltd.); and LanguageCert
  • all other providers listed throughout this report are ‘Non-SELT Providers’ that have not had their testing services verified by UKVI
  • ‘Unprompted response’ refers to responses which are coded from open text questions (that is, no response options were provided for selection)
  • quotes used throughout the report have been checked to ensure they contain no information which could lead to an individual or institution being identified

Executive summary

Accepted formats, providers and tests

SELTs were an accepted method of assessment for almost all HEPs (99%), with the vast majority also accepting Non-SELTs (92%), international English subject qualifications (93%) and degrees conducted in English (92%). They were slightly less likely to accept Local English subject qualifications (83%), Pre-sessional English courses / pathway programmes (78%), and high-school / college level diplomas conducted in English (33%). Over one-third (35%) provided in-house English language testing.

Tests from SELT providers were generally more likely to be accepted than tests from Non-SELT providers. The vast majority accepted tests from IELTS (99%) and Pearson PTE (94%); around three-quarters accepted tests from LanguageCert (78%) and Trinity College London (78%), while Skills for English was accepted by just over half (53%). TOEFL (89%) and Cambridge English (86%) were the only Non-SELT providers accepted by more than half of all HEPs.

While many tests were listed as acceptable in evidencing English language ability, most were listed by only one or 2 providers; 21 specific English language tests were accepted by at least a quarter of all HEPs, with 11 of these administered by SELT providers. IELTS Academic and Pearson Test of English (PTE) Academic UKVI were accepted by the vast majority of those surveyed (97% and 94% respectively). TOEFL iBT (Test Centre (89%)/ Home(65%)) and Cambridge English (C1 (84%) and C2 (76%)) tests were accepted by more HEPs than not.

Assessment processes

Test quality, reliability, security, and reputation of the provider was cited by the vast majority as essential in deciding which tests to accept.

Almost all HEPs (97%) reported reviewing the validity of Non-SELT assessments to ensure the reliability of these tests, with a further 92% saying they reviewed test security.

Top reasons for providing in-house English language testing were:

  • to provide an accurate assessment of English language ability (63%)
  • to provide an alternative for students unable to access other forms of assessment (61%)
  • to gain greater control of the assessment process (47%)

Almost all (96%) accepted passports as a valid form of ID. Over one-third (37%) accepted another (non-specified) form of official identification.

Around one-fifth (18%) of those providing in-house testing used a third-party service to support this process, with provision of testing platforms the top reason for using a third-party. Most used these services for cost/convenience reasons, both to the HEP and student. However, these findings should be treated with caution due to the low number of HEPs using in-house testing.

The 2 main barriers around English language testing were limited accessibility to approved testing centres (56%) and difficulty in ensuring consistency across providers (51%). Difficulty with document verification (31%) and clarity/consistency of government policy guidance (28%) were also noted as challenges.

Fraud, compliance and monitoring

Over one-third (35%) reported conducting internal checks to assess English language qualifications. Many HEPs (59%) spent 30 minutes or less reviewing English language documentation for a single application, with around one-third (34%) spending a few minutes or less. Roughly two-thirds (65%) of HEPs reported using an ‘external organisation’ to help verify applicant documentation, whilst in open text comments four-fifths (80%) referenced using a provider website to verify English language documents (suggesting that at least some HEPs interpret the use of test providers online verification services as different to using an ‘external organisation’ for helping verify English language documentation). A few HEPs (17%) used third-party services for English language proficiency checks, and nearly all used these services for document verification (92%), with around one-third (32%) using them for compliance support.

Around four-fifths (79%) evaluated test security and fraud as part of due diligence, with half of HEPs (50%) having a written policy to detect and/or prevent instances of fraud. Around one-tenth (12%) of HEPs had not encountered any fraud; most had experienced fraud rarely, very rarely, or almost never (71%); around one-sixth (16%) had experienced it sometimes, and a small minority (2%) reported that fraud occurred often. Internal review (82%) and student withdrawal (87%) were the most common ways to deal with fraud. Most HEPs mentioning fraud in open text responses expressed a strict approach in dealing with its presence. The vast majority reported offering fraud prevention training (92%) or providing staff the opportunity to participate in UKVI document verification training (94%).

Roughly four-fifths (79%) had a process in place for dealing with students displaying an inadequate level of English language ability. Three-quarters (76%) offered English language support to struggling students, while two-fifths (40%) offered academic support. Some HEPs mentioned that they would also undertake compliance procedures (for example, document checks (44%)) where fraud had been identified post-enrolment.

The nationalities most often encountered when assessing for English language proficiency included those from India (92%) China (76%), Nigeria (65%) and Pakistan (65%). Students from Ghana (43%) and Bangladesh (38%) were also frequently encountered. All other national groups were mentioned by less than one-third of HEPs.

1. Accepted formats, providers and tests

While SELTs were accepted by almost all HEPs, a very high proportion also accepted Non-SELT tests (92%), international English subject qualifications (93%) and degrees conducted in English (92%) as evidence of English language proficiency. A majority also accepted Local English subject qualifications (83%) and Pre-sessional English courses / pathway programmes (78%). High-school / college level diplomas conducted in English were by comparison only accepted by a third (33%), with a similar proportion of HEPs providing their own in-house testing (35%). This suggests that HEPs would prefer to review certified evidence of English language proficiency, rather than test from scratch.

Figure 1: Accepted methods of assessment

Base: “Which of the following does your institution accept as evidence of English language proficiency for prospective international students applying to courses at degree level or higher?”, All respondents (144).

Tests from SELT providers were generally more likely to be accepted compared to tests from Non-SELT providers. Nearly all HEPs accepted tests from IELTS (99%) and Pearson PTE (94%); over three-quarters accepted tests from LanguageCert (78%) and Trinity College London (78%) respectively, while Skills for English was accepted by just over half (53%). TOEFL (89%) and Cambridge English (86%) were the only Non-SELT providers accepted by more than half of all HEPs.

Figure 2: Accepted providers

Base: “…Please select the providers that your institution accepts tests or certifications from as evidence of English language proficiency for prospective students applying to courses at degree level or higher.”, All respondents (144).

Although a number of different tests were listed as acceptable in evidencing English language ability, 21 specific English language tests were accepted by at least a quarter of all HEPs. Of these, IELTS Academic and Pearson Test of English (PTE) Academic UKVI were the most likely to be accepted, with all but a few HEIs accepting these specific tests. TOEFL iBT (Test Centre / Home) and Cambridge English (C1 and C2) tests were accepted by more institutions than not.

Figure 3: Accepted tests (most frequently accepted)

Base: “Which of the following tests or certifications does your institution accept as evidence of English language proficiency for prospective students applying to courses at degree level or higher?”, All respondents (144). Responses below 25% not shown.

2. Assessment processes

2.1 External English language testing

Where concerning the criteria that were most important in selecting which tests to accept, almost all HEPs reported that ‘test quality/reliability’ (97%) and ‘security’ (95%) were essential. A majority (84%) reported ‘reputation of the provider’ as being essential to the decision-making process. All other factors were more likely to be viewed as desirable, rather than essential. Some factors were considered irrelevant by many, most notably ‘cost to the university’ (40%).

Figure 4: Test selection criteria

Base: “What criteria are most important for your institution when deciding which tests to use for assessing English language proficiency?”, All respondents (144).

Where concerning steps taken to ensure reliability of Non-SELTs, almost all HEPs (97%) reported reviewing the validity of Non-SELT assessments, with the majority (92%) also reviewing test security. Many reviewed marker training and standardisation (63%), with ‘other’ responses showing additional consideration for benchmarking against widely accepted criteria (for example, CEFR, 4 language skills). Applicant experience and accessibility considerations were selected by less than half (41%) and one-third (32%) respectively.

Figure 5: Steps taken to ensure reliability of non-SELTs

Base: “What steps does your institution take to ensure that non-SELT tests are reliable in determining the English language ability of prospective students?”, All accepting ‘Other (non-SELT) English language tests or certifications (for example, TOEFL, TOEIC, Duolingo)’ (133).

“We investigate whether the test covers all 4 language components and also consider assessment rigour (for example, length of the written test). We also look at how the test provider has benchmarked their test results against CEFR. We also ask for information about how the test provider deals with test taker breaches (and potential breaches). All tests are examined by the Head of Centre for International English.”

2.2 In-house English language testing

Of the roughly one-third (35%) of HEPs that reported providing in-house English language testing, many did so because they felt it provided an accurate assessment of English language ability (63%) or provided an alternative for students unable to access other forms of assessment (61%). Just under half used this method to give them greater control of the assessment process (47%) and to improve accessibility for the student (41%). For around a quarter of respondents, it was also seen to be quicker (27%) and more secure (22%).

Figure 6: Top reasons for developing and using in-house English language testing

Base: “What are the top 3 reasons for developing and using your own in-house English language testing as an alternative to other methods of assessment?”, All providing ‘In-house English language testing’ (51).

Almost all HEPs (96%) accepted passports as proof of identification for in-house testing. Over one-third (37%) accepted another official form of identification, with UK Driving Licenses, International Driving Licenses, and other (non-specified) National ID cards accepted by less than one-fifth of all HEPs, respectively. Several other forms of identification were mentioned, although not more than once or twice. Roughly a quarter (27%) collected applicants’ personal details.

Figure 7: Accepted methods for verifying applicant identify for in-house tests

Base: “Please select the methods that your institution accepts for verifying the identity of prospective students taking in-house English language tests”, All providing ‘In-house English language testing’ (51).

Around one-fifth (18%) of those providing in-house testing used a third-party service to support this process. Services provided to HEPs by third-parties included providing test platforms and materials, invigilating, test scoring and managing data, fraud detection and logistical support. Most used these services due to cost effectiveness, convenience / accessibility for students, and due to a lack of internal resources (although these findings are not conclusive due to low base sizes).

Figure 8: Use of third-party services for in-house English language testing

Base: “Does your institution use third-party services to administer in-house English language testing?”, All providing ‘In-house English language testing’ (51).

2.3 Challenges for HEPs

The 2 main barriers, mentioned by more than half of HEPs, were limited accessibility to approved testing centres (56%) and difficulty in ensuring consistency across providers (51%). Difficulty with document verification (31%) and clarity / consistency of government policy guidance (28%) were noted as challenges by more than a quarter of HEPs. Only a small number of respondents (12%) did not indicate facing any challenges or barriers, selecting ‘None of the above’.

Figure 9: Challenges and barriers in assessing English language ability

Base: “Which of the following do you recognise as challenges or barriers currently facing your institution in assessing the English language ability of prospective international students”, All respondents (144).

3. Fraud, compliance and monitoring

3.1 Document verification process

When asked to outline the steps taken to verify English language documentation without being prompted, the majority of HEPs (80%) reported using test providers online website / portal to verify English language qualifications. Some HEPs (16%) reported that they would contact the issuing institution directly and others (12%) reported that they would use another third-party software (for example, ‘Qualification Check’). When prompted, around two-thirds (65%) of respondents reported their institution using an ‘external organisation’ to help verify applicant documentation.

Figure 10: External verification (steps taken to verify qualification documents)

Base: “Please outline the steps that your institution takes to verify the authenticity of qualification documents used to evidence English language ability”, All respondents (144). Unprompted response (derived through coding of open text data). Responses below 2% not shown.

“For the language tests (IELTS/TOEFL/ PTE/ IB, for instance) we will use the online verification services from the test provider to ensure the certificates provided match what is on the test provider’s database. The work is carried out in-house as a standard process during the application assessment.”

Without prompting, roughly one-third (35%) of HEPs reported conducting some form of internal verification when assessing English language documents. Some HEPs (15%) would conduct an interview or in-house test/assessment as part of this process, and a some (12%) would cross check with another form of photo ID. Thirty minutes or less per application was the average time taken to review English language documentation for most (59%) HEPs, whilst roughly one-third (34%) would spend only a few minutes or less. Some institutions stated that time taken to review documentation largely depended on the nature of the application as well as the type of evidence provided. Verification of lesser used English language tests or certificates could take more time.

Figure 11: Internal verification (steps taken to verify qualification documents)

Base: “Please outline the steps that your institution takes to verify the authenticity of qualification documents used to evidence English language ability”, All respondents (144). Unprompted response (derived through coding of open text data). Responses below 2% not shown.

Figure 12: Time taken to review English language documentation

Base: “Please give us your best estimate of the average time taken to review the documentation relating to English language proficiency for a single (non-domestic) application”, All respondents (144).

“Where available, all English language test results are verified via the online test verification portal. At this stage, student photos are compared to passport photos, if applicable. We work in partnership with Qualification Check (external company) to directly verify qualifications where we have concerns, where possible. With all documentation assessed, the admissions team carry out a series of checks on the document to ensure it appears genuine, for example, alignment, font, marks and inconsistencies. They are all trained regularly on identification of fraud, engage with UKVI webinars on this topic, and there are a range of resources available to support this practice.”

Under one-fifth (17%) of HEPs reported using a third-party service for English language proficiency checks and document verification. All using these services used them for document checks (92%), with roughly one-third (32%) using them for compliance support. Third-parties were used for a few reasons, including greater reassurance in meeting compliance standards (60%), access to external expertise and specialised tools/data (56%), to reduce workload (52%) and to improve accountability (48%).

Figure 13: Third parties used for English language proficiency checks and document verification

Base: “Does your institution use third-party services to undertake English language proficiency checks and document verification?”, All respondents (144).

Figure 14: Services that third parties offer HEPs

Base: “What services do these third-party providers offer to your institution to support with English language proficiency checks and document verification?”, All using a third-party service (26).

Figure 15: Reasons for using third-party services

Base: “What are the main reasons your institution opts for third-party services to support English language proficiency checks and document verification?”, All using a third-party service (26).

Although training was less often mentioned when discussing the steps taken to verify English language qualification documents, when prompted the vast majority of HEPs reported offering some form of internal fraud prevention training (92%) and providing the opportunity for staff to participate in UKVI document verification training (94%). Most of those mentioning training in open text responses reported providing access to internal, UKVI, and ECCTIS trainings.

Figure 16: Access to training

Base: “Please read the below statements and select all that apply to your institution?”, All respondents (144).

“When assessing English language proficiency through a high school qualification, our admissions staff conduct a thorough certificate verification check. They are trained to recognize indicators of fraud, regularly attending UKVI and ECCTIS training sessions to stay informed on best practices.”

Several different teams were cited as being involved in the process of verifying qualification documents used to evidence English language ability. However, admissions teams were cited by almost half (46%), indicating that admissions staff are the most likely to be the ones handling the verification process. Immigration compliance / CAS / international student services staff were the second most likely to be involved, being referenced by 13% of respondents through open text responses.

Figure 17: Teams involved

Base: “Please outline the steps that your institution takes to verify the authenticity of qualification documents used to evidence English language ability”, All respondents (144). Unprompted response (derived through coding of open text data). Responses below 2% not shown.

“All documentation is assessed and verified internally via our Admissions team and then rechecked via our international student services team prior to CAS issuance. Where possible, online verification of documentation is used. If not, we check that documentation looks authentic, and follows the standard format and structure of previous documentation we have seen…”

3.2 Fraud

Although fraud was unlikely to be explicitly mentioned when discussing the steps taken to verify English language qualifications, when prompted most (79%) respondents reported that their institution evaluates test security / fraud as part of due diligence, with half (50%) of HEPs having put in place a written policy to help detect and prevent instances of fraud. Most who explicitly mentioned fraud in open text responses expressed a strict approach to dealing with its presence.

Figure 18: Fraud prevention

Base: “Please read the below statements and select all that apply to your institution?”, All respondents (144).

“…Where an English language document is proven or suspected to be fraudulent, we present our findings to the applicant in writing and give them an opportunity to respond and provide further information or an explanation. If the validity of the documents cannot be confirmed, the explanation provided is insufficient, or a response is not received within 2 weeks, we withdraw all current offers and applications; withdraw assigned CAS; withdraw sponsorship; refuse to assess any future applications”

A small proportion of HEPs (12%) reported not encountering any instances of fraud, while most were likely to have experienced fraud either rarely, very rarely, or almost never (71%). Around one-sixth (16%) had experienced fraud sometimes, with a minority (2%) saying fraud occurred often at their institution. Internal review (82%) and student withdrawal (87%) were the most common ways to deal with fraud, whilst over one-quarter (28%) would pursue a more formal / legal investigation.

Figure 19: Prevalence of fraud

Base: “Where concerning proof of English language proficiency, roughly how often do instances of forgery, fraud, or impersonation occur at your institution?”, All respondents (144).

Figure 20: How fraudulent behaviours are dealt with

Base: “How are fraudulent behaviours related to English language testing by applicants dealt with by your institution?”, All respondents (144).

3.3 Post-enrolment

Most (79%) HEPs had a process in place for providing feedback on students displaying an inadequate level of English language proficiency. Around three-quarters (76%) offered support to students struggling with English language, and two-fifths (40%) offered academic support. Respondents also noted that they would undertake compliance procedures in such instances (checking of documentation (44%), reviewing test used (26%), raising to admissions (15%), retesting (13%), for instance).

Figure 21: Process in place for feeding back on English language ability of enrolled students

Base: “Does your institution have processes in place for sharing information/providing feedback to the admissions team if a student displays an inadequate level of English language proficiency during their course?”, All respondents (144).

“If an academic department identify that an international student is struggling due to English language ability, the department will contact Admissions and ask them to review the application documentation to ensure that there was no fraud at point of admission. If Admissions are concerned following this review, they will make contact with the test provider and ask them to review the English language test for the student.”

Figure 22: Steps taken if student seen to be struggling due to low English language ability

Base: “Please describe the steps that would be taken by your institution if an international student was seen to be struggling academically due to a lack of English language proficiency?”, All respondents (144). Unprompted response (derived through coding of open text data). Responses below 3% not shown.

“Academic units will monitor a student’s academic performance and will ensure a student is accessing the most appropriate English language support for their circumstances. Ultimately if academic performance does not improve a student will be dealt with by our standard exam board or unsatisfactory progress processes which can ultimately lead to a student being prevented from progressing on their programme and being excluded from further study.”

Annex 1: Institution profile

The tables in this annex detail the institutional profile of the 144 HEPs surveyed.

Table A1

Institution size Number %
Large (over 15,000) 68 47%
Medium (5,000 to 15,000) 44 31%
Small (less than 5,000) 32 22%

Table A2

Institution type Number %
Traditional university (established prior to 1992) 55 38%
Modern university (established post-1992) 52 36%
Specialist institution (focus on arts, business, music, for instance) 30 21%
Other / Prefer not to say 7 5%

Table A3

Institution location Number %
East of England 8 6%
East Midlands 5 3%
London or Greater London 33 23%
North East 6 4%
North West 10 7%
South East (excluding London) 19 13%
South West 12 8%
West Midlands 11 8%
Yorkshire and the Humber 13 9%
Scotland 17 12%
Wales 8 6%
Northern Ireland 2 1%

Table A4

Share of international students Number %
0 to 10% 31 22%
11 to 20% 35 24%
21 to 30% 40 28%
31 to 40% 15 10%
41 to 50% 14 10%
51% or more 8 6%
Don’t know 1 1%

Annex 2: Nationality profile

This annex details the nationalities most often encountered when assessing for English language proficiency.

Students from India were the nationality group most often encountered when assessing for English language proficiency, being listed by the majority (92%) of HEPs. Around three-quarters listed students from China (76%), and roughly two-thirds listed students from Nigeria (65%) and Pakistan (65%). Students from Ghana (43%) and Bangladesh (38%) were also frequently encountered when assessing for English language proficiency.

Table A5

Nationality profile Number %
India 132 92%
China 110 76%
Nigeria 93 65%
Pakistan 93 65%
Ghana 62 43%
Bangladesh 55 38%
Nepal 37 26%
Hong Kong 34 24%
Malaysia 33 23%
Saudi Arabia 30 21%
Sri Lanka 30 21%
France 29 20%
Thailand 27 19%
Italy 25 17%
Germany 23 16%
Turkey 23 16%
Kenya 22 15%
Singapore 20 14%
Taiwan 18 13%
Vietnam 18 13%
Japan 17 12%
Kuwait 17 12%
Iran 16 11%
Spain 16 11%
South Korea 13 9%
UAE 12 8%
Indonesia 10 7%
USA 9 6%
Afghanistan 8 6%
Morocco 5 3%
Netherlands 5 3%
Norway 5 3%
Russia 5 3%
Brazil 4 3%
Myanmar 4 3%
Poland 4 3%
Portugal 4 3%
South Africa 4 3%
Zimbabwe 4 3%
Canada 3 2%
Egypt 3 2%
Uganda 3 2%
Uzbekistan 3 2%

Notes:

  1. Responses below 2% not included

Figure 23: Nationality profile

Base: “Please list the countries/nationalities most frequently encountered when assessing for English language proficiency. You may list up to 10.”, All respondents (144). Students from countries highlighted in grey are still liable to be assessed for English language proficiency.