Guidance

Reception baseline assessment framework

Updated 1 April 2025

1. Purposes of statutory assessment

The main intended uses of the outcomes of statutory primary assessment are set out in the Bew Report and the Government’s consultation document on primary assessment and accountability.

The reception baseline assessment (RBA) is an on-entry assessment which forms part of the statutory assessment arrangements for pupils in the reception year. The purpose of the RBA is to create a baseline measure from which cohort-level progress to the end of key stage 2 (KS2) can be calculated.

The RBA is not intended to:

  • be used in any way to measure performance in the early years, evaluate or compare pre-school settings or hold early years practitioners to account
  • provide detailed diagnostic information about pupils’ areas for development
  • provide ongoing formative information for practitioners or replace any on-entry baseline work schools already undertake

The assessment falls under Ofqual’s regulatory framework for national assessments.

2. Purpose of this framework

The purpose of an assessment framework is to guide the development of an assessment. As such, this framework is written primarily for those who write assessment materials and to guide subsequent assessment development and construction. It is available to a wider audience for reasons of openness and transparency.

The framework outlines the areas of early childhood development that will be covered in the assessment (the content domain) as well as the cognitive processes associated with the question types (the cognitive domain).

The assessment framework also includes a specification from which a valid and reliable assessment can be constructed. This includes specifics about assessment format, question types, response types, scoring and a clear assessment-level reporting strategy.

By providing all this information in a single document, the assessment framework answers questions about what the assessment will cover, and how, in a clear and concise manner. The framework does not provide information on how teachers should approach teaching in the early years.

The assessment development process used by the Standards and Testing Agency (STA) embeds within it the generation of validity and reliability evidence through expert review and trialling. The assessment framework does not provide detail of the validity and reliability of individual assessments. That information will be provided in the RBA validity framework, which will be published separately following the administration of the assessment.

This assessment framework should be used in conjunction with the annual RBA assessment and reporting arrangements (ARA).

3. Nature of the assessment

The RBA is an on-entry assessment which forms part of the statutory assessment arrangements for pupils in the reception year.

It is distinct from the early years foundation stage (EYFS) profile which summarises and describes children’s attainment at the end of the reception year.

The RBA is a short assessment designed to capture the wide range of attainment in literacy, communication and language (LCL) and mathematics that is common in reception classes at the start of the year. It also has clear links to the key stage 1 and key stage 2 (KS2) national curriculum (2014) against which progress will be measured when pupils reach the end of KS2.

3.1 Population to be assessed

The RBA is designed as a suitable assessment for pupils to take during their first half term in reception. All pupils should be assessed within the first 6 weeks of joining reception, regardless of when they join the class, if they have not been previously assessed.

A small minority of pupils will be exempt from the assessment. Further details are available in the RBA ARA.

3.2 Assessment format and routing

The RBA comprises 2 separate components:

  • mathematics
  • LCL

The questions in each component are, where appropriate, ordered according to developmental progression.

The assessment usually takes less than 20 minutes for each pupil. Routing is used to help prevent pupils from being presented with too many questions with which they are unlikely to be successful. Routing means that a more difficult question may not be presented to a pupil if their previous answers suggest they are unlikely to get it right. It also helps to reduce the time required for the assessment and could reduce the possible discomfort a pupil may feel if they are unable to answer a question.

Within each component, there are several places where routing rules may be applied based on how a pupil responds to each question they are presented with. The routing rules have been developed based on data from trialling. Questions that the pupil answers incorrectly and questions that are not presented to the pupil due to routing will score 0 marks. The system will automatically route the pupil through the assessment.

3.3 Assessment administration

The assessment must be administered by a reception teacher, reception teaching assistant or suitably qualified practitioner such as an early years lead or special educational needs coordinator. Those conducting the assessment should be familiar to the pupil, fully trained and familiar with the assessment materials.

The practitioner must administer the assessment to each pupil on a one to one basis, using practical resources and 2 digital devices:

  • one for the practitioner administering the assessment
  • one for the pupil

The device used by the pupil must have a touchscreen measuring at least 9.7 inches diagonally. The device used by the practitioner does not require a touchscreen and can be any suitable device such as a tablet, desktop or laptop. STA will not provide schools with these devices.

The practitioner will use their device to control the pupil’s journey through the assessment. For some question types (for example, where a verbal response is required) the teacher will be required to record the pupil’s response. For other question types (for example, multiple choice response) the response will be automatically marked.

The RBA is not timed and the pupil can complete the assessment at their own pace. The assessment is unlikely to take longer than 20 minutes for a pupil to complete. The assessment can also be paused and returned to at a later stage. Schools can choose the order in which they administer the 2 assessment components.

4. Content domain

The RBA is an age-appropriate assessment of mathematics and LCL that will be delivered in English. It is linked to the learning and development requirements of the EYFS. However, due to the length and nature of the RBA, not all areas of learning and development in the EYFS are assessed. Table 1 shows the content domain, which sets out the components assessed in the RBA.

Table 1: Content domain

Content domain strand Content domain substrand Content domain reference Content domain reference label
Mathematics Number M1 Select a number of objects from a group
Mathematics Number M2 Count a number of objects
Mathematics Number M3 Identify numerals
Mathematics Number M4 Order numerals
Mathematics Calculation M5 Solve number stories involving partitioning and combining quantities
Mathematics Calculation M6 Solve simple addition and subtraction equations
Mathematics Pattern M7 Identify and continue patterns
Mathematics Measure M8 Order and compare items by size
LCL Vocabulary L1 Name and identify pictures
LCL Language skills L2 Repeat sentences containing different grammatical structures
LCL Phonological awareness L3 Identify words with the same initial sound
LCL Phonological awareness L4 Isolate phonemes from words
LCL Comprehension L5 Understand a story

5. Cognitive domain

The cognitive domain seeks to make the thinking skills and intellectual processes required for the RBA explicit. Each question will be rated against the strands of the cognitive domain listed in the tables below to provide an indication of the cognitive demand. Information on how the questions are rated is also included in the tables.

The cognitive domain will be used during assessment development to ensure the demand and difficulty of new questions is appropriate.

The following tables provide descriptors for the rating scale. Each of the strands below must be considered in the early years context.

5.1 Short-term memory

This strand is used to assess the demand associated with the temporary cognitive storage of the information necessary to answer questions.

You can think of it as: ‘What is the length and structure of the information provided that needs to be remembered to answer the question?’

Table 2: Short-term memory rating scale

1 (Low) 2 3 (High)
One element to be remembered 2 elements to be remembered 3 or more elements to be remembered
Cueing, structure of the question and/or its presentation supports encoding and recall Cueing, structure of the question and/or its presentation give some support with encoding and recall No memory cues and/or the structure to the question does not support encoding and recall

5.2 Operational complexity

This strand evaluates the complexity of the sequenced steps that are required to complete the question.

You can think of it as: ‘How many and how complex are the mental steps for question completion?’

Table 3: Operational complexity rating scale

1 (Low) 2 3 (High)
One simple mental step required to complete the question 2 simple mental steps required to complete the question 2 or more mental steps required to complete the question – one or more of these steps is complex due to the level of comprehension, skill or specific knowledge required

5.3 Abstractness

This strand evaluates the degree of abstraction from real life of the question to be completed.

You can think of it as: ‘To what extent does this question reflect a familiar, real-life experience for the child?’

Table 4: Abstractness rating scale

1 (Low) 2 3 (High)
Activity based on concrete objects utilised in a familiar question setting – demand may also be low in a digital design that imitates real life objects Question relies on pictorial stimuli or prompts – these approximate real-life experience in a moderately familiar and intuitive way Question is abstract with little relation to the real-life experience of the child

5.4 Motor demand

This strand evaluates the demand associated with production of a motor response. The greater the length and precision of a motor response, the higher the rated demand. In most cases it is the motor demands of a speech or touch response that is evaluated.

You can think of it as: ‘How long and how precise does a response need to be?’

Table 5: Motor demand rating scale

1 (Low) 2 3 (High)
Response generated is short Response generated is of moderate length or comprises of more than one part Response generated is long or comprises many parts
Required precision of motor response is low Required precision of motor response is moderate Required precision of motor response is large

6. Assessment specification

This section provides details of each assessment component.

6.1 Breadth and emphasis

The content and cognitive domains for the RBA are specified in sections 4 and 5. Not every element of the content domain may be included within each pupil’s assessment if the routing rules are applied. The questions are placed in order of difficulty, where possible, within each substrand.

6.2 Profile of content domain

The total number of marks available is 40. As the assessment will include carefully designed routing, the number of marks presented will vary from pupil to pupil. Each pupil will be presented with questions which are worth a total of at least 31 marks.

The content within the assessment is specified in the content domain in section 4. The proportion of marks assessing each area of the content domain is included in Table 6 below.

Table 6: Profile of marks by content area

Content domain Percentage of total mark
Mathematics 40% to ­­­­­­­­60%
LCL 40% to 60%

6.3 Format of questions and responses

The question types in the assessment will be distributed in the proportions indicated in Table 7 below. The range of question types are exemplified by, but not limited to, those listed.

Table 7: Profile of marks by question types

Question types Proportion of marks
Multiple choice 20% to 30%
Short answer 45% to 65%
Ordering 0% to 20%
Pick multiple (selecting more than one object from a group) 0% to 20%

The response formats for the questions are included in Table 8 below.

Table 8: Profile of marks by response formats

Response formats Proportion of marks
Verbal response 40% to 60%
Drag and drop 5% to 25%
Touch 20% to 30%
Practical resources 0% to 15%

6.4 Scoring

Digital technology is used to maximise the manageability of the assessment, enabling quick, easy and automated recording of the pupil’s answers.

Answers for some question types will be automatically recorded by the pupil and their interaction with the screen. However, scoring is not reliant upon the pupil’s ability to engage with the device, as the practitioner will be able to confirm or input the answers on their tablet if the pupil does not wish to interact with the screen.

Some question types (for example, those requiring verbal response) will require the practitioner to record whether the pupil’s response is correct or incorrect. Where this is the case, the embedded mark scheme on the practitioner’s screen will give specific guidance for the scoring of each question. Where multiple correct answers are possible, examples of different variations of the correct answer will be given in the mark scheme. Where applicable, additional guidance will indicate minimally acceptable responses and unacceptable responses.

The practitioner screen will contain:

  • a view of the pupil’s screen
  • the question and any instructions required
  • bullets indicating the required responses, acceptable answers and unacceptable responses (where necessary)

6.5 Reporting

At the end of the assessment, a single raw score out of 40 will be recorded for each pupil. This score will not be made available to schools or parents. Raw scores will be recorded in the national pupil database and used to create a cohort-level progress measure for schools at the end of KS2.

At the end of the assessment, a series of narrative statements will be generated and provided to schools to describe how each pupil performed in the different content domains presented in the assessment. There is no legal requirement for schools to report RBA narrative statements to parents. However, schools must share a pupil’s RBA narrative statements with the child’s parents if they request them.

6.6 Desired psychometric properties

The aim is to develop an assessment with robust psychometric properties to support its intended use as a baseline for a progress measure. Multiple forms have been constructed according to the specification outlined in this framework.

The selection and balance of questions to assess the content domains is evidence based, taking account of known associations with later performance in mathematics and literacy.

To support reliable inferences about progress, the baseline needs to measure across the ability spectrum. To facilitate this, a pupil’s pathway through the assessment will be adapted according to the answers they give. The use of routing rules will personalise a pupil’s assessment journey, allowing greater precision of measurement whilst minimising the conditional standard error of measurement across the range.

The assessment procedure has been designed, through clear guidance, to provide a consistent measure regardless of who administers it. This has been tested through studies of inter-rater reliability. There will be appropriate levels of internal consistency as a measure of reliability for the assessment as a whole and for each component. As the purpose of this assessment is to form a baseline for measured progress at a cohort level, individual scores will not be reported. Instead, qualitative descriptors of pupil performance will be generated when the assessment has been completed. However, any summative progress reporting for accountability purposes will be at school level.

7. Diversity and inclusion

The Equality Act 2010 sets out the principles by which the national curriculum assessments and associated development activities are conducted. During the development of the RBA, STA will make provision to overcome barriers to fair assessment for individuals and groups wherever possible.

Assessments are required to meet Ofqual’s regulatory framework which states that “an assessment should minimise bias, differentiating only on the basis of each pupil’s level of attainment. A pupil should not be disadvantaged by factors that do not relate to what is being tested, for example, the way a particular question may be worded.”

The RBA should:

  • use appropriate means to allow all pupils to demonstrate their knowledge and skills, including pupils with special educational needs and disabilities (SEND)
  • provide opportunities for all pupils to demonstrate their ability, irrespective of gender, race or social and cultural background
  • not be detrimental to pupils’ self esteem or confidence
  • be free from stereotyping and discrimination in any form

The RBA has been designed to be an inclusive assessment, accessible to the majority of pupils on entry to school. This includes applying the principles of plain English and universal design wherever possible. It aims to assess pupils in a fair and comparable way, with as many pupils as possible able to access the questions. It has been designed so that pupils with SEND and those learning English as an additional language (EAL) can participate in the standard assessment format and has been subject to a SEND and cultural review.

For each assessment in development, expert feedback is gathered on specific questions – for example, during inclusion panel meetings and reviews, which include experts and practitioners from across the fields of inclusion, disabilities and special educational needs. This provides an opportunity for some questions to be amended or removed in response to concerns raised.

Issues likely to be encountered by pupils with specific learning difficulties have been considered in detail. Features of questions that lead to construct irrelevant variance have been considered and questions have been presented in line with best practice for a range of specific learning difficulties.

A range of accessibility settings – for example, changing background colour and colour contrast – have been built into the digital assessment service and can be selected by practitioners for specific pupils, where appropriate.

For pupils with a hearing impairment or who use sign language, the assessment can be conducted in British Sign Language (BSL) or any sign-supported English, using signs familiar to the pupil.

Further information about how the assessment materials should be administered are provided in the RBA modified guidance, training modules and administration guidance.

Questions are presented based on those the pupil has already completed within the assessment. Pupils may find some questions challenging, but the range of questions will allow them to demonstrate the breadth of their abilities and routing rules will help to ensure that pupils are not faced with a significant number of questions that are too difficult for them.

All pupils will be presented with at least 31 marks’ worth of questions across a broad range of areas in mathematics and LCL.

7.1 Access arrangements

The full range of access arrangements are set out in the RBA ARA. These include accessibility settings that change the way the assessment looks on screen and information about how to access a paper version of the assessment for any pupils with a specific need.

7.2 Pupils with English as an additional language

Pupils with EAL should be registered for the RBA. If a pupil’s limited ability to communicate in English means they are unable to access the assessment, then they will be working below the standard of the assessment and should not take it, as set out in the RBA ARA.

8. Glossary

Assessment framework A document that sets out the principles, rationale and key information about the assessment and contains an assessment specification.
Assessment specification A detailed specification of what to include in an assessment in any single cycle of development.
Cognitive domain Cognitive processes refer to the thinking skills and intellectual processes that occur in response to a stimulus. The cognitive domain makes explicit the thinking skills associated with an assessment.
Component A section of an assessment, presented to pupils as an assessment paper or assessment booklet, is called a component. Some assessments may have 2 or more components that each pupil needs to do to complete the assessment.
Construct irrelevant variance Construct irrelevant variance is the variation in pupils’ test scores that does not come from their knowledge of the content domain. It can result in pupils gaining fewer marks than their knowledge would suggest or lead to the award of more marks than their knowledge alone would deserve.
Content domain The body of subject knowledge to be assessed by the assessment.
Domain The codified definition of a body of skills and knowledge.
Form The selection of questions that make up a component.
Mark scheme The creditworthy response or the criteria that must be applied to award the mark for a question in the assessment.
Raw score A raw score is the unmodified score achieved on an assessment, following marking. In the case of these assessments, it is the total marks achieved – for example, if a pupil scores 27 out of 60 possible marks, the raw score is 27.
Routing Moving a pupil past a particular question that they are likely to be unsuccessful with based on their responses to previous questions.