Research and analysis

Observations on the consistency of moderator judgements

Observational study of local authority moderation of key stage 2 writing assessments in 2017.


Key stage 2 writing moderation: Observations on the consistency of moderator judgements (Cuff, Howard, Mead and Newton, 2018)

This file may not be suitable for users of assistive technology. Request an accessible format.

If you use assistive technology (such as a screen reader) and need a version of this document in a more accessible format, please email Please tell us what format you need. It will help us if you say what assistive technology you use.


This was specific and focused research, observing a small proportion of moderation to provide detailed insights into aspects of the validity of assessment arrangements.

Our main purpose was to identify potential risks to the consistency of moderation judgements and feedback relevant information to help the Standards and Testing Agency mitigate any such risks in future years. Our observations do not provide a definitive judgement on the quality of moderation and do not provide a broad representation of national practice.

While our research did not compare 2017 with earlier years, many participants in the study commented that they thought the Interim Teacher Assessment Framework was better understood in 2017 than it had been in 2016. This is supported by data provided to us by the Standards and Testing Agency, which also suggests an improvement in consistency of key stage 2 writing assessment outcomes in 2017 compared to 2016, based on an analysis of the correlation between writing teacher assessment and reading test outcomes.

We identified variations in approaches taken to moderation in 2017, including different logistical arrangements, practices and understandings of Interim Teacher Assessment Framework-referenced moderation. On this basis, we concluded that it was likely that moderators’ judgements were more inconsistent during 2017 than they could have been, and that some variations could have operated between local authorities, but that it should be possible to reduce inconsistency in future years.

We therefore recommended that the Standards and Testing Agency take steps to reduce risks of inconsistency for future years; informed by the analysis within this report, as well as by its own evidence gathering. We also recommended that they revisit the design of the standardisation test, in light of concerns expressed about its authenticity.

More broadly, our observations, including that some teachers, moderators and moderation managers had not interpreted the Interim Teacher Assessment Framework standards as intended, suggested that it would be appropriate to keep the approach to the assessment of writing under review.

Published 29 March 2018