The Evidence About What Works in Education: Graphs to Illustrate External Validity and Construct Validity

Insights from the Research on Improving Systems of Education (RISE) programme

Abstract

Currently, the bulk of the new empirical work on estimating the impact on learning of various education projects/programmes/policies, while based on sound principles of estimating causal impacts, is far too inadequately theorised and specified to be of much immediate and direct use in formulating effective action to accelerate learning. Therefore, just “more of the same” empirical research is unlikely to be of much help or to add up to a coherent action or research agenda as it faces massive challenges of external and construct validity.

The RISE research agenda is moving forward by:

  • embedding research into a prior diagnostic of the overall system which allows a more precise characterisation of what “context” might mean;

  • evaluating on-going attempts at education reform at scale (rather than isolated field experiments);

  • specificity about the details of programme/project/policy design;

  • acknowledgment that policy relevant learning is itself part of the system, not a one-off exercise.

This work is part of the Department for International Development’s ‘Research on Improving Systems of Education’ (RISE) Programme

Citation

Pritchett, L. The Evidence About What Works in Education: Graphs to Illustrate External Validity and Construct Validity. Research on Improving Systems of Education (RISE) Insights

The Evidence About What Works in Education: Graphs to Illustrate External Validity and Construct Validity

Published 1 June 2017