Mental Health and Employment Partnership: final evaluation report (summary)
Published 29 August 2025
Applies to England
This is the third and final report of a 5-year evaluation of the Mental Health and Employment Partnership (MHEP). The evaluation has assessed the effectiveness of social outcomes partnerships (SOPs - also known as social impact bonds) as a commissioning tool to improve social outcomes for citizens.[footnote 1]
The MHEP SOPs were 5 projects supported by the Life Chances Fund (LCF). The LCF was a £70 million programme funded by the Department for Culture, Media and Sport (DCMS), supporting 29 locally-commissioned SOP projects across England. It ran between 2016 and 2025 and is the largest outcomes fund launched to date in the UK. The LCF was designed to tackle complex social problems across policy areas including child and family welfare, homelessness, health and wellbeing, employment and training, and more.
The Mental Health and Employment Partnership was established in 2015 by Social Finance, and was backed by social investment from Big Issue Invest for a total of £1.2m across 5 projects in Haringey and Barnet, Shropshire, Enfield, and Tower Hamlets (2 projects - mental health and learning disabilities). MHEP supported the delivery of an intervention known as ‘Individual Placement and Support’ (IPS) to help people experiencing mental health issues or learning disabilities to find and remain in competitive, paid work.
The report asks 2 primary questions: whether the MHEP SOP made a difference to the social outcomes achieved, compared with alternative commissioning approaches, and through which mechanisms it contributed to improved services and positive social outcomes.
Key Findings
Outcomes performance
For the 4 mental health projects, average project performance against best case scenarios targets were as follows:
-
68% for service user engagement, ranging from 47% in Enfield to 105% in Shropshire (in total these projects engaged 2,524 services users)
-
55% for job starts, ranging from 49% in Enfield to 75% in Shropshire (in total these projects recorded 806 job starts)
-
30% for job sustainment for part-time work (<16 hours) over 13 weeks, ranging from 24% in Shropshire to 38% in Haringey and Barnet (in total these projects recorded 143 part-time job sustainments)
-
55% for job sustainment for full-time work (>16 hours) over 13 weeks, ranging from 42% in Tower Hamlets to 91% in Shropshire (in total these projects recorded 279 full-time job sustainments)
The average job outcomes rate (the proportion of engaged service users moving into work) was 33% across the mental health projects, with the final rate for each project as follows:
- Enfield: 35%
- Shropshire: 34%
- Haringey and Barnet: 32%
- Tower Hamlets: 31%
SOP project costs
For all MHEP SOPs, management costs were higher than anticipated and investment costs lower than anticipated.
-
The key factors leading to costs being higher than anticipated were: COVID-19, higher need for managing underperformance, service underperformance to targets, and delayed start of delivery.
-
The key factors leading to costs being lower than anticipated were early repayments of loans and thus reductions in total interest. Higher than expected achievements in relation to engagements and sustainments also increased income in Shropshire and Tower Hamlets (Learning Disabilities).
Performance incentives
The “dose-response” analysis (see methodology below) found a significant link between performance incentives and job outcomes. However the analysis was constrained by limited data, and these findings should be treated with caution as unmeasured factors will also have influenced results. The analysis found:
-
a £1,000 increment in performance incentive is associated with a 21% increase in job start achievement.
-
a £1,000 increment in performance incentive is associated with approximately 17% increase in the speed of job start achievement.
-
No statistically significant relationship was found between the duration of a project’s exposure to the MHEP performance management regime and a job start achievement.
In the MHEP SOPs, the strongest performance incentives sat with the intermediary, Social Finance, whose payment was entirely outcomes-based. Financial incentives were more muted for providers who received a significant proportion of their payment through “block” rather than outcomes payments. The intermediary translated the financial incentives it received into softer incentives which respected the intrinsic motivation of the providers. However, the effectiveness of these incentives could be weakened by practical barriers like contract complexity and resource constraints.
Background
The Mental Health and Employment Partnership (MHEP) was established in 2015 to drive the expansion of Individual Placement and Support (IPS) for people with severe mental illness, addictions, and/or learning disabilities. IPS is a rigorously tested employment support intervention that follows a ‘place, then train’ model, where employment specialists support service users to secure employment quickly before providing them with ongoing support to help them stay in work.
MHEP was set up as a special purpose vehicle (SPV), i.e., a separate legal entity created and managed by intermediary Social Finance. MHEP initially partnered with 3 areas in 2016 (Haringey, Tower Hamlets, and Staffordshire) to secure £1.3m of ‘top-up’ funding from the Commissioning Better Outcomes (CBO) Fund and the government’s Social Outcomes Fund [footnote 2]. Additional top-up funding from the Life Chances Fund extended to the service to cover 5 projects: Haringey and Barnet, Shropshire, Enfield, Tower Hamlets Mental Health, and Tower Hamlets Learning Disabilities.
Figure 1. MHEP SOPs supported by the LCF
Figure 1 shows the 5 MHEP SOP projects supported through the LCF, showing the role of the MHEP special purpose vehicle overseeing the 5 projects and their providers, and coordinating between investors and commissioners.
This report
This is the third and final report of the MHEP evaluation (a summary of findings from the previous reports can be found in the box below). The following findings are framed through 4 questions:
-
Were the MHEP SOPs effective in achieving their outcomes targets and how does their effectiveness compare with that of traditionally commissioned IPS contracts?
-
Did the SOP have higher costs than expected and if so, why?
-
Was the SOPs’ outcomes achievement (the SOP effect) related to the intensity of the performance management or performance incentive?
-
How were different actors incentivised for performance?
MHEP interim report findings summary
The first report [footnote 3] focused on theories of change, explored the distinctive contribution of MHEP SOPs, and analysed performance data on the key outcomes metrics through time and across different sites and providers. The first report found that MHEP SOPs provided additional value compared to traditional commissioning via:
-
a dedicated performance management function that was seen to drive additional focus on achieving outcomes
-
more effective working culture within each local partnership
-
identifying and successfully unlocking the LCF funding. This was understood to bring additional financial and human resources to projects.
The second report [footnote 4] focused on the implementation experience of the MHEP SOP, including whether the MHEP SOPs affected service quality, provider incentives and legacy for providers and commissioners. The second report found that:
-
There was improved accountability and commissioning practice under SOPs;
-
MHEP contributed to the national scaling of IPS via Social Finance’s advocacy;
-
MHEP SOPs’ contractual and payment structures were unnecessarily complex and could be simplified via earlier buy-in for design principles, annual caps, and more realistic expectations on forecasting outcomes performance;
-
MHEP brought enhanced capacity to providers by building an IT data system, talent pipelines for staff, and efficient data routines. It brought enhanced capacity to commissioners via experience in partnership working and the creation of a new baseline for expected IPS outcomes; and
-
MHEP’s incentives on providers were more muted than expected.
Methodology and limitations
The primary research questions for the 3 MHEP evaluation reports are:
-
Did the MHEP SOPs make a difference to the social outcomes achieved, compared with alternative commissioning approaches?
-
Through which mechanisms did specific aspects of the MHEP SOP arrangement contribute to these impacts?
However, due to data limitations it was not possible to make a direct comparison of MHEP (IPS delivered through a SOP) and to similar IPS services delivered through traditional contracting. However, evidence from the evaluation found that the intensity of the application of the SOP model varied significantly over time and between sites. In lieu of comparator data, the analysis therefore tested the intensity of the SOP approach within MHEP and its effect on: 1) outcomes, 2) perceived costs and 3) incentives (and performance management).
The intensity of MHEP SOP approach has been investigated by means of:
-
Quantitative analysis through a dose-response analysis.
-
Comparison of actual costs of SOP compared to forecasts
-
Comparative analysis based on in-depth interviews and a survey.
Dose-response analysis
A dose-response analysis examines how different levels of exposure to a policy, programme or intervention affect outcomes. This analysis had 4 main elements:
-
Logistic regression to assess the effect of performance incentive on job start achievement
-
Survival analysis to explore the time to job start achievement for different performance incentives
-
Logistic regression to assess the effect of performance management on job start achievement
-
Two-proportion z-test comparing site association with job start achievement.
A logistic regression is a statistical technique used to predict the probability of a binary (dependent) variable, in relation to a range of independent variables, e.g., the likelihood of being in or out of work.
A survival analysis is a statistical method used to predict the timing of an event, such as how long people remain unemployed.
It is worth noting some limitations to this report. The lack of direct comparator data from non-SOP sites made it impossible to answer the original research question quantitatively. Instead, the evaluation relies heavily on qualitative methods and a dose-response analysis. Other limitations included sample attrition over the course of the study (e.g., as a result of provider staff turnover), resulting in an interview sample mostly of Social Finance staff. The fact that only 4 SOP sites were included in the dose analysis limits project-level conclusions, though a large service-user dataset strengthens analysis at the individual level. Finally, it is worth noting that the report relies on provider-supplied data (with potential unobserved biases) about service delivery which was complicated by the impact of COVID-19 on service delivery and the difficulty of isolating the ‘SOP effect’ from the broader influence of MHEP and IPS.
Were the MHEP SOPs effective in achieving their outcomes targets?
Achievement is captured by 3 main metrics:
-
success rates (the percentage of outcomes that were achieved by the end of the project compared to best case scenario)
-
conversion rates (the rate of progress from one outcome to the next e.g., job start to job sustainment)
-
IPS fidelity score comparisons (this score measures the quality of IPS service delivery).
This section only includes MHEP SOPs supporting service users with mental health disorders. An analysis of the performance of MHEP Tower Hamlets Learning Disabilities (which is not directly comparable to the other projects) can be found in Appendix C of the full report.
The mean success rate over the 4 MHEP mental health SOPs was:
-
68% for ‘Engagement with IPS service’ (the service user attends at least 3 appointments with an IPS employment specialist and a vocation profile is completed)
-
55% for ‘Job start individual gains competitive employment’ (a service user gets a job)
-
30% for ‘Individual sustains job for less than 16 hours per week for 13 weeks’
-
55% for ‘Individual sustains job for more than 16 hours per week for 13 weeks.’
Table 1. Performance across MHEP Mental Health projects compared to best case scenario.
Site | Metric | Engagement | Job start | Job sustainment (<16 hours) | Job sustainment (>16 hours) |
---|---|---|---|---|---|
Enfield | Recorded outcomes | 254 | 88 | 16 | 42 |
Success rate | 47% | 49% | 29% | 76% | |
Haringey and Barnet | Recorded outcomes | 660 | 212 | 43 | 60 |
% Success rate | 83% | 56% | 38% | 65% | |
Shropshire | Recorded outcomes | 439 | 148 | 16 | 51 |
% Success rate | 105% | 75% | 24% | 91% | |
Tower Hamlets Mental Health | Recorded outcomes | 1171 | 358 | 68 | 126 |
% Success rate | 60% | 50% | 27% | 42% | |
All projects | Recorded outcomes (total) | 2,524 | 806 | 143 | 279 |
% Success rate (mean) | 68% | 55% | 30% | 55% |
The outcomes conversion rate is the rate at which one type of outcome transitions into the next successive outcome in a causal chain, e.g., engagement to job start. Figure 2 depicts the overall conversion rates across the sites. The average conversion rate for engagement to job start (alternatively known as the “job outcome rate”) was 33% for all MHEP mental health projects ranging from 31% in Tower Hamlets to 35% in Enfield.
Figure 2. Outcomes conversion rates across MHEP mental health projects.
Site | Proportion of engaged participants securing a job start |
---|---|
Enfield | 35% |
Shropshire | 34% |
Haringey and Barnet | 32% |
Tower Hamlets | 31% |
Site | Proportion of job starts achieving job sustainment |
---|---|
Enfield | 66% |
Shropshire | 49% |
Haringey and Barnet | 45% |
Tower Hamlets | 54% |
The job outcome rate is broadly in line with IPS literature and NHS expectations. Two systematic reviews find employment rates above 40% for IPS programmes targeting severe mental illness.[footnote 5] Given the COVID restrictions and substantial disruptions to the job market, it is not possible to make direct comparison to trial IPS interventions which did not operate under such restrictions. More recently, NHS England (2023) suggests that a new IPS service should be achieving a minimum of 30%–40% of clients into employment/helping them to retain their existing employment. [footnote 6]
IPS fidelity scores are measurements of the service delivery quality. The higher the fidelity score (out of 125 points), the greater the quality of the IPS service and the more closely delivery adheres to the IPS model.
MHEP SOPs’ service providers achieved the following fidelity scores:
-
82/125 (66%) in Haringey (2023)
-
97/125 (78%) in Barnet (2023)
-
100/125 (80%) in Enfield (2022)
-
110/125 (88%) in Shropshire (2023)
-
101/125 (81%) in Tower Hamlets Mental Health (2023).
The latest literature states the mean score for fidelity on the IPS-25 item scale reported in the UK is 102.[footnote 7]
Did the SOP have higher costs than expected and if so, why?
This section explores the costs associated with the MHEP SOP across all 5 sites. SOPs differ from standard contracting and grants processes because they have more intensive setup, monitoring and evaluation costs. These are ‘transaction costs’: the costs incurred in delivering the service beyond the cost of the service itself.
MHEP SOPs’ “monetised” costs were estimated to be £6,160,973, inclusive of costs for investment, management, evaluation and learning, and delivery as defined above. These costs were lower than expected for investment, delivery, evaluation and learning, but higher for management than expected by 25.8%.
Table 2. Project costs, and difference to original baseline expectations
Site | Metric | Engagement | Job start | Job sustainment (<16 hours) | Job sustainment (>16 hours) |
---|---|---|---|---|---|
Enfield | Recorded outcomes | 254 | 88 | 16 | 42 |
Success rate | 47% | 49% | 29% | 76% | |
Haringey and Barnet | Recorded outcomes | 660 | 212 | 43 | 60 |
% Success rate | 83% | 56% | 38% | 65% | |
Shropshire | Recorded outcomes | 439 | 148 | 16 | 51 |
% Success rate | 105% | 75% | 24% | 91% | |
Tower Hamlets Mental Health | Recorded outcomes | 1171 | 358 | 68 | 126 |
% Success rate | 60% | 50% | 27% | 42% | |
All projects | Recorded outcomes (total) | 2,524 | 806 | 143 | 279 |
% Success rate (mean) | 68% | 55% | 30% | 55% |
A number of factors explain the differences in costs between sites and from initial expectations, including:
-
Higher performance support demands: 2 sites (Shropshire and Enfield) were subjected to a formal performance improvement plan.
-
Investment costs were lower than expected by 32.9% due to early repayment of loans compared to predictions and a resulting reduction in interest cost.
-
COVID-19: initial revenue loss was seen due to the suspension of outcomes payments during April 2020–October 2020 and job market disruption.
-
Forecasting uncertainty: in the preliminary forecasts, there was no awareness of the cohort variances between severe mental illness and learning disabilities.
-
Delayed start of delivery: Tower Hamlets delivery was delayed by 3 months due to delays in contract negotiations.
-
Annual cap challenges: the rigidity of annual caps (maximum payment levels) made it challenging for the project to recuperate revenue across periods of fluctuating performance.
Big Issue Invest (BII), the social investors for MHEP SOPs, invested the same amount as intended for all MHEP projects, apart from Tower Hamlets Learnings Disabilities where delays to launch led to a lower investment. The return on investment varied significantly between projects, from -6.8% for Tower Hamlets Mental Health to 25.5% in Haringey and Barnet. The agreements with BII allowed MHEP to pool gains and losses across projects (including MHEP projects not supported by the LCF), creating a risk pooling effect at the portfolio level. Overall, the social investor incurred a gain in line with expectations.
Table 3. Investment costs and returns
Enfield | Haringey & Barnet | Shrop- shire | Tower Hamlets (MH) | Tower Hamlets (LD) | Total | |
---|---|---|---|---|---|---|
Forecast loan from investor | £126,000 | £227,000 | £204,000 | £300,000 | £414,000 | £1,271,000 |
Actual loan from investor | £126,000 | £227,000 | £204,000 | £300,000 | £328,000 | £1,185,000 |
Investment cost | £55,306 | £201,208 | £60,995 | 113,559 | 76,293 | £507,361 |
Forecast ROI | 8.30% | 18.50% | 9.80% | 18.10% | 11.49% | |
Actual ROI | -1.30% | 25.51% | 8.80% | -6.80% | 7.40% |
Transaction costs are defined as the expenses incurred when buying or selling a service in addition to the cost of the service itself. These include additional monitoring, searching, negotiating and enforcing in a SOP on top of a traditional service delivery contract of IPS.
The key factors that reduced transaction costs across the MHEP SOPs compared to traditional IPS were:
-
The efficiency of the outcomes monitoring process, which was streamlined and in part automated
-
The standardisation of processes using a Special Purpose Vehicle which allowed the pooling of resources, smoothed cash flows and facilitated the inclusion of additional commissioners or geographical areas
-
The standardisation of the contracts across sites which provided a clear understanding of responsibilities across all projects, allowing teams to focus on execution rather than reinterpreting terms
-
The SOP readiness of commissioners and providers.
The key factor that increased the transaction costs of the MHEP SOPs compared to traditional IPS was their perceived complexity, particularly in terms of new contractual clauses and new stakeholders.
Overall, transaction costs were high during setup of the SOP, the completion, and during periods of uncertainty in underperformance.
Was the SOPs’ outcomes achievement (the SOP effect) related to the intensity of the performance management or performance incentive?
The dose-response analysis (see methodology section above and in the full report) uses 2 components of the SOP “dose” (i.e., the intensity in which the SOP model is applied), namely:
-
Performance management: this variable captures the length of time (in months) between the site’s SOP launch (under CBO or LCF) and the date of a service user’s referral. It is a proxy for sites’ exposure to the intense, data-led performance management routines of IPS and mechanisms of a SOP.
-
Performance incentive: this variable captures the price of outcomes (engagement and job outcomes) across the MHEP sites when the service user was recruited to the project. The outcomes price varied by site due to contract revisions made through the projects’ lifespans.
Regression analysis
A regression analysis was conducted to assess the association between the job start outcomes (known as the ‘dependent variable’) and the performance incentive/ performance management variable (the ‘independent variable’) whilst also taking into account, and holding constant, variables like delivery site, local economic conditions and national unemployment rates.
The regression model for performance incentives suggests that there is a positive, statistically significant relationship and for every £1,000 increase in incentives, the probability of a job start outcome increases by 21%. This is equivalent to the observation that, for every increase of incentives by £100, the probability of a job start outcome increases by 2.1% (logistic regression output tables can be found in the full report).
The regression model for performance management did not provide clear results because the relationship between performance management and job start was not consistent or predictable in one direction. This meant that the regression was not reliable as this analysis is not designed to be used with variables with this kind of unpredictable relationship. The relationship between time and dose seems more complex. For instance, the data suggests a potential wind-down effect with service users referred later in the programme having worse than predicted job start outcomes, possibly due to the project coming closer to closure.
Results of the regression analysis should be treated with caution, however. Data limitations mean the models do not comprehensively capture all the factors that could influence employment outcomes (for more details on the methodology and data limitations please see Appendix A of the full report).
Survival analysis
To provide a more nuanced understanding of the payment incentive dose, survival analysis was used to assess the relationship between incentives and length of time between referral and job start outcomes. Survival analysis is a field of statistical tools used to assess the time until an event occurs. The analysis indicates that a £1,000 increase in payment incentive dose is associated with approximately a 17% increase in speed of job start outcomes. Figure 3 shows the difference in duration between referral and job outcomes between low (<£2,000 for job starts) and high (>£2,000) incentive sites/periods (on the left) and by £1,000 increments of performance incentive (on the right).
Figure 3. Length of time to achieve job outcomes by performance incentives
How were different actors incentivised for performance?
MHEP had financial and non-financial incentives within its SOP structure. Those at risk of gaining and losing payments based on outcome performance were the intermediary (Social Finance via the MHEP SPV), the investor and, to a limited degree, providers. The intermediary was paid 100% on outcome achievement, while investors would start to recover the capital provided (and interest) only once the outcome payments covered the operational costs of the contract. Providers were mostly paid through block payments that shielded them from a significant portion of the risk, but in some cases they were in part paid on outcome achievement (up to 30% of the contract value in the case of Enfield). Local commissioners and the LCF were both outcome funders who paid according to outcomes achievement, reducing the financial risk of underperformance. However, all stakeholders as part of the SOP had varying non-financial incentives for the MHEP SOPs to be a success.
The qualitative data in the evaluation suggest that the most effective financial incentive in MHEP SOPs sat with the performance management team in Social Finance, the intermediary. This incentive is multifaceted and derived from several sources:
-
Outcomes payments incentive
-
Investor scrutiny (oversight and pressure)
-
Role as a learning tool/system leader
-
Aligning stakeholders and relational trust
-
Impact-driven mission/reputational success.
The analysis shows that several conditions needed to be met for the incentives on the intermediary to be effective, including: a clear role separation between the social impact investor and the MHEP performance managers, transparency, a governance board that was capable of holding people accountable, a robust reporting and contractual framework, joint goal alignment with commissioners, and centralised oversight.
The incentive structure of the SOP could, however, be weakened by time constraints and resource limitations, outcomes variability in local pathways, fragmentation/complexity of contracts and transaction cost trade-offs.
‘Outcomes payment puts pressure on the investors. They put pressure on us as performance management. And we have to find a way to filter that into good performance [by the provider]. I think that DOES work.’
MHEP
‘It was MHEP manager at the time, they did a good job kind of putting this to us, in a very nice way, but in quite an assertive way that, “I’m in the middle now, I’ve got people above me who say that we’ve got to start turning things around, I’m here to support, but we are going put performance management in place.”’
Provider
Policy recommendations
1. Recognise the significant relationship between financial incentives and outcomes achievement.
The research found that for every £1,000 increase in incentives, the likelihood of outcomes achievement increased by about 20%. The reverse also applies. However, this finding does not simply mean that greater financial input automatically results in more outcomes from service providers. Although the effect was statistically significant for the MHEP SOPs, the analysis also indicated that more data is needed before it can be used to make predictions.
The qualitative analysis revealed that the important factor was the intermediary, who translated the hard incentives (more outcome payments) into increased softer incentives for service providers such as more relational and operational support. Thus, the incentives work less through hard-edge financial pressures on service providers and more through the motivation and accountability coming from the intermediaries. These incentives’ effectiveness at the intermediary level relied on clear role separation between the investor and the performance managers; transparency; a governance board that was capable of holding people accountable; and a robust reporting and contractual framework.
Recommendation: Design financial incentives with a clear understanding of how different actors respond: intermediaries are more likely to be influenced by financial incentives, while service providers may be more motivated by relational or mission-driven factors.
2. Anticipate setup and wind-down effort as a foundation for adaptive delivery.
The MHEP SOPs required appreciably more time and effort during setup and completion phases than initially anticipated due to complex negotiations, outcomes modelling, and end-of-grant reconciliation. However, these investments laid the groundwork for more adaptive, data-driven delivery during the contract. Stakeholders noted that while the transaction costs were high up front, they enabled robust structures, trust and shared understanding, which ultimately supported better performance management and problem-solving throughout delivery.
Recommendation: Build in adequate time and resourcing for SOP setup and closure phases, recognising them as critical foundations for continuous improvement and collaborative service delivery.
3. Enable responsive problem-solving via continuous monitoring, frequent engagement, and bespoke data analytics.
A key strength of the MHEP SOP model was its ability to enable responsive problem-solving through continuous monitoring, frequent engagement and bespoke data analytics. The intermediary played a central role in identifying underperformance early and working closely with providers and commissioners to adapt strategies in real time. Regular data reviews, site visits and tailored performance improvement plans allowed challenges – such as staffing gaps, referral delays or outcome dips – to be addressed proactively. This dynamic, data-informed approach contrasted with more static traditional contracts and was widely credited by stakeholders with improving both service quality and outcomes achievement across the SOPs.
Recommendation: Programmes should incorporate continuous monitoring, frequent engagement and tailored data analytics to enable responsive problem-solving and improve service outcomes.
Acknowledgements
The authors gratefully acknowledge the support of colleagues at the Department for Culture, Media and Sport. The authors also acknowledge the National Lottery Community Fund, which administered the Life Chances Fund. We extend thanks to Social Finance for their support in compiling relevant MHEP documentation required for our research activities over the multi-year evaluation period. We are very grateful to Isaac Grennan for supporting the quantitative analysis. We are particularly grateful to the staff at each MHEP delivery site who participated in interviews and shared relevant data. The research wouldn’t have been possible without the participation of all MHEP delivery sites.
Contribution statement
-
Emily Hulse authored this report. She contributed to the research design, developed the conceptual report design and led the data collection and analysis. Emily reviewed and co-edited the report.
-
Eve Grennan co-authored this report. She contributed to data analysis and drafting of the report.
-
Dr Mara Airoldi co-authored this report. She reviewed and co-edited the report. Mara also provided final quality assurance for the report.
-
Dr Eleanor Carter co-authored this report. She designed the overall research strategy, supported data collection and analysis. Eleanor is responsible for overall research quality.
-
Jessica Reedy authored the Executive Summary and co-edited the report. She also supported final reviews, formatting, and quality assurance processes.
-
Michael Gibson co-edited the report.
-
You can learn more about Social Outcomes Partnerships and the Life Chances Fund on the SOPs page on GOV.UK or on the Government Outcomes Lab website. ↩
-
See the INDIGO Outcomes Fund Directory for more information on these funds. ↩
-
Mental Health and Employment Partnership Evaluation: First interim report, 2023. ↩
-
Mental Health and Employment Partnership Evaluation: Second interim report, 2024. ↩
-
Richter, D., & Hoffmann, H. (2019). Effectiveness of supported employment in nontrial routine implementation: systematic review and meta-analysis. Social psychiatry and psychiatric epidemiology, 54(5), 525-531. Bond, G. R., Drake, R. E., & Becker, D. R. (2012). Generalizability of the Individual Placement and Support (IPS) model of supported employment outside the US. World psychiatry, 11(1), 32-39. ↩
-
NHS England(2023). Individual placement and support for severe mental illness. NHS England. ↩
-
Waghorn G, van Veggel R, Chant D, Lockett H. The utility of item level fidelity scores for developing evidence based practices in supported employment. Journal of Vocational Rehabilitation. 2018;48(3):387-391. doi:10.3233/JVR-180946 ↩