Guidance

Use your results: evaluating digital health products

How to write up and share your findings

This page is part of a collection of guidance on evaluating digital health products.

Write up your results

Once you have carried out an evaluation, finish writing up what you have done, including:

  • how you carried out the evaluation
  • what happened in the evaluation
  • what decisions you made about analysis
  • what results you found
  • what conclusions you have drawn

Structure your report

There are various reporting guidelines. The methods library (see ‘Methods library’ in Evaluating digital health products) includes reporting guidelines and examples of written-up studies for specific evaluation methods.

A standard reporting format is IMRaD: Introduction, Methods, Results, and Discussion.

1. Introduction

What was the background to the study? Why is it important? What was the research question?

2. Methods

What, when, where, who and how. What was your design? Who were your participants? Where, when and how was the study carried out?

3. Results

What data did you collect? What were the findings? Avoid discussing the implications of what you have found; just report it.

This section should match your methods section. Any part of the study you mentioned in the methods section should have a corresponding part in the results section. The results section should not introduce new parts of the design.

4. Discussion

What did the evaluation show? How do the findings fit in with what else you know? What recommendations should you make? This is where you give an interpretation of the meaning of your findings and how they add to what you or others knew about the product before you did the evaluation.

An evaluation report can have an hourglass form.

  1. Start broad: what is the general background to what you are doing and what gaps are you addressing?
  2. Get more specific: what are the implications of this for a particular product or service? The middle of the report is the most specific. What did you do at a particular time with particular participants?
  3. Broaden out again: what are the general implications of the findings from your study?

Be honest

Be honest and transparent in your evaluation report. Discuss what went right, but also what went wrong. Your findings may have proved what you hoped they would, but they may have proved the opposite. The point of doing the evaluation is to find out. You may want to shape the story for different external audiences, but be open for internal purposes.

Some information may need to be kept confidential. Information which can identify participants should usually be removed. Your report could group results together (aggregated data) rather than using results from individuals. Or you could use non-identifiable individual results, such as short quotations from interviews. If you are using quotes or case studies, you may need to check participants are happy for you to do this.

More guidance on reporting

For health economic studies – Consolidated Health Economic Evaluation Reporting Standards (CHEERS).

A minimum set of recommendations for reporting randomised trials – Consolidated Standards of Reporting Trials (CONSORT).

An extension of the CONSORT statement providing guidance for reporting digital health interventions – CONSORT-EHEALTH: Improving and Standardizing Evaluation Reports of Web-based and Mobile Health Interventions.

For qualitative studies – Consolidated Criteria for Reporting Qualitative Research (COREQ).

A framework for reporting new system-level knowledge about how to improve healthcare – Standards for Quality Improvement Reporting Excellence (SQUIRE).

Recommendations for reporting observational research in epidemiology, including checklists of items that should be included – Strengthening the Reporting of Observational Studies in Epidemiology (STROBE).

Guidelines for reporting non-randomised evaluations of behavioral and public health interventions – Transparent Reporting of Evaluations with Nonrandomized Designs (TREND).

General guidance on writing evaluation reports – How to write an evaluation report on the NCVO website.

A library of reporting guidelines and other resources, including many of the resources mention in this section – The Equator Network.

Share your results

You may want to tailor your write-up for different audiences. For example, you might write it for:

  • your own use, as a record and to focus your thoughts
  • internal and external stakeholders
  • the wider evaluation community
  • current users
  • future users
  • patients or the public
  • current or future commissioners
  • investors
  • trustees

Consider what your audience needs. What details do they care about? What language, tone and level of technical detail should you use?

Consider disseminating your results. How you do this will depend on the context and the sort of evaluation you have carried out. How widely applicable your research findings are will also vary.

When to share your results

It is important to share your findings so that the digital health community can build a more comprehensive picture of what might be effective, for what types of users, in what contexts in digital health.

If you are in the public sector, or your evaluation was funded by public money, you have a duty to share your results openly. Your evaluation will probably come under Freedom of Information rules.

If you are in the private sector, you may want to keep your research confidential. For example, your evaluation may contain commercially sensitive information. You should still share results internally and in some situations you might want to share them externally. Summative evaluations can demonstrate the value of your product or service. Regulatory processes may require you to make some evaluations public. If you received public funds for research, you may also be required to make results publicly available.

Sometimes, you might want to share evaluation results with participants or other people directly involved in the study. Make sure what you do here is consistent with what you said you would do beforehand.

Check with your collaborators what they expect. They may have requirements of their own.

How to share your results

There are several ways to share results.

You can make a report available on your own website.

There may be ways to share experiences locally, for example, through an Academic Health Science Network (AHSN). You could submit your report to AHSN Atlas, an online resource that shares examples of innovation and good practice in healthcare.

You could try different formats for communicating your results, for example:

  • one-page summaries
  • infographics
  • video outputs
  • presentations at expos, conferences and other events

You can submit your research for publication in an academic journal. There are several journals specialising in digital health and its evaluation, and there will also be journals specific to the problem area or disease you are researching. Each journal will have a webpage with details of what format to use and other guidelines. Get help from academic collaborators.

Make recommendations

Once you have done the evaluation, consider what recommendations follow on from the work. This may depend whether you’re evaluating the product as you’re developing it, to work out how to make it better (formative evaluation), or when you are launching it, or soon after, to find out whether it achieves its aims (summative evaluation). Recommendations usually fall into 4 broad categories.

Recommendations for future evaluation studies

Evaluation is often part of an iterative process with more evaluations happening afterwards. Your evaluation may have raised new questions that should be answered later. You may have started with a simpler evaluation. If the results look promising, this might justify committing resources to a more complex evaluation.

Recommendations for how to make the product better

Particularly with formative evaluations, but also with summative evaluations, there will be implications for how to make your product better. Feed these back to the development team.

Recommendations for how the product should be implemented

Summative evaluations are often used to justify commissioning and purchasing decisions, or to pass some regulatory approval.

Recommendations for how to do evaluations differently

Your experience of carrying out an evaluation may be useful for people doing other evaluations. Evaluations in digital health are relatively new and methods are still developing. Consider sharing your experiences within a community of practice.

Say what evidence you have

Be honest about the weight of evidence. Some evaluations provide more robust evidence for conclusions than others. Some conclusions are more directly-related to the evidence than others. Be clear what evidence you have for a recommendation.

Make sure you have turned conclusions into actionable conclusions. You could use SMART criteria – recommendations should be:

  • specific
  • measurable
  • achievable
  • relevant
  • time-bound

Consider making separate recommendations for different stakeholders. You may need to present your findings and recommendations differently. The Health Foundation website has a toolkit, ‘Communicating your research’. It includes templates and guidance on how to plan communication of your findings to different audiences.

Published 30 January 2020