Cookies on this website

We use cookies to ensure that we give you the best experience on our website. If you click 'Accept all cookies' we'll assume that you are happy to receive all cookies and you won't see this message again. If you click 'Reject all non-essential cookies' only necessary cookies providing core functionality such as security, network management, and accessibility will be enabled. Click 'Find out more' for information on how to change your cookie settings.

<jats:sec id="abs1-1"><jats:title>Background</jats:title><jats:p>Many of the problems confronting policy- and decision-makers, evaluators and researchers today are complex, as are the interventions designed to tackle them. Their success depends both on individuals’ responses and on the wider context of people’s lives. Realist evaluation tries to make sense of these complex interventions. It is a form of theory-driven evaluation, based on realist philosophy, that aims to understand why these complex interventions work, how, for whom, in what context and to what extent.</jats:p></jats:sec><jats:sec id="abs1-2"><jats:title>Objectives</jats:title><jats:p>Our objectives were to develop (a) quality standards, (b) reporting standards, (c) resources and training materials, (d) information and resources for patients and other lay participants and (e) to build research capacity among those interested in realist evaluation.</jats:p></jats:sec><jats:sec id="abs1-3"><jats:title>Methods</jats:title><jats:p>To develop the quality and reporting standards, we undertook a thematic review of the literature, supplemented by our content expertise and feedback from presentations and workshops. We synthesised findings into briefing materials for realist evaluations for the Delphi panel (a structured method using experts to develop consensus). To develop our resources and training materials, we drew on our experience in developing and delivering education materials, feedback from the Delphi panel, the RAMESES JISCMail e-mail list, training workshops and feedback from training sessions. To develop information and resources for patients and other lay participants in realist evaluation, we convened a group consisting of patients and the public. We built research capacity by running workshops and training sessions.</jats:p></jats:sec><jats:sec id="abs1-4"><jats:title>Results</jats:title><jats:p>Our literature review identified 152 realist evaluations, and when 37 of these had been analysed we were able to develop our briefing materials for the Delphi panel. The Delphi panel comprised 35 members from 27 organisations across six countries and five disciplines. Within three rounds, the panels had reached a consensus on 20 key reporting standards. The quality standards consist of eight criteria for realist evaluations. We developed resources and training materials for 15 theoretical and methodological topics. All resources are available online (<jats:uri xmlns:xlink="http://www.w3.org/1999/xlink" xlink:href="http://www.ramesesproject.org" xlink:role="web">www.ramesesproject.org</jats:uri>). We provided methodological support to 17 projects and presentations or workshops to help build research capacity in realist evaluations to 29 organisations. Finally, we produced a generic patient information leaflet for lay participants in realist evaluations.</jats:p></jats:sec><jats:sec id="abs1-5"><jats:title>Limitations</jats:title><jats:p>Our project had ambitious goals that created a substantial workload, leading to the need to prioritise objectives. For example, we truncated the literature review and focused on standards and training material development.</jats:p></jats:sec><jats:sec id="abs1-6"><jats:title>Conclusions</jats:title><jats:p>Although realist evaluation holds much promise, misunderstandings and misapplications of it are common. We hope that our project’s outputs and activities will help to address these problems. Our resources are the start of an iterative journey of refinement and development of better resources for realist evaluations. The RAMESES II project seeks not to produce the last word on these issues, but to capture current expertise and establish an agreed state of the science. Much methodological development is needed in realist evaluation but this can take place only if there is a sufficient pool of highly skilled realist evaluators. Capacity building is the next key step in realist evaluation.</jats:p></jats:sec><jats:sec id="abs1-7"><jats:title>Funding</jats:title><jats:p>The National Institute for Health Research Health Services and Delivery Research programme.</jats:p></jats:sec>

Original publication

DOI

10.3310/hsdr05280

Type

Journal article

Journal

Health Services and Delivery Research

Publisher

National Institute for Health Research

Publication Date

10/2017

Volume

5

Pages

1 - 108