Researching Complex Interventions in Health: The State of the Art (original) (raw)
Related papers
Research made simple: developing complex interventions
Evidence Based Nursing, 2020
Table 1 Framework for developing a complex intervention 8 applied to EoLC interventions and MORECare guidelines 11 Activities Considerations Planning Identify the problem, and why a change is required, its importance and what would be the benefits of a new intervention. Identify current interventions-what does and does not work. Identify resources required both in terms of time (be realistic) and funding Understand the context (population and individuals; location or geographical setting; social, economic, cultural and political influences) where the intervention will be implemented. Understand the facilitators and barriers to reaching the relevant population
Implementation Science, 2014
Background: Process evaluation is vital for understanding how interventions function in different settings, including if and why they have different effects or do not work at all. This is particularly important in trials of complex interventions in 'real world' organisational settings where causality is difficult to determine. Complexity presents challenges for process evaluation, and process evaluations that tackle complexity are rarely reported. This paper presents the detailed protocol for a process evaluation embedded in a randomised trial of a complex intervention known as SPIRIT (Supporting Policy In health with Research: an Intervention Trial). SPIRIT aims to build capacity for using research in health policy and program agencies. Methods: We describe the flexible and pragmatic methods used for capturing, managing and analysing data across three domains: (a) the intervention as it was implemented; (b) how people participated in and responded to the intervention; and (c) the contextual characteristics that mediated this relationship and may influence outcomes. Qualitative and quantitative data collection methods include purposively sampled semi-structured interviews at two time points, direct observation and coding of intervention activities, and participant feedback forms. We provide examples of the data collection and data management tools developed. Discussion: This protocol provides a worked example of how to embed process evaluation in the design and evaluation of a complex intervention trial. It tackles complexity in the intervention and its implementation settings. To our knowledge, it is the only detailed example of the methods for a process evaluation of an intervention conducted as part of a randomised trial in policy organisations. We identify strengths and weaknesses, and discuss how the methods are functioning during early implementation. Using 'insider' consultation to develop methods is enabling us to optimise data collection while minimising discomfort and burden for participants. Embedding the process evaluation within the trial design is facilitating access to data, but may impair participants' willingness to talk openly in interviews. While it is challenging to evaluate the process of conducting a randomised trial of a complex intervention, our experience so far suggests that it is feasible and can add considerably to the knowledge generated.
BMJ
The UK Medical Research Council's widely used guidance for developing and evaluating complex interventions has been replaced by a new framework, commissioned jointly by the Medical Research Council and the National Institute for Health Research, which takes account of recent developments in theory and methods and the need to maximise the efficiency, use, and impact of research. Complex interventions are commonly used in the health and social care services, public health practice, and other areas of social and economic policy that have consequences for health. Such interventions are delivered and evaluated at different levels, from individual to societal levels. Examples include a new surgical procedure, the redesign of a healthcare programme, and a change in welfare policy. The UK Medical Research Council (MRC) published a framework for researchers and research funders on developing and evaluating complex interventions in 2000 and revised guidance in 2006. 1-3 Although these documents continue to be widely used and are now accompanied by a range of more detailed guidance on specific aspects of the research process, 4-8 several important conceptual, methodological and theoretical developments have taken place since 2006. These developments have been included in a new framework commissioned by the National Institute of Health Research (NIHR) and the MRC. 9 The framework aims to help researchers work with other stakeholders to identify the key questions about complex interventions, and to design and conduct research with a diversity of perspectives and appropriate choice of methods.
International Journal of Nursing Studies, 2018
Background: In recent years there has been much emphasis on 'research waste' caused by poor question selection, insufficient attention to previous research results, and avoidable weakness in research design, conduct and analysis. Little attention has been paid to the effect of inadequate development of interventions before proceeding to a full clinical trial. Objective: We therefore propose to enrich the development phase of the MRC Framework by adding crucial elements to improve the likelihood of success and enhance the fit with clinical practice Methods: Based on existing intervention development guidance and synthesis, a comprehensive iterative intervention development approach is proposed. Examples from published reports are presented to illustrate the methodology that can be applied within each element to enhance the intervention design. Results: A comprehensive iterative approach is presented by combining the elements of the MRC Framework development phase with essential elements from existing guidance including: problem identification, the systematic identification of evidence, identification or development of theory, determination of needs, the examination of current practice and context, modelling the process and expected outcomes leading to final element: the intervention design. All elements are drawn from existing models to provide intervention developers with a greater chance of producing an intervention that is well adopted, effective and fitted to the context. Conclusion: This comprehensive approach of developing interventions will strengthen the internal and external validity, minimize research waste and add value to health care research. In complex interventions in health care research, flaws in the development process immediately impact the chances of success. Knowledge regarding the causal mechanisms and interactions within the intended clinical context is needed to develop interventions that fit daily practice and are beneficial for the end-user. What is already known about the topic? • To prevent research waste, little attention has been paid to the effect of inadequate development of interventions before proceeding to a full clinical trial. • The MRC Framework for developing and evaluating complex interventions is the most cited guidance using an iterative approach. • A misfit between the intervention and the implementation context often hampers resulting in suboptimal treatment success.
Qualitative Health Research
Qualitative methods are underutilized in health intervention evaluation, and overshadowed by the importance placed on randomized controlled trials (RCTs). This Commentary describes how innovative qualitative methods are being used as part of RCTs, drawing on articles included in a special issue of Qualitative Health Research on this topic. The articles’ insights and a review of innovative qualitative methods described in trial protocols highlights a lack of attention to structural inequalities as a causal mechanism for understanding human behavior. We situate this gap within some well-known constraints of RCT methodologies, and a discussion of alternative RCT approaches that hold promise for bringing qualitative methods center stage in intervention evaluation, including adaptive designs, pragmatic trials, and realist RCTs. To address the power hierarchies of health evaluation research, however, we argue that a fundamental shift needs to take place away from a focus on RCTs and towar...
Health Technology Assessment, 2021
BackgroundThe Medical Research Council published the second edition of its framework in 2006 on developing and evaluating complex interventions. Since then, there have been considerable developments in the field of complex intervention research. The objective of this project was to update the framework in the light of these developments. The framework aims to help research teams prioritise research questions and design, and conduct research with an appropriate choice of methods, rather than to provide detailed guidance on the use of specific methods.MethodsThere were four stages to the update: (1) gap analysis to identify developments in the methods and practice since the previous framework was published; (2) an expert workshop of 36 participants to discuss the topics identified in the gap analysis; (3) an open consultation process to seek comments on a first draft of the new framework; and (4) findings from the previous stages were used to redraft the framework, and final expert re...
Framework for design and evaluation of complex interventions to improve health
British Medical Journal, 2000
Randomised controlled trials are widely accepted as the most reliable method of determining effectiveness, but most trials have evaluated the effects of a single intervention such as a drug. Recognition is increasing that other, non›pharmacological interventions should also be rigorously evaluated. 1-3 This paper examines the design and execution of research required to address the additional problems resulting from evalua›