Decision-based models of the implementation of interventions in systems of healthcare: Implementation outcomes and intervention effectiveness in complex service environments
Autoři:
Arno Parolini aff001; Wei Wu Tan aff001; Aron Shlonsky aff002
Působiště autorů:
Department of Social Work, University of Melbourne, Carlton, Victoria, Australia
aff001; Department of Social Work, Monash University, Caulfield East, Victoria, Australia
aff002
Vyšlo v časopise:
PLoS ONE 14(10)
Kategorie:
Research Article
prolekare.web.journal.doi_sk:
https://doi.org/10.1371/journal.pone.0223129
Souhrn
Implementation is a crucial component for the success of interventions in health service systems, as failure to implement well can have detrimental impacts on the effectiveness of evidence-based practices. Therefore, evaluations conducted in real-world contexts should consider how interventions are implemented and sustained. However, the complexity of healthcare environments poses considerable challenges to the evaluation of interventions and the impact of implementation efforts on the effectiveness of evidence-based practices. In consequence, implementation and intervention effectiveness are often assessed separately in health services research, which prevents the direct investigation of the relationships of implementation components and effectiveness of the intervention. This article describes multilevel decision juncture models based on advances in implementation research and causal inference to study implementation in health service systems. The multilevel decision juncture model is a theory-driven systems approach that integrates structural causal models with frameworks for implementation. This integration enables investigation of interventions and their implementation within a single model that considers the causal links between levels of the system. Using a hypothetical youth mental health intervention inspired by published studies from the health service research and implementation literature, we demonstrate that such theory-based systems models enable investigations of the causal pathways between the implementation outcomes as well as their links to patient outcomes. Results from Monte Carlo simulations also highlight the benefits of structural causal models for covariate selection as consistent estimation requires only the inclusion of a minimal set of covariates. Such models are applicable to real-world context using different study designs, including longitudinal analyses which facilitates the investigation of sustainment of interventions.
Klíčová slova:
Mental health and psychiatry – Statistical data – Decision making – Economic models – Health services research – Psychotherapy – Complex systems
Zdroje
1. Aarons GA, Sklar M, Mustanski B, Benbow N, Brown CH. “Scaling-out” evidence-based interventions to new populations or new health care delivery systems. Implement Sci. 2017;12: 111. doi: 10.1186/s13012-017-0640-6 28877746
2. Ioannidis JPA. Randomized controlled trials: often flawed, mostly useless, clearly indispensable: a commentary on Deaton and Cartwright. Soc Sci Med. 2018;210: 53–56. doi: 10.1016/j.socscimed.2018.04.029 29776687
3. Bird L, Arthur A, Cox K. “Did the trial kill the intervention?” experiences from the development, implementation and evaluation of a complex intervention. BMC Med Res Methodol. 2011;11: 24. doi: 10.1186/1471-2288-11-24 21362159
4. Finley EP, Huynh AK, Farmer MM, Bean-Mayberry B, Moin T, Oishi SM, et al. Periodic reflections: a method of guided discussions for documenting implementation phenomena. BMC Med Res Methodol. 2018;18: 153. doi: 10.1186/s12874-018-0610-y 30482159
5. Solomons NM, Spross JA. Evidence-based practice barriers and facilitators from a continuous quality improvement perspective: an integrative review. J Nurs Manag. 2011;19: 109–120. doi: 10.1111/j.1365-2834.2010.01144.x 21223411
6. Chaudoir SR, Dugan AG, Barr CH. Measuring factors affecting implementation of health innovations: a systematic review of structural, organizational, provider, patient, and innovation level measures. Implement Sci. 2013;8: 22. doi: 10.1186/1748-5908-8-22 23414420
7. Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Health Ment Health Serv Res. 2011;38: 4–23. doi: 10.1007/s10488-010-0327-7 21197565
8. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4: 50. doi: 10.1186/1748-5908-4-50 19664226
9. Fixsen DL, Naoom SF, Blase KA, Friedman RM, Wallace F. Implementation research: a synthesis of the literature. Tampa: University of South Florida, Louis de la Parte Florida Mental Health Institute, the National Implementation Research Network; 2005. Report No.: 231. Available: https://nirn.fpg.unc.edu/sites/nirn.fpg.unc.edu/files/resources/NIRN-MonographFull-01-2005.pdf
10. Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health Ment Health Serv Res. 2011;38: 65–76. doi: 10.1007/s10488-010-0319-7 20957426
11. Chambers D. Foreword. In: Brownson RC, Colditz GA, Proctor EK, editors. Dissemination and implementation research in health: translating science to practice. New York; Oxford: Oxford University Press; 2012. pp. vii–x.
12. Proctor E. Dissemination and implementation research. Encyclopedia of social work. National Association of Social Workers and Oxford University Press Interactive Factory; 2014. pp. 1–34. Available: http://socialwork.oxfordre.com/view/10.1093/acrefore/9780199975839.001.0001/acrefore-9780199975839-e-900
13. Curran GM, Bauer M, Mittman B, Pyne JM, Stetler C. Effectiveness-implementation hybrid designs: combining elements of clinical effectiveness and implementation research to enhance public health impact. Med Care. 2012;50: 217–226. doi: 10.1097/MLR.0b013e3182408812 22310560
14. Lewis CC, Klasnja P, Powell BJ, Lyon AR, Tuzzio L, Jones S, et al. From classification to causality: advancing understanding of mechanisms of change in implementation science. Front Public Health. 2018;6: 136. doi: 10.3389/fpubh.2018.00136 29868544
15. Williams NJ. Multilevel mechanisms of implementation strategies in mental health: integrating theory, research, and practice. Adm Policy Ment Health Ment Health Serv Res. 2016;43: 783–798. doi: 10.1007/s10488-015-0693-2 26474761
16. Cook TD. Twenty-six assumptions that have to be met if single random assignment experiments are to warrant “gold standard” status: a commentary on Deaton and Cartwright. Soc Sci Med. 2018;210: 37–40. doi: 10.1016/j.socscimed.2018.04.031 29778288
17. Deaton A, Cartwright N. Understanding and misunderstanding randomized controlled trials. Soc Sci Med. 2018;210: 2–21. doi: 10.1016/j.socscimed.2017.12.005 29331519
18. Frieden TR. Evidence for health decision making—beyond randomized, controlled trials. N Engl J Med. 2017;377: 465–475. doi: 10.1056/NEJMra1614394 28767357
19. Teele DL, editor. Field experiments and their critics: essays on the uses and abuses of experimentation in the social sciences. New Haven: Yale University Press; 2014.
20. Gelman A. Benefits and limitations of randomized controlled trials: a commentary on Deaton and Cartwright. Soc Sci Med. 2018;210: 48–49. doi: 10.1016/j.socscimed.2018.04.034 29747877
21. Pearl J. Causality: models, reasoning, and inference. 2nd ed. New York: Cambridge University Press; 2009.
22. Train KE. Discrete choice methods with simulation. 2nd ed. New York: Cambridge University Press; 2009.
23. Moullin JC, Dickson KS, Stadnick NA, Rabin B, Aarons GA. Systematic review of the Exploration, Preparation, Implementation, Sustainment (EPIS) framework. Implement Sci. 2019;14: 1. doi: 10.1186/s13012-018-0842-6 30611302
24. Powell BJ, Waltz TJ, Chinman MJ, Damschroder LJ, Smith JL, Matthieu MM, et al. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci. 2015;10: 21. doi: 10.1186/s13012-015-0209-1 25889199
25. Chamberlain P, Feldman SW, Wulczyn F, Saldana L, Forgatch M. Implementation and evaluation of linked parenting models in a large urban child welfare system. Child Abuse Negl. 2016;53: 27–39. doi: 10.1016/j.chiabu.2015.09.013 26602831
26. Fixsen DL, Blase KA, Naoom SF, Wallace F. Core implementation components. Res Soc Work Pract. 2009;19: 531–540. doi: 10.1177/1049731509335549
27. Rubin DB. Estimating causal effects of treatments in randomized and nonrandomized studies. J Educ Psychol. 1974;66: 688–701. doi: 10.1037/h0037350
28. Heckman JJ, Pinto R. Causal analysis after Haavelmo. Econom Theory. 2015;31: 115–151. doi: 10.1017/S026646661400022X 25729123
29. Heckman JJ, Vytlacil E. Structural equations, treatment effects, and econometric policy evaluation. Econometrica. 2005;73: 669–738. doi: 10.1111/j.1468-0262.2005.00594.x
30. White H, Chalak K. Identification and identification failure for treatment effects using structural systems. Econom Rev. 2013;32: 273–317. doi: 10.1080/07474938.2012.690664
31. Heckman JJ, Humphries JE, Veramendi G. Dynamic treatment effects. J Econom. 2016;191: 276–292. doi: 10.1016/j.jeconom.2015.12.001 27041793
32. Low H, Meghir C. The use of structural models in econometrics. J Econ Perspect. 2017;31: 33–58. doi: 10.1257/jep.31.2.33
33. Heckman JJ, Vytlacil EJ. Econometric evaluation of social programs, part I: causal models, structural models and econometric policy evaluation. In: Heckman JJ, Leamer EE, editors. Handbook of econometrics. Elsevier; 2007. pp. 4779–4874. doi: 10.1016/S1573-4412(07)06070-9
34. Pearl J, Glymour M, Jewell NP. Causal inference in statistics: a primer. Chichester: John Wiley & Sons Ltd; 2016.
35. Tversky A, Kahneman D. The framing of decisions and the psychology of choice. Science. 1981;211: 453. doi: 10.1126/science.7455683 7455683
36. Tversky A, Kahneman D. Judgment under uncertainty: heuristics and biases. Science. 1974;185: 1124. doi: 10.1126/science.185.4157.1124 17835457
37. Michie S, van Stralen MM, West R. The behaviour change wheel: a new method for characterising and designing behaviour change interventions. Implement Sci. 2011;6: 42. doi: 10.1186/1748-5908-6-42 21513547
38. Imbens GW, Rubin DB. Causal inference for statistics, social, and biomedical sciences: an introduction. New York, NY: Cambridge University Press; 2015.
39. James A, Soler A, Weatherall R. Cognitive behavioural therapy for anxiety disorders in children and adolescents. Cochrane Database Syst Rev. 2005; doi: 10.1002/14651858.CD004690.pub2 16235374
40. Hofmann SG, Asnaani A, Vonk IJJ, Sawyer AT, Fang A. The Efficacy of cognitive behavioral therapy: a review of meta-analyses. Cogn Ther Res. 2012;36: 427–440. doi: 10.1007/s10608-012-9476-1 23459093
41. Lewis CC, Scott K, Marti CN, Marriott BR, Kroenke K, Putz JW, et al. Implementing measurement-based care (iMBC) for depression in community mental health: a dynamic cluster randomized trial study protocol. Implement Sci. 2015;10: 127. doi: 10.1186/s13012-015-0313-2 26345270
42. James AC, James G, Cowdrey FA, Soler A, Choke A. Cognitive behavioural therapy for anxiety disorders in children and adolescents. Cochrane Database Syst Rev. 2015; doi: 10.1002/14651858.CD004690.pub4 25692403
43. White H, Lu X. Causal diagrams for treatment effect estimation with application to efficient covariate selection. Rev Econ Stat. 2011;93: 1453–1459. doi: 10.1162/REST_a_00153
44. Burton A, Altman DG, Royston P, Holder RL. The design of simulation studies in medical statistics. Stat Med. 2006;25: 4279–4292. doi: 10.1002/sim.2673 16947139
45. Bryan ML, Jenkins SP. Multilevel modelling of country effects: a cautionary tale. Eur Sociol Rev. 2016;32: 3–22. doi: 10.1093/esr/jcv059
46. Pearl J. Challenging the hegemony of randomized controlled trials: a commentary on Deaton and Cartwright. Soc Sci Med. 2018;210: 60–62. doi: 10.1016/j.socscimed.2018.04.024 29704961
47. Keane MP. Structural vs. atheoretic approaches to econometrics. J Econom. 2010;156: 3–20. doi: 10.1016/j.jeconom.2009.09.003
48. White H, Chalak K. Testing a conditional form of exogeneity. Econ Lett. 2010;109: 88–90. doi: 10.1016/j.econlet.2010.08.018
49. Cameron AC, Trivedi PK. Microeconometrics: methods and applications. New York: Cambridge University Press; 2005.
50. Mullainathan S, Spiess J. Machine learning: an applied econometric approach. J Econ Perspect. 2017;31: 87–106. doi: 10.1257/jep.31.2.87
51. Hernán MA, Hsu J, Healy B. A second chance to get causal inference right: a classification of data science tasks. CHANCE. 2019;32: 42–49. doi: 10.1080/09332480.2019.1579578
52. Athey S, Imbens GW. The state of applied econometrics: causality and policy evaluation. J Econ Perspect. 2017;31: 3–32. doi: 10.1257/jep.31.2.3
53. Brookhart MA, Schneeweiss S, Rothman KJ, Glynn RJ, Avorn J, Stürmer T. Variable selection for propensity score models. Am J Epidemiol. 2006;163: 1149–1156. doi: 10.1093/aje/kwj149 16624967
54. Wooldridge JM. Violating ignorability of treatment by controlling for too many factors. Econom Theory. 2005;21: 1026–1028. doi: 10.1017/S0266466605050516
55. Tan WW, Jeffreys C, Parolini A. Implementing implementation: integrating the measurement of implementation and effectiveness in complex service systems. In: Mildon R, Albers B, Shlonsky A, editors. Effective implementation science. New York: Springer; Forthcoming.
56. Lewis CC, Stanick CF, Martinez RG, Weiner BJ, Kim M, Barwick M, et al. The Society for Implementation Research Collaboration Instrument Review project: a methodology to promote rigorous evaluation. Implement Sci. 2015;10: 2. doi: 10.1186/s13012-014-0193-x 25567126
57. Holland PW. Statistics and causal inference. J Am Stat Assoc. 1986;81: 945–960. doi: 10.2307/2289064
58. Shadish WR, Cook TJ, Campbell DT. Experimental and quasi-experimental designs for generalized causal inference. Belmont: Wadsworth Cengage Learning; 2002.
59. Oster E. Unobservable selection and coefficient stability: theory and evidence. J Bus Econ Stat. 2019;37: 187–204. doi: 10.1080/07350015.2016.1227711
60. Imbens GW, Wooldridge JM. Recent developments in the econometrics of program evaluation. J Econ Lit. 2009;47: 5–86. doi: 10.1257/jel.47.1.5
61. Imbens GW, Angrist JD. Identification and estimation of local average treatment effects. Econometrica. 1994;62: 467–475. doi: 10.2307/2951620
62. Heckman JJ, Navarro S. Dynamic discrete choice and dynamic treatment effects. J Econom. 2007;136: 341–396. doi: 10.1016/j.jeconom.2005.11.002
Článok vyšiel v časopise
PLOS One
2019 Číslo 10
- Metamizol jako analgetikum první volby: kdy, pro koho, jak a proč?
- Nejasný stín na plicích – kazuistika
- Masturbační chování žen v ČR − dotazníková studie
- Úspěšná resuscitativní thorakotomie v přednemocniční neodkladné péči
- Fixní kombinace paracetamol/kodein nabízí synergické analgetické účinky
Najčítanejšie v tomto čísle
- Correction: Low dose naltrexone: Effects on medication in rheumatoid and seropositive arthritis. A nationwide register-based controlled quasi-experimental before-after study
- Combining CDK4/6 inhibitors ribociclib and palbociclib with cytotoxic agents does not enhance cytotoxicity
- Experimentally validated simulation of coronary stents considering different dogboning ratios and asymmetric stent positioning
- Prevalence of pectus excavatum (PE), pectus carinatum (PC), tracheal hypoplasia, thoracic spine deformities and lateral heart displacement in thoracic radiographs of screw-tailed brachycephalic dogs