psychiatrist

This work may not be copied, distributed, displayed, published, reproduced, transmitted, modified, posted, sold, licensed, or used for commercial purposes. By downloading this file, you are agreeing to the publisher’s Terms & Conditions.

Article

Implementation Science Supports Core Clinical Competencies: An Overview and Clinical Example

JoAnn E. Kirchner, MD; Eva N. Woodward, PhD; Jeffrey L. Smith, BS; Geoffrey M. Curran, PhD; Amy M. Kilbourne, PhD, MPH; Richard R. Owen, MD; and Mark S. Bauer, MD

Published: December 8, 2016

Implementation Science Supports Core Clinical Competencies:

An Overview and Clinical Example

ABSTRACT

Objective: Instead of asking clinicians to work faster or longer to improve quality of care, implementation science provides another option. Implementation science is an emerging interdisciplinary field dedicated to studying how evidence-based practice can be adopted into routine clinical care. This article summarizes principles and methods of implementation science, illustrates how they can be applied in a routine clinical setting, and highlights their importance to practicing clinicians as well as clinical trainees.

Method: A hypothetical clinical case scenario is presented that explains how implementation science improves clinical practice. The case scenario is also embedded within a real-world implementation study to improve metabolic monitoring for individuals prescribed antipsychotics.

Results: Context, recipient, and innovation (ie, the evidence-based practice) factors affected improvement of metabolic monitoring. To address these factors, an external facilitator and a local quality improvement team developed an implementation plan involving a multicomponent implementation strategy that included education, performance reports, and clinician follow-up. The clinic remained compliant with recommended metabolic monitoring at 1-year follow up.

Conclusions: Implementation science improves clinical practice by addressing context, recipient, and innovation factors and uses this information to develop and utilize specific strategies that improve clinical practice. It also enriches clinical training, aligning with core competencies by the Accreditation Council for Graduate Medical Education and American Boards of Medical Specialties. By learning how to change clinical practice through implementation strategies, clinicians are more able to adapt in complex systems of practice.

Prim Care Companion CNS Disord 2016;18(6):doi:10.4088/PCC.16m02004

aVA Quality Enhancement Research Initiative (QUERI) for Team-Based Behavioral Health, Central Arkansas Veterans Healthcare System, North Little Rock, Arkansas

bDepartment of Psychiatry, University of Arkansas for Medical Sciences, Little Rock, Arkansas

cSouth Central Mental Illness Research Education and Clinical Centers (MIRECC), Central Arkansas Veterans Healthcare System, North Little Rock, Arkansas

dCenter for Mental Healthcare & Outcomes Research, Central Arkansas Veterans Healthcare System, North Little Rock, Arkansas

eCenter for Implementation Research, University of Arkansas for Medical Sciences, Little Rock, Arkansas

fDepartment of Pharmacy Practice and Psychiatry, University of Arkansas for Medical Sciences, Little Rock, Arkansas

gQuality Enhancement Research Initiative (QUERI), US Department of Veterans Affairs, Ann Arbor, Michigan

hDepartment of Psychiatry, University of Michigan Medical School, Ann Arbor, Michigan

iDepartment of Epidemiology, College of Public Health, University of Arkansas for Medical Sciences, Little Rock, Arkansas

jDepartment of Psychiatry, Harvard Medical School, Cambridge, Massachusetts

kCenter for Healthcare Organization and Implementation Research (CHOIR), US Department of Veterans Affairs, Boston, Massachusetts

*Corresponding author: JoAnn E. Kirchner, MD, Central Arkansas Veterans Healthcare System, QUERI for Team-Based Behavioral Healthcare, 2200 Fort Roots Drive, Bldg 58, 16MIR/NLR, North Little Rock, AR 72114 ([email protected]).

Instead of asking clinicians to work faster or longer to improve quality of care, implementation science provides another option. Implementation science is an emerging interdisciplinary field dedicated to studying how evidence-based practice (EBP) can be adopted into routine clinical care. Here, we provide a hypothetical clinical case scenario that explains how implementation science improves clinical practice. The case scenario is also embedded within a real-world implementation study to improve metabolic monitoring for individuals prescribed antipsychotics.

CLINICAL CASE SCENARIO

Dr K is a psychiatrist practicing within a large integrated health care system that provides primary and specialty care. She prides herself on being current with recommended standards of care and EBP. For patients started on a new antipsychotic medication, clinical practice guidelines recommend baseline and follow-up monitoring for metabolic side effects such as weight gain, diabetes, and dyslipidemia, which are associated with these medications. To address this important patient safety concern, leadership of Dr K’s health care system established performance standards requiring that, for patients started on new antipsychotics, baseline monitoring be completed for weight, glucose or hemoglobin A1c, and lipids. The performance standards specified minimum compliance as 90% for weight monitoring, 80% for glucose/hemoglobin A1c monitoring, and 60% for lipid monitoring, assessed on a monthly basis. Although the performance standards have been in place for over a year, Dr K’s clinic has yet to meet the minimum thresholds for compliance in any monthly report.

Mr A is a 33-year-old single man with a 10-year diagnosis of schizophrenia who presents as a new patient with no prior medical records. He reports that he has done well in the past when treated with olanzapine but has not been on any medication for 6 months. He exhibits mild psychotic symptoms including occasional noncommand auditory hallucinations, confused thinking, and social isolation. After a thorough evaluation, Dr K confirms the schizophrenia diagnosis and no history of diabetes, restarts Mr A on olanzapine 15 mg daily, and requests that he see the receptionist to be weighed and visit the laboratory for testing before he leaves. His return appointment is made for 4 weeks later.

On the day of his scheduled return appointment, Mr A did not show up. He was contacted and rescheduled for 2 weeks later. When Mr A returns, he displays no confusion, reports almost complete resolution of auditory hallucinations, but continues to report social isolation. He states that he left immediately after his first appointment and did not get weighed or go to the laboratory. By this appointment, Dr K has been told about the full clinical recommendations for metabolic monitoring at a staff meeting. Dr K continues olanzapine at the current dose and provides directions to the receptionist for a weigh-in and asks the receptionist to direct Mr A to the laboratory. The receptionist weighs Mr A as instructed. The receptionist is not accustomed to sending patients to the laboratory but gives Mr A paper instructions on how to get to the laboratory.

Four weeks later, Dr K receives a phone call from the hospital intensive care nurse informing her that Mr A presented to the emergency room in acute ketoacidosis with a glucose level of 600 mmol/L and is hospitalized in intensive care. Reviewing his medical record, Dr K finds a weight from his prior appointment but no laboratory values. Mr A’s emergency room weight reflects a 10-lb increase since he was last seen. Dr K is frustrated and demoralized.

IMPLEMENTATION SCIENCE

Dr K’s experience is common in clinical care. Despite knowledge that EBPs can improve patient outcomes, many clinicians struggle to implement EBPs into routine clinical care. In fact, some researchers estimate a 17-year gap from the time that treatment has proven effective to when it is provided routinely to patients.1 A growing body of evidence-based implementation strategies can move EBPs into clinical practice more rapidly, effectively, and sustainably.2 The National Institutes of Health define implementation as the “use of strategies to adopt and integrate evidence-based health interventions and change practice patterns within specific systems.”3 The National Academy of Medicine4 has incorporated ongoing self-assessment and improvement of clinical practices into their conceptualization of a Learning Healthcare System as the ideal for clinical health care delivery.

The importance of assessing and improving one’s practice has also long been recognized by the Accreditation Council for Graduate Medical Education (ACGME) in conjunction with the American Boards of Medical Specialties. The 6 ACGME core competencies include practice-based learning and improvement and systems-based practice,5 both central components of implementation and heavily represented in the new ACGME Psychiatry Milestone assessment process.6 In addition, support for implementing EBP into routine clinical care is rapidly gaining momentum through major professional organization activities. For example, the American Psychiatric Association will train over 3,500 psychiatrists over the next 4 years to implement the collaborative care model in primary care through an initiative funded through the Centers for Medicare and Medicaid Services. This initiative will apply many of the strategies evaluated in implementation science studies such as peer-based learning networks and implementation technical support.7

Thus, familiarity with implementation strategies supports clinicians in adopting EBPs in practice8 and also enriches the development of trainees. The following is a broad narrative review of implementation science, with a focus on clinicians and use of the clinical case scenario to highlight the rationale for clinical application of implementation science.

What Is Implementation Science?

Implementation science is an emerging interdisciplinary field dedicated to studying how research findings or EBPs are best adopted into routine care to improve the uptake of effective clinical innovations (ie, the EBP), especially ones that rely on more than a single provider or team of providers. Effective innovations that have benefitted from implementation science range from genomics to cancer control to mental health treatments and include the collaborative care model used to integrate mental health services into primary care settings, psychotherapies for posttraumatic stress disorder, and operationalization of protocols for using telemedicine across rural practices.9–11 Just as there is a science that supports rigorous testing of new clinical interventions, the emerging field of implementation science consists of a growing portfolio of rigorously tested implementation strategies or highly specified activities that seek to improve uptake of EBPs by targeting barriers at the provider or health care organization levels, usually through partnerships between clinical operations and research.12 Specifically, implementation science aims to:

  1. Develop effective strategies for improving health-related processes and outcomes.
  2. Produce generalizable knowledge regarding these strategies by understanding implementation processes, barriers, and facilitators.
  3. Develop, test, and refine the theories, frameworks, strategies, and measures that inform implementation strategies.13

Clinicians are vital to the process of building local support and tailoring implementation strategies and clinical interventions already established as efficacious in the scientific literature.14–16 Individual clinicians are not expected to assume roles as implementation scientists; rather, they are implementation practitioners who can increase the uptake of an EBP into clinical practice by applying tools developed in implementation research and working with those with implementation expertise to contextualize evidence-based implementation strategies and improve clinical processes. It is important to emphasize that as implementation practitioners, clinicians serve as experts in the local context and organizational practices as opposed to passive recipients of process changes. In addressing the first 2 aims of implementation science noted previously—developing effective strategies to improve processes and outcomes and better understanding implementation processes, barriers, and facilitators—researchers can help implementation practitioners improve their own clinical practices through broad dissemination of tools and processes developed within research activities as well as key lessons learned.

The Society for Implementation Research Collaboration (SIRC)17 encourages participation of implementation practitioners, including clinicians, managers, and policy makers who are involved in implementation activities. Within SIRC, there is a stakeholder subgroup entitled the EBP Champion Group. Presentations at SIRC meetings from this subgroup are highly encouraged and preferentially accepted. In addition, SIRC has resources beyond the EBP Champion Group to inform and educate implementation practitioners.17

For those who want more focused training in implementation science, a recent review18 identified 11 opportunities that provide training to develop implementation science skills, from 1-day workshops to certificate courses. Support for these training opportunities comes from a broad range of funders including the Agency for Healthcare Research and Quality (AHRQ), National Institutes of Health, and the US Department of Veterans Affairs, as well as institutional training programs. The breadth of funders reflects the importance of this emerging field in its potential to enhance professional development and add clinical value.19 The success of both evidence-based implementation practice at a local site and implementation science research is highly dependent on clinician involvement, along with that of other key stakeholders at a site.

The Method of Implementation Science

Although the case scenario provided earlier depicts a single patient experience, regional-level mental health leadership in that health care system had identified the need for improved metabolic side effect monitoring as a system-wide problem. That specific case was discussed by mental health leadership who used the electronic health record to identify that this was a systemic problem that required a more tailored strategy than simply establishing a side effect monitoring performance standard. Dr K’s site was identified as a pilot for improving this aspect of clinical care. Regional and local leadership worked with a team of implementation scientists to assess, develop, and evaluate a strategy to improve metabolic side effect monitoring.

Since health care represents a complicated system, implementation scientists rely on theoretical approaches to understand, intervene upon, and study the interacting components. One clinically and empirically supported approach is the Integrated Promoting Action on Research Implementation in Health Services (i-PARIHS) framework.15 The i-PARIHS approach conceptualizes practice improvement through distinct but overlapping constructs including the context (ie, clinical setting), the recipients (eg, patients, providers), and the innovation (ie, the EBP or practice to be implemented).15 Factors at any of these levels can be barriers to or facilitators of EBP implementation. Here, we define these constructs and apply them to the case scenario.

At the contextual level, factors include characteristics of the setting where the innovation is to be implemented. Examples of contextual factors include leadership support for implementation and the degree to which the innovation is consistent with organizational priorities and culture. At Dr K’s hospital, the chief of psychiatry conveyed the importance of metabolic monitoring in several clinician staff meetings after the new performance standard was introduced at the hospital. However, infrastructure to fully support metabolic monitoring (eg, the ability to monitor patient weights and laboratory findings between visits) was not in place, and, thus, there had been no change in clinical care.

At the recipient level, factors include providers, patients, and other key stakeholders potentially adopting the innovation; they are the “intended target groups of the implementation.”15(p33) Recipient factors include motivation, beliefs, skills, knowledge, time, and resources. Clinicians in Dr K’s hospital were quite open to the evidence that antipsychotic monitoring was clinically valuable. However, 2 clinicians who were well-respected among the staff (“opinion leaders”) regularly complained about any new performance measures and believed that, because standards changed so often, none were valid or important.

At the innovation level, factors pertain to characteristics of the EBP itself—in the case presented here, metabolic side effect monitoring of antipsychotics. For instance, characteristics of the innovation that may influence implementation are how much change the innovation will entail, perceived advantages or benefits it may have, its usability to clinical staff, and knowledge that staff have of the innovation.20 In Dr K’s clinic, because antipsychotic side effect monitoring was an EBP that could help ensure patient safety from potential metabolic complications of treatment, clinic staff generally believed it was clinically valuable.21 However, staff members were not accustomed to prioritizing antipsychotic side effect monitoring. Additionally, a computerized clinical reminder for metabolic side effect monitoring was not perceived as helpful by clinicians.

Intervening at the Level of Contextual,
Recipient, and Innovation Factors

As with most clinical settings, Dr K’s clinic presented both facilitators and barriers to metabolic monitoring. Implementation scientists assume that providers are neither inherently “bad” nor “good.” Rather, context, recipient, and innovation factors all influence use of EBPs; these factors can be influenced by facilitation to enhance implementation. Facilitation is an evidence-based method to support change in an organization22–26 and involves an integrated set of strategies.24,26,27 Facilitators work with clinicians, leadership, and staff, enabling and supporting them as they implement an EBP or other clinical innovation. The specific roles facilitators take on and when they assume them depends on a given facility’s needs over the course of implementation.24,26,27

Facilitator activities occur at each of the 3 levels pertinent to implementation. Facilitation at the contextual level may include identifying and engaging key stakeholders at all organizational levels, identifying and addressing barriers related to organizational culture or readiness, engaging in marketing and promotion of the innovation, keeping leadership engaged and informed, and leveraging any financial incentives or policy changes available to support innovation adoption. For recipients, facilitator activities may include assessing and addressing staff training needs, assisting with local goal setting, assisting in developing and monitoring local implementation plans, and regularly auditing and feeding back clinical data to improve implementation. Facilitation at the innovation level might include gathering, appraising, and synthesizing evidence for the innovation, assessing the degree of fit between the innovation and the local setting, adapting the innovation to promote an optimal fit while maintaining fidelity, and assessing potential for pilot testing of the innovation on a smaller scale within the clinical setting. For those who are interested in becoming better acquainted with facilitation as an implementation strategy, a manual written for clinical managers is available.28 Here, we provide an example of an actual implementation research trial that showcases how principles and methods of implementation science can improve clinical practice; the trial resulted from clinician experiences like Dr K’s.

IMPLEMENTATION TRIAL EXAMPLE: A STUDY OF STRATEGIES TO IMPROVE SCHIZOPHRENIA

A Study of Strategies to Improve Schizophrenia (ASSIST) tested a facilitation implementation strategy to improve metabolic side effect monitoring for patients in the US Department of Veterans Affairs (VA) with schizophrenia who were prescribed antipsychotics in Dr K’s outpatient clinic.29 Research activities were under approval and regulation of the institutional review board at the site. The project utilized a quality improvement (QI) team comprising local opinion leaders involved in medication management of patients with schizophrenia, supported by an external facilitator—an expert in implementation science.15,22

Through interviews with clinical staff, the external facilitator identified local barriers to recommended metabolic side effect monitoring at the levels of context, recipients, and innovation:

  1. Context: lack of resources or staff to ensure monitoring was completed and a culture that was generally resistant to change.
  2. Recipients: lack of training on use of the clinical reminder, lack of clinician awareness of clinical recommendations for metabolic monitoring, low perceived need among clinicians for QI, and patient nonadherence to side effect monitoring (eg, “no shows” for scheduled visits, transportation barriers).
  3. Innovation: lack of a mechanism to track monitoring and perceived limitations of an existing computerized clinical reminder.

To address these barriers, the external facilitator assisted the QI team in developing a local implementation plan. The external facilitator maintained regular contact with the QI team over a 6-month period to assist them with developing the implementation plan, identifying barriers, monitoring progress, problem-solving, and adapting strategies to achieve local QI goals. The QI team was also provided educational materials and information technology tools (eg, monthly performance reports on metabolic side effect monitoring rates within the clinic). These tools were developed and tailored to local preferences based on input from clinic leadership and staff.

Initially, the QI team focused their efforts solely on disseminating educational materials to clinicians, increasing awareness, making minor revisions to the computerized clinical reminder, and distributing monthly performance reports at staff meetings. These initial efforts were weighted heavily toward promoting staff education and awareness, producing only modest (10%–15%) improvements in metabolic monitoring rates, which were still not compliant with their health care network’s performance standards. By the third month of implementation, these modest improvements in monitoring rates had almost returned to baseline levels (Figure 1).

At that time, the external facilitator reengaged with the local QI team to elicit their ideas for strategies that could produce sustainable improvements in metabolic monitoring rates. Dr K suggested that, although the monthly performance reports were helpful in monitoring overall progress, they did not offer timely, actionable data that could be used to identify patients who had not been monitored in compliance with the performance standards. The external facilitator then worked with the information technology staff to develop a computerized report e-mailed to Dr K on a weekly basis identifying patients due for metabolic monitoring. During the first month the weekly report was implemented, Dr K used the information to contact individual clinicians to encourage them to complete metabolic monitoring for their patients (a clinician follow-up activity). Monitoring rates increased substantially (see Figure 1). After the first month of implementing the weekly reports, the external facilitator encouraged Dr K to delegate the clinician follow-up activity to another clinical staff member (nurse) to increase feasibility for continuing follow-up with clinicians so improvements in monitoring rates could be more sustainable. At the end of the 6-month implementation period, the proportion of patients whose weight was monitored as recommended increased from 70% to 93%, with dramatic increases in glucose and lipid monitoring rates (53%–80% and 29%–67%, respectively). For the first time ever, the clinic was compliant with the performance standards for metabolic side effect monitoring, and the clinic remained in compliance at 1-year follow-up. The ASSIST project implementation strategies were then included in guidance provided to other facilities in the health care system as part of a subsequent project to disseminate best practices in metabolic side effect monitoring for patients taking antipsychotics. Also, although the ASSIST project used an external facilitator, other studies have shown that individuals internal to the clinic or system can be trained to apply facilitation strategies to support local implementation or QI efforts.2

The Case of Dr K Revisited

Following the ASSIST project, at the local clinic Ms B presented to Dr K, who started her on olanzapine 15 mg, ordered baseline laboratories, and asked her to stop by the receptionist desk to be weighed. Prior to her return appointment, Dr K received a computerized report noting that Ms B had not received metabolic monitoring following the new prescription of olanzapine. She contacted Ms B and directed her to the laboratory to receive a fasting glucose level and the original laboratory workup before her return appointment. Immediately prior to Ms B’s appointment, Dr K received a laboratory alert noting that Ms B’s fasting glucose level was 180 mmol/L. During her appointment, Ms B denied a history of diabetes or elevated blood glucose. Dr K chose to taper the olanzapine and initiate aripiprazole, to which Ms B responded well with no exacerbation of her psychotic symptoms or abnormal glucose levels, lipid levels, or weight changes.

DISCUSSION

Despite rapid advancement in the establishment of EBPs, systematic implementation of many practices has not occurred. Providers are frequently left in isolation, attempting to “try harder” and “do the right thing” without the infrastructure or assistance to support their efforts. Dr K’s real case scenario in this article illustrates how implementation science applies interdisciplinary knowledge, skills, and strategies to support systematic use of EBP in routine clinical care to maximize treatment benefit for patients.20–22,26,27 Of particular importance, implementation science encourages us to view changes in provider behavior from a larger lens encompassing the organizational context within which the provider exists, a broad spectrum of recipients of the EBP (including patients), and factors associated with the innovation itself.

Implementation science also strives to provide evidence-based strategies to identify and address barriers at multiple levels. Although implementation science may use components of traditional QI (working with a local QI team) and dissemination (distributing educational materials), it entails a focus on acquiring new knowledge, use of strategies that address barriers within and across a system of care, active efforts to improve the quality of care, and spread of knowledge acquired during implementation to other settings. Whereas QI can address some local barriers to implementing practice guidelines, implementation science is tasked with generalizing local findings to other settings via lessons learned, disseminating implementation products and processes, informing refinements to implementation theories, and, eventually, using controlled trials to test implementation strategies. QI projects are local and typically utilize staff as experts in site-specific issues and solutions. Implementation research should involve such components and also draw from nonlocal implementation scientists who can offer implementation best practices that are derived from research12 and theory.30

Although we presented a case scenario in a VA clinic, authors of this article (J.E.K., G.M.C., A.M.K., R.R.O., and M.S.B.) have worked extensively with Federally Qualified Health Centers (FQHCs), community mental health center networks, and academic affiliated clinical networks and believe the same principles and methods apply in settings outside the VA. In addition, there are strong implementation science efforts embedded within other health organizations such as health maintenance organizations and mental health research networks.31,32 Finally, the Institute for Health Improvement, a not-for-profit organization, supports implementation of new innovations that support health care improvement within and outside of the United States.33

A range of resources are available to support implementation practitioners’ efforts to implement improvement or local change, including web-based information, interest groups, national meetings and discussion forums oriented to implementation practitioners, formal fellowships, and certificate programs.8,34 Tools and products such as implementation guides2,35 and processes15,36 can be applied by clinical providers within their own settings. Exposure to implementation science ensures state-of-the-art training for developing clinicians by aligning with ACGME core competencies of practice-based learning and systems-based practice. This alignment is also consistent with the evolution of a Learning Health Care System, whereby clinicians are involved in continuous QI through the use of data to inform gaps in care processes and to, in turn, apply implementation strategies to mitigate them.37 The incorporation of implementation science principles and perspectives within the rubric of these competencies holds tremendous promise for ensuring that patients receive the highest quality of care available with commensurate improvements in clinical outcomes.

Instead of asking clinicians to work faster or longer to improve the quality of care they deliver, implementation science provides a third and valuable option. By understanding implementation science principles, addressing barriers to evidence-based practices through a systems approach, and incorporating an ongoing learning health care culture within their practice,35 clinicians and clinical managers can more efficiently improve quality of care and may more easily adapt in their ever-changing and complex systems of practice.

Submitted: June 15, 2016; accepted October 10, 2016.

Published online: December 8, 2016.

Drug names: aripiprazole (Abilify), olanzapine (Zyprexa and others).

Potential conflicts of interest: None.

Funding/support: This article was supported by the Team Based Behavioral Health Quality Enhancement Research Initiative (QUERI) grant QUE 15-289 through the Department of Veterans Affairs (Drs Kirchner and Bauer and Mr Smith); Center for Healthcare Organization and Implementation Research (CHOIR) grant CIN 13-403 (Dr Bauer); the Department of Veterans Affairs Office of Academic Affiliations Advanced Fellowship Program in Mental Illness Research and Treatment (Dr Woodward); and the Department of Veterans Affairs South Central Mental Illness Research, Education, and Clinical Center (Dr Woodward).

Role of the sponsor: The supporters had no role in the design, analysis, interpretation, or publication of this study.

Disclaimer: The views expressed in this report are those of the author and do not necessarily represent the views of the US Department of Veterans Affairs.

Previous presentations: Poster presentations at the 2003 Veterans Health Administration’s QUERI National Meeting; March 3–5, 2006; Washington, DC 2006 VA HSR&D National Meeting; February 16–17, 2006; Arlington, Virginia 2006 AcademyHealth Annual Research Meeting; June 25–27, 2006; Seattle, Washington 2007 AcademyHealth Annual Research Meeting; June 4, 2007; Orlando, Florida 2007 VA National Mental Health Meeting, Transforming Mental Health Care: Promoting Recovery and Integrated Care; July 17, 2007; Alexandria, Virginia.

REFERENCES

1. Balas E, Boren S. Managing clinical knowledge for health care improvement. In: Bemmel J, McCray AT, eds. Yearbook of Medical Informatics 2000: Patient-Centered Systems. Stuttgart, Germany: Schattauer Verlagsgesellschaft mbH; 2000:65–70.

2. Kirchner JE, Ritchie MJ, Pitcock JA, et al. Outcomes of a partnered facilitation strategy to implement primary care-mental health. J Gen Intern Med. 2014;29(suppl 4):904–912. PubMed doi:10.1007/s11606-014-3027-2

3. NIH PAR-13-055: Dissemination and Implementation Research in Health (R01), 2013. National Institutes of Health Web site. http://grants.nih.gov/grants/guide/pa-files/PAR-13-055.html.

4. The Learning Healthcare System. Workshop Summary (IOM Roundtable on Evidence-Based Medicine). Washington, DC: National Academies Press; 2007.

5. Kavic MS. Competency and the six core competencies. JSLS. 2002;6(2):95–97. PubMed

6. Thomas CR, Psychiatry Milestone Group. The Psychiatry Milestone Project. 2013. The Accreditation Council for Graduate Medical Education and The American Board of Psychiatry and Neurology Web site. http://www.acgme.org/acgmeweb/Portals/0/PDFs/Milestones/PsychiatryMilestones.pdf.

7. American Psychiatric Association. Transforming Clinical Practice Initiative Support and Alignment Network. 2016. APA Web site. https://www.psychiatry.org/psychiatrists/practice/professional-interests/integrated-care/transforming-clinical-practice-initiative. Accessed September 2, 2016.

8. Tabak RG, Khoong EC, Chambers DA, et al. Bridging research and practice: models for dissemination and implementation research. Am J Prev Med. 2012;43(3):337–350. PubMed doi:10.1016/j.amepre.2012.05.024

9. Eccles MP, Mittman BS. Welcome to implementation science. Implement Sci. 2006;1(1):1–3. doi:10.1186/1748-5908-1-1

10. National Academies of Sciences, Engineering, and Medicine. Applying an Implementation Science Approach to Genomic Medicine: Workshop Summary. Washington, DC: The National Academies Press; 2016.

11. Neta G, Sanchez MA, Chambers DA, et al. Implementation science in cancer prevention and control: a decade of grant funding by the National Cancer Institute and future directions. Implement Sci. 2015;10:4. PubMed doi:10.1186/s13012-014-0200-2

12. Bauer MS, Damschroder L, Hagedorn H, et al. An introduction to implementation science for the non-specialist. BMC Psychol. 2015;3(1):32. PubMed doi:10.1186/s40359-015-0089-9

13. Mittman BS. Accelerating adoption of genomic medicine innovations: tools and guidance from implementation science. National Academies Web site. http://iom.nationalacademies.org/~/media/Files/Activity%20Files/Research/GenomicBasedResearch/2015-NOV-19/Mittman.pdf. Updated November 19, 2015.

14. Rycroft-Malone J, Seers K, Titchen A, et al. What counts as evidence in evidence-based practice? J Adv Nurs. 2004;47(1):81–90. PubMed doi:10.1111/j.1365-2648.2004.03068.x

15. Harvey G, Kitson A. Implementing Evidence-Based Practice in Healthcare: A Facilitation Guide. New York, NY: Routledge; 2015:1707.

16. Fortney J, Enderle M, McDougall S, et al. Implementation outcomes of evidence-based quality improvement for depression in VA community based outpatient clinics. Implement Sci. 2012;7:30. PubMed doi:10.1186/1748-5908-7-30

17. Society for Implementation Research Collaboration Membership. SIRC Web site. http://societyforimplementation
researchcollaboration.net/sirc-membership/
. Accessed September 2, 2016.

18. Proctor EK, Chambers DA. Training in dissemination and implementation research: a field-wide perspective [published online May 3, 2016]. Transl Behav Med. 2016:1–12. PubMed doi:10.1007/s13142-016-0406-8

19. Moses H 3rd, Matheson DHM, Cairns-Smith S, et al. The anatomy of medical research: US and international comparisons. JAMA. 2015;313(2):174–189. PubMed doi:10.1001/jama.2014.15939

20. Diffusion of Innovations. 5th ed. New York, NY: Free Press; 2003;576.

21. American Diabetes Association; American Psychiatric Association; American Association of Clinical Endocrinologists; North American Association for the Study of Obesity. Consensus development conference on antipsychotic drugs and obesity and diabetes. Diabetes Care. 2004;27(2):596–601. PubMed doi:10.2337/diacare.27.2.596

22. Stetler CB, Legro MW, Rycroft-Malone J, et al. Role of “external facilitation” in implementation of research findings: a qualitative evaluation of facilitation experiences in the Veterans Health Administration. Implement Sci. 2006;1:23. PubMed doi:10.1186/1748-5908-1-23

23. Kitson A, Harvey G, McCormack B. Enabling the implementation of evidence based practice: a conceptual framework. Qual Health Care. 1998;7(3):149–158. PubMed doi:10.1136/qshc.7.3.149

24. Hayden P, Frederick L, Smith BJ, et al. Developmental facilitation: helping teams promote systems change: collaborative planning project for planning comprehensive early childhood systems. Institute of Education Science Web site. http://eric.ed.gov/?q=Developmental+Facilitation%3a+Helping+Teams+Promote+Systems+Change&id=ED455628. Updated 2001. Accessed December 9, 2015.

25. Nagykaldi Z, Mold JW, Robinson A, et al. Practice facilitators and practice-based research networks. J Am Board Fam Med. 2006;19(5):506–510. PubMed doi:10.3122/jabfm.19.5.506

26. Harvey G, Loftus-Hills A, Rycroft-Malone J, et al. Getting evidence into practice: the role and function of facilitation. J Adv Nurs. 2002;37(6):577–588. PubMed doi:10.1046/j.1365-2648.2002.02126.x

27. Thompson GN, Estabrooks CA, Degner LF. Clarifying the concepts in knowledge transfer: a literature review. J Adv Nurs. 2006;53(6):691–701. PubMed doi:10.1111/j.1365-2648.2006.03775.x

28. Kirchner J, Ritchie M, Dollar KM, et al. Implementation Facilitation Training Manual: Using External and Internal Facilitation to Improve Care in the Veterans Health Administration. http://www.queri.research.va.gov/tools/implementation/Facilitation-Manual.pdf. Updated 2010. Accessed January 6, 2016.

29. Owen RR, Smith J, Hudson T, et al. Comparison of Strategies to Improve Antipsychotic Monitoring and Management for Schizophrenia. VA HSR&D/QUERI National Meeting. February 2008; Baltimore, Maryland. Health Services Research & Development Web site. Accessed November 4, 2016. http://www.hsrd.research.va.gov/meetings/2008/display_abstract.cfm?RecordID=444.

30. Sales A, Smith J, Curran G, et al. Models, strategies, and tools: theory in implementing evidence-based findings into health care practice. J Gen Intern Med. 2006;21(suppl 2):S43–S49. PubMed doi:10.1111/j.1525-1497.2006.00362.x

31. Rossom RC, Simon GE, Beck A, et al. Facilitating action for suicide prevention by learning health care systems. Psychiatr Serv. 2016;67(8):830–832. PubMed doi:10.1176/appi.ps.201600068

32. Newton KM, Larson EB. Learning health care systems: leading through research: the 18th Annual HMO Research Network Conference, April 29–May 2, 2012, Seattle, Washington. Clin Med Res. 2012;10(3):140–142. PubMed doi:10.3121/cmr.2012.1099

33. Institute for Healthcare Improvement home page. IHI Web site. http://www.ihi.org/Pages/default.aspx. Accessed September 2, 2016.

34. Society for Implementation Research Collaboration: dissemination and implementation training opportunities. SIRC Web site. http://societyforimplementation
researchcollaboration.net/dissemination-and-implementation-training-opportunities/
. Accessed September 2, 2016.

35. Kirchner JE, Parker LE, Bonner LM, et al. Roles of managers, frontline staff and local champions, in implementing quality improvement: stakeholders’ perspectives. J Eval Clin Pract. 2012;18(1):63–69. PubMed doi:10.1111/j.1365-2753.2010.01518.x.

36. Fortney JC, Pyne JM, Smith JL, et al. Steps for implementing collaborative care programs for depression. Popul Health Manag. 2009;12(2):69–79. PubMed doi:10.1089/pop.2008.0023

37. Chambers DA, Feero WG, Khoury MJ. Convergence of implementation science, precision medicine, and the learning health care system: a new model for biomedical research. JAMA. 2016;315(18):1941–1942. PubMed doi:10.1001/jama.2016.3867

Related Articles

Volume: 18

Quick Links:

References