October 31, 2017

SIRC 4th Biennial Conference: We opened Pandora’s Box of implementation questions

SIRC-Cara-Lewis_Callie-Walsh-Baily-2col.jpg

Dr. Cara Lewis and Callie Walsh-Bailey describe a meeting that generated as many questions as answers — and why that’s a good thing.

by Cara Lewis, PhD, HSPP, associate scientific investigator, and Callie Walsh-Bailey, MPH, research support specialist, Kaiser Permanente Washington Health Research Institute

The United States spends billions of dollars each year on research. Although this investment has funded incredible discoveries, innovations, and evidence-based treatments to improve health, the process of incorporating research evidence into routine practice to improve the quality of care is painstakingly slow. The National Academy of Medicine suggests that it can take almost two decades for only a portion of research to impact the lives of consumers, a problem known as the “quality chasm.” Implementation science is a new field designed to bridge the research-to-practice gap, with the ultimate goal to improve the quality and effectiveness of health care.

The field of implementation science has spent the last two decades describing what got in the way of integrating evidence based treatment, or characterizing the barriers to implementation, like negative attitudes, provider habit, poor leadership, limited resources, etc. That descriptive work was really important. However, more recently, researchers began testing head-to-head comparisons of implementation strategies like training plus consultation versus building a web-based platform and learning collaborative. The problem with these most recent efforts is that they are not linked with the decades of descriptive work. That is, the development and testing of strategies has been pursued in the absence of understanding the processes, or mechanisms through which these strategies influence change. Perhaps because of this, “best practices” of how to implement an evidence-based treatment are becoming increasingly complex and costly without enhanced benefit. Fortunately, there are international initiatives and societies seeking to address this very issue.

Making waves to find out what works and why

The Society for Implementation Research Collaboration (SIRC), led by KPWHRI’s Associate Investigator, Dr. Cara Lewis, is one such organization that is making waves in implementation science. SIRC is a self-funded society committed to facilitating collaboration between implementation researchers, practitioners, and other stakeholders to enhance the quality, rigor, and relevance of methods used to embark on and evaluate implementation efforts. A recent publication showed that SIRC’s conference series ranked in the top 15 of most highly attended conferences for individuals interested implementation science. This September, SIRC held its Fourth Biennial Seattle-based conference, the theme of which was “Mechanisms of Implementation: What Works and Why?” Dr. Lewis was inspired to take on this theme because studies have shown that two-thirds of implementation efforts fail and half don’t impact the intended outcome.

“Implementation science is facing an ironic situation — although the field came about to bridge the gap between research and practice, a gap is emerging within the field itself,” says Dr. Lewis. “Methods are proposed that health care systems cannot afford. Measures are developed that don’t have predictive utility in diverse settings. Strategies are designed that don’t acknowledge the dynamism and complexity of routine care settings. Without mechanisms-focused research, scientists will be spinning their wheels testing strategies without any idea of what they target and why they don’t yield the desired outcome.”

Stripping away the scientific jargon, our colleague at the University of Washington, Dr. Bryan Weiner, provides this everyday example of a mechanism. Imagine someone wants to hang a painting. She has a nail (which we can think of as our evidence-based treatment for anchoring paintings to walls). However, she is unable to push the nail into the wall with her bare hands deeply enough to secure the painting (the barrier to implementation is that the depth is insufficient to put the painting into “practice”). She needs a more effective tool (implementation strategy A) than her hand (implementation strategy B) to push the nail into the wall at a depth that will hold the painting. She may have many tools in her toolbox from which to choose.

Ultimately she selects a hammer over a screwdriver because a hammer applies blunt force (mechanism) sufficient to secure the nail to the wall whereas the screwdriver is not as effective for that purpose. Additional customization of the strategy can make it an even better fit for the context. For instance, a smaller hammer might be used in a tight space where a larger hammer would be too big to use.

A field poised for its crucial role

In our example, our subject identifies a specific strategy prior to taking action. This thoughtful, planned selection of strategies helps ensure that the effort taken to implement an evidence-based treatment fits the context, which helps result in successful implementation. SIRC believes the field is poised to do this work, and its priority is to identify appropriate implementation mechanisms so that effective, feasible ways of integrating evidence-based interventions into health care delivery systems can happen more quickly.

Over the two days of the conference, participants opened what some referred to as “Pandora’s Box,” generating as many challenging questions as they had answers, but left motivated to carry forth this critical research agenda. When we meet again in 2019, we will recount the great progress the field has made. In the meantime, we will be hard at work looking to establish mechanisms and increase the efficiency and impact of our implementation efforts. If you are interested in joining us or learning more, check out the SIRC website.

Learn more about Kaiser Permanente Washington Health Research Institute. Sign up for our free monthly newsletter.