by Greg Simon, MD, MPH, Group Health psychiatrist and Group Health Research Institute senior investigator
During a recent project meeting, I suddenly thought of The Runaway Bunny, the classic children’s book by Margaret Wise Brown and Clement Hurd. Remember the story? Over and over, the little bunny declares his determination to run away. Each time, his mother’s responses are patient, compassionate, flexible, and incredibly persistent: “If you become a bird and fly away, then I will be a tree that you come home to.”
The meeting that made me think of the bunny was for our Mental Health Research Network Suicide Prevention Outreach Trial. The trial is testing if persistent outreach programs reduce the risk of suicide attempt in people who report frequent thoughts of death or self harm. In one of the outreach programs, care managers are supported by decision-support software that helps them watch over thousands of at-risk patients. The software uses data from electronic health records to identify patients at risk of suicide who appear to have disengaged from outpatient care. The software generates patient-specific outreach and follow-up recommendations for the care managers.
The software is a tool that allows each care manager to support many patients. However, algorithms certainly don’t replace human caring. Care managers’ actual outreach to patients, via online messaging and telephone, is completely personalized. Each outreach message is shaped by everything the care manager knows about the patient’s history, current situation, and preferences. We have no illusions that a chat bot or some other software tool will replace the human part of this work. Instead, we count on software tools to do what they do best—consistently and persistently remind us when follow-up care has fallen off the recommended path.
It was our lead care manager, Deborah King, MSW, who made me see the parallels between our research and a 74-year-old picture book. Deborah noted that the decision support program resolutely reminds us about guideline-based follow-up care. Care managers are rarely prompted to reach out to patients who are doing well—who recover or stay engaged in care. Instead, care managers are constantly directed back to situations where caring can be stressful or difficult. The care manager’s daily “to do” list focuses attention on patients at highest suicide risk, typically those who are most hopeless and alienated.
Deborah’s comments made me realize that while algorithms alone can’t produce caring, neither is compassionate caring alone sufficient to restore hope and connection with the most alienated. Left to our own devices, we tend to direct our caring to the places where we are most comfortable or where our outreach is the most gratefully received. Algorithms are important because they push us back toward the places where caring is most necessary—even if those places are the least comfortable.
When I re-read my family’s well-used copy of The Runaway Bunny, I noticed something I hadn’t seen when reading it to my children. The mother’s patient responses are actually written as a series of “If-Then” statements. They read almost like the conditional expressions in our decision support software. Perhaps Margaret Wise Brown was really coding the first care management algorithm: “If you cancel all of your appointments with your therapist, then I’ll keep trying to reach you—to see if you are safe.”