Publication

Article

The American Journal of Managed Care

November 2013
Volume19
Issue 11

Collecting Mortality Data to Drive Real-Time Improvement in Suicide Prevention

The Perfect Depression Care initiative serves as an example of how suicide prevention programs can collect real-time mortality data internally to drive rapid quality improvement.

Objectives:

To evaluate the use of government mortality records compared with internally collected data to drive quality improvement in suicide prevention programs using suicide mortality data from the Perfect Depression Care initiative.

Methods:

Perfect Depression Care (PDC) is a quality improvement suicide prevention initiative within the Behavioral Health Services (BHS) division of the Henry Ford Health System. Eligible subjects were all patients who received services from BHS, were members of the health maintenance organization, and had a medical group physician during the 11-year study period. Mortality data were collected internally and from government-collected death records, and were linked to treatment utilization data from the medical record.

Results:

The mean suicide rate was 96.6 per 100,000 during the 2-year baseline period (1999- 2000) and declined to 19.1 per 100,000 during the initiative (2002-2009) using both sources of data combined. We observed a similar statistically significant (P <.001) reduction in the suicide death rate using both the internal and government data sources. There were no significant differences between the 2 sources of data in the mean suicide rates for the baseline and intervention periods (P >.05). The data sources did differ in the capture of unique suicide deaths.

Conclusions:

Internally collected data were an effective measure of suicide deaths in the PDC initiative. A combination of internal and government-collected records may be most effective for future suicide prevention programs.

Am J Manag Care. 2013;19(11):e386-e390Health systems trying to implement suicide prevention programs are limited by their inability to access real-time suicide mortality data to inform rapid-cycle quality improvement.

  • As demonstrated by the Perfect Depression Care initiative, suicide data can be collected internally to drive rapid quality improvement.

  • Internal suicide data should be compared with official government records after their release 2 years later.

Suicide is a major public health concern and it remains a leading cause of death in the United States.1-7 Population-level estimates of suicide have not improved over time,2,8,9 leading the Institute of Medicine to declare suicide prevention “a national imperative.”10 While some suicide prevention strategies exist,11 they are difficult to evaluate due to the rarity of suicide deaths and the inefficiency in the current methods for counting and measuring suicide deaths.

One suicide prevention initiative that has received much attention, including receipt of the 2006 Gold Achievement Award from the American Psychiatric Association and the 2006 Earnest Amory Codman Award from the Joint Commission, is the Perfect Depression Care (PDC) initiative12,13 at Henry Ford Health System (HFHS). Perfect Depression Care is a health system quality improvement initiative associated with reductions in suicide deaths among patients in its health maintenance organization (HMO) network.14

As described previously, the PDC initiative is unique in its use of internally collected suicide data for rolling program evaluation rather than using government records,13 which are traditionally used for population mortality research.15 This internal surveillance method of suicide ascertainment was chosen to drive real-time process improvement, such as to inform protocol as well as provider- and system-level changes. Such rapid-cycle improvement would not be feasible with traditional government records, because it can take up to 2 years before they are available. The aim of this study was to compare the agreement between our internal collected data on suicide and the data obtained from official government mortality records.

METHODSSetting and Population

Figure

The Henry Ford Health System is an integrated healthcare organization serving southeastern Michigan. It consists of multiple hospitals, a large medical group, and a large HMO. The Behavioral Health Services (BHS) division offers specialty mental health services within HFHS. Eligible subjects included all patients who made any service contact at BHS between January 1, 1999, and December 31, 2009, and were enrolled in the HMO network during the year in which they received care. The total sample size of all eligible individuals varied each year, ranging from 6168 to 10,831 patients (). The HMO population, from which the sample for this study was derived, is broadly representative of southeast Michigan with respect to age, sex, race/ethnicity, and socioeconomic status. This project was approved by institutional review boards at Henry Ford Hospital (IRB# 6821) and the state of Michigan (IRB# 953-PHALHAS).

Perfect Depression Care Initiative

The concept and components of the PDC initiative are described in more detail elsewhere.12,13 The PDC initiative began in response to the Institute of Medicine report titled Crossing the Quality Chasm: A New Health System for the 21st Century, which identified depression as a key condition of importance.16 We named our initiative Perfect Depression Care to reflect the Institute of Medicine’s audacious call to pursue a system of perfect healthcare. In such a system, persons receiving treatment for mental disorders would not die from suicide—thus, we adopted the elimination of suicide as one of PDC’s primary aims. To pursue this aim, we developed a bundle of core interventions organized into 3 main categories, which have been described elsewhere.12 Briefly, these categories include (1) a distinct effort to establish a “just culture” in which departmental operations and clinical care are aligned with the aims of the Institute of Medicine report, (2) comprehensive evaluations for every patient using a multilevel suicide risk assessment, and (3) a redesigned care delivery system based on the Chronic Care Model.17

Mortality Data

Date and cause of death were determined from 2 sources of data: internally from a clinical suicide surveillance system and externally from the state of Michigan (SoM) mortality records. For the remainder of this study, the term “suicide” refers to death by suicide. Internal data were collected as they were reported to BHS from any source, generally by a family member, care provider, or HMO representative. In the latter scenario, BHS collected information from the HMO on the reason for death when an individual was no longer enrolled in the health plan and was listed as deceased. De-enrollment from the health plan occurred immediately during the month of death. Finally, sources of suicide information also included information from media outlets or any other means of identification in which a provider or the department identified the suicide victim as being a patient. Internally collected suicides were investigated using TapRooT, a root cause analysis system allowing the PDC team to investigate each incident by reviewing prior treatment history, involvement in care, access to means, and system-level errors.

External suicide data came from the SoM mortality records,18 where cause of death is described by a medical examiner and coded by government officials using the International Classification of Diseases, 10th Revision (ICD-10). After initial coding by the state, the preliminary mortality information is sent to the Centers for Disease Control and Prevention’s (CDC's) National Center for Health Statistics. There, the National Vital Statistics System processes and corrects all mortality records before sending the final corrected version back to each state and region. Approximately 2 years after the end of each calendar year, HFHS receives the final corrected SoM data set.

In the current study, all deaths coded as X60-X84 or Y87.0 were identified as suicides. This coding scheme is recommended by the CDC.19 These data were linked to HFHS records using a 2-step process. First, social security numbers were matched between the SoM data set and the health system’s administrative data from the medical record. Then name, date of birth, address, and sex were used to identify additional matches. The matching process also complies with the CDC protocol for mortality death linkage.20

Analysis

Both data sources were collected independently and matched to medical record data to develop a single data set using SAS version 9.2.21 A trained psychiatrist confirmed the accuracy of the data by conducting medical chart reviews of each case. The research team then reviewed each case to ensure agreement. Using the total number of suicides each year divided by the total number of eligible subjects receiving care that year, we calculated 4 distinct suicide rates: (1) internal data only, (2) SoM data only, (3) either internal or SoM data (“combined”), and (4) both internal and SoM data (“matched”). The suicide rates were adjusted to provide a rate per 100,000 persons for each year, as shown in the Figure.

Poisson regression models were used to compare the change in mean suicide rate with the PDC initiative (implemented in 2001) from baseline (1999-2000) to intervention (2002-2009). Paired-samples t tests were used to assess the degree of agreement between data sources. We then compared and contrasted the matched and mismatched cases from the 2 data sources using comprehensive medical chart review. The main analyses were conducted using Stata release 11.2.22

RESULTS

Table 1

Table 2

As shown in the Figure, throughout the observation period the mean suicide rate for patients in the HMO network dropped from the baseline period (1999-2000) to the intervention period (2002-2009). The data show a decline in the suicide rate of 82% (from 89 per 100,000 to 16 per 100,000; P <.001) using the SoM data and a similar decline by 86% (from 97 per 100,000 to 14 per 100,000; P <.001) using the internal data (). There were no statistical differences between the mean suicide rates for each observation period (baseline/intervention) as estimated by the 2 data sources (). Also, as shown in the Figure, for 5 of the 11 years

the suicide rates were identical. For the 6 years in which they differed, the SoM rates were slightly higher than the internal rate in 3 years and slightly lower in the other 3 years.

Table 3

Although they yielded similar rates, the 2 data sources differed in actual cases captured (). The combined 31 cases of suicide included 23 matched cases (74%). For each of these cases, the 2 data sources also agreed on the manner of suicide. Of the 8 mismatched cases, the SoM data identified 4 cases that were missed by internal data. In each case, these patients were lost to follow-up and the medical record contained no reference to their death. Similarly, the SoM data missed 4 cases identified by the internal data: 2 cases were nonresidents of Michigan and therefore not captured by SoM records (1 Florida resident, 1 Canada resident) and the 2 other cases had their cause of death listed as unintentional self-poisoning, even though our internal system clearly identified them as suicides.

Overall, both data sources missed cases of suicide and, as such, appeared to underestimate the actual rate. Thus, the combined total (n = 31) appeared to be a better estimate of the suicide rate. Using the combined data, the mean suicide rate was 97 per 100,000 for the baseline years and 19 per 100,000 for the intervention period (P <.001) (Table 1).

DISCUSSION

We found that our internally collected data on suicides agreed closely with official government-collected mortality data, although neither source captured all identified cases. This agreement in suicide rates between the 2 data sources suggests that internally collected mortality data are sufficient for driving real-time quality improvement work in suicide prevention. Official US mortality data from the National Center for Health Statistics are considered the gold standard for research,23 a consideration that may be founded on the grounds that no better alternative exists.15,24 On one hand, the findings from this study seem to affirm this conclusion. Both the internal and external methods identified—and missed&mdash;the same number of suicides. This finding suggests that while both methods are imperfect, neither is decidedly better than the other. On the other hand, quality improvement interventions are driven by small, rapid cycles of change that rely on realtime measurement of outcomes. Because official mortality records may not be readily available, they are less useful for driving quality improvement. Thus, an effective overall strategy for suicide prevention efforts might be to use an internal real-time surveillance system for rapid cycles of learning while also using government-collected data, once they are available, to confirm or enhance that learning.

More specifically, the real-time collection of suicide data allows the health system to review each case and subsequently make improvements at almost the same time. For instance, in the PDC initiative, when a suicide occurred by firearm, the system immediately reviewed the protocol for means restriction and determined whether it was appropriate to implement any changes. Alternatively, when an intentional drug overdose death occurred, then the team reviewed the medication policies and drug treatment provided to that patient. As such, any deficiency in system policies was corrected immediately and similar treatment plans were reviewed more closely for patients with analogous symptoms. Unfortunately, if the PDC program relied solely on government-collected data, then those protocol review processes and changes could not begin until at least 2 years after the calendar year in which the deaths occurred. During that waiting period, the improved protocols might have prevented additional suicides. However, it is apparent that the internal method is not perfect; thus, also using government data would allow the health system to review all available information and consider all possible improvements to the suicide prevention program. The system is currently reviewing information on government-collected suicide data that were identified in this study in order to determine whether additional modifications to the protocol are warranted.

The main limitation of this study is the possible underreporting of suicides. It is not surprising that not all suicides were recorded by both data sources, given that medical examiners determine them conservatively25,26 and that much social stigma still surrounds suicide.27 Nonetheless, one core feature of PDC is the creation of a culture in which clinicians and family members are encouraged to report suspected suicides without fear of system-level punishment or remediation. A second limitation was related to location of residence. Despite the fact that 98.8% of BHS patients were Michigan residents, the SoM method missed 2 nonresidents, an error that might not have occurred using national data. However, a portion of the health system’s service area branches into Canada, which is not captured by national suicide data. Third, there was limited power to make individual-level inferences with these data. Future research may investigate this type of initiative across multiple sites, which would increase the sample size.

We conclude that suicide prevention efforts can be guided by and evaluated using multiple data sources, including realtime, internally collected suicide data. We hope the results encourage other organizations to consider building their own internal systems so that future work can evaluate their utility. Finally, this study calls attention to the need for large-scale, real-time suicide surveillance systems.Author Affiliations: From Center for Health Policy and Health Services Research (BKA) and Department of Psychiatry (BKA, MJC, CEC), Henry Ford Health System, Detroit, MI.

Funding Source: This research was supported by the Fund for Henry Ford Hospital and internal funds from Behavioral Health Services, Henry Ford Health System.

Author Disclosures: The authors (BKA, MJC, CEC) report no relationship or financial interest with any entity that would pose a conflict of interest with the subject matter of this article.

Authorship Information: Concept and design (BKA, MJC, CEC); acquisition of data (BKA, MJC, CEC); analysis and interpretation of data (BKA, MJC, CEC); drafting of the manuscript (BKA, MJC, CEC); critical revision of the manuscript for important intellectual content (BKA, MJC, CEC); statistical analysis (BKA, CEC); provision of study materials or patients (MJC, CEC); obtaining funding (CEC); administrative, technical, or logistic support (CEC); and supervision (CEC).

Address correspondence to: Brian K. Ahmedani, PhD, Center for Health Policy and Health Services Research, Henry Ford Health System, 1 Ford Place, 3A, Detroit, MI 48202. E-mail: bahmeda1@hfhs.org.1. Baca-Garcia E, Perez-Rodriguez MM, Keyes KM, et al. Suicidal ideation and suicide attempts among Hispanic subgroups in the United States: 1991-1992 and 2001-2002. J Psychiatr Res. 2011;45(4):512-518.

2. Baca-Garcia E, Perez-Rodriguez MM, Keyes KM, et al. Suicidal ideation and suicide attempts in the United States: 1991-1992 and 2001-2002. Mol Psychiatry. 2010;15(3):250-259.

3. Kessler RC, Berglund P, Borges G, Nock M, Wang PS. Trends in suicide ideation, plans, gestures, and attempts in the United States, 1990-1992 to 2001-2003. JAMA. 2005;293(20):2487-2495.

4. Crosby AE, Ortega L, Stevens MR. Suicides—United States, 1999-2007. MMWR Surveill Summ. 2011;60(suppl):56-59.

5. Gunnell D, Middleton N. National suicide rates as an indicator of the effect of suicide on premature mortality. Lancet. 2003;362(9388): 961-962.

6. Xu J, Kochanek KD, Murphy SL, Tejada-Vera B. Deaths: final data for 2007. Natl Vital Stat Rep. 2010;58(19):1-136.

7. Ahmedani BK, Perron B, Ilgen M, Abdon A, Vaughn M, Epperson M. Suicide thoughts and attempts and psychiatric treatment utilization: informing prevention strategies. Psychiatric Serv. 2012;63(2):186-189.

8. Holinger PC, Klemen EH. Violent deaths in the United States, 1900-1975. Relationships between suicide, homicide and accidental deaths. Soc Sci Med. 1982;16(22):1929-1938.

9. Rockett IR, Hobbs G, De Leo D, et al. Suicide and unintentional poisoning mortality trends in the United States, 1987-2006: two unrelated phenomena? BMC Public Health. 2010;10:705.

10. Institute of Medicine. Reducing Suicide: A National Imperative. Washington, DC: National Academies Press; 2002.

11. Mann JJ, Apter A, Bertolote J, et al. Suicide prevention strategies: a systematic review. JAMA. 2005;294(16):2064-2074.

12. Coffey CE. Building a system of perfect depression care in behavioral health. Jt Comm J Qual Patient Saf. 2007;33(4):193-199.

13. Coffey CE. Pursuing perfect depression care. Psychiatr Serv. 2006; 57(10):1524-1526.

14. Hampton T. Depression care effort brings dramatic drop in large HMO population’s suicide rate. JAMA. 2010;303(19):1903-1905.

15. Cowper DC, Kubal JD, Maynard C, Hynes DM. A primer and comparative review of major US mortality databases. Ann Epidemiol. 2002;12(7):462-468.

16. Committee on Quality of Health Care in America, Institute of Medicine. Crossing the Quality Chasm: A New Health System for the 21st Century. Washington, DC: National Academies Press; 2001.

17. Wagner EH, Glasgow RE, Davis C, et al. Quality improvement in chronic illness care: a collaborative approach. Jt Comm J Qual Improv. 2001;27(2):63-80.

18. Michigan Department of Community Health. Michigan Mortality. http://www.mdch.state.mi.us/pha/osr/index.asp?Id=4. Updated June 29, 2012.

19. Centers for Disease Control and Prevention. ICD framework: external cause of injury mortality matrix. http://www.cdc.gov/nchs/injury/ice/matrix10.htm. Published June 2010. Accessed April 2, 2013.

20. Centers for Disease Control and Prevention. National Death Index. http://www.cdc.gov/nchs/ndi.htm#matching. Published February 2013. Accessed April 12, 2013.

21. SAS Institute Inc. SAS/STAT Users Guide. Version 9.2 ed. Cary, NC: SAS Institute Inc; 2008.

22. StataCorp. Stata Statistical Software [computer program]. Release 11.2. College Station, TX: Statacorp, LP; 2009.

23. Hermansen SW, Leitzmann MF, Schatzkin A. The impact on National Death Index ascertainment of limiting submissions to Social Security Administration Death Master File matches in epidemiologic studies of mortality. Am J Epidemiol. 2009;169(7):901-908.

24. Claassen CA, Yip PS, Corcoran P, Bossarte RM, Lawrence BA, Currier GW. National suicide rates a century after Durkheim: do we know enough to estimate error? Suicide Life Threat Behav. 2010;40(3): 193-223.

25. Rockett IR. Counting suicides and making suicide count as a public health problem. Crisis. 2010;31(5):227-230.

26. Timmermans S. Suicide determination and the professional authority of medical examiners. Am Sociol Rev. 2005;70:311-333.

27. Tadros G, Jolley D. The stigma of suicide. Br J Psychiatry. 2001; 179:178

Related Videos
Matias Sanchez, MD
Screenshot of an interview with Nadine Barrett, PhD
dr carol regueiro
dr carol regueiro
dr carol regueiro
Corey McEwen, PharmD, MS
dr linda bosserman
dr andrew leitner
dr joseph alvarnas
Related Content
AJMC Managed Markets Network Logo
CH LogoCenter for Biosimilars Logo