Publication

Article

Supplements and Featured Publications
The Aligning Forces for Quality Initiative: Summative Findings and Lessons Learned From Efforts to Improve Healthcare Quality at the Community Level
Volume 22
Issue 12

From Rhetoric to Reality: Consumer Engagement in 16 Multi-Stakeholder Alliances

Objective: A key component of the Aligning Forces for Quality (AF4Q) program was engaging consumers in their health and healthcare. We examined the extent to which the alliances embraced 4 areas of consumer engagement: self-management, consumer friendliness of reports of healthcare provider quality, involvement of consumers in alliance governance, and the integration of consumers into quality improvement teams.

Methods: We used a largely qualitative approach. The evaluation team conducted 1100 in-depth interviews with alliance stakeholders. Two authors reviewed the consumer engagement data for each alliance to assess its level of embrace in the 4 consumer engagement areas. For consumer friendliness of public reporting websites, we also assessed alliance public reports for reading level, technical language, and evaluable displays. Population-level effects were also examined for self-management and public reporting.

Results: Consumer engagement was new to most alliances, and few had staff with consumer engagement expertise or existing consumer constituencies. For each area of consumer engagement, some alliances enthusiastically embraced the work, other alliances made a concerted but limited effort to develop programs, and a third group of alliances did the minimum work required. Integrating consumers into governance was the area most often embraced, followed by making public reports consumer friendly. Two alliances strongly embraced both self-management and integrating patients into quality improvement efforts. The AF4Q program did not have greater population level effects from self-management or public reporting than were those observed in a national comparison sample.

Conclusion: The AF4Q program sparked a few alliances to develop robust consumer engagement programming, while most alliances tried consumer engagement efforts for the first time and developed an appreciation for integrating consumer perspectives into their work.

Am J Manag Care. 2016;22:S403-S412

The concept of consumer engagement has received a lot of attention in recent years, being described as “the blockbuster drug of the century” and a key factor for achieving the Triple Aim.1,2 The idea that consumers can help improve the efficiency of healthcare delivery by being more involved in their health and making more informed choices about treatments and providers has widespread, bipartisan appeal.3,4 However, what exactly is meant by the term has been questioned by many, including Munro who wrote: “How we define it will determine whether it’s truly a miracle drug—or just another variant of age-old snake oil.”5-7

Because of the ambiguity in definition, 2 recent papers have sought to capture consumer engagement’s dimensions in a conceptual framework.8,9 Carman and colleagues, who served as technical assistance providers for the Aligning Forces for Quality (AF4Q) program, described 3 levels at which patients can be engaged: the direct care level, in which individuals become engaged in their own health and healthcare; the level of organizational design and governance for healthcare organizations; and finally, the policy-making level, when consumers are involved with federal, state, and local healthcare policy.8 At each level, the consumer role is viewed on a continuum, from consultation with consumers at the low end to partnership and shared leadership at the high end.

Mittler and colleagues, who were part of the AF4Q program evaluation team, further differentiated engagement activities at the direct-care level into 4 components.9 Two relate to health behaviors: self-management behaviors, which focus on management of chronic disease, and healthy behaviors, which are general health-promoting behaviors like healthy eating and engaging in physical activity. The other 2 relate to managing healthcare: healthcare encounter behaviors relate to effectively communicating with healthcare providers and shopping behaviors include making informed choices to select high-quality (or high-value) healthcare providers and treatments.

This article examines how consumer engagement was implemented in the AF4Q program’s 16 multi-stakeholder coalitions (alliances). Specifically, we examine the approaches to implementing efforts in the 4 key areas of consumer engagement in the AF4Q program, which include examples of consumer engagement at the direct-care and healthcare organization governance levels, but not at the policy-making level. At the direct-care level, we examined efforts to improve (1) self-management and (2) shopping; at the healthcare organizational level, we examined efforts to involve individual consumers in (3) alliance governance and (4) healthcare quality improvement teams. The first 3 areas were required as part of the AF4Q program, while the fourth was voluntarily developed by several alliances. We examined how and to what degree the alliances embraced each of these 4 areas of consumer engagement, the factors related to the level of embrace of work in the areas of consumer engagement, and, for the direct-care consumer engagement efforts, we also examined the extent to which the interventions had population-level effects.

Background

With the AF4Q initiative, the Robert Wood Johnson Foundation (RWJF) was early to embrace the concept of consumer engagement. The AF4Q program was premised on the idea that to improve the quality of patient care, consumers needed to be involved with providers and purchasers. Consumer engagement was seen as one of the key drivers for improving healthcare quality in the United States, along with performance measurement, public reporting, and quality improvement.10,11

Although RWJF viewed consumer engagement as essential to the AF4Q program, what exactly consumer engagement would look like within the initiative was less clear.12 The initial call for proposals required “substantial and credible” consumer representation in the alliance leadership.10 It also emphasized consumers’ use of information to make care decisions. To clarify how that would be operationalized, early in the AF4Q program, RWJF created a consumer engagement learning community in which alliances met, received technical assistance, and shared ideas. As part of the learning community, each alliance was expected to develop plans in 2 areas of direct-care consumer engagement: self-management and consumer use of public reports of provider quality.7 RWJF staff later wrote, “We let a thousand flowers bloom, believing that there was too little evidence about what works in consumer engagement to dictate one approach.”13

The AF4Q alliances’ early experience with consumer engagement was challenging. According to early evaluation findings, “Developing a concerted, coherent consumer engagement strategy has taken much more time than originally anticipated in virtually all of the sites.”7 The challenges were attributed to differing levels of enthusiasm for the work within alliances, a lack of existing evidence-based strategies, and difficulty reaching consensus on strategies within alliances.

In response to the challenges, RWJF began reworking its approach to consumer engagement in late 2008. Instead of supporting a “wide breadth of activities,” the Foundation decided to provide alliances with more structure and narrower consumer engagement expectations.14 In March 2009, the Foundation sent the alliances a memo clarifying that the principal goal for consumer engagement was raising consumers’ awareness and use of comparative healthcare performance information.14 Specifically, alliances were tasked with providing consumer friendly public reports of healthcare provider performance information and using a range of strategies to connect consumers to access and use the information to facilitate consumer shopping. The memo also reiterated the Foundation’s desire for consumer involvement in alliance governance. It further explained that the new consumer engagement expectations could be viewed as a “floor of activities” and not the entirety of what alliances could do. Thus, alliances could continue to implement programs in self-management or other areas of consumer engagement as long as they also disseminated consumer-friendly public reports on provider quality and involved consumers in governance.

Soon after the memo’s release, the alliances were required to report specifically on metrics related to the 2 main consumer engagement goals. For shopping, alliances had to report on the number of visits and page views for their online public reports of ambulatory provider quality performance. For integrating consumers into alliance governance, alliances had to document that at least 1 consumer (not a consumer advocate) was represented on the alliance’s leadership team or workgroup. As the grant reporting requirements became more process-oriented, the alliances were asked to describe major activities (eg,“to promote consumers’ use of health and comparative performance information in healthcare decision making” and “engaging individual consumers and consumer advocates in the work of [the] AF4Q [program]”).

Methods

We used a qualitative approach to examine the alliances’ experiences implementing programs in the 4 areas of consumer engagement. In addition, for consumer shopping for high-quality healthcare providers, we assessed the consumer friendliness of the alliances’ public reporting websites.

For the direct-care levels of consumer engagement, self-management and shopping, we also examined population-level changes over 4 early years of the AF4Q program (2007-2008 and 2011-2012) using 2 rounds of the AF4Q Consumer Survey. (For New Mexico and Boston, which joined the AF4Q program later, the round 1 survey was conducted in 2010 and the round 2 survey in 2013/2014.) Details of the data, analysis, and results are presented in the online eAppendix. It is notable that a third round of the consumer survey had been part of the evaluation team’s research design, but it was not, in the end, funded. Without a third round, the analysis of population-level effects was limited to approximately the first half of the AF4Q initiative. However, our qualitative results of the AF4Q program’s self-management efforts do not suggest that there would be a greater impact on patient activation later in the program. Similarly, we did not observe a substantial change in consumer friendliness of the AF4Q public reporting websites after 2012, as is detailed in the results section later in this paper.

Qualitative Component

As part of the AF4Q program evaluation, the evaluation team conducted 1100 qualitative interviews with alliance stakeholders over the program’s almost 10-year history. The qualitative interviews were conducted through a combination of periodic in-person site visits and annual or semi-annual telephone interviews. In each round of interviews, alliance stakeholders were asked about their consumer engagement work, and there were 2 rounds of interviews that largely focused on consumer engagement.

The interviews were audio-recorded, transcribed, and coded in Atlas.ti for the key AF4Q programmatic areas (eg, consumer engagement, quality improvement, and public reporting). For each alliance, data related to consumer engagement were retrieved and reviewed by 2 authors, and a detailed summary of how the alliance approached the 4 areas of consumer engagement (and any additional areas) was developed. This summary included qualitative data on alliance stakeholders’ attitudes and perceptions of the various areas of consumer engagement. The 2 authors then characterized each alliance’s level of embrace for each of the 4 areas of consumer engagement, as: (1) high embrace, in which the alliance made the work a key priority; (2) medium embrace, in which alliances made a concerted effort to develop programming, but in a more limited way; or (3) low embrace, in which alliances did the minimum required or “checked the box” in that area. We also identified themes that covered all 4 areas of consumer engagement related to developing consumer engagement programming within the AF4Q initiative.

Assessment of the Consumer Friendliness of Public Reports

To assess the consumer friendliness of the alliances’ public reporting websites, we examined the diabetes quality metrics for ambulatory care at 3 time points: (1) 2010, summer; (2) 2013, summer; and (3) 2015, fall. Diabetes metrics were selected because they were most consistently publicly reported by alliances.15 We included the 14 alliances that reported ambulatory diabetes metrics in each of the 3 time periods (excluding West Michigan and Western New York, which last reported in 2011 and 2014, respectively). We analyzed Wisconsin’s consumer-oriented public report rather than the alliance’s main reporting website.

We used 3 measures of consumer friendliness related to providing simpler and more evaluable information to consumers, which were derived from the research literature available early in the AF4Q program.16-18 Specifically, we examined the reading level of the description of the diabetes indicators using the Flesch-Kincaid readability test.19,20 We evaluated the description of the diabetes indicators for technical language, categorizing them into technical (eg, “A1C test”), mixed (eg, “blood sugar [A1C] less than 7%”), and plain English (eg, “blood sugar control for adults with diabetes”). (A1C indicates glycated hemoglobin.) We also assessed whether performance was displayed using an evaluable approach (eg, stars or word icons) or if bar charts with percentages were used, which require greater cognitive processing.21

We examined whether there was change in each consumer friendliness metric across the 14 alliances over the 3 time periods. Using 2015 data, we categorized the alliances based upon how many of the following thresholds they met for consumer friendliness: using language that was assessed at below the ninth-grade reading level, use of plain English (rather than technical or mixed), and use of evaluable icons.

Results

AF4Q Program Consumer Engagement Context

Consumer engagement was a new area to almost all of the AF4Q alliances, and few alliance staff had background or expertise in the area. Alliance staff explained: “It’s not our expertise,” “This is all new stuff that we’re kind of playing with because of the Robert Wood Johnson initiative,” and “I don’t think I know much about it, and I don’t think it’s very easy.” This was largely due to the AF4Q request for proposals’ program description and alliance selection criteria, which disproportionately focused on performance measurement and public reporting.10 Not only did the selection criteria require applicants to demonstrate familiarity with nationally recognized performance measures and have the capacity (or progress toward capacity) to collect performance information, they had to commit to publicly reporting performance information for half of their community’s primary care providers within 3 years. In contrast, the only specific requirement for consumer engagement was to have a leadership team with substantial consumer representation.10

Selecting alliances based on their commitment to public reporting meant that the organizations rarely had a consumer engagement background and the expertise, and few had existing consumer constituencies. For example, leaders of 2 alliances shared: “We are a physician-led, really not direct-to-consumer kind of an organization” and “We’re not looking to have a consumer constituency of our own.” Some alliances joined the AF4Q program very enthusiastic about developing programming in consumer engagement. One leader said, “When the grant came out we all said, ‘Oh, we need to do more of that consumer engagement thing, whatever that is.’” Others, however, described coming “kicking and screaming,” saying, “My god, how can we make a dent?”

These contextual factors made the development of consumer engagement within the AF4Q program challenging. The challenges were further exacerbated by having clearly defined public reporting goals from the program’s start without having similarly clear requirements for consumer engagement. One alliance leader described the dynamic: “What I was told, literally, is, ‘You need to bring up a public report within a year, and if you don’t, you’re not in the game anymore.’ So even though they were telling us to do consumer engagement…we knew that if we wanted to stay in, we had to focus on public reporting.” In contrast, alliance directors described the consumer engagement expectations as “unfocused,” “inconsistent,” and “not defined.” One alliance leader summed up the situation saying, “If we looked at what…the RWJF wanted us to achieve [in consumer engagement], it was not really clear. Plus, it was unfamiliar territory for us.”

Areas of Consumer Engagement

In the following sections, we describe how alliances implemented each of the 4 key areas of consumer engagement. Specifically, we detail an illustrative alliance’s approach at each level of embrace (high, medium, and low) and the factors related to the level of embrace, and for the direct-care level efforts, we examine the population-level impact of the work.

Self-Management Education. Two alliances (Humboldt County, California, and south central Pennsylvania [SCPA]) strongly embraced chronic disease self-management programming throughout the AF4Q initiative. A third alliance (Western New York) also initially embraced self-management, but due to a change in strategic direction, it stopped self-management work at the end of 2013. The Humboldt County, California alliance, which was the exemplar in this area, decided early in the AF4Q program that “patient self-management is the main part of consumer engagement.” The alliance launched the Stanford Chronic Disease Self-Management Program in over 25 locations throughout the county.22 In this evidence-based program, patients with chronic conditions met 2.5 hours a week for 6 weeks with lay facilitators to interactively learn about topics such as handling frustration, decision making, and how to evaluate a new treatment. By the end of the AF4Q program, 1047 adults, or 1 out of every 103 adults in Humboldt County, graduated from the program (another 508 participated but did not complete the full program).23 A leader attributed this wide reach to building a “grass roots political campaign” to spread the program, including working closely with primary care physicians who would make referrals to the program like they would for a specialist.

These alliances were primed to develop programming in the self-management area. A Humboldt County leader described previously trying to run the Stanford program, but having “failed”: “You learn from the first failure about what works.” Both the Humboldt County and SCPA alliances had “very influential physician champions” supporting consumer engagement work. Humboldt County also had a volunteer consumer “thought leader” champion and the SCPA alliance hired senior staff with consumer engagement expertise. Both alliances credit the AF4Q initiative with enabling them to develop this programming: “I can’t thank them [RWJF] enough. It would not have happened without that.”

These 2 alliances could have made a population-level impact on patient activation, a term that means having the skills, knowledge, and confidence to manage one’s health and healthcare. We examined the population-level impact using 2 different analytical approaches (eAppendix). In the difference-in-differences regression model (eAppendix, Table 2), there was a small and significant improvement in activation in Humboldt County compared with what occurred in the comparison sample (an increase of 4.3 points vs 3.8 points, respectively, on a theoretical 100-point scale). (Fowles and colleagues have found that a 5-point difference in activation is the difference between those who engage in a healthy behavior and those who do not, so the overall increase from round 1 to round 2 is likely meaningful.24) However, in the fixed-effects model, no significant difference was detected (eAppendix, Table 1). Notably, no other alliances had significantly more improvement in activation than the comparison sample.

Most alliances (10) made concerted efforts to develop educational programs in self-management that were typically limited in their reach and/or intensity. The Memphis alliance, for example, published a weekly article in the local newspaper from 2007 to 2014 that focused on health and healthcare-related topics, including managing chronic conditions. For a number of years, the alliance also convened a Neighborhood Outreach Workgroup (NOW) that brought together a handful of community leaders in 4 lower- and middle-income neighborhoods.

“NOW was much more door-to-door, neighbor-to-neighbor, sitting in somebody’s house talking about 10 people at a time—what you need to do about your health.” An alliance leader described the strengths and weaknesses of what they accomplished: “We’ve created a lot of awareness, but in terms of creating activation, there’s still a lot of work.”

Other medium-embrace alliances worked through employers to reach consumers, “partner[ed] with organizations that are more consumer facing than we are,” and used their websites to reach consumers. Among the approaches taken were creating pilot educational projects, conducting patient empowerment training, and e-mailing health-related communications to consumers. The medium-embrace alliances took the work seriously, but it was often aimed at a narrow group of consumers (“scale is an issue”), not always a good organizational fit (“it’s sort of been an add-on more than the core of what we do”), and often not sustained during the entire AF4Q program. In fact, 4 alliances abandoned their self-management educational efforts after RWJF’s 2009 memo clarified its interest in efforts to encourage consumer use of public reports. One leader explained that self-management was new to their organization, and because their self-management work didn’t “seem to be what the Foundation’s interested in…we shifted gears.” This same alliance, however, reported that the organization was fundamentally “transformed” by the consumer engagement work: “If we looked back before our work with Aligning Forces…it was just assumed that you were meeting the needs of patients. But I would say now there’s much more direct conversation around what is it that the patients need, how do we ensure that we understand it...”

Finally, 3 alliances minimally embraced the area of self-management. One alliance made posters about diabetes and distributed them to physician practices and another tried several approaches, including scheduling classes on taking control of your healthcare, but “didn’t have much uptake.” One of these alliances stopped its consumer engagement work altogether in the final phase of the AF4Q program, explaining, “I think what we struggled with, as we did with all of these consumer pieces, [was] where we were doing almost—I would say—one-shot type of things that were good. But the long-term sustainability of pulling it together under a consumer rubric was what we found challenging.”

Consumer Involvement in Alliance Governance. This area was most commonly embraced by AF4Q alliances. Half of the alliances enthusiastically incorporated individual consumers in alliance governance, 4 of which changed their bylaws to require 1 or more consumers on the lead organization’s board. The Boston alliance was an exemplar in this area. In November 2011, in response to AF4Q program expectations and the strong voice of some existing involved consumers, the alliance established a board-level consumer council on par with its existing physician and health plan councils. Further, they amended their bylaws so that 2 consumer council members could serve on the organization’s board. An alliance leader described these accomplishments as the alliance’s most successful element of consumer engagement: “I think that’s huge in setting the stage for [the organization] going into the future, that all of our work [is] not only collaborative, but it absolutely includes the patient’s voice.”

Three alliances moderately embraced this area, making a serious effort to involve consumers in governance, but the concept did not permeate the organization. One alliance, for example, convened a consumer workgroup that met 3 times a year and worked on specific deliverables, such as providing feedback on public reports.

While the alliance had a consumer join the board, it did not happen until quite late in the AF4Q program.

Five alliances minimally embraced inclusion of consumers in governance, doing what was required by the AF4Q program, but not necessarily buying into the concept. One leader described putting a consumer on the leadership team when it was required in 2010: “We’ve got a terrific person. And she’s very dedicated to the program. I’m not sure how much she gains or we gain from it, but we’re happy to have her.” Others described having purchasers or employers serve as their consumers (“they [the consumers] tend to kind of look and feel more like a purchaser in terms of their level of understanding”). Another alliance in this group, however, described gaining an appreciation for the consumer perspective from trying to have a consumer on the board. Although they ultimately decided not to have a consumer serve on the board (related to challenges of empowering consumers and getting them up to speed on the issues), the alliance’s staff interviewed consumers to bring consumer perspectives to board meetings when needed.

Consumer Involvement in Healthcare Providers’ Quality Improvement Teams. The 2 alliances that strongly embraced self-management also strongly embraced Patient Partner programs, which integrate consumers into ambulatory care quality improvement teams. Although this area of consumer engagement was not required by the AF4Q program, both alliances, interestingly, credited their efforts in this area as being their biggest consumer engagement success. One leader said, “It was a huge game changer to get the patients into the room during the [practice’s quality improvement] collaborative,” and another explained, “It has made, I think, a philosophical change for our community and our healthcare system.”

The SCPA alliance learned about the Humboldt County alliance’s Patient Partner program in 2010, and the leadership was struck by the concept of including patients in physician practices’ quality improvement efforts. A leader explained, “It was sort of, wow, if we want patient-centered medical home[s], they better be patient-centered, and how could they ever be patient-centered if you’re not having patients being involved in the development of the work and the changes?” In 2011, the SCPA alliance began requiring practices participating in their quality improvement program to bring 2 patient partners onto the quality improvement team. This effort grew from 14 patient partners in 7 primary and specialty care practices in 2011 to 70 partners in 58 practices in June 2016 (more than a year after the AF4Q program ended). The SCPA alliance staff and steering committee trained the patient partners, did monthly webinars or face-to-face meetings to support them, and troubleshooted with practices if problems arose.

Three other alliances also created programs similar to Humboldt County’s and SCPA’s Patient Partner programs. One required patient-centered medical home pilot practices to include 2 or more patient or family members in their redesign efforts, although the alliance did not work directly with the consumer members. Another alliance had a short-term program, and a third started a small effort in the final year of the AF4Q program. The remaining 11 alliances chose not to integrate patients into quality improvement efforts.

Shopping: Consumer Friendliness of Public Reports. The Table shows the metrics for consumer friendliness of the alliances’ public reporting websites from 2010 to 2015. There was not substantial improvement in consumer friendliness over the time period. While the number of alliances that used technical language to describe diabetes metrics decreased from 4 to 2, the reading level increased slightly from an average grade level of 7.6 to 8.1 and only 1 alliance switched from using percentages to evaluable icons.

In 2015, only 2 alliances, Maine and Oregon, met all 3 criteria for consumer friendliness (reading level below a ninth-grade level, used plain English to describe quality indicators, and had evaluable icons) (eAppendix, Table 3). Another 5 alliances (Boston, Minnesota, Washington, Kansas City, and Humboldt County) met 2 of the 3 criteria. Of these 7 alliances, 5 were publicly reporting ambulatory care quality or were in the process of doing so when the AF4Q program began. One alliance leader explained that when their existing site fared poorly in RWJF testing with consumers, they were motivated “to totally revamp the website.” Another described the AF4Q initiative as pushing them “to put equal weight into getting information to consumers…so this is huge, and I don’t think it would have happened if it hadn’t been for Aligning Forces.”

The consumer friendliness of these alliances’ websites, however, did not result in significantly larger increases in consumer use of doctor or hospital public reports in these communities compared with the national comparison sample (Greene J, et al; unpublished manuscript). There was, notably, an increase in awareness of physician quality reports in the AF4Q communities (from 13.3% to 16.8%, 3.5 percentage points); however, it was no larger than the gain in the comparison sample (from 11.5% to 17.2%, a 5.7-percentage point gain). No AF4Q community increased consumer use of public reports significantly more than what was observed in the comparison sample over the first half of the AF4Q initiative, the period covered by the consumer survey data.

Six alliances met one of the consumer friendliness criteria and 1 alliance did not meet any of the criteria (2 alliances were not reporting in 2015). Among these alliances with less consumer friendly websites, some leaders described facing barriers from physicians: “So, for example, we know that best practices are not just simply to give data to consumers and have them draw their own conclusions. Well, on our website, we give data to consumers and they have to draw their own conclusions unfortunately…we have not been able to get stakeholder agreement around that [labeling providers as above average, average, or below average].” Other alliances cited the high cost of redesigning their website as a barrier to making it more consumer friendly. One leader said, “I’d like to have us move toward that direction [ie, consumer friendly design], but we didn’t budget for those kinds of updates and changes, so we’re continuing to use what we’ve got.”

Finally, the Wisconsin alliance met only 1 of the consumer friendliness criteria; however, it created a separate consumer-oriented website with cartoon stories about the importance of quality measures and taking charge of one’s healthcare.25 This alliance, which was publicly reporting quality prior to the AF4Q program, decided it “needed to do something different that really met people where they were at.” Despite developing a consumer-focused website that was generally very accessible to consumers, the alliance utilized some display approaches, like bar charts, that have been shown to be challenging for consumers to process, particularly those with low numeracy skills.26,27

Discussion

Consumer engagement in the AF4Q program was challenging because few alliances had consumer engagement expertise or existing consumer constituencies, the Foundation’s consumer engagement expectations were not always specific, and there was a limited evidence base on effective interventions. Despite these obstacles, there were a number of successes in the AF4Q program’s consumer engagement efforts. Perhaps most notable is that the majority of the alliances described the AF4Q initiative as sparking a cultural shift within their organizations, in which the consumer perspective became more important and integrated into many aspects of the alliances’ work. Instead of assuming that the alliance was meeting the needs of patients, there was a realization that it was important to seek out and learn from individual consumers about their point of view. This occurred within alliances that strongly embraced multiple elements of consumer engagement, but it also seemed to appear within alliances that did not feel it was their role to run self-management programs or have consumers on their organization’s board. This shift has positioned the alliances to integrate consumer perspectives into their future work.

Another key success was sparking 2 alliances, Humboldt County and SCPA, to embrace self-management programming and develop Patient Partner programs that incorporated consumers into quality improvement efforts in ambulatory care settings. The AF4Q program’s support enabled these alliances to develop robust consumer engagement efforts, learn from each other, and inspire other AF4Q alliances and other organizations nationally. These alliances were primed to develop consumer engagement programming and had strong leadership support and prior expertise or attempts at consumer engagement. The resources from the AF4Q program enabled them to dramatically grow their programming and commitment to consumers. In Humboldt County, the alliance’s efforts may have made a population-level impact: because the county has a relatively small population, the 6-session self-management program was completed by 1 of every 103 adults.

The AF4Q initiative prompted some success in creating consumer friendly public reports of provider quality performance. Five of the 6 alliances that were reporting ambulatory care quality or en route to doing so prior to the AF4Q program created websites that adhered to consumer friendly design principles, and the sixth alliance created an innovative cartoon guide helping consumers understand the importance of quality measures. Alliances new to public reporting described being very focused on the challenges of simply producing a public report, but not having the bandwidth, resources, or stakeholder commitment (and, in some cases, interest) to make it consumer friendly. Unfortunately, however, the consumer friendliness of the websites did not translate to increased consumer use of comparative quality information—at least over the time period covered by our data. Many alliance leaders highlighted that public reporting was not like the quote from the movie Field of Dreams, where “If you build it, [they] will come.” Few alliances had consumer constituencies that they could draw to the website, and efforts with Consumer Reports in 3 communities suggested that working with established consumer organizations was a more effective way to reach consumers (Greene J, et al; unpublished manuscript).

Another area of consumer engagement success that the AF4Q program helped to further was developing the literature on consumer engagement. Technical assistance providers, the evaluation team, the national program office staff, alliance staff, and Foundation staff have all published articles that have been important in defining and describing exactly what is meant by “consumer engagement.”8,9,28-32 These publications vary from academic papers, including Carman and colleagues’ article that had 155 citations 3 years after publication, according to Google Scholar, to practitioner-focused “toolkits” to help other organizations learn from the experiences of those participating in the AF4Q program.8

Limitations

The findings reported within should be interpreted in light of the study’s limitations. A key limitation is that we do not have outcome data on the consumer engagement efforts at the organizational and governance levels, which integrated consumers into alliance governance and healthcare quality improvement teams. Although we found that integrating consumers into alliance governance was the most commonly embraced area of consumer engagement, we were not able to objectively measure its impact on alliance programming. Another limitation (previously mentioned) is that our population-level outcome assessments for self-management and shopping covered only the first half of the AF4Q program. Additionally, the results published in this article focus exclusively on the 4 key areas of consumer engagement, so we do not capture efforts by individual alliances that fell outside of these 4 areas (eg, the Humboldt County, California, alliance created a patient committee to study surgical rate variation in the community).

Conclusion

The AF4Q initiative pushed the field of consumer engagement and sparked alliances to try implementing consumer engagement programs, many of which would not have done so otherwise. A few alliances excelled in this area, and others developed a new awareness about the consumer role and, in some cases, a new expertise that they likely would not have developed without the AF4Q program (“It’s been helpful, but I don’t think that we would have on our own headed down that path.”). Still, other alliances were only minimally touched by consumer engagement. Given that the alliances were largely inexperienced in consumer engagement, and consumer engagement work was not a natural organizational fit for many alliances, it is notable that there were important accomplishments in this area. The shortcomings were due to the specific context of the AF4Q program and should not be viewed as a test of whether consumer engagement efforts can have a widespread impact.

To effectively build consumer engagement efforts, particularly direct-care level efforts like self-management, this study suggests that consumer-facing organizations with existing expertise and commitment to consumer engagement should be tapped. This is the approach that the federal Administration on Aging has taken, awarding grants to community agencies and state health departments to implement the Stanford Chronic Disease Self-Management Program.33 In contrast, the AF4Q initiative sought to build consumer engagement capacity in organizations that often had no background in this area.

The underlying programmatic theory was that to improve the quality of care, AF4Q alliances needed to work in multiple sectors, including consumer engagement, quality improvement, and public reporting of provider performance. Whereas involving all sectors may be necessary for improving quality of care communitywide, this evaluation suggests that it may not be reasonable to expect organizations to lead efforts in programmatic areas that are far outside their key areas of expertise, even with generous technical assistance and financial support. Instead, future funders may want to consider helping organizations and alliances across multiple sectors to advance their unique contributions and capitalize on their strengths, as they work together to improve the quality of care for the whole community.Acknowledgments

We would like to thank the Robert Wood Johnson Foundation for their generous support of the AF4Q program evaluation. Additionally, we would like to thank all of the AF4Q program stakeholders for sharing their experiences and thoughts about the AF4Q initiative with the evaluation team.

Author affiliations: School of Public Health, University of Minnesota Minneapolis, MN (JBC); Center for Health Care and Policy Research, Penn State University, University Park, PA (DCF, DPS); School of Nursing, George Washington University, Washington, DC (JG); Health Policy and Administration, Penn State University, University Park, PA (DPS, YS).

Funding source: This supplement was supported by the Robert Wood Johnson Foundation (RWJF). The Aligning Forces for Quality evaluation is funded by a grant from the RWJF.

Author disclosures: Dr Christianson, Ms Farley, Dr Greene, Dr Scanlon, and Dr Shi report receipt of grants from RWJF. Dr Greene reports meeting or conference attendance on behalf of Insignia Health. Dr Scanlon reports meeting or conference attendance on behalf of RWJF.

Authorship information: Concept and design (JBC,JG, DPS); acquisition of data (JBC, JG, DPS, YS); analysis and interpretation of data (JBC, DCF, JG, DPS, YS); drafting of the manuscript (JG, DPS); critical revision of the manuscript for important intellectual content (JBC, DCF, JG, DPS, YS); statistical analysis (YS); obtaining funding (DPS); administrative, technical, or logistic support (DCF, JG); and supervision (JG).

Address correspondence to: jessgreene@gwu.edu.

REFERENCES

1. Dentzer S. Rx for the ‘blockbuster drug’ of patient engagement. Health Aff (Millwood). 2013;32(2):202-202. doi:10.1377/hlthaff.2013.0037.

2. Chase D. Patient engagement is the blockbuster drug of the century. Forbes website. www.forbes.com/sites/davechase/2012/09/09/patient-engagement-is-the-blockbuster-drug-of-the-century/. Published September 9, 2012. Accessed May 10, 2016.

3. The White House. Fact sheet: health care transparency: empowering consumers to save on quality care. White House website. http://georgewbush-whitehouse.archives.gov/news/releases/2006/08/20060822.html. Published August 22, 2006. Accessed May 11, 2016.

4. United States Government. The Patient Protection and Affordable Care Act. H.R.3590. Section 1311. US Government Publishing Office. www.gpo.gov/fdsys/pkg/BILLS-111hr3590enr/pdf/BILLS-111hr3590enr.pdf. Published January 5, 2010. Accessed May 11, 2016.

5. Munro D. Patient engagement: blockbuster drug or snake oil? Forbes website. www.forbes.com/sites/danmunro/2013/08/17/patient-engagement-blockbuster-drug-or-snake-oil/. Published August 17, 2013. Accessed May 11, 2016.

6. Barello S, Graffigna G, Vegni E, Bosio AC. The challenges of conceptualizing patient engagement in health care: a lexicographic literature review. Journal of Participatory Medicine website. www.jopm.org/evidence/reviews/2014/06/11/the-challenges-of-conceptualizing-patient-engagement-in-health-care-a-lexicographic-literature-review/. Published June 11, 2014. Accessed May 11, 2016.

7. Hurley RE, Keenan PS, Martsolf GR, Maeng DD, Scanlon DP. Early experiences with consumer engagement initiatives to improve chronic care. Health Aff (Millwood). 2009;28(1):277-283. doi:10.1377/hlthaff.28.1.277.

8. Carman KL, Dardess P, Maurer M, et al. Patient and family engagement: a framework for understanding the elements and developing interventions and policies. Health Aff (Millwood). 2013;32(2):223-231. doi:10.1377/hlthaff.2012.1133.

9. Mittler JN, Martsolf GR, Telenko SJ, Scanlon DP. Making sense of “consumer engagement” initiatives to improve health and health care: a conceptual framework to guide policy and practice. Milbank Q. 2013;91(1):37-77. doi:10.1111/milq.12002.

10. Robert Wood Johnson Foundation. Aligning Forces for Quality: The Regional Market Project. Request for Proposals. 2006.

11. Scanlon DP, Beich J, Alexander JA, et al. The Aligning Forces for Quality initiative: background and evolution from 2005 to 2012. Am J Manag Care. 2012;18(suppl 6):S115-S125.

12. Robert Wood Johnson Foundation. Frequently Asked Questions. RMP Teleconferences. 2006.

13. Gibbons CB, Weiss AF. Creating and sustaining change: early insights from Aligning Forces. Am J Manag Care. 2012;18(suppl 6):S96-S98.

14. Robert Wood Johnson Foundation. Consumer Engagement Activities (letter to AF4Q Alliances). 2009.

15. Christianson JB, Volmar KM, Shaw BW, Scanlon DP. Producing public reports of physician quality at the community level: the Aligning Forces for Quality initiative experience. Am J Manag Care. 2012;18(suppl 6):S133-S140.

16. Peters E, Dieckmann N, Dixon A, Hibbard JH, Mertz CK. Less is more in presenting quality information to consumers. Med Care Res Rev. 2007;64(2):169-190.

17. Hibbard JH, Greene J, Daniel D. What is quality anyway? Performance reports that clearly communicate to consumers the meaning of quality of care. Med Care Res Rev. 2010;67(3):275-293. doi:10.1177/1077558709356300.

18. Hibbard JH, Peters E. Supporting informed consumer health care decisions: data presentation approaches that facilitate the use of information in choice. Annu Rev Public Health. 2003;24:413-433.

19. Cotugna N, Vickery CE, Carpenter-Haefele KM. Evaluation of literacy level of patient education pages in health-related journals. J Commun Health. 2005;30(3):213-219. doi:10.1007/s10900-004-1959-x.

20. Eltorai AE, Sharma P, Wang J, Daniels AH. Most American Academy of Orthopaedic Surgeons’ online patient education material exceeds average patient reading level. Clin Orthop Relat Res. 2015;473(4):1181-1186. doi:10.1007/s11999-014-4071-2.

21. Kurtzman ET, Greene J. Effective presentation of health care performance information for consumer decision making: a systematic review. Patient Educ Couns. 2016;99(1):36-43. doi:10.1016/j.pec.2015.07.030.

22. Chronic Disease Self-Management Program (Better Choices, Better Health Workshop). Standford Medicine website. http://patienteducation.stanford.edu/programs/cdsmp.html. Accessed May 11, 2016.

23. Population of Humboldt County, California: Census 2010 and 2000 interactive map, demographics, statistics, graphs, quick facts. CensusViewer website. http://censusviewer.com/county/CA/Humboldt. Accessed May 11, 2016.

24. Fowles JB, Terry P, Xi M, Hibbard J, Bloom CT, Harvey L. Measuring self-management of patients’ and employees’ health: further validation of the Patient Activation Measure (PAM) based on its relation to employee characteristics. Patient Educ Couns. 2009;77(1):116-122. doi: 10.1016/j.pec.2009.02.018.

25. MyHealthWisconsin website. http://myhealthwi.org/Resources/GettingGoodCare.aspx. Accessed May 11, 2016.

26. Gerteis M, Gerteis JS, Newman D, Koepke C. Testing consumers’ comprehension of quality measures using alternative reporting formats. Heal Care Financ Rev. 2007;28(3):31-45.

27. Mason D, Boase S, Marteau T, et al. One-week recall of health risk information and individual differences in attention to bar charts. Health Risk Soc. 2014; 16(2):136-153. doi: 10.1080/13698575.2014.884544.

28. Osborne-Stafsnes J. 4 pillars of patient engagement. Becker’s Healthcare website. www.beckershospitalreview.com/quality/4-pillars-of-patient-engagement.html. Published June 12, 2014. Accessed May 11, 2016.

29. Mende S, Roseman D. The Aligning Forces for Quality experience: lessons on getting consumers involved in health care improvements. Health Aff (Millwood). 2013;32(6):1092-1100. doi:10.1377/hlthaff.2012.1079.

30. Roseman D, Osborne-Stafsnes J, Amy CH, Boslaugh S, Slate-Miller K. Early lessons from four “aligning forces for quality” communities bolster the case for patient-centered care. Health Aff (Millwood). 2013;32(2):232-241. doi:10.1377/hlthaff.2012.1085.

31. Robert Wood Johnson Foundation. Engaging patients in improving ambulatory care. Aligning Forces for Quality website. http://forces4quality.org/af4q/download-document/6251/Resource-engaging_patients_in_aqi_compendium_final.pdf. Published February 2013. Accessed May 11, 2016.

32. American Institutes for Research. Best practices for engaging consumers in organization leadership: lessons learned from Aligning Forces for Quality communities. Aligning Forces for Quality website. http://forces4quality.org/af4q/download-document/11185/Resource-0-028-af4q_best_practices_for_engaging_consumers_in_organizational_leadership_final.pdf. Published January 2015. Accessed May 11, 2016.

33. Administration on Aging (AoA). Chronic Disease Self-Management Education Programs. Administration for Community Living website. www.aoa.gov/AoA_Programs/HPW/ARRA/PPHF.aspx#purpose. Accessed May 11, 2016.

AJMC Managed Markets Network Logo
CH LogoCenter for Biosimilars Logo