Publication
Article
The American Journal of Managed Care
Author(s):
This study examines the impact of geographically limited disasters on health care quality performance scores of Medicare Advantage contracts, finding limited impact on performance scores.
ABSTRACT
Objective: To describe the effect of geographically limited disasters on health plan (ie, contract) quality performance scores using a broad set of clinical quality and patient experience measures.
Study Design: Retrospective analyses to assess the impact of disasters on Medicare Advantage contracts’ quality-of-care performance scores in 2017 and 2018 for 11 Part C clinical quality and patient experience measures used in the Medicare Advantage Star Ratings.
Methods: We calculated each Medicare Advantage contract’s disaster exposure using the percentage of the contract’s beneficiaries residing in a Federal Emergency Management Agency–designated disaster area during the measurement period. Using linear mixed models, we estimated the association between contract-level disaster exposures and performance scores during the performance period measured, with random effects for contract and fixed effects for year, contract characteristics, and the disaster exposure, using repeated cross-sectional data on contracts from 2016 to 2018.
Results: We found no evidence that geographically limited disasters meaningfully affected contract quality performance scores. The disasters studied were associated with statistically significant but small changes in performance scores for 1 of 11 measures in both years.
Conclusions: The lack of evidence that being in a disaster-affected area had a meaningful negative impact on quality measure performance suggests that performance measurement programs are robust to the impact of short-term localized disasters and continue to function as intended.
Am J Manag Care. 2025;31(2):In Press
Takeaway Points
This study examines the impact of geographically limited disasters on health care quality performance scores of Medicare Advantage contracts.
Disasters, whether natural or man-made,1 may impact care delivery and, in turn, the quality of care received by individuals. Disasters could impact health care delivery in multiple ways: They may lead to more people seeking urgent or emergent care, straining the health care system’s ability to provide routine ambulatory chronic disease management or preventive care services; they may displace individuals, rendering them unable to receive care from their usual community providers and requiring them to establish new relationships with providers; individuals may leave needed medications behind while evacuating, which can exacerbate chronic diseases; disasters can cause power outages, limiting the ability of hospitals and medical facilities to operate at full capacity or provide their typical level of care; and entire medical facilities that provide ambulatory care services may temporarily close, requiring individuals to find new service locations.2
Natural disasters are increasing in frequency and intensity, partially as a result of climate change.3,4 Disasters such as hurricanes, earthquakes, floods, and fires could impact the quality of care delivered in multiple ways. This is particularly true for ambulatory care providers, which have less infrastructure and surge capacity than hospitals. If individuals are displaced or unable to remain in their homes, they may be less likely to get preventive care services or fill their medications on time. Strain on the health care system and damages to infrastructure may force patients to wait longer than usual to receive routine services, impacting both their utilization and experiences of health care. This in turn can impact performance scores because quality measures used in performance accountability initiatives include patient experience and clinical process and outcome measures.5 The impact disasters might have on ambulatory care delivered and on quality measure scores used in performance accountability initiatives (which predominantly measure ambulatory care) is not well understood. A better understanding of the effects of disasters on clinical care delivery has implications for value-based payment systems and other pay-for-performance programs that measure, hold accountable, and reward or penalize providers for their quality performance.
Even short-term disasters could have long-lasting impacts on the availability of health care in the affected communities. The 2017 Tubbs Fire shut down every Sutter Health facility in Sonoma County, California, and although most facilities had reopened within 2 weeks, the largest Sutter Health medical office building in the county, seeing more than 700 patients per day, was badly damaged and remained closed for months following the fire.6 A full 4 years after the Camp Fire in Paradise, California, the town still lacked a fully staffed hospital.7,8 A review of the common themes from natural disasters (mostly hurricanes and earthquakes) that directly affect hospitals and health systems via damage to physical infrastructure or degradation of resources found that common barriers to successful disaster response include loss of power, water, heating and ventilation, communications, health information technology, staffing, supplies, and safety and security as well as structural and nonstructural damage.9 Analyses focused on the long-term impacts of climate-related weather disasters found that severe disasters lead to a loss of health care facilities.10 Sponsors of performance measurement programs have taken actions in recent years to avoid penalizing plans and providers affected by major disasters. CMS’ Medicare Advantage (MA) and Part D Star Ratings program scoring methodology provides a hold-harmless rule for contracts affected by major disasters; contracts with service areas covering counties that were both Federal Emergency Management Agency (FEMA)–declared major disaster Individual Assistance counties11 and where the HHS secretary declared a public health emergency12 are given the higher of the prior- or current-year measure star as the final star and score for the current year for most measures. This hold-harmless rule is applied when contracts have at least 25% of beneficiaries residing in the identified Individual Assistance counties at the time of a qualifying disaster, hereafter identified as disaster-affected areas.
Some studies have examined emergency department visits for disaster-induced injuries and the immediate surge capacity that accompanies a disaster.13,14 However, few studies have examined the broader impact of disasters on an array of quality-of-care measures in an ambulatory care setting following a disaster. Geographically limited disasters are common and increasing in frequency; in this article, we explore the impact of geographically limited disasters on quality performance scores and the implications for performance accountability programs.
Having a better understanding of how disasters affect performance scores could help sponsors of performance measurement and accountability programs improve whether and how these programs account for disasters when measuring provider performance during the disaster-affected measurement period. To that end, we explore the impact of geographically limited disasters on the annual performance scores on a broad set of ambulatory quality-of-care measures used to assess MA contract performance to better understand which areas of quality performance may be meaningfully affected by disasters. Such information is vital to policy makers and the sponsors of performance measurement programs—who need to understand the implications of holding providers accountable for achieving high performance (eg, through value-based payment and public reporting) during disruptions that may affect health care delivery—and for measuring performance accurately.
To evaluate the impact of disasters on quality performance, we examined changes in performance scores that are measured and reported annually for consumers and plan sponsors and are used to determine Quality Bonus Program payments and rebates to MA contracts. We assessed the effects as a function of the percentage of health plan enrollees residing in disaster-affected areas. We used publicly available national performance data from MA contracts (ie, health plans) from 2016 to 2018 (a pre–COVID-19 pandemic period during which disaster policies were in effect for MA), examining performance on a set of widely used, nationally endorsed quality-of-care measures (process, patient experience, outcome, and intermediate outcome measures).
METHODS
We used MA quality measure scores representing care delivered in 2016 through 2018, using 2016 as the baseline year (ie, comparison year, prior to the implementation of the disaster policy in the MA program, which began in 2017). Specifically, we estimated the relationship between contract-level disaster exposure and quality measure performance scores in 2017 (the year of implementation of the disaster policy in the MA program), with 2016 as a baseline or comparison year, and repeated the analysis for 2018 performance scores (with 2017 as the baseline year) to draw conclusions about the impact of disasters on quality measure performance scores.
We used CMS-supplied information on disaster exposure to conform with CMS’ disaster-designation definition for the MA and Part D Star Ratings program.15 CMS calculates each contract’s disaster exposure as the percentage of MA contract enrollees who resided in a disaster-affected area, specifically a FEMA-designated Individual Assistance county11 where the HHS secretary declared a public health emergency and issued a Section 1135 waiver12 (waivers are issued to ensure that during an emergency, sufficient health care items and services are available)16 during the performance year.15,17 Calculating the percentage of MA contract enrollees who resided in disaster-affected service areas during the measurement year required a list of disaster-affected counties, the time frame for each disaster (eg, start and end dates of the incident), and enrollment totals by county within the service areas covered by the contract and each disaster.15 Enrollment counts associated with each disaster come from the MA monthly enrollment file that most closely matches the disaster period (information obtained from Identification of Contracts Affected by Disaster in Star Ratings Technical Notes).15 If a contract was affected by multiple disasters with different time frames in the performance year, the mean of each time frame’s enrollment counts was calculated. The number of enrollees in disaster-affected service areas was divided by the total number of enrollees to produce the percentage of enrollees residing in a disaster-affected service area for each contract.
Our analysis thus covers the 3 most recent years of prepandemic data. The use of annual performance scores is inherent to the policy question we are asking, as scores used in accountability initiatives such as the MA Star Ratings, Hospital Value-Based Purchasing Program,18 and the National Committee for Quality Assurance’s Quality Compass19 routinely are annual calculations from patient data covering periods of 1 or more years.
Outcome Measures
Thirty-four Medicare Part C quality-of-care measures were used to measure contract performance in at least 1 of the measurement years 2016 through 2018 in MA Star Ratings. From this set of 34 measures, we selected a broad range of ambulatory care services and measures of access to care and care coordination for analysis, excluding measures based on several factors. Measures were initially screened for inclusion based on factors described in eAppendix Table 1 (eAppendix available at ajmc.com). We focused on measures that correspond to care delivered by ambulatory care providers or care providers in the disaster-affected communities. Because financial incentives are tied to performance on measures included in Star Ratings, we examined the effects on measures that were in the Star Ratings for all 3 years in the measurement period. This excluded 3 measures that transitioned into or out of the Star Ratings over the 3-year measurement period. We also excluded 2 Medicare Health Outcomes Survey measures that required data collected across 2 years, 2 measures that had a look-back period exceeding 12 months (colorectal cancer and breast cancer screening), and 4 measures that applied only to Special Needs Plans. Seven additional measures were excluded because they focused on non–care delivery administrative functions of the plan, such as assessing back-office functions of the plan or involving/being related to customer service; the provision of these administrative services by MA contracts typically occurs at a centralized location that is outside the disaster-affected area. Of the 16 remaining measures, 2 were excluded because they were topped out with average performance at or above 95% of the maximum score for all 3 years, 1 measure was excluded because less than half of MA contracts eligible for Star Ratings had measure data across all 3 years, and 2 measures were excluded because they were similar to other measures included.
Table 1 shows the final set of 11 measures and the number of MA contracts included in the analyses. Measurement periods ranged from 5 months to 12 months, as shown in eAppendix Table 1. We hypothesized that disasters would most affect performance on measures that require in-person clinical or preventive care (such as Diabetes Care–Eye Exam) or are time-sensitive (such as Getting Care Quickly), where temporary disruptions in care delivery would affect receipt of recommended care.
Statistical Methods
We estimated a linear mixed model with an explanatory variable representing the proportion of beneficiaries affected by a disaster in 2017, an indicator variable for measurement year 2017, other contract characteristics as covariates, and random intercepts for contract, taking the form:
yit = α + γi + β11(t = 2017) + β2pit1(t = 2017) + β3Xit + ϵit
where yit is the outcome (ie, 1 of the 11 measures of health care quality) observed for contract i in measurement year t; pit is the proportion of beneficiaries affected by disasters in contract i and year t; and γi~N(0,σ2) is a random intercept for contract i that accounts for contract-level differences in performance and the correlation across years within contracts. The function 1(t = 2017) is equal to 1 if t = 2017, and 0 otherwise. The coefficient β1 is the expected change in the outcome across all contracts from 2016 to 2017, in the absence of a disaster. The coefficient β2 is the disaster effect of interest, capturing the impact of the disaster on performance (β2 is the expected impact on performance for a contract that is 100% affected, relative to that contract being unaffected). We used the same model to test the hypotheses that 2018 disasters affected performance on these measures, using 2017 performance as the baseline for comparison. The model also accounts for other contract characteristics that may affect performance, which are encoded in Xit, and include contract size (ie, number of enrolled beneficiaries) and contract type (eg, Coordinated Care Plan, Medical Savings Account, Private Fee for Service, Special Needs Plan, 1876 Cost Plan).
We used the results from this model to identify measures with statistically significant and clinically meaningful disaster effects. Differences of 1 point on a scale of 0 to 100 are generally considered small, and differences of 3 points are considered moderate.20,21 We considered differences of 3 points or greater to be clinically meaningful. A sensitivity analysis was also run for this model that removed contracts that had any beneficiaries in disaster-affected service areas in both 2017 and 2018 (between 9% and 10% of contracts across measures).
RESULTS
There were no Section 1135 waivers issued in 2016 in response to public health emergencies, so no contracts were disaster affected per the disaster exposure definition.12 In 2017, Section 1135 waivers were issued for the California wildfires, Hurricane Nate, Hurricane Maria, Hurricane Irma, and Hurricane Harvey; these disasters had time frames of 5 to 24 days (see eAppendix Table 2) and affected MA contracts with service areas covering the Individual Assistance counties. A total of 115 MA contracts (20%) were disaster affected in 2017, with the percentage of enrollees who were disaster affected in these contracts ranging from 1% to 100%, with a mean of 53%. In 2018, Hurricane Florence, Hurricane Michael, Typhoon Yutu, California wildfires, and an earthquake in Alaska had time frames of 1 to 23 days (eAppendix Table 2) and affected 59 MA contracts (9%), with the percentage of enrollees disaster affected in these contracts ranging from 1% to 100%, with a mean of 27%.
None of the 11 measures we examined had an estimated disaster impact in 2017 that was both statistically significant and of a meaningful (>1 point)20 magnitude (Table 2). Of the 11 measures examined, only 2 had disaster effects that were statistically significant (Annual Flu Vaccine and Getting Care Quickly). For both, the magnitude of the effect was small.20 We estimated that, compared with a contract with no enrollees residing in a disaster-affected area, a contract with all enrollees residing in a disaster-affected area would see an impact of –1.3 percentage points on the percentage of enrollees getting the annual flu vaccine and –0.6 percentage points for the percentage of enrollees reporting getting care quickly.
Analyses of the effects of 2018 disasters confirmed the initial findings that geographically limited disasters had no meaningful impact on performance scores. Specifically, only 2 measures had statistically significant differences in performance (Getting Care Quickly and Monitoring Physical Activity), but the estimated average impact was 1.5 points or less in magnitude across both measures. Results were similar in the sensitivity analysis that excluded a small number of contracts that were disaster affected in both 2017 and 2018. eAppendix Table 3 includes estimates for regression coefficients for other contract characteristics included as control variables in the models.
DISCUSSION
In our examination of quality performance among MA contracts, we did not find evidence that geographically limited disasters of short duration meaningfully reduced annual quality performance scores. The geographically and time-limited disasters studied were associated with statistically significant but small changes in performance in both years for only 1 measure. Given the lack of meaningful impact on performance scores, our findings suggest that program sponsors may continue to use these measures to evaluate and compare provider performance in the presence of short-term localized disasters like those that occurred during the years of our analysis without a negative impact on those being measured. Our results also suggest that quality-of-care measure performance data from a measurement period that includes shorter-term impacts on health care during a major disaster can continue to be used for public reporting or payment purposes. Even though the results suggest that the effect of localized disasters on contracts is small on average, program sponsors may still choose to make adjustments or special rules to ensure that no contracts are penalized, given that contracts may be differentially affected.
Limitations
A limitation of our analysis is the range and number of years analyzed and the particular disasters that occurred. It is possible that some natural or man-made disasters, particularly those of longer duration such as Hurricane Katrina, may have more significant and sustained impact on the delivery of services and quality of care than other disasters that are more limited in time and geographic scale; our study period did not include such disasters. Additionally, although disasters were declared at the county level, a smaller area may have truly been impacted, but not the entire county. Therefore, our results cannot be generalized to disasters that are large in scale or duration (eg, months). Ascertaining the impact of more severe disasters or the relationship between disaster severity and impact would require more unique disasters (our analysis included 10 disaster events).
It is also possible that disasters could differentially affect certain contracts, such that some are impacted more than others. Our analyses focused on the impact of disasters on MA contracts, but these results may not be generalizable to other types of health plans. It is plausible that certain types of health plans may be more susceptible. For example, those that contract with providers in the disaster area that do not have alternative capacity to redirect patients to other facilities outside the disaster area may be less able to withstand the shocks to infrastructure, staff, and patients. It is also possible that larger health systems with multiple care delivery sites that are proximately located can triage to other sites of care to respond to care needs in a disaster, shifting patients to unaffected facilities and providers within their system. Larger health systems may also have the capability and resources to redeploy personnel between facilities to create temporary surge capacity. They also may be better equipped to utilize telehealth services to ensure patients continue to have care needs met, as small practices are less likely to use telemedicine than large practices and hospitals affiliated with health systems are more likely to use telehealth than unaffiliated ones.22,23
Because disasters are somewhat infrequent events and the data are observational, it is difficult to confidently ascertain the characteristics of contracts that are more affected by a given disaster. Disasters may also impact aspects of performance that are not addressed by the set of measures used in our analysis; however, the measures we included address a broad set of ambulatory care services from cancer screening to vaccination and patient experience to gauge potential differential impacts on a variety of types of care.
Localized disasters are more common than widespread disasters; although they have been occurring more frequently, localized disasters have a more limited impact on care delivery in terms of duration and number of providers affected. Our analyses indicate that geographically limited disasters do not appear to adversely affect the quality of care that patients receive across a 5- to 12-month measurement period, although a disaster that is especially severe or long in duration may have greater effects than those we observed in this study.
CONCLUSIONS
There are several possible explanations for our findings regarding localized disasters. Health care systems may be able to cope with localized disasters because many of the services addressed by the quality measures evaluated are not time sensitive and can be performed after the disaster. Many measures such as screening and vaccination measures (which are included in our analysis) allow for care to be received at any point in the year or over many months. Although even localized disasters can substantially displace patients and providers and may directly impede the delivery of care at one or more facilities, they generally do not impact the long-term provision of medical care across a large geographic area. It is therefore possible that the quality measures we studied are affected by disasters in the short term, but because performance scores for contracts and other health care entities are generally measured on an annual basis, impacts of short-term disasters may have a limited impact on scores. There is some related research to support these findings. An analysis of hospitalization rates before and after Hurricane Katrina found that hospitalization rates peaked 6 days after landfall in Orleans Parish but returned to prelandfall levels within 2 months,24 and analyses including 7 hurricanes between 2005 and 2016 found that the overall rate of weekly emergency department visits for counties affected by the hurricanes did not increase significantly from the prehurricane period to 4 weeks post hurricane.25
Although not the focus of this study, the COVID-19 pandemic was a sustained international disaster that had a direct and widespread impact on the ability of the US health system to deliver care, with providers pausing the delivery of some services altogether and limiting in-person interactions. In the first few months of the pandemic, health care systems stopped providing nonurgent care.26,27 For example, it was difficult to get a routine in-person eye exam for many months during the pandemic.28,29 Several studies described the impact of both these initial and acute changes in access to care and longer-term changes in care during the pandemic on service delivery, patient experiences, and quality of care.30-34 Although some research suggests that after the initial impact of the COVID-19 pandemic there was a return to prepandemic levels of outpatient visits within 6 months to a year,35-37 staffing impacts were long-lasting and some aspects of quality of care were reduced for more than 1 year.33 Our study findings add to building a deeper understanding of the impact of disasters on health care and indicate that the health care system may be robust to many kinds of localized disasters.
Author Affiliations: RAND Corporation, Santa Monica, CA (MD, MM, CMR, AB, MNE, CLD), and Boston, MA (ROR).
Source of Funding: CMS, contract 75FCMC19F0076.
Author Disclosures: The authors report no relationship or financial interest with any entity that would pose a conflict of interest with the subject matter of this article.
Authorship Information: Concept and design (MD, CMR, ROR, MNE, CLD); acquisition of data (MM, CLD); analysis and interpretation of data (MD, MM, CMR, AB, ROR, MNE, CLD); drafting of the manuscript (MD, MM, CMR, AB, CLD); critical revision of the manuscript for important intellectual content (MD, CMR, ROR, MNE, CLD); statistical analysis (MD, MM, AB); and obtaining funding (CLD).
Address Correspondence to: Maria DeYoreo, PhD, RAND Corporation, 1776 Main St, Santa Monica, CA 90401. Email: mdeyoreo@rand.org.
REFERENCES
1. What is a disaster? International Federation of Red Cross and Red Crescent Societies. Accessed March 21, 2022. https://www.ifrc.org/what-disaster
2. Bell SA, Abir M, Choi H, Cooke C, Iwashyna T. All-cause hospital admissions among older adults after a natural disaster. Ann Emerg Med. 2018;71(6):746-754.e2. doi:10.1016/j.annemergmed.2017.06.042
3. Ghazali DA, Guericolas M, Thys F, Sarasin F, Arcos González P, Casalino E. Climate change impacts on disaster and emergency medicine focusing on mitigation disruptive effects: an international perspective. Int J Environ Res Public Health. 2018;15(7):1379. doi:10.3390/ijerph15071379
4. van Aalst MK. The impacts of climate change on the risk of natural disasters. Disasters. 2006;30(1):5-18. doi:10.1111/j.1467-9523.2006.00303.x
5. Types of health care quality measures. Agency for Healthcare Research and Quality. February 2015. Updated July 2015. Accessed June 26, 2024. https://www.ahrq.gov/talkingquality/measures/types.html
6. Sutter Health medical offices in North Santa Rosa opens after fire. Sonoma County Gazette. December 15, 2017. Accessed June 26, 2024. https://www.sonomacountygazette.com/sonoma-county-news/sutter-health-
medical-offices-in-north-santa-rosa-opens-after-fire/
7. Stetson A. Paradise still without a full hospital over four years after the Camp Fire. KRCR. March 28. 2023. Updated March 30, 2023. Accessed June 26, 2024. https://krcrtv.com/news/local/paradise-still-without-a-full-hospital-five-years-after-the-camp-fire
8. Colliver V. Rebuilding Paradise: finding health care after total destruction. Politico. Updated June 2, 2019. Accessed June 26, 2024. https://www.politico.com/story/2019/06/02/paradise-california-wildfire-health-care-1349430
9. Melnychuk E, Sallade TD, Kraus CK. Hospitals as disaster victims: lessons not learned? J Am Coll Emerg Physicians Open. 2022;3(1):e12632. doi:10.1002/emp2.12632
10. Chang K, Smiley K, Hirsch J, Clay L, Michael Y. Climate-related disaster impact on health care infrastructure in the USA. Ann Epidemiol. 2023;85:133. doi:10.1016/j.annepidem.2023.06.020
11. Disaster information. Federal Emergency Management Agency. Updated May 15, 2023. Accessed January 5, 2024. https://www.fema.gov/disaster
12. Waiver or modification of requirements under Section 1135 of the Social Security Act. Administration for Strategic Preparedness and Response. Accessed January 4, 2024. https://aspr.hhs.gov/legal/1135-Waivers/Pages/default.aspx
13. Abir M, Choi H, Cooke CR, Wang SC, Davis MM. Effect of a mass casualty incident: clinical outcomes and hospital charges for casualty patients versus concurrent inpatients. Acad Emerg Med. 2012;19(3):280-286. doi:10.1111/j.1553-2712.2011.01278.x
14. Kim H, Schwartz RM, Hirsch J, Silverman R, Liu B, Taioli E. Effect of Hurricane Sandy on Long Island emergency departments visits. Disaster Med Public Health Prep. 2016;10(3):344-350. doi:10.1017/dmp.2015.189
15. Attachment Q. In: Medicare 2019 Part C & D Star Ratings Technical Notes. CMS; March 21, 2019:137-140. Accessed January 12, 2024. https://www.cms.gov/Medicare/Prescription-Drug-Coverage/PrescriptionDrugCovGenIn/Downloads/2019-Technical-Notes.pdf
16. Section 1135 waivers: issuing waivers/modifications. Administration for Strategic Preparedness and Response. Accessed January 4, 2024. https://aspr.hhs.gov/legal/1135-Waivers/Pages/1135-Waivers.aspx
17. CMS, HHS. Medicare and Medicaid Programs; policy and technical changes to the Medicare Advantage, Medicare Prescription Drug Benefit, Programs of All-Inclusive Care for the Elderly (PACE), Medicaid Fee-for-Service, and Medicaid Managed Care programs for years 2020 and 2021. Fed Regist. 2019;84(73):15680-15844.
18. The Hospital Value-Based Purchasing (VBP) Program. CMS. Updated September 10, 2024. Accessed January 5, 2024. https://www.cms.gov/medicare/quality/value-based-programs/hospital-purchasing
19. Quality Compass 2023. National Committee for Quality Assurance. Accessed January 5, 2024. https://store.ncqa.org/data-and-reports/quality-compass-2023-my-2022.html
20. Quigley DD, Elliott MN, Setodji CM, Hays RD. Quantifying magnitude of group-level differences in patient experiences with health care. Health Serv Res. 2018;53(suppl 1):3027-3051. doi:10.1111/1475-6773.12828
21. Paddison CA, Elliott MN, Haviland AM, et al. Experiences of care among Medicare beneficiaries with ESRD: Medicare Consumer Assessment of Healthcare Providers and Systems (CAHPS) survey results. Am J Kidney Dis. 2013;61(3):440-449. doi:10.1053/j.ajkd.2012.10.009
22. Pylypchuk Y, Barker W. Use of telemedicine among physicians and development of telemedicine apps. Assistant Secretary for Technology Policy/Office of the National Coordinator for Health IT. March 9, 2023. Accessed June 26, 2024. https://www.healthit.gov/buzz-blog/health-data/use-of-telemedicine-among-physicians-and-development-of-telemedicine-apps
23. Gaziel-Yablowitz M, Bates DW, Levine DM. Telehealth in US hospitals: state-level reimbursement policies no longer influence adoption rates. Int J Med Inform. 2021;153:104540. doi:10.1016/j.ijmedinf.2021.104540
24. Becquart NA, Naumova EN, Singh G, Chui KKH. Cardiovascular disease hospitalizations in Louisiana parishes’ elderly before, during and after Hurricane Katrina. Int J Environ Res Public Health. 2018;16(1):74. doi:10.3390/ijerph16010074
25. Heslin KC, Barrett ML, Hensche M, et al. Effects of hurricanes on emergency department utilization: an analysis across 7 US storms. Disaster Med Public Health Prep. 2021;15(6):762-769. doi:10.1017/dmp.2020.281
26. CMS releases recommendations on adult elective surgeries, non-essential medical, surgical, and dental procedures during COVID-19 response. News release. CMS. March 18, 2020. Accessed January 12, 2024.
https://www.cms.gov/newsroom/press-releases/cms-releases-recommendations-adult-elective-surgeries-non-essential-medical-surgical-and-dental
27. Helping private practices navigate non-essential care during COVID-19. American Medical Association. Updated March 22, 2022. Accessed March 21, 2022. https://www.ama-assn.org/delivering-care/public-health/helping-private-practices-navigate-non-essential-care-during-covid-19
28. Chou B. How COVID-19 is reshaping optometry. Review of Optometry. October 15, 2020. Accessed January 12, 2024. https://www.reviewofoptometry.com/article/how-covid19-is-reshaping-optometry
29. Nagra M, Allen PM, Norgett Y, Beukes E, Bowen M, Vianya-Estopa M. The effect of the COVID-19 pandemic on working practices of UK primary care optometrists. Ophthalmic Physiol Opt. 2021;41(2):378-392. doi:10.1111/opo.12786
30. Mehrotra A, Chernew ME, Linetsky D, Hatch H, Cutler DM, Schneider EC. The impact of the COVID-19 pandemic on outpatient care: visits return to prepandemic levels, but not for all providers and patients. The Commonwealth Fund. October 15, 2020. Accessed January 12, 2024. https://www.commonwealthfund.org/publications/2020/oct/impact-covid-19-pandemic-outpatient-care-visits-return-prepandemic-levels
31. Moynihan R, Sanders S, Michaleff ZA, et al. Impact of COVID-19 pandemic on utilisation of healthcare services: a systematic review. BMJ Open. 2021;11(3):e045343. doi:10.1136/bmjopen-2020-045343
32. Lawrence E. Nearly half of Americans delayed medical care due to pandemic. KFF Health News. May 27, 2020. Accessed March 21, 2022. https://kffhealthnews.org/news/nearly-half-of-americans-delayed-medical-care-due-to-pandemic/
33. Elliott MN, Beckett MK, Cohea CW, et al. Changes in patient experiences of hospital care during the COVID-19 pandemic. JAMA Health Forum. 2023;4(8):e232766. doi:10.1001/jamahealthforum.2023.2766
34. DeYoreo M, Anhang Price R, Haas A, Tolpadi A, Teno JM, Elliott MN. Changes in hospice care experiences during the COVID-19 pandemic. J Am Geriatr Soc. 2024;72(1):300-302. doi:10.1111/jgs.18598
35. Mafi JN, Craff M, Vangala S, et al. Trends in US ambulatory care patterns during the COVID-19 pandemic, 2019-2021. JAMA. 2022;327(3):237-247. doi:10.1001/jama.2021.24294
36. Chatterji P, Li Y. Effects of the COVID-19 pandemic on outpatient providers in the United States. Med Care. 2021;59(1):58-61. doi:10.1097/MLR.0000000000001448
37. Waitzberg R, Quentin W, Webb E, Glied S. The structure and financing of health care systems affected how providers coped with COVID-19. Milbank Q. 2021;99(2):542-564. doi:10.1111/1468-0009.12530