Publication

Article

The American Journal of Managed Care

April 2004
Volume10
Issue 4

Primary Care Practice and Facility Quality Orientation: Influence on Breast and Cervical Cancer Screening Rates

Background: Despite the importance of early cancer detection, variation in screening rates among physicians is high. Insights into factors influencing variation can guide efforts to decrease variation and increase screening rates.

Objectives: To explore the association of primary care practice features and a facility’s quality orientation with breast and cervical cancer screening rates.

Study Design: Cross-sectional study of screening rates among 144 Department of Veterans Affairs (VA) medical centers and for a national sample of women.

Methods: We linked practice structure and quality improvement characteristics of individual VA medical centers from 2 national surveys (1 to primary care directors and 1 to a stratified random sample of employees) to breast and cervical cancer screening rates determined from a review of random medical records. We conducted bivariate analyses and multivariate logistic regression of primary care practice and facility features on cancer screening rates, above and below the median.

Results: While the national screening rates were high for breast (87%) and cervical cancer (90%), higher screening rates were more likely when primary care providers were consistently notified of specialty visits and when staff perceived a greater organizational commitment to quality and anticipated rewards and recognition for better performance.

Conclusions: Organization and quality orientation of the primary care practice and its facility can enhance breast and cervical cancer screening rates. Internal recognition of quality performance and an overall commitment to quality improvement may foster improved prevention performance, with impact varying by clinical service.

(Am J Manag Care. 2004;10:265-272)

In response to accreditation demands on managedcare organizations, integrated systems and theiraffiliated providers are placing increased emphasison delivering preventive care services to their enrolledpopulations.1 Despite these pressures and the importanceof early detection of disease, variation in rates oftest performance among primary care physicians ishigh.2-4 Research to date has shown that many patientandprovider-level factors influence whether indicatedscreenings are performed. In the case of sex-specificscreening for breast and cervical cancer, examples ofthese factors include insurance status, access to a regularsource of care, and sex of the provider, among others.5-12 Primary care practice structure as well as organizationalapproaches to prevention and quality mayalso influence screening, yet little is known about howthese factors act to foster or hinder performance.13Because some of these organizational features may bemutable,14,15 insights into this area can provide directionto persons working to decrease variation andincrease screening rates.

Evaluation has been limited into how organizationalinterventions influence provision of preventive care.Hulscher and colleagues16 found that systematic andwell-planned changes to the process of care (teamwork,delegation of tasks), the structure of care (developmentof follow-up procedures or patient reminders), and thecontent of care (using health charts or flow sheets)improved screening performance. Additionally, performancewas enhanced by using "learning throughsocial influence," which includes utilization of smallgroupquality improvement (QI) activities.

A recent meta-analysis found that organizationalchange, when compared with other types of interventions,was among the most effective in improving immunizationsand cancer screening services.17 Although thedesign and approach in these interventions varied, thecommon thread among them was an emphasis on system-level changes to foster screening, including delegationof prevention screening responsibility tononphysician staff; and to some extent, relying on tech-niques consistent with the tenets of continuous QI (eg,incremental changes through iterative cycles ofimprovements). Additionally, the use of collaborationand teamwork were notably effective features towardimproving use of sex-specific screening services, particularlycervical cancer screening.

Our goal in this study was to explore how primarycare practice features and organizational characteristicssimilar to those described above are associatedwith sex-specific screening performance in Departmentof Veterans Affairs (VA) medical centers. The VAis the largest integrated healthcare system in theUnited States, providing care to more than 4.6 millionpatients a year, with a growth rate of 33% during thepast 5 years.18 Since the mid-1990s, the VA has investedsignificant resources in improving its quality ofcare19; VA performance in the areas of preventive careand management of chronic diseases rivals or exceedsthat of the private sector or other government healthsystems.20 The VA also has made substantial investmentsin data systems, performance measurement,and management-oriented health services research,thereby allowing a comprehensive analysis of the environmental,organizational, and primary care practicefeatures that may be associated with cancer screeningperformance rates. A better understanding of factorsthat enhance performance in the VA may provideinsights to private sector health organizations workingto improve quality of care.21

METHODS

Model of Organizational Influenceson Cancer Screening

Our model for organizational predictors of cancerscreening is displayed in Figure 1. Facility or site-basedpredictors include: academic affiliation, quality orientation,and size and complexity. Primary care practicefeatures include continuity of care, coordinationprocesses, accessibility of services and QI involvement.These aspects of the delivery of care occurwithin a context of patient, physician, and environmentfactors including geographic region andurban/rural designation. In this study, we wereunable to control for individual patient or physiciancharacteristics because our data were aggregatedat the facility level.

Data Sources

We merged VA facility-level information from 4data sources that provided the opportunity to linkpractice structure, organizational attributes, andfacility performance data: (1) the 1996 VHAPrimary Care Delivery Models Survey, (2) the1997 National VA QI Survey, (3) the 1996 VAExternal Peer Review Program (EPRP) sponsoredby the VA Office of Quality and Performance, and(4) the 1996 Outpatient Clinic file (OPC) from theVA Austin Automation Center. All sources identifiedsites in a common fashion, making possibletheir merger. This project was reviewed andapproved by the Institutional Review Board of theVA Greater Los Angeles Healthcare System.

Measures

The 1996 VHA Primary Care Delivery ModelsSurvey was mailed to the primary care directorsat all VA medical centers in the United States(N = 160, including integrated systems) andassessed information on the organization of primarycare services (100% response rate). Surveymeasures included questions gauging the pres-ence of discrete primary care practice features (eg,assignment of patients to specific providers, processesused to enhance screening, presence of a primary carebasedQI program) and levels of independence (eg,level of primary care program implementation and levelof practice coordination). Table 1 displays descriptivestatistics for each variable.

The 1997 National VA QI Survey was administered bythe VA Management Decision and Research Center tomedical center staff (71% response rate) who rated servicequality and aspects of facility culture (eg, extent towhich their facility promoted innovation and risk taking,teamwork, and cooperation)22 as well as commitment toservice quality goals and the level of rewards and recognitionsupporting improved service quality.23 We utilized3 variables from this survey. The "Quality SystemSurvey Scale" assessed whether problem-solving practices,data availability and management practices reflecta genuine commitment to continuous QI. "PerformanceGoals" assessed whether individual performance goalsemphasize service QI. "Reward and Recognition"assessed whether staff efforts to improve service qualityare recognized and rewarded. We rescaled facility scores(×100) to facilitate interpretation in the logistic model(ie, a change of 1 unit increases the odds of being abovethe median screening rate) and examined both the continuousmeasures and dichotomized (high/low) versionscut at the median value for each scale.15

Guideline for Breast CancerScreening

Guideline forCervical Cancer Screening

Breast and cervical cancer screening rates wereassessed by the 1997 VA EPRP, in which personnel conductperiodic chart reviews of a random, national sampleof patients from primary care and subspecialtyclinics. At the time of the study, trained nurse abstractorsevaluated whether patients had received appropriatebreast and cervical cancer screening using guidancefrom VA policy that was based on the 1996 USPreventive Services Task Force Guide to ClinicalPreventive Services.24 The advised a screening mammogram every 2years for women between the ages of 50 and 69 who hadnot undergone bilateral mastectomy. The advised a Pap smear every3 years for women between the ages of 18 and 65 years,who had not had a total hysterectomy. We obtainedsite-specific results for 144 medical centers that indicatedtheir percentage of reviewed charts that wereguideline-adherent (theoretical range from 0% to 100%).

We used the 1996 OPC, a national dataset comprisedof all VA outpatient encounters, to identify the proportionof women veterans served at each VA facility in fiscalyear 1996 (October 1, 1995, through September 30,1996): unique women veterans with 1 or more outpatientclinic visits divided by the total number of veteranusers in the year.

Statistical Analyses

We merged variables from the 4 data sources usingnumeric station identifiers, resulting in a uniform databasecomprising facility, practice, and screening datafor 144 medical centers. We focused on 2 dependentvariables: percent of reviewed charts that were guideline-adherent for (1) mammogram and (2) Pap smearperformance. Our independent variables included facilityand primary care practice features as shown inFigure 1 and Table 1.

P

First, we examined the distribution of breast and cervicalcancer screening rates (mean and range) amongthe national sample of VAs. We used correlation coefficients,analysis of variance, and ordinary least squaresregression to assess the association between continuousand discrete independent variables hypothesized tohave an influence on sex-specific screening practices.We subsequently categorized selected continuous variableswith marked skew (eg, number of general internalmedicine physicians), transforming them by setting cutpoints for high and low scores at the median value. Weran a logistic regression model to examine the independentpredictors of high versus low performancerates of breast and cervical cancer screening. Variableswere included if bivariate analysis indicated their significanceat < .20 with either screening test. We conductediterative model analyses to confirm our findingsusing SPSS/PC (version 11.5, SPSS, Inc; Chicago, Ill).We tested the models using the Hosmer-Lemeshow testand both models met criteria for goodness of fit.

RESULTS

Variation in Adherence to Screening Guidelines

While mean rates of adherence to guidelines werehigh, we still noted substantial variations in rates ofbreast cancer screening (mammogram) and cervicalcancer screening (Pap smear) (Figure 2). The meanrate for mammogram performance was 87% with a rangeof 33% to 100%, and for Pap smear performance, 90%with a range of 60% to 100%. The median cut point was89% (n = 72 at or below the median and n = 72 abovethe median) for breast cancer screening and 91% (n =72 at or below the median and n = 73 above the median)for cervical cancer screening.

Organizational Variations in Screening

The percentage of medical centers with above-medianbreast and cervical cancer screening rates by facilityand primary care features are shown in Tables 2 and 3.A significant association was noted between the qualityorientation variables and better breast cancer screeningperformance, with the exception of organizational commitmentto QI. In the case of cervicalcancer screening, higherrewards and recognition scoreswere significantly associated withbetter performance, with a trendtoward better performance for facilitieswith greater commitment to QIand higher ratings for performancegoals being linked to quality.Overall, academic VAs performedno better than nonacademic sites,although large academic programs(ie, greater than median numbers ofinternal medicine house officers)did outperform small ones.

Of the primary care features,higher generalist staffing was associatedwith better breast cancerscreening performance and a nonsignificanttrend was noted for betterperformance in siteswith greater primaryspecialistcoordination.Sites with less formalassignment of patientsto primary care providerswere more likelyto have high breast cancerscreening rates. Forcervical cancer screening,the only primarycare feature associatedwith better performancewas having afully implemented QIprogram.

Sites with largerwomen veteran caseloadswere marginallymore likely to behigh performers ofcervical cancer screening(55.6%) than siteswith fewer womenpatients (45.2%) (Table2). We found no differencesin screeningperformance by regionor urban/ruraldesignation.

Predictors of Breastand Cervical CancerScreening Rates

P

P

The results of the logistic regression for breast cancer(mammograms) screening rates and for cervicalcancer (Pap smears) screening rates among VA medicalcenters are displayed in Table 4. Variables that predictedwhether a site was more likely to be a high performerwere different for the 2 screening procedures.In multivariate analyses, large academic facilities withgreater numbers of house staff were more likely to havehigh rates for breast cancer screening (odds ratio [OR]= 3.49, 95% confidence interval [CI] 1.27-9.58). Thewide confidence intervals likely relate to variability inhousestaff volumes. High staff perceptions that thefacility rewards and recognizes service quality also predictedthat a site had higher mammography rates (OR= 1.04, 95% CI 1.01-1.08). Two measures of primarycare practice were independently associated withmammography performance. First, high levels of primarycare-subspecialist coordination (ie, primary careproviders are always notified of subspecialty consults)were associated with above-median performance ofmammograms for breast cancer screening (OR = 6.24,95% CI 1.26-30.77). In contrast, stringent patientassignment to primary care providers (defined throughself-report as all/almost all patients being formallyassigned to a primary care provider) was associatedwith below-median mammography screening rates (OR= 0.36, 95% CI 0.14-0.93). Sites with such stringent primarycare provider-patient assignment tended to be insmall cities or more rural areas (62.5% small city, semirural,or rural vs 45.1% large urban, = .021, data notshown) and were less likely to be academic (73.3% vs50.4%, = .018, data not shown).

Medical centers where staff perceived greater commitmentto service quality were more likely to haveabove-median rates for cervical cancer screening (Papsmear) (OR 1.06, 95% CI 1.01-1.12). Sites with at least1 primary care-based QI program were more likely todemonstrate above-median cervical cancer screeningperformance (OR = 4.00, 95% CI 1.10-14.54). The wideconfidence intervals for some predictor variables illustratethe significant structural variability in how primarycare is organized in the VA. Finally, theproportion of women veterans served at a given medicalcenter was a significant predictor of higher Pap smearscreening rates (OR = 1.32, 95% CI 1.07-1.64).

DISCUSSION

Our study demonstrates that certain specified primarycare practice features as well as characteristics ofan organization's quality orientation can influence provisionof breast and cervical cancer screening. The VA,a large integrated, medical system, provides a uniqueopportunity to evaluate the effect of nonpatient factorson preventive services because income and insurancecoverage are not significant barriers to care. Furthermore,services are provided throughout the UnitedStates and across age-eligible adult patients. The manyfacilities providing primary care servicespermit a study of differences amongfacilities.

While this study assesses services providedto female patients only, the size ofthe VA population assures an adequatesample from which to generalize.Women veterans currently account forapproximately 6% of the VA user population,representing about 322 000unique patients. More than half of thewomen veterans are adults youngerthan 45 years,25 most of whom requirecervical cancer screening and many ofwhom are eligible for breast cancerscreening.

Breast Cancer Screening

Factors predicting the likelihood thatsites have higher rates of breast cancerscreening were different from those forcervical cancer screening. Whereas bothscreening tests are often evaluated in concert,they require different levels ofinvolvement from primary care providers,and different staff training and equipment.Breast cancer screening generallyrequires a referral for mammography; performancecan be based on ease of patientaccess to the appropriate radiology centerand how well care is coordinated. Coordination characteristics of the primarycare practice, such as group practice,scheduling processes, and use of flow sheets andreminders, have been shown to be associated with mammographyutilization,26 as has overall coordinationamong providers, especially for older low-incomewomen.27 We found that VA sites where primary careproviders were always notified of subspecialty consultationresults were significantly more likely to be "high"performers of mammography. As a proxy for ease of carecoordination, consult notification may be particularlyimportant for breast cancer screening, where primaryproviders require the review and feedback from mammographyspecialists.

Cervical Cancer Screening

In contrast to breast cancer screening, cervical cancerscreening requires interaction with a providerwhose office is set up to perform Pap smears and whohas skill and experience in this procedure. We foundthat sites were more likely to be high performers of Papsmears when the female proportion of patients washigher and when sites had a primary care-based QI program.Because most women's health clinics within theVA are not sufficiently large to warrant a separate QIprogram, this finding suggests that a system may needto make special efforts to meet the medical screeningneeds of its minority populations.

Organizational Influences

Academic medical centers with large internal medicinetraining programs (ie, above-median number ofinternal medicine house staff) performed better thancenters with small programs in the area of breast cancerscreening. Subsequent research has demonstratedthat such programs reside in the more complex VAmedical centers (ie, large tertiary care centers) which,in turn, are significantly more likely to provide on-siteaccess to both basic and specialized women's healthservices.28 Stronger linkages to university affiliates inthese large programs may also facilitate access tomammography services.

We found evidence for a positive relationshipbetween characteristics of a QI program and screeningperformance in several QI-related measures.28-30Organizations whose staff had above-median expectationsthat higher quality performance was recognizedand rewarded were more likely to be high performers ofbreast cancer screening. Similarly, sites where staffperceived greater commitment to higher service qualitywere more likely to have above-median cervical cancerscreening rates. The relationships suggest the valueof translating performance goals into cogent messagesthroughout the organization.

Limitations

This cross-sectional assessment of variations in sexspecificpreventive screenings has a number of limitations.While we found a number of independentpredictors of screening performance at the level of themedical center and primary care practice, our studyfindings may have nonetheless been limited by insufficientvariation in rates of screening across VA medicalcenters. The VA's performance measures focus onaccountability for the population of patients served byeach medical center and the chart-based review allowsfor screening to have occurred in non-VA settings. Forexample, if a chart indicated that the Pap smear ormammogram had been carried out elsewhere, credit forperformance was still given. Differences in screeningrates among sites were therefore not simply a reflectionof characteristics of the particular sites but also othersources of care and insurance coverage in the area. Wewere unable to assess the impact of individual patientand provider characteristics as data were aggregated byfacility and not assessed at the individual level.

CONCLUSIONS

Specified primary care practice features as well ascharacteristics of an organization's quality orientationwere found to influence provision of cancer screening.Furthermore, factors predicting the likelihood that siteshave higher rates of breast cancer screening were differentfrom those for cervical cancer screening. Siteswith above-median rates of breast cancer screeningwere more likely to notify primary care providers ofsubspecialty consultation results, were more likely tohave staff who perceived that higher-quality performanceis recognized and rewarded, and were more likelyto have larger numbers of internal medicine house staff.Sites with above-median rates of cervical cancer screeningwere more likely to have a primary care-based QIprogram, staff who perceived a greater commitment tohigher service quality, and more women patients.

This study demonstrated that both practice featuresand facility attitudes about quality affect the delivery ofpreventive services. However, the individual servicemay be affected differentially. Focus on quality maylead to improved care, but it is necessary to assessdirect effect by clinical service rendered.

Acknowledgments

Special thanks go to Gary Young, JD, PhD, for providing accessto facility-level scores representing employee ratings of organizationalculture and QI and to Paul Shekelle, MD, PhD, for histhoughtful review of earlier drafts of the manuscript. We alsoacknowledge the contributions toward survey design and administrationof Barbara F. Simon, MA, HSR&D survey director, AlissaSimon, MA, production designer, and Ismelda Canelo, BA, projectdirector. A previous version of this work was presented at a postersession at the Annual Meeting of the VA HSR&D Service inFebruary 19, 2001, in Washington, DC. We are appreciative ofreviewers who enabled clearer manuscript focus.

From the Division of General Internal Medicine, Veterans Affairs Greater Los Angeles Healthcare System, Los Angeles, Calif (CLG, DLW); the Veterans Affairs Greater Los Angeles Health Services Research & Development Center of Excellence, Los Angeles, Calif (CLG, DLW, ABL, EMY); the Department of Medicine, David Geffen School of Medicine at UCLA, Los Angeles, Calif (CLG, DLW); and the Department of Health Services, UCLA School of Public Health, Los Angeles, Calif (PHP, EMY).

This study was funded by the Department of Veterans Affairs, VA HSR&D Service Project # MPC-97012) and the Veterans Affairs Greater Los Angeles HSR&D Center of Excellence (Project # HFP-084) for the Managed Care Performance of VHA Primary Care Delivery Systems Study. The views expressed in this article are those of the authors and do not necessarily represent the views of the Department of Veterans Affairs.

Address correspondence to: Caroline Lubick Goldzweig, MD, MSHS, VA Greater Los Angeles Healthcare System, 11301 Wilshire Boulevard (111-G), Los Angeles, CA 90073. E-mail: caroline.goldzweig@med.va.gov.

1. National Committee on Quality Assurance (NCQA). HEDIS 2000 List ofMeasures. March 21, 2000, last updated October 10, 2001. Available at:http://www.ncqa.org/programs/hedis/h00meas.htm. Accessed February 23, 2004.2. Gandhi TK, Francis EC, Puopolo AL, Burstin HR, Haas JS, Brennan TA.Inconsistent report cards: assessing the comparability of various measures of thequality of ambulatory care. Med Care. 2002;40:155-165.3. Greenfield S, Kaplan SH, Kahn R, Ninomiya J, Griffith JL. Profiling care providedby different groups of physicians: effects of patient case-mix (bias) and physician-level clustering on quality assessment results. Ann Intern Med.2002;136:111-121.4. Parkerton PH, Smith DG, Belin TA, Feldbau GA. Physician performance assessment:nonequivalence of primary care measures. Med Care. 2003;41:1034-1047.5. Franks P, Clancy C. Physician gender bias in clinical decisionmaking: screeningfor cancer in primary care. Med Care. 1993;31:213-218.6. Hayward RA, Shapiro MF, Freeman HE, Corey CR. Who gets screened for cervicaland breast cancer? Results from a new national survey. Arch Intern Med.1988;148:1177-1181.7. Hsia J, Kemper E, Kiefe C, et al. The importance of health insurance as a determinantof cancer screening: evidence from the Women’s Health Initiative. PrevMed. 2000;31:261-270.8. Kreuter MW, Strecher VJ, Harris R, Kobrin SC, Skinner CS. Are patients ofwomen physicians screened more aggressively? A prospective study of physiciangender and screening. J Gen Intern Med. 1995;10:119-125.9. Lurie N, Slate J, McGovern P, Ekstrum J, Quam L, Margolis K. Preventive carefor women. Does the sex of the physician matter? N Engl J Med. 1993;329:478-482.10. O’Malley AS, Mandelblatt J, Gold K, Cagney KA, Kerner J. Continuity of careand the use of breast and cervical cancer screening services in a multiethnic community.Arch Intern Med. 1997;157:1462-1470.11. Osborn EH, Bird JA, McPhee SJ, Rodnick JE, Fordham D. Cancer screening byprimary care physicians. Can we explain the differences? J Fam Pract.1991;32(5):465-471.12. Wilcox LS, Mosher WD. Factors associated with obtaining health screeningamong women of reproductive age. Public Health Rep. 1993;108:76-86.13. Phillips KA, Morrison KR, Andersen R, Aday LA. Understanding the context ofhealthcare utilization: assessing environmental and provider-related variables in thebehavioral model of utilization. Health Serv Res. 1998;33:571-596.14. Hargraves JL, Palmer RH, Orav EJ, Wright EA. Practice characteristics andperformance of primary care practitioners. Med Care. 1996;34(suppl):SS67-SS76.15. Yano E, Sherman S, Lanto A, Lee M, Rubenstein L. Organizational precursorsof high quality preventive care. Paper presented at: 21st Annual Meeting of theNational Society for Medical Decision Making (SMDM). Reno, Nev; October 3,1999.16. Hulscher ME, Wensing M, Grol RP, van der Weijden T, van Weel C.Interventions to improve the delivery of preventive services in primary care. Am JPublic Health. 1999;89:737-746.17. Stone EG, Morton SC, Hulscher ME, et al. Interventions that increase use ofadult immunization and cancer screening services: a meta-analysis. Ann InternMed. 2002;136:641-651.18. Department of Veterans Affairs, Veterans Health Administration. VeteranData & Information: Program Statistics. Available at http://www.va.gov/vetdata/ProgramStatics/index.htm. Accessed February 19, 2004.19. Jha AK, Perlin JB, Kizer KW, Dudley RA. Effect of the transformation of theVeterans Affairs Health Care System on the quality of care. N Engl J Med. 2003;348:2218-2227.20. Vaughn TE, McCoy KD, BootsMiller BJ, et al. Organizational predictors ofadherence to ambulatory care screening guidelines. Med Care. 2002;40:1172-1185.21. Coordinating the Roles of the Federal Government to Enhance Quality of Care.In: Committee on Enhancing Federal Healthcare Quality Programs. Corrigan JM,Eden J, Smith BM, eds. Leadership by Example: Coordinating Government Roles inImproving Health Care Quality. Washington, DC: National Academy Press; 2002:56-78.22. Zammuto RF, Krakower JY. Quantitative and qualitative studies of organizationalculture. Res Organization Change and Development. 1991;5:83-114.23. Parker VA, Wubbenhorst WH, Young GJ, Desai KR, Charns MP.Implementing quality improvement in hospitals: the role of leadership and culture.Am J Med Qual. 1999;14:64-69.24. US Preventive Services Task Force. Guide to Clinical Preventive Services,2nd ed. Washington DC: US Department of Health and Human Services, Office ofDisease Prevention and Health Promotion; 1996.25. Department of Veterans Affairs, Veterans Health Administration. 2001National Survey of Veterans (NSV). Updated March 28, 2003. Available at:http://www.va.gov/vetdata/SurveyResults/method.htm. Accessed January 6, 2004.26. Gann P, Melville SK, Luckmann R. Characteristics of primary care office systemsas predictors of mammography utilization. Ann Intern Med. 1993;118:893-898.27. Agency for Health Care Policy and Research. Cancer Prevention for MinorityWomen in a Medicaid HMO. Final Report. Springfield, Va: National TechnicalInformation Service; 1997. NTIS publication PB97-134449.28. Washington DL, Caffrey C, Goldzweig C, Simon B, Yano EM. Availability ofcomprehensive women’s health care through Department of Veterans AffairsMedical Center. Womens Health Issues. 2003;13(2):50-54.29. Shortell S, Bennett CL, Byck GR. Assessing the impact of continuous qualityimprovement on clinical practice: what it will take to accelerate progress. MilbankQ. 1998;76:593-624.30. Solberg LI, Kottke TE, Brekke ML, Magnan S. Improving prevention is difficult.Eff Clin Pract. 2000;3:153-155. Available at:http://www.acponline.org/journals/ecp/mayjun00/solberg_2.htm. Accessed January6, 2004.

Related Videos
Related Content
AJMC Managed Markets Network Logo
CH LogoCenter for Biosimilars Logo