Publication

Article

The American Journal of Managed Care
May 2012
Volume 18
Issue 5

CAH Staff Perceptions of a Clinical Information System Implementation

This study examines staff perceptions of patient care quality and the processes before and after implementation of a comprehensive clinical information system in 7 critical access hospitals.

Objectives: This study examines staff perceptions of patient care quality and the processes before and after implementation of a comprehensive clinical information system (CIS) in critical access hospitals (CAHs).

Study Design: A prospective, nonexperimental design, evaluation study.

Methods: A modified version of the Information Systems Expectations and Experiences (I-SEE) survey instrument was administered to staff in 7 CAHs annually over 3 years to capture baseline, readiness, and postimplementation perceptions.

Results: Descriptive analyses examined 840 survey responses across 3 survey administrations and job categories (registered nurses [RNs], providers, and other clinical staff). Analysis of variance compared responses for main effects (ie, administration, staff position, hospital, and cohort) and interactions between groups over time. Correlations examined the relationships between variables. In general, the responses indicate a high level of positive perceptions regarding the processes and quality of care in these hospitals. For most of the items, responses were quite consistent across the 3 survey administrations. Significant changes occurred for 5 items; 4 reflecting information flow and increased communication, and 1 reflecting a decrease in improved patient care. Overall, providers had lower mean responses compared with nurses and other clinical staff. Significant interactions between administrations and job categories were found for 4 items.

Conclusions: Even though staff had overwhelmingly positive perceptions of patient care quality and processes, significant differences between providers, RNs, and other clinical staff were observed. Variability was also found across CAHs. Research on CIS implementation in small hospitals is rare and needed to guide the identification of factors and strategies related to success.

(Am J Manag Care. 2012;18(5):244-252)The study results indicate that even though overall responses show a high level of positive perceptions of the processes and quality of care in these hospitals, significant variability across job positions and critical access hospitals is observed.

  • Hospital employees’ perceptions of how new information technology will affect their work flow and the actual quality of care they deliver is very important.

  • Our study stands apart because there is so little research focused on information systems in small hospitals, and with the move to meaningful use requirements in all hospitals research is sorely needed to guide the implementation efforts.

A recent survey by the American Hospital Association found that small, rural, and critical access hospitals (CAHs) had consistently lower levels of rates of adoption of electronic health records (EHRs) as compared with their large or urban counterparts.1 The major models of information technology (IT) use indicate that perceptions of the impact on work and outcomes are significant determinants of technology use and adoption.2 Research has shown that users’ attitudes regarding risks to service quality and disruptions in work flow play a large role in the use of health information technology (HIT).3-5 Research on the evaluation of IT systems in healthcare organizations is quite limited, with the available studies differing in organizational settings, IT approach, and evaluation techniques.6 In addition, there is a shortage of studies on measurement tools for IT evaluation.7-9

Research on the effect of IT implementation in small, rural, and CAHs is limited and needed in order to explore the healthcare users’ role and viewpoint related to the lagging EHR adoption in these hospitals. For small hospitals, IT implementation represents a significant investment and hence evaluation is critical to guarantee its success.This study examines staff perception in 7 CAHs of patient care quality and the processes before and after implementation of a comprehensive clinical information system (CIS). It follows Kaplan and Shaw’s recommendations for IT evaluation and examines how well a system works with affected users in a particular setting.10

METHODS

Study Hospitals

Mercy Health Network—North Iowa consists of Mercy Medical Center—North Iowa (MMC-NI), 9 CAHs, and a primary physician network. MMC-NI is a rural referral hospital owned by Trinity Health in Novi, Michigan, and it in turn owns 1 of the CAHs and manages the others. Seven of the 9 network CAHs collaborated in a comprehensive EHR and computerized provider order entry (CPOE) system implementation (termed the EHR10 project) as part of Trinity Health’s extensive CIS initiative.11 As shown in Table 1, the 7 study CAHs have 25 or fewer acute care beds (1 includes a 10-bed psychiatric unit) and 2 have attached nursing homes.12 Full-time inpatient personnel range from 75 to 180 employees; all perform surgical services, and all but 2 offer obstetrics services.

CIS Implementation and Survey Timing

The EHR10 implementation process extended over several years of planning and execution.11 A well-formulated readiness process documented the progress through project milestones. The CAHs, along with MMC-NI, worked together to define the structure for communication and the decision making that would enable effective change management across the 7 CAHs. To meet the implementation goals, the CAHs worked collaboratively to create standardized processes and system designs. Major activities involved setting the stage for network collaboration, which included identifying both local and network-wide structures for communication and decision making. CAH staff, identified to fill key project roles, were freed from their regular duties and educated as to the use of the readiness plan and project tools. Electronic communication was ongoing and included monthly in-person meetings of each task-defined affinity group and the overall leadership team.

The study survey was administered 3 times at annual intervals. The first survey (administration 1) was timed to precede major changes related to the EHR implementation and captured the steady state baseline (March 2007).

The second survey (administration 2) occurred a year later (March 2008) after phase 1 of readiness had occurred. At this point, CAH personnel had become used to the read-only electronic capacity (eg, online laboratory reports), work flow processes had been redesigned, hardware had been acquired and tested, and “super users”—staff who had earlier and more hours of training and practice—were being trained. The administration 2 survey was distributed a few months before the “Go-Live” date of phase 2—the specific date when the EHR/CPOE system was activated. At this point most of the CAH personnel had not yet undergone training for full EHR/CPOE implementation, but were generally aware of the planned changes in work flow, communication, and care processes.

The CAHs followed the Trinity Health readiness process, which pays particular attention to end-user training.11 Training was conducted for all employees who would use the system over a 3-month period immediately preceding Go-Live. Each CAH identified trainers and super users.13 The CAHs varied somewhat in how they managed work schedules during training; most used the weekly 4-hour formal training sessions with ongoing practice sessions. For implementation purposes the 7 CAHs were divided into 2 cohorts; the first activated Go-Live in July 2008, and the second in September 2008.

The third survey (administration 3) occurred 1 year later (March 2009). Administration 3 surveys occurred 6 to 8 months after Go-Live and just after the automated medicine dispensing cabinet installation and bar-code medication administration were implemented.

Survey Design and Administration

A previously validated version of the instrument (the Information Systems Expectations and Experiences [I-SEE] survey) was designed to assess expectations and experiences regarding the impact that CISs have on work processes and outcomes.13,14 For the previous version, the instructions asked respondents to indicate how each item would change (or had changed) as a result of the new CIS, with response options ranging from “much worse” to “no change” to “much improved.”

As HIT is implemented within the context of specific clinical and administrative work processes, the resulting information flow and work flow become much more closely bundled and integrated and the research focus shifts to how communications and work processes are changed. Thus, for the current study, the I-SEE survey item content was retained, but the instructions were modified to remove reference to the CIS, thus creating a “current perception” survey focused on patient care quality and processes. In this way the focus shifted to being able to assess perceptions of the underlying care processes regardless of the presence or stage of an HIT implementation. This modification also facilitated direct comparison of perceptions across 3 survey administrations each conducted 1 year apart. Specifically, respondents were asked to indicate the degree to which they agreed with statements related to work flow, information flow, and selected care processes in their hospital at a given point in time. The survey items were measured on a 6-point Likert scale (ie, strongly disagree, moderately disagree, mildly disagree, mildly agree, moderately agree, or strongly agree) with an option to indicate “don’t know” or “not applicable.”

Approximately 700 surveys were mailed each year. The Institutional Review Board—approved survey packets were mailed to the Human Resources director at each CAH who distributed to all hospital personnel except facility support employees and service support employees who had no interaction with the CIS. No identifying information was collected except for hospital name, years of healthcare experience, and the work position category. For each administration, a follow-up survey was distributed to increase the response rate.

Data Processing and Statistical Analysis

Surveys were entered by 2 individuals independently into a Microsoft Access template, data sets were compared, discrepancies were corrected, and a final data set was created. Out of the 1201 surveys returned, 37 lacked hospital identification, 16 lacked an identifiable work position, and 14 survey respondents were facility/service support employees. These surveys were deleted, leaving 1132 usable surveys.

To facilitate comparison of responses according to staff positions, the 15 position options on the survey were combined into 4 groups: providers (physicians and mid-levels), registered nurses (RNs), other-clinical, and nonclinical. “Not applicable” responses were relatively common among the nonclinical group (more than 11% overall and 55% for some items), reflecting employees that did not have clinical duties. This group was deleted, leaving 840 relevant surveys for subsequent analyses. Of the 840 surveys, 48 (5.7%) were completed by providers, 341 (40.6%) by RNs, and 451 (53.7%) by other clinical personnel. We do not have specific counts of the personnel who actually received the surveys, but of the 221 RNs at these 7 CAHs,12 135 (62% response rate) completed the first survey, 106 (48% response rate) completed the second survey, and 96 (43% response rate) completed the third survey. Of the 46 physicians and 25 mid-level providers affiliated with these hospitals, 48 surveys were completed across the 3 administrations for an estimated response rate of 26% among providers.

Analyses were conducted using SAS version 9.1 (SAS Institute Inc, Cary, North Carolina). Analysis of variance compared responses for main effects (ie, administration, staff position, hospital, and cohort) and interactions between groups over time with post-hoc t tests used to examine significant findings. Correlations examined relationships between variables.

RESULTS

Mean Item Responses

The mean response was calculated for each survey item at each administration by averaging responses across all 3 staff positions (providers, RNs, and other clinical staff). In general, the responses indicated a high level of agreement and thus, a positive perception of the processes and quality of care in these hospitals. As shown in Table 2, the items with ratings above 5 (“strongly agree”) at all administrations were captured as follows: “Access to information to make good patient care decisions is available,” “Communications ensuring high-quality and safe patient care routinely occur when patients are transferred to other facilities,” “I get a great deal of professional satisfaction from my job,” “I enjoy my job,” “Overall patient care is safe in the areas I work,” and “Patient care is consistently given according to [the 9 Rights]”15 (items 31a through 31i). The 2 items that received an average rating less than 4 indicating that the respondents disagreed with the statement were “Too many verbal orders are made on my unit” and “Patients are rarely asked the same questions by the staff.” Low scores (indicating disagreement) on the first item are obviously favorable. And given Joint Commission and other recommendations to confirm identity with patients, asking the same question of patients was also favorable in some situations.

Changes Across Administrations

Analyses of mean item responses indicated significant differences across survey administrations for 4 information flow and communication items and for 1 patient care item. Significant improvement across time was seen for 4 items: “I can quickly access information that I need to share with patients and families,” “Patient-related clinical data are available to decision makers in a timely manner,” “Patient care orders are consistently legible and clear,” and “Too many verbal orders are made on my unit” (this item decreased, which indicated improvement after implementation). One particular item, “Staff are alerted to potential patient care errors before they occur,” showed significant decreases across administrations. It should be noted that despite the major changes associated with the implementation of EHR and CPOE systems, staff perceptions were not adversely affected for 34 of the 39 items.

Differences Among Staff Categories

To explore whether the 3 staff position categories (providers, RNs, and other clinical staff) differed, analyses compared their mean responses, averaged across the 3 administrations. As shown in Table 3, significant main effects for the staff positions were found for 22 of the 39 items. In 15 of these, providers had lower responses than the other 2 staff categories. In 11 of these, the other clinical staff had higher responses than the other 2 categories. Significant staff position administration interactions were found for items 10, 11, 19, and 26. As shown in Figure 1, for the work flow and work life items, RNs and other clinical staff showed relatively consistent responses across administrations, while the provider group showed significant declines in their responses from administration 2 to administration 3. An opposite pattern was found for the fourth item, with providers showing greater improvement. To further explore the pattern across job positions, analyses compared physicians and midlevel providers in terms of their responses (averaged across time because of the low total of patients). They showed significant differences on 9 items (3, 4, 5, 12, 14, 19, 21, 22, 27), with physicians having lower overall responses than mid-level providers.

Differences Across Hospitals

All but 3 items (3, 9, 10) showed significant differences across the 7 CAHs. Significant hospital administration interactions were found for all but 4 items (1, 9, 10, 23). Significant cohort administration interactions were found for 14 items (3, 8, 14, 15, 17, 20, 22, 26, 27, 31b, 31c, 31d, 31e, 31f). These results were due to employees at 1 particular hospital showing significant decreases in their responses from administration 2 to administration 3 while the other 6 hospitals showed relatively consistent response patterns across time.

Relationship of Survey Responses to Subsequent CPOE Adherence

As shown in Figure 2, CPOE adherence rates increased over the first few months and then leveled off. Survey responses for each of the 48 providers who completed the survey at some point were correlated with CPOE adherence rates averaged across the first 8 months of use within each hospital (1 rate per hospital). This exploratory analysis suggested that providers with higher survey responses on 12 items (13, 14, 17, 22, 27, 28, 31c, 31d, 31e, 31f, 31g, 31i) also had higher hospital-level average CPOE adherence rates.

DISCUSSION

Perceptions of Care Processes and Quality

The need to improve clinical processes and work flow efficiencies, to share patient information, and to improve healthcare quality have been identified as the primary factors driving the implementation of CISs.16 A systematic review6 of factors related to the success of inpatient CISs identified system quality, information quality, usage, user satisfaction, individual impact, organizational impact, and organizational culture. The survey instrument used in the present study was specifically designed to assess these dimensions. As shown in Table 2, the staff of the 7 participating hospitals in the study reported consistently high ratings for most items, reflecting considerable satisfaction both before and after EHR/CPOE implementation as related to the quality of patient care, clinical processes, communication, work flow efficiency, and flow of information. In addition, exploratory analyses suggested that survey responses may be related to CPOE adherence rates among providers.

Differences Across Professions

In addition to examining changes in perceptions of care quality and processes over time, the current methodology permitted us to explore differences among employee groups. In the current analyses, significant staff position survey administration interactions were found for 4 items. All 3 groups showed significant increase after full implementation for “Patient care orders are consistently legible and clear,” with the providergroup showing the largest increase. In contrast, the providergroup showed significant declines in their responses after full implementation for “I spend about the right amount of time recording diagnoses and symptoms,” “I spend about the right amount of time preparing discharge documents,” and “The work processes I commonly use are efficient.” RNs and other clinical staff showed relatively consistent responses across the 3 administrations. These particular items tap into the impact of the EHR/CPOE system on perceptions of time efficiency. Poissant and colleagues17 conducted a systematic review of studies on this topic and found that nurses are more likely to gain time efficiencies through the use of CIS as compared with physicians. In particular, once EHR was implemented it was found that nurses saved time documenting, but physicians spent more time after CPOE was introduced. To explain these findings, it was speculated that nurses and physicians engage in different types of documentation and work processes. Thus, the current findings on the differences between provider and nurse perceptions are consistent with the systematic review17 of actual time efficiencies. A qualitative study in community hospitals18 found that nurses perceived that EHRs helped them deliver safer patient care work through increased information access, improved organization and efficiency, and helpful screen alerts. However, those nurses also perceived that EHRs reduced the quality of care through decreased interdisciplinary communication, impaired critical thinking, and increased demands on work time.18

Besides perceptions of time efficiencies, the current survey included items related more generally to the perceptions of healthcare quality. Previous studies have examined safety climate perceptions among staff groups and generally found that physicians expressed more positive perceptions of safety climate than nurses and other clinical personnel.19,20 These previous studies were generally conducted in large hospitals. Our findings may be among the first to examine this pattern in staff groups in CAHs. Compared with previous research, multiple items on the current perception survey showed significant differences between job categories, with providers, and especially physicians, often registering the lowest scores and other clinical personnel often registering the highest. However, in the current study, few items showed significant differences across job categories in the pattern across time. Thus providers expressed more negative perceptions for most items at all administration times, with only a few work flow items sensitive to the CIS implementation. It should be noted that a limitation of the current study is that the response rate is not known for the other clinical personnel and estimated to be 48% for nurses and 26% for providers. Given that a small percentage of the surveys were completed by physicians and mid-level providers, differences between job categories must be interpreted cautiously and future research is warranted to further explore these patterns.

The Importance of What Did Not Change

While the current survey tool was helpful in identifying specific positive and negative changes, it was also very useful to identify things that did not change. Of necessity, hospitals implementing CISs must take on the tasks of planning, work flow redesign, staff training, and organizational readiness with existing resources. Such implementations are carried out simultaneously with the ongoing provision of patient care services. With additional workloads, changes in work processes, and the stress of learning new ways of documenting care, the potential for major disruptions to existing care processes, communication patterns, and staff satisfaction arises. The survey results strongly suggest that despite the potential for such disruptions, the 7 study hospitals were very successful during these complex implementations in not disrupting care. We speculate that the comprehensive readiness, planning, and systematic implementation supported by network resources in these hospitals minimized disruption. Thus, the survey instrument is useful in identifying what improved or deteriorated, and also what did not change.

Comparison of the Current Perception Survey With Previous Versions

The current survey was developed from a previous version called the I-SEE survey which was validated across samples,13 and employee responses were shown to be correlated with super user availability.14 In addition, the I-SEE survey items showed significant changes in a group of nurses at a rural referral hospital undergoing the same CIS implementation as in the present study.21 The current study furthers exploration of the use of employee ratings of patient care quality and processes, using a variation of the previous instrument that focuses on current perceptions.

While the content of most survey items is similar, the current survey and previous version differ in terms of specific survey item wording, instructions, response options, and response scaling. Therefore, direct comparisons of responses to the 2 surveys cannot be made. However, the pattern of differences does help to illuminate how the 2 survey versions differ. In particular, the sample of nurses using the previous version showed significant changes across administrations on many more items21 than in the current project. It appears that asking hospital employees to rate their perceptions of patient care quality and work processes yields relatively stable responses over time. We speculate that, in contrast, asking hospital employees how patient care quality and work processes are likely to change with a major HIT implementation taps into their perceptions of work flow modifications and the necessary adjustments needed for such a large intervention. Thus, both versions of the survey are useful, each serving different purposes.

IMPLICATIONS

This study focused on changes in staff perceptions of patient care and work processes related to the implementation of a comprehensive CIS in 7 CAHs. A particular strength of this study was its longitudinal nature, which provided us with useful insight into changes during and after the process of CIS implementation. As we have shown here, widely held positive perceptions by hospital staff are possible and can be maintained, even after considerable technological change. Overall, even though overwhelmingly positive responses occurred, significant variability among CAHs was observed. The CAHs collaborated on a shared approach to change management; however, we speculate that more positive responses occurred at the hospitals with stronger leadership involvement, provider and staff buyin, and financial resources. Implementation of such systems in small rural hospitals has been rare1,22,23 but should expand as hospitals strive to implement EHR/CPOE in order to achieve meaningful use. Thus, research on CIS implementation in all settings, and especially small hospitals, is needed to guide in the identification of factors and strategies24 necessary for success.Acknowledgment

The authors thank Kwame A. Nyarko, BA, for assistance with data input.

Author Affiliations: From Center for Health Policy and Research and Department of Health Management and Policy (MMW, SV, TRM), University of Iowa, Iowa City, Iowa; Mercy Medical Center-North Iowa (JLL), Mason City, Iowa; Ellsworth Municipal Hospital (JOB), Iowa Falls, Iowa; Department of Management and Marketing (JRBH), University of Alabama, Tuscaloosa, AL; Center for Health Care Quality and Department of Health Management and Informatics (DSW), University of Missouri, Columbia, MO.

Funding Source: This work was supported by funding from Agency for Healthcare Research and Quality (5UC1HS016156) “EHR Implementation for the Continuum of Care in Rural Iowa” conducted in Mercy Health Network—North Iowa, the University of Iowa Center for Health Policy and Research, the University of Missouri Center for Health Care Quality, and Trinity Health of Novi Michigan.

Author Disclosures: The authors (MMW, SV, JLL, JOB, TRM, JRBH, DSW) report no relationship or financial interest with any entity that would pose a conflict of interest with the subject matter of this article.

Authorship Information: Concept and design (MMW, SV, TRM, DSW); acquisition of data (MMW, JOB, TRM, DSW); analysis and interpretation of data (MMW, SV, TRM, JRBH); drafting of the manuscript (MMW, SV, JRBH); critical revision of the manuscript for important intellectual content (SV, JLL, JRBH); statistical analysis (SV); provision of study materials or patients (JOB); obtaining funding (JLL, JOB, DSW); administrative, technical, or logistic support (JLL); and supervision (MMW).

Address correspondence to: Marcia M. Ward, PhD, Center for Health Policy and Research, College of Public Health, University of Iowa, N250CPHB, Iowa City, IA 52242. E-mail: marcia-m-ward@uiowa.edu.1. Jha AK, DesRoches CM, Kralovec PD, Joshi MS. A progress report on electronic health records in US hospitals. Health Aff (Millwood). 2010;29(10):1951-1957.

2. Kukafka R, Johnson SB, Linfante A, Allegrante JP. Grounding a new information technology implementation framework in behavioral science: a systematic analysis of the literature on IT use. J Biomed Inform. 2003;36(3):218-227.

3. Hu PJ, Chau PYK, Sheng ORL. Adoption of telemedicine technology by health care organizations: an exploratory study. J Organ Comput Electron Commerce. 2002;12(3):197-221.

4. Zheng K, Padman R, Johnson MP, Diamond MS. Understanding technology adoption in clinical care: clinician adoption behavior of a point-of-care reminder system. Intl J Med Inform. 2005;74(7-8):535-543.

5. Oroviogoicoechea C, Watson R. A quantitative analysis of the impact of a computerized information system on nurses’ clinical practice using a realistic evaluation framework. Int J Med Inform. 2009;78(12):839-849.

6. Van der Meijden MJ, Tange HJ, Hasman A. Determinants of success of inpatient clinical information systems: a literature review. J Am Med Inform Assoc. 2003;10(3):235-243.

7. Lee T-T. Evaluation of computerised nursing care plan: instrument development. J Prof Nurs. 2004;20(4):230-238.

8. Otieno OG, Toyama H, Asonuma M, Kanai-Pak M, Naitoh K. Nurses’ views on the use, quality and user satisfaction with electronic medical records: questionnaire development. J Adv Nurs. 2007;60(2):209-219.

9. Oroviogoicoechea C, Watson R, Elena Beortegui E, Remirez S. Nurses’ perception of the use of computerised information systems in practice: questionnaire development. J Clin Nurs. 2010;19(1-2):240-248.

10. Kaplan B, Shaw NT. Future directions in evaluation research: people, organizational and social issues. Methods Inf Med. 2004;43(3):215-231.

11. Crandall D, Brokel J, Schwichtenberg T, et al. Redesigning care delivery through health IT implementation: exploring Trinity Health’s IT model. J Healthc Inf Manag. 2007;21(4):41-48.

12. IHA 2007 Profiles. Hospital and health system characteristics. Iowa Hospital Association website. http://www.ihaonline.org/. Accessed September 1, 2009.

13. Wakefield DS, Halbesleben JRB, Ward MM, et al. Development of a measure of clinical information systems expectations and experiences. Med Care. 2007;45(9):884-890.

14. Halbesleben JR, Wakefield DS, Ward MM, Brokel J, Crandall D. The relationship between super users’ attitudes and employee experiences with clinical information systems. Med Care Res Rev. 2009;66(1):82-96.

15. Wakefield DS, Ward MM, Wakefield BJ. A 10-Rights framework for patient care quality and safety. Am J Med Qual. 2007;22(2):103-111.

16. Houser SH, Johnson LA. Perceptions regarding electronic health record implementation among health information management professionals in Alabama: a statewide survey and analysis. Perspect Health Inform Manag. 2008;5(6):1-15.

17. Poissant L, Pereira J, Tamblyn R, Kawasumi Y. The impact of electronic health records on time efficiency of physicians and nurses: a systematic review. J Am Med Inform Assoc. 2005;12(5):505-516.

18. Kossman SP, Scheidenhelm SL. Nurses’ perceptions of the impact of electronic health records on work and patient outcomes. Comput Inform Nurs. 2009;26(2):69-77.

19. Duclos A, Gilliaizeau F, Colombet I, Coste J, Durieux P. Health staff perception regarding quality of delivered information to inpatients. Int J Qual Health Care. 2008;20:3-21.

20. Singer SJ, Gaba DM, Falwell A, et al. Patient safety climate in 92 US hospitals: differences by work area and discipline. Med Care. 2009;47(1):23-31.

21. Ward MM, Vartak S, Schwichtenberg T, Wakefield DS. Nurses’ perceptions of how clinical information system implementation affects workflow and patient care. Comput Inform Nurs. 2011;29(9):502-511.

22. Li P, Bahensky JA, Jaana M, Ward MM. Role of multihospital system membership in electronic medical record adoption. Health Care Manage Review. 2008;33(2):1-9.

23. Ward MM, Jaana M, Bahensky JA, Vartak S, Wakefield DS. Clinical information system availability and use in urban and rural hospitals. J Med Syst. 2006;30(6):429-438.

24. Bahensky J, Ward MM, Nyarko K, Li P. HIT implementation in critical access hospitals: extent of implementation and business strategies supporting IT use. J Med Syst. 2011;35(4):599-607.

Related Videos
Related Content
AJMC Managed Markets Network Logo
CH LogoCenter for Biosimilars Logo