Findings from an Experiment. September 2013
|
|
|
- Gary French
- 10 years ago
- Views:
Transcription
1 WORKING PAPEr 20 BY LORENZO MORENO, SUZANNE FELT-LISK, AND STACY DALE Do Financial Incentives Increase the Use of Electronic Health Records? Findings from an Experiment September 2013
2 ABSTRACT Background The Electronic Health Records Demonstration (EHRD), implemented by the Centers for Medicare & Medicaid Services (CMS), provided financial incentives to physician practices to use a certified EHR. Practices that met minimum EHR use requirements received payments on a graduated scale, increasing for more use of EHR functions. Methods The demonstration was implemented in four sites and targeted practices with 20 or fewer providers supplying primary care to at least 50 fee-for-service (FFS) Medicare beneficiaries. The demonstration was expected to operate for five years (June 1, 2009 May 31, 2014); however, it was canceled in August 2011 because 43 percent of the practices did not meet program requirements. The evaluation used a stratified, experimental design 412 treatment and 413 control practices to estimate the impacts of the payments on adoption and use of EHR functionalities. Results In June 2011, treatment group practices were, on average, 9 to 18 percentage points more likely than control group practices to report using 13 EHR functionalities queried by providers at baseline (2008). The payments increased a summary score of EHR use, which ranged from 1 to 100, by more than 11 points, on average, than that of the control group (54 versus 43). 1
3 Conclusion Moderate incentive payments did not lead to universal EHR adoption and use in a two-year time frame. However, the demonstration showed that incentives can influence physician use of EHRs. Although these results are encouraging for the potential effectiveness of the Medicare EHR Incentive Program, they also suggest that meaningful use of EHRs on a national scale may take longer than anticipated. 2
4 For more than a decade, the Institute of Medicine, the federal government, and other influential stakeholders have envisioned health information technology (health IT) as a promising tool for improving quality of health care and reducing costs. 1,2,3,4,5 This consensus is likely to have influenced the decision by Congress to enact the Health Information Technology for Economic and Clinical Health (HITECH) Act of the American Recovery and Reinvestment Act of HITECH created programs to promote the adoption and use of electronic health records (EHRs) and electronic exchange of information by eligible providers. 7 These programs provide technical assistance and other support to the target population of eligible professionals and hospitals to achieve meaningful use of EHRs. 8 The largest of these programs the Medicare and Medicaid EHR Incentive Programs, are charged with providing financial incentives to providers who voluntarily join the program for the meaningful use of certified EHRs. 9 The Electronic Health Records Demonstration (EHRD), funded and implemented by the Centers for Medicare & Medicaid Services (CMS), was designed to evaluate whether providing financial incentives increases physician practices adoption and use of EHRs. 10 CMS expected the use of this technology to result in structural and organizational changes that would improve the quality of care delivered to chronically ill patients with fee-for-service (FFS) Medicare coverage, while reducing the costs of care and improving practices financial performance. Lessons from the EHRD evaluation could have direct implications for primary care providers who have joined, or are considering joining, the ongoing Medicare EHR Incentive Program and, therefore, may be of considerable interest to them, health IT policymakers, and other stakeholders. This paper focuses on the impact of the demonstration on the adoption and use of EHRs; findings on the impacts of EHRD on quality of care and costs will be reported elsewhere. 3
5 METHODS Study Design CMS initially planned to implement the demonstration in 12 sites in two phases one year apart. The agency chose four sites for Phase I: Louisiana, Maryland and the District of Columbia, southwestern Pennsylvania, and South Dakota,. Phase II was to have consisted of eight more sites starting a year later. However, CMS canceled Phase II before it began because of the passage of HITECH. Therefore, EHRD consisted only of the four Phase I sites. On behalf of CMS, 14 community partners recruited 900 interested practices, which CMS screened for eligibility. The demonstration was expected to operate for five years (June 1, 2009 May 31, 2014); however, CMS canceled it in August 2011 because 43 percent of practices left the program or did not meet program requirements. 11 The demonstration targeted practices serving at least 50 traditional FFS Medicare beneficiaries with certain chronic conditions for whom the practices provided primary care. Under the original design, primary care providers (physicians, as well as nurse practitioners and physician assistants who provide primary care) in practices with 20 or fewer providers were eligible to earn incentive payments for (1) using at least the minimum functions of a certified EHR (a systems payment, with increasing rewards for increasing use); (2) reporting 26 quality measures for congestive heart failure, coronary artery disease, diabetes, and preventive health services (a reporting payment); and (3) achieving specified standards on clinical performance measures during the demonstration period (a performance payment, with increasing rewards for better adherence to recommended care guidelines). All incentive payments under the demonstration were to be made in addition to the FFS Medicare payments practices receive for submitted claims. Physicians could have received up to $13,000 and practices up to $65,000 over 4
6 the first two years of the demonstration. Because the demonstration was terminated, the reporting and performance payments were never made; CMS made only the systems payment for the first two years of the demonstration in fall 2010 and fall 2011, which totaled $4.5 million. The EHRD evaluation used a stratified, experimental design to allocate 825 eligible practices that volunteered for Phase I of EHRD to treatment and control groups (Figure 1). This design was used to achieve balance on practice characteristics that are important predictors of adoption and use of EHRs (Table 1). In February 2009, practices from the four sites were randomized in equal proportions into treatment and control practices within strata, defined by site, number of primary care physicians, and whether the practice was in a medically underserved area (MUA). The evaluation also included site visits to systematically, purposively selected practices in each of the four sites (four treatment practices and two control practices in each site), as well as telephone interviews with seven practices that voluntarily left the demonstration. A two-person team visited the practices during May and June A semistructured protocol was used during the discussions (which lasted one to two hours per practice) with at least one physician and an administrative staff member knowledgeable about the demonstration. Data Sources Key measures for the evaluation, constructed from a web-based Office Systems Survey (OSS), were (1) practices adoption and use of EHRs and other health IT, and (2) a summary (composite) score that quantifies EHR use for the calculation of the incentive payment. Soon after the start of the demonstration, CMS determined that seven of the treatment practices and one of the control practices were ineligible because they failed to meet the terms 5
7 and conditions of the demonstration (Figure 1). An additional 43 treatment practices voluntarily discontinued participation in the intervention. Between April and June 2010 and 2011, the OSS was administered to treatment practices; for control practices, it was administered only in The OSS collected information on practice characteristics, provider characteristics, and use of EHRs and other health IT. All practices that had been randomized to the treatment or control group, even those that left the intervention, were asked to participate. The final response rates were 87 and 68 percent for treatment and control group practices, respectively. To calculate EHR summary scores for practices currently using a certified EHR, the OSS measured 53 functions (for example, prescribing medications, ordering laboratory tests and other procedures, and care management and coordination) thought to be connected to improved care (although, for many, a causative link is not yet empirically proven). These functions can also be sorted into five domains: (1) completeness of information, (2) communication about care outside the practice, (3) clinical decision support, (4) use of the system to increase patient engagement/adherence, and (5) medication safety (Supplementary Appendix Table A.1). If practices were to use all 53 functions for three-fourths or more of their patients, the total composite score would equal 100. In addition to calculating this score, composite scores were calculated for the five OSS domains. 12 (Baseline scores cannot be estimated because application data on EHR/other health IT use are available for only 13 of the 53 functions.) Based on the total composite score for each treatment practice, CMS calculated payments during each demonstration year. Practices received their payments in the fall following the end of each demonstration year. 6
8 Figure 1. EHRD Flow Chart Statistical Analysis All randomized practices were included in an intent-to-treat analysis (Figure 1). Using data from all practices that completed the 2011 OSS, treatment-control differences in any EHR/health IT use and use of each of the 13 EHR functions were estimated using separate regressions. We conducted a similar analysis for the overall OSS summary score and the five OSS domain scores. 7
9 The regressions adjusted for the stratifying variables and the baseline measure of the 13 functions. Inclusion of these variables adjusts for any differences between treatment and control groups due to survey nonresponse. Observations were weighted to adjust for survey nonresponse and nonrandom demonstration attrition. We conducted sensitivity tests to confirm that the results were similar in regressions that did not use baseline control variables and in regressions that did not use weights (Supplementary Appendix Table A.2). Analyses were conducted using STATA. 13 8
10 RESULTS Participation Practices were required to implement and use EHR minimum functions in a certified EHR each year to qualify for system payment (Table 2). Of the 412 originally randomized treatment practices, 57 percent complied with these requirements by the end of the second year of the demonstration (Table 3). However, the remaining practices either refused to respond to the OSS or had left the demonstration voluntarily (went out of business or merged practices), or more commonly, because they failed to meet demonstration requirements. Impacts on Selected Measures of Health IT Use The analysis of the 2011 OSS data found statistically and substantively significant impacts on any EHR/health IT use. Treatment group practices were nearly 10 percentage points more likely than control group practices (89 versus 80 percent) to report any EHR/health IT use (p < 0.001), controlling for use in 2008 and the stratifying variables (Table 4). Treatment practices also were 9 to 18 percentage points more likely than control practices to report using the following functions: maintaining electronic patient visit notes, keeping electronic patient problem lists, using automated patient-specific alerts and reminders, using electronic diseasespecific patient registries, disseminating patient-specific educational materials, making online referrals to other providers, viewing lab tests online, printing and faxing prescriptions, and digitally transmitting prescriptions to pharmacies. In particular, large treatment-control differences exist for use of automated patient-specific alerts and reminders, and for electronic disease-specific patient registries (18 percentage point treatment-control difference in both cases; p < 0.001). These treatment-control differences were similar in magnitude and statistical 9
11 significance, regardless of the use of baseline controls or the application of nonresponse weights (not shown). Impacts on Health IT Summary Score EHRD had a statistically significant and positive impact on practices overall OSS scores, which ranged from 1 to 100, and all five OSS domain scores (Table 5). After controlling for practice characteristics and baseline health IT use, treatment group practices overall OSS scores were more than 11 points higher than those of control group practices, on average (54 for treatment versus 43 for control group practices; p < 0.001). In addition, treatment group practices scores on all five domains were at least 1.5 points higher than those of control group practices (with a maximum score of between 17 and 22 points in each domain; p < 0.001). There were notably large impacts on the completeness of information in the EHR and medication safety domains (2.4 and 3.4 points, respectively). In analyses that limited the sample to EHR users (excluding the 96 practices without an EHR), positive impacts on health IT use were present regarding the completeness of information and on medication safety; however, there were no significant treatment-control differences in communication of care outside the practice, clinical decision support, or increasing patient engagement (not shown). Limitations Although the EHRD evaluation relied on an experimental design making it a rigorous study it had several limitations. First, treatment practices could have overstated their EHR use because the level of the incentive payment was determined by the level of use they reported in the OSS. Although these simple attestations were confirmed by a second set of requests for 10
12 screenshots and more detailed responses for a random sample of respondents, there was no full, independent verification. Second, the exclusion of eight practices originally classified as eligible but later determined to be ineligible after randomization may have introduced a small degree of selection bias to the OSS intention-to-treat impact estimates. Third, because of differential response rates and nonrandom attrition between the treatment and control groups in the OSS, the comparison between these two groups could also be unreliable, despite the use of nonresponse and attrition analytic weights to minimize these biases. Finally, national generalizations cannot be made because the sample of practices was purposively selected from only four sites. Furthermore, the EHRD practices were probably more advanced in their thinking about and use of EHRs than other small practices nationally. In fact, nearly two-fifths of treatment and control group practices (43 and 44 percent, respectively) used an EHR at the time of application to the demonstration (Table 4). In contrast, a national estimate for the same year (2008) suggested that only 10 to 13 percent of practices (albeit defined slightly differently) used an EHR
13 DISCUSSION Site visits and interviews with practices that stopped participating in the demonstration suggest two main reasons for the high attrition. First, it is difficult to implement an EHR. Second, many practices lacked some or all of the conditions needed to surmount the difficulties: project management skills; time, labor, and upfront financial resources; and a Medicare FFS caseload large enough to realize sizable incentive payments. In contrast, practices that met demonstration requirements and continued to participate seemed to already have the wherewithal and intention to implement an EHR soon, and the financial incentives of EHRD motivated them to accelerate the process. These findings are consistent with other qualitative studies of EHR implementation. 15,16,17,18,19 Lessons Learned This evaluation provides some evidence about the health IT experience of a limited sample of small- to medium-size primary care practices serving Medicare FFS beneficiaries. Because of the demonstration s termination, the evidence must be interpreted cautiously. If the demonstration had run for the original five-year term, the effects could have been different from those estimated from the current analysis. Nonetheless, we learned two policy lessons from this limited evaluation. First, moderate financial incentive levels can influence physician practice use of EHRs, but that level of the incentives cannot achieve universal adoption and use in a two-year time frame. Although more than half the practices responded to the financial incentives for implementing and using an EHR system, many practices were not able or willing to do so within the time frame the 12
14 demonstration required. Their decision to not respond to the incentives raises the important question of whether the incentives should have been larger. Second, targeting the incentives to individual practitioners instead of practices might be more effective. Site visits found considerable variation within practices in individual practitioners use of EHRs; often, decision making on EHR use was at the individual level. However, incentive payments for a practice often were not passed through to individual practitioners; rather, they were used for overall support of the practice or its EHR system. In the HITECH Incentive Program, eligible professionals who receive the incentive payment can assign it to the practice; however, it remains untested whether payment to the practice or to the individual might be more effective. Policy Context The demonstration results must be interpreted cautiously, not only because of the early termination of the demonstration, but also because of the rapid, concurrent changes in health IT policy, including financial incentives and available technical assistance. Efforts that overlapped with demonstration goals had the potential to support and encourage treatment and control group practices adoption and use of EHRs. Beginning in 2011, eligible providers could begin receiving payments under the HITECH Incentive Program for demonstrating meaningful use of EHR, which included meeting a core set of required criteria, as well as several selected criteria. There was a four-month overlap between EHRD and the HITECH Incentive Program. In fact, a sizable number of treatment and control practices that responded to the OSS reported changing decisions regarding EHR adoption or the practice s care delivery processes due to the Incentive Program by spring 2011 (44 and 41 13
15 percent, respectively). It is unclear whether the demonstration would have had as much influence on EHR adoption and use in an environment unaccompanied by additional EHR-related incentives. Other federal, state, and local projects had goals similar to those of EHRD. Although many of these initiatives may have enhanced the effects of EHRD, those in the early stages of development seemed to have also made adoption and implementation more complicated. Based on the site visits, some of the most actively participating practices reported they were delaying initial decisions until they could determine how to meet the requirements for multiple program opportunities with a single set of practice changes. In sum, the demonstration had favorable impacts on EHR use, even though demonstration participation for more than two-fifths of the practices was terminated, mostly because they did not implement or sufficiently use an EHR within the time frame the demonstration required. These positive findings are encouraging for the potential impact of the HITECH Incentive Program, but also cautionary regarding the expectation of rapid conversion to meaningful use on a national scale. 14
16 Table 1. Summary of Baseline Characteristics for Treatment and Control Group Practices (Percentages) Treatment Group Control Group Practice Baseline Characteristics Site* Louisiana Maryland Pennsylvania South Dakota Practice Size* 1 to 2 physicians to 5 physicians or more physicians Located in an MUA* Yes No Practice Affiliation Unaffiliated Affiliated Located in a Rural Area Yes No Participating in Another EHR, Quality Improvement, or Quality Reporting Program Yes No Number of Practices Difference Source: EHRD practice application database, * Stratifying variables. For all comparisons of baseline characteristics between the treatment and control groups, p > Owned by a hospital, hospital system, or larger medical group, or affiliated with a larger medical group, independent practice association, physician hospital organization, or other entity. EHR = electronic health record; MUA = medically underserved area. 15
17 Table 2. Demonstration Minimum Requirements Requirement 1. Implement a certified* EHR by the end of the second demonstration year (May 31, 2011) 2. Use the EHR for Entering patient clinical notes Recording/entering laboratory and other diagnosis test orders Entering laboratory and other diagnosis test results Documenting the ordering of prescription medications (new and refills) *Valid June 2009 or later. Certification by the old Certification Commission for Health Information Technology or other certification organizations approved by the Office of the National Coordinator for Health Information Technology. EHR = electronic health record. Table 3. Summary of Practice Participation in the Demonstration Status Treatment Group Control Group Practices randomized at the start of the demonstration 412 (100%) 413 (100%) Practices eligible for the year 2 OSS* 346 (84%) 389 (94%) Completed the year 2 OSS 311 (76%) 267 (65%) Reported having an EHR in the year 2 OSS 264 (64%) 188 (46%) Met minimum requirements for payment at the end of year (57%) n.a. (n.a.) Source: OSS, conducted in spring and summer 2010 and Numbers in parentheses correspond to the percentage of the practices in each status category relative to the total number of practices randomized to the treatment or control group. *Excludes practices that went out of business or merged practices, and withdrawn or terminated practices that refused to be contacted. (Most withdrawn or terminated practices remained in the survey sample.) Three practices that were asked to complete an OSS validation module but did not complete it or failed to provide the requested screenshots are considered to not have completed the OSS. The denominator for this estimate is equal to the total number practices randomized to the treatment group (row 1), except for seven treatment practices and one control practice determined by CMS to be ineligible soon after the start of the demonstration because they failed to meet the terms and conditions of the demonstration. EHR = electronic health record; n.a. = not applicable; OSS = Office Systems Survey. 16
18 Table 4. Impacts of EHRD on Health IT Use, by Function Treatment Group Mean at Baseline (Fall 2008) Control Group Mean at Baseline (Fall 2008) Treatment Group Adjusted Mean (Spring Summer 2011) Control Group Adjusted Mean (Spring Summer) 2011) Impact EHR/Health IT Function Any EHR/Health IT Use *** Electronic Patient Visit Notes *** Electronic Patient Problem Lists *** Automated Patient-Specific Alerts and Reminders *** Electronic Disease-Specific Patient Registries *** Patients Patient-Specific Educational Materials *** Online Referrals to Other Providers *** Laboratory Tests: Online order entry Online results viewing * Radiology Tests: Online order entry Online results viewing (reports and/or digital films) E-Prescribing: Printing and/or faxing Rx *** Online Rx transmission to pharmacy *** Number of Practices (Weighted) Number of Practices (Unweighted) Sources: Notes: OSS, conducted in spring and summer 2011, and data drawn from the applications practices submitted to EHRD in fall From fall 2008 application data. Reported means are regression-adjusted. Regressions control for the stratifying variables (state, MUA, practice size); and the health IT functions practices reported on the application to the demonstration listed above. The baseline value of any 17
19 EHR/health IT use is included as control for any EHR/health IT use the end of year 2; similarly, the baseline value of each health IT function is included as control for the corresponding health IT function at the end of year 2. Observations for treatment and control group practices are adjusted for nonresponse to the 2011 OSS and for demonstration attrition. The weighted sample reflects all randomized practices, except for seven treatment practices and one control practice that were determined by CMS to be ineligible soon after the start of the demonstration because they failed to meet the terms and conditions of the demonstration. * p < 0.05; ** p < 0.01; *** p < CMS = Centers for Medicare & Medicaid Services; EHRD = Electronic Health Records Demonstration; MUA = medically underserved area; OSS = Office Systems Survey; Rx = prescription. 18
20 Table 5. Impacts of EHRD on OSS Scores, by Domain Treatment Group Adjusted Mean (Spring Summer 2011) Control Group Adjusted Mean (Spring Summer 2011) Difference OSS Score (Means) Overall OSS score *** OSS Score Domains 1. Completeness of information in the EHR *** 2. Communication of care outside the practice *** 3. Clinical decision support *** 4. Increasing patient engagement *** 5. Medication safety *** Number of Practices (Weighted) Number of Practices (Unweighted) Sources: OSS, conducted in spring and summer 2011, and data drawn from applications practices submitted to EHRD in Notes: Reported means are regression-adjusted. Regressions control for the stratifying variables (state, MUA, practice size); and health IT functions practices reported on the application to the demonstration (listed in Table 4). Because the OSS score could not be calculated for the baseline period from the application to the demonstration, the 13 health IT functions measured at baseline are used as a proxy for this score. Observations for treatment and control group practices are adjusted for nonresponse to the 2011 OSS and for demonstration attrition. The weighted sample reflects all randomized practices, except for seven treatment practices and one control practice that were determined by CMS to be ineligible soon after the start of the demonstration because they failed to meet the terms and conditions of the demonstration. * p < 0.05; ** p < 0.01; *** p < CMS = Centers for Medicare & Medicaid Services; EHRD = Electronic Health Records Demonstration; MUA = medically underserved area; OSS = Office Systems Survey. 19
21 REFERENCES 1 Institute of Medicine. Crossing the Quality Chasm: A New System for the 21st Century. Washington, DC: National Academies Press, Blumenthal, D. Health Information Technology: What Is the Federal Government Role? Publication no Washington, DC: Commonwealth Fund, Shekelle, P.M., S.C. Morton, E.B. Keeler, et al. Costs and Benefits of Health Information Technology. Evidence Report/Technology Assessment 132. AHRQ Publication no. 06-E006. Rockville, MD: Agency for Healthcare Research and Quality, Blumenthal, D., and J.P. Glaser. Information Technology Comes to Medicine. New England Journal of Medicine, vol. 356, no. 24, 2007, pp Congressional Budget Office. Evidence on the Costs and Benefits of Health Information Technology. Publication no Washington, DC: CBO, U.S. Congress. American Recovery and Reinvestment Act of P.L Feb 17, 7 Redhead, C.S. The Health Information Technology for Economic and Clinical and Health (HITECH) Act. Congressional Research Service Report to Congress Blumenthal, D. Stimulating the Adoption of Health Information Technology. New England Journal of Medicine, vol. 360, no. 15, 2009, pp Jha, A.K. Meaningful Use of Electronic Health Records. The Road Ahead. Journal of the American Medical Association, vol. 304, no. 15, 2010, pp Centers for Medicare & Medicaid Services. Electronic Health Records (EHR) Demonstration. Demonstration Summary. Accessed July 25, 2012, at Medicare/Demonstration-Projects/DemoProjectsEvalRpts/downloads/EHR _DemoSummary.pdf. 11 Centers for Medicare & Medicaid Services. Update: On August 11, 2011, CMS announced that the demonstration would end early. Accessed July 24, 2012, at Demonstrations-Items/CMS html. 20
22 12 Felt-Lisk, S., R. Shapiro, C. Fleming, et al. Evaluation of the Electronic Health Record Demonstration: Implementation Report 2010, Appendix A. Princeton, NJ: Mathematica Policy Research, Accessed October 5, 2012, at Systems/Statistics-Trends-and-Reports/Reports/downloads/Felt-Lisk_EHRD_Final_2010.pdf. 13 STATA software, release 11. College Station, TX: StataCorp, DesRoches, C.M., E.G. Campbell, R.R. Rao, K. Donelan, T.G. Ferris, A. Jha, R. Kaushal, D.E. Levy, S. Rosenbaum, A.E. Shields, and D. Blumenthal. Electronic Health Records in Ambulatory Care A National Survey of Physicians. New England Journal of Medicine, vol. 359, no. 1, 2008, pp Fernandopulle, R., and N. Patel. How the Electronic Health Record Did Not Measure Up to the Demands of Our Medical Home Practice. Health Affairs, vol. 29, no. 4, 2010, pp Baron, R.J., E.L. Fabens, M. Schiffman, et al. Electronic Health Records: Just Around the Corner? Or Over the Cliff? Annals of Internal Medicine, vol. 143, no. 3, 2005, pp Miller, R.H., and I. Sim. Physicians Use of Electronic Medical Records: Barriers and Solutions. Health Affairs, vol. 23, no. 2, 2004, pp Frisse, M.E. Health Information Technology: One Step at a Time. Health Affairs, vol. 28, no. 2, 2009, pp. w-379 w
23 SUPPLEMENTARY APPENDIX Table A.1. EHR Functions Associated with the Five OSS Domains Domain Completeness of Information Communication About Care Outside the Practice Functions Paper records transitioned to the EHR system Paper charts pulled for recent visits Method to transition paper records Clinical notes for individual patients Allergy lists for individual patients Problem/diagnosis lists for individual patients Patient demographics Patient medical histories Record that instructions/educational information was given to patients Print/fax lab orders Fax lab orders electronically from system Transmit lab orders electronically directly from system to facility with capability to receive Print/fax imaging orders Fax imaging orders electronically from system Transmit imaging orders electronically directly from system to facility with capability to receive Transfer electronic lab results (received in non-machine-readable format) directly into system Enter electronic lab results manually Receive electronically transmitted lab results directly into system Transfer electronic imaging results (received in non-machine-readable format) directly into system Record/enter new prescriptions and refills Record/enter lab orders Scan paper lab results Review lab results electronically Record/enter imaging orders Scan paper imaging results Review imaging results electronically Enter electronic imaging results manually into electronic system (whether received by fax, mail, or telephone) Receive electronically transmitted imaging results directly into system Enter requests for referrals/consultation Transmit medication lists/information Transmit lab results (machinereadable) Transmit imaging results (machinereadable) Receive electronically transmitted reports directly into system Print prescriptions, fax to pharmacy/hand to patient Fax prescription orders electronically from system Transmit prescription orders electronically directly from system to pharmacy with capability to receive 22
24 Domain Clinical Decision Support Use of the System to Increase Patient Engagement/ Adherence Medication Safety Functions Enter clinical notes into templates View graphs of height/weight data over time View graphs of vital signs data over time Flag incomplete/overdue test results Highlight out-of-range test levels View graphs of lab/test results over time Prompt clinicians to order tests/studies Review and act on reminders at the time of the patient encounter Manage telephone calls Exchange secure messages with patients Patients view records online Patients update information online Patients request appointments online (not scored) Patients request referrals online (not scored) Produce hard-copy or electronic reminders for patients about needed tests, studies, or other services Maintain medication list Generate new prescriptions Generate prescription refills Select medication (from a drop-down list, for example) Reference information on medications Reference guidelines when prescribing Search for or generate a list of patients: - Requiring a specific intervention - on a specific medication - Who are due for a lab or other test - Who fit a set of criteria (age, for example) Generate written or electronic educational information to help patients* understand their condition or medication Create written care plans to help guide patients* in self-management Prompt provider to review patient self-management plan with patient* during a visit Modify self-management plan as needed following a patient* visit Identify generic or less expensive brand alternatives at time of prescription entry Reference drug formularies to recommend preferred drugs Calculate appropriate dose/frequency Screen prescriptions for drug allergies, drug-drug interactions, drug-lab interactions, and drugdisease interactions * Congestive heart failure, coronary artery disease, diabetes, and preventive care patients. OSS = Office Systems Survey. 23
25 Table A.2. Description of Sensitivity Tests Sensitivity Test Compare respondents to the 2011 OSS to the full sample of randomized practices eligible to respond to the OSS Estimate model with and without nonresponse weights Estimate model with and without baseline control variables Rationale for the Test To assess nonresponse bias To test if results were sensitive to nonresponse weights To test if results were sensitive to baseline control (stratifying and use of health IT functions) variables OSS = Office Systems Survey. 24
26 Improving public well-being by conducting high-quality, objective research and surveys PRINCETON, NJ - ANN ARBOR, MI - CAMBRIDGE, MA - CHICAGO, IL - OAKLAND, CA - WASHINGTON, DC Mathematica is a registered trademark of Mathematica Policy Research, Inc.
Adoption of Electronic Health Record Systems among U.S. Non- Federal Acute Care Hospitals: 2008-2014
ONC Data Brief No. 23 April 2015 Adoption of Electronic Health Record Systems among U.S. Non- Federal Acute Care Hospitals: 2008-2014 Dustin Charles, MPH; Meghan Gabriel, PhD; Talisha Searcy, MPA, MA The
ONC Data Brief No. 9 March 2013. Adoption of Electronic Health Record Systems among U.S. Non-federal Acute Care Hospitals: 2008-2012
ONC Data Brief No. 9 March 2013 Adoption of Electronic Health Record Systems among U.S. Non-federal Acute Care Hospitals: 2008-2012 Dustin Charles, MPH; Jennifer King, PhD; Vaishali Patel, PhD; Michael
Adoption of Electronic Health Record Systems among U.S. Non-federal Acute Care Hospitals: 2008-2013
ONC Data Brief No. 16 May 2014 Adoption of Electronic Health Record Systems among U.S. Non-federal Acute Care Hospitals: 2008-2013 Dustin Charles, MPH; Meghan Gabriel, PhD; Michael F. Furukawa, PhD The
OPTIMIZING THE USE OF YOUR ELECTRONIC HEALTH RECORD. A collaborative training offered by Highmark and the Pittsburgh Regional Health Initiative
OPTIMIZING THE USE OF YOUR ELECTRONIC HEALTH RECORD A collaborative training offered by Highmark and the Pittsburgh Regional Health Initiative Introductions Disclosures Successful completion of training
Use and Characteristics of Electronic Health Record Systems Among Office-based Physician Practices: United States, 2001 2013
Use and Characteristics of Electronic Health Record Systems Among Office-based Physician Practices: United States, 2001 2013 Chun-Ju Hsiao, Ph.D., and Esther Hing, M.P.H. Key findings In 2013, 78% of office-based
GAO ELECTRONIC HEALTH RECORDS. First Year of CMS s Incentive Programs Shows Opportunities to Improve Processes to Verify Providers Met Requirements
GAO United States Government Accountability Office Report to Congressional Committees April 2012 ELECTRONIC HEALTH RECORDS First Year of CMS s Incentive Programs Shows Opportunities to Improve Processes
MEANINGFUL USE STAGE 2 REQUIREMENTS FOR ELIGIBLE PROVIDERS USING CERTIFIED EMR TECHNOLOGY
MEANINGFUL USE STAGE 2 REQUIREMENTS FOR ELIGIBLE PROVIDERS USING CERTIFIED EMR TECHNOLOGY On August 24, the Centers for Medicare & Medicaid Services (CMS) posted the much anticipated final rule for Stage
MEANINGFUL USE STAGE 2 2015 FOR ELIGIBLE PROVIDERS USING CERTIFIED EMR TECHNOLOGY
MEANINGFUL USE STAGE 2 2015 FOR ELIGIBLE PROVIDERS USING CERTIFIED EMR TECHNOLOGY STAGE 2 REQUIREMENTS EPs must meet or qualify for an exclusion to 17 core objectives EPs must meet 3 of the 6 menu measures.
STAGE 2 of the EHR Incentive Programs
EHR Incentive Programs A program administered by the Centers for Medicare & Medicaid Services (CMS) Eligible Professional s Guide to STAGE 2 of the EHR Incentive Programs September 2013 TABLE OF CONTENTS...
Stage 2 of Meaningful Use Summary of Proposed Rule
Stage 2 of Meaningful Use Summary of Proposed Rule Background In order to receive incentives for the adoption of electronic health records (EHRs) under either the Medicare or Medicaid (Medi-Cal) incentive
Meaningful Use Objectives
Meaningful Use Objectives The purpose of the electronic health records (EHR) incentive program is not so much the adoption of health information technology (HIT), but rather how HIT can further the goals
Meaningful Use - The Basics
Meaningful Use - The Basics Presented by PaperFree Florida 1 Topics Meaningful Use Stage 1 Meaningful Use Barriers: Observations from the field Help and Questions 2 What is Meaningful Use Meaningful Use
Going beyond Meaningful Use with EMR solutions from the Centricity portfolio
Going beyond Meaningful Use with EMR solutions from the Centricity portfolio The IT tools and services you need now. The support you need for the future. GE Healthcare is focused on providing customers
Medicaid EHR Incentive Program. Focus on Stage 2. Kim Davis-Allen, Outreach Coordinator [email protected]
Medicaid EHR Incentive Program Focus on Stage 2 Kim Davis-Allen, Outreach Coordinator [email protected] Understanding Participation Program Year Program Year January 1 st - December 31st. Year
Meaningful Use Criteria for Eligible Hospitals and Eligible Professionals (EPs)
Meaningful Use Criteria for Eligible and Eligible Professionals (EPs) Under the Electronic Health Record (EHR) meaningful use final rules established by the Centers for Medicare and Medicaid Services (CMS),
Meaningful Use Updates. HIT Summit September 19, 2015
Meaningful Use Updates HIT Summit September 19, 2015 Meaningful Use Updates Nadine Owen, BS,CHTS-IS, CHTS-IM Health IT Analyst Hawaii Health Information Exchange No other relevant financial disclosures.
AAP Meaningful Use: Certified EHR Technology Criteria
AAP Meaningful Use: Certified EHR Technology Criteria On July 13, 2010, the US Centers for Medicare and Medicaid Services (CMS) released a Final Rule establishing the criteria with which eligible pediatricians,
Presented by. Terri Gonzalez Director of Practice Improvement North Carolina Medical Society
Presented by Terri Gonzalez Director of Practice Improvement North Carolina Medical Society Meaningful Use is using certified EHR technology to: Improve quality, safety, efficiency, and reduce errors Engage
Issue Brief Findings from HSC
Issue Brief Findings from HSC NO. 133 JULY 2010 EVEN WHEN PHYSICIANS ADOPT E-PRESCRIBING, USE OF ADVANCED FEATURES LAGS By Joy M. Grossman Physician practice adoption of electronic prescribing has not
Where to Begin? Auditing the Current EHR System
Chapter 1 Where to Begin? Auditing the Current EHR System After implementation, allow for a period of stabilization, so physicians and employees can gain more comfort using the electronic health record
Health Information Technology in Healthcare: Frequently Asked Questions (FAQ) 1
Health Information Technology in Healthcare: Frequently Asked Questions (FAQ) 1 1. What is an Electronic Health Record (EHR), an Electronic Medical Record (EMR), a Personal Health Record (PHR) and e-prescribing?
An Overview of Meaningful Use: FAQs
An Overview of Meaningful Use: FAQs On Feb. 17, 2009, President Obama signed the American Recovery and Reinvestment Act of 2009 (ARRA) into law. This new law includes provisions (known as the HITECH Act)
Health Care February 28, 2012. CMS Issues Proposed Rule on Stage 2 Meaningful Use,
ROPES & GRAY ALERT Health Care February 28, 2012 CMS Issues Proposed Rule on Stage 2 Meaningful Use, ONC Issues Companion Proposed Rule on 2014 EHR Certification Criteria On February 23, 2012, the Centers
Incentives to Accelerate EHR Adoption
Incentives to Accelerate EHR Adoption The passage of the American Recovery and Reinvestment Act (ARRA) of 2009 provides incentives for eligible professionals (EPs) to adopt and use electronic health records
Meaningful Use. Goals and Principles
Meaningful Use Goals and Principles 1 HISTORY OF MEANINGFUL USE American Recovery and Reinvestment Act, 2009 Two Programs Medicare Medicaid 3 Stages 2 ULTIMATE GOAL Enhance the quality of patient care
On the Road to Meaningful Use of EHRs:
On the Road to Meaningful Use of EHRs: A Survey of California Physicians June 2012 On the Road to Meaningful Use of EHRs: A Survey of California Physicians Prepared for California HealthCare Foundation
A Guide to Understanding and Qualifying for Meaningful Use Incentives
A Guide to Understanding and Qualifying for Meaningful Use Incentives A White Paper by DrFirst Copyright 2000-2012 DrFirst All Rights Reserved. 1 Table of Contents Understanding and Qualifying for Meaningful
Meaningful Use Rules Proposed for Electronic Health Record Incentives Under HITECH Act By: Cherilyn G. Murer, JD, CRA
Meaningful Use Rules Proposed for Electronic Health Record Incentives Under HITECH Act By: Cherilyn G. Murer, JD, CRA Introduction On December 30, 2009, The Centers for Medicare & Medicaid Services (CMS)
Meaningful Use Stage 2 MU Audits
Meaningful Use Stage 2 MU Audits Presented by: Deb Anderson, CPHIMS HTS Consultant HTS, a division of Mountain Pacific Quality Health Foundation 1 CEHRT Certified Electronic Health Record Technology (EHR)
EHR Meaningful Use Incentives for School-Based Health Clinics
EHR Meaningful Use Incentives for School-Based Health Clinics Denise Holmes Institute for Health Care Studies Michigan State University September 27, 2011 Background The Health Information Technology for
Appendix 2. PCMH 2014 and CMS Stage 2 Meaningful Use Requirements
Appendix 2 PCMH 2014 and CMS Stage 2 Meaningful Use Requirements Appendix 2 PCMH 2014 and CMS Stage 2 Meaningful Use Requirements 2-1 APPENDIX 2 PCMH 2014 AND CMS STAGE 2 MEANINGFUL USE REQUIREMENTS Medicare
Meaningful Use. Medicare and Medicaid EHR Incentive Programs
Meaningful Use Medicare and Medicaid Table of Contents What is Meaningful Use?... 1 Table 1: Patient Benefits... 2 What is an EP?... 4 How are Registration and Attestation Being Handled?... 5 What are
Overview of MU Stage 2 Joel White, Health IT Now
Overview of MU Stage 2 Joel White, Health IT Now 1 Agenda 1. Introduction 2. Context 3. Adoption Rates of HIT 4. Overview of Stage 2 Rules 5. Overview of Issues 6. Trend in Standards: Recommendations v.
EHR Incentive Programs for Eligible Professionals: What You Need to Know for 2015 Tipsheet
EHR Incentive Programs for Eligible Professionals: What You Need to Know for 2015 Tipsheet CMS recently published a final rule that specifies criteria that eligible professionals (EPs), eligible hospitals,
VIII. Dentist Crosswalk
Page 27 VIII. Dentist Crosswalk Overview The final rule on meaningful use requires that an Eligible Professional (EP) report on both clinical quality measures and functional objectives and measures. While
Eligible Professionals please see the document: MEDITECH Prepares You for Stage 2 of Meaningful Use: Eligible Professionals.
s Preparing for Meaningful Use in 2014 MEDITECH (Updated December 2013) Professionals please see the document: MEDITECH Prepares You for Stage 2 of Meaningful Use: Professionals. Congratulations to our
Guide To Meaningful Use
Guide To Meaningful Use Volume 1 Collecting the Data Contents INTRODUCTION... 3 CORE SET... 4 1. DEMOGRAPHICS... 5 2. VITAL SIGNS... 6 3. PROBLEM LIST... 8 4. MAINTAIN ACTIVE MEDICATIONS LIST... 9 5. MEDICATION
STAGES 1 AND 2 REQUIREMENTS FOR MEETING MEANINGFUL USE OF EHRs 1
STAGES 1 AND 2 REQUIREMENTS FOR MEETING MEANINGFUL USE OF EHRs 1 Requirement CPOE Use CPOE for medication orders directly entered by any licensed health care professional who can enter orders into the
Health Information Technology
Health Information Technology chartbook volume II Maine Hospitals Survey 2010 UNIVERSITY OF SOUTHERN MAINE Health Information Technology Maine Hospitals Survey Volume II Authors Martha Elbaum Williamson,
Agenda. Overview of Stage 2 Final Rule Impact to Program
Electronic Health Record (EHR) Incentive Payment Program Review of Meaningful Use Stage 2 Regulation Changes and Other Impacts to the Medicaid EHR Incentive Program for 2014 that combines the effective
Andrew C. Bledsoe, MBA, CHPA, PCMH CCE Executive Director. Northeast KY Regional Health Information Organization. www.nekyrhio.org
Andrew C. Bledsoe, MBA, CHPA, PCMH CCE Executive Director Northeast KY Regional Health Information Organization www.nekyrhio.org NCQA Program Setup Standards Six Standards Outline Program Elements Six
MEANINGFUL USE. Community Center Readiness Guide Additional Resource #13 Meaningful Use Implementation Tracking Tool (Template) CONTENTS:
Community Center Readiness Guide Additional Resource #13 Meaningful Use Implementation Tracking Tool (Template) MEANINGFUL USE HITECH s goal is not adoption alone but meaningful use of EHRs that is, their
Meaningful Use 2015 and beyond. Presented by: Anna Mrvelj EMR Training Specialist
Meaningful Use 2015 and beyond Presented by: Anna Mrvelj EMR Training Specialist 1 Agenda A look at the CMS Website Finding your EMR version Certification Number Proposed Rule by the Centers for Medicare
Health Information Technology in the United States: Progress and Challenges Ahead, 2014
Health Information Technology in the United States: Progress and Challenges Ahead, 2014 About the Robert Wood Johnson Foundation For more than 40 years the Robert Wood Johnson Foundation has worked to
Achieving Meaningful Use in Private Practice. A close examination of Stage 2 requirements
Achieving Meaningful Use in Private Practice A close examination of Stage 2 requirements Abstract As part of the American Recovery and Reinvestment Act of 2009, the Federal Government laid the groundwork
Health Information Technology in the United States: Information Base for Progress. Executive Summary
Health Information Technology in the United States: Information Base for Progress 2006 The Executive Summary About the Robert Wood Johnson Foundation The Robert Wood Johnson Foundation focuses on the pressing
Physician Motivations for Adoption of Electronic Health Records Dawn Heisey-Grove, MPH; Vaishali Patel, PhD
ONC Data Brief No. 21 December 2014 Physician Motivations for Adoption of Electronic Health Records Dawn Heisey-Grove, MPH; Vaishali Patel, PhD In 2009, Congress committed to supporting the adoption and
Meaningful Use Stage 1:
Whitepaper Meaningful Use Stage 1: EHR Incentive Program Information -------------------------------------------------------------- Daw Systems, Inc. UPDATED: November 2012 This document is designed to
Achieving Meaningful Use Training Manual
Achieving Meaningful Use Training Manual Terms EP Eligible Professional Medicare Eligible Professional o Doctor of Medicine or Osteopathy o Doctor of Dental Surgery or Dental Medicine o Doctor of Podiatric
ACCOUNTABLE CARE ANALYTICS: DEVELOPING A TRUSTED 360 DEGREE VIEW OF THE PATIENT
ACCOUNTABLE CARE ANALYTICS: DEVELOPING A TRUSTED 360 DEGREE VIEW OF THE PATIENT Accountable Care Analytics: Developing a Trusted 360 Degree View of the Patient Introduction Recent federal regulations have
Stage 1 Meaningful Use - Attestation Worksheet: Core Measures
Stage 1 Meaningful Use - Attestation Worksheet: Core Measures Core Measures Objective # Objective Title / Explanation Goal Attestation Response - Values below reflect reponses of most radiologists Explanation
1. Introduction - Nevada E-Health Survey
1. Introduction - Nevada E-Health Survey Welcome to the Nevada E-Health Survey for health care professional providers and hospitals. The Office of Health Information Technology (OHIT) for the State of
MEDICAL ASSISTANCE BULLETIN
ISSUE DATE April 8, 2011 EFFECTIVE DATE April 8, 2011 MEDICAL ASSISTANCE BULLETIN NUMBER 03-11-01, 09-11-02, 14-11-01, 18-11-01 24-11-03, 27-11-02, 31-11-02, 33-11-02 SUBJECT Electronic Prescribing Internet-based
REQUIREMENTS GUIDE: How to Qualify for EHR Stimulus Funds under ARRA
REQUIREMENTS GUIDE: How to Qualify for EHR Stimulus Funds under ARRA Meaningful Use & Certified EHR Technology The American Recovery and Reinvestment Act (ARRA) set aside nearly $20 billion in incentive
eprescribing Information to Improve Medication Adherence
eprescribing Information to Improve Medication Adherence January 2014 This white paper was funded by the Pharmaceutical Research and Manufacturers of America. About Point-of-Care Partners Point-of-Care
EHR Incentive Program Stage 2 Objectives Summary CORE OBJECTIVES (You must meet all objectives unless exclusion applies.)
EHR Incentive Program Stage 2 Objectives Summary CORE OBJECTIVES (You must meet all objectives unless exclusion applies.) TARGETING CANCER CARE Objective Objective Description Measure/Attestation Requirement
NCQA Standards Workshop Patient-Centered Medical Home PCMH 2011. Part 1: Standards 1-3
NCQA Standards Workshop PCMH 2011 Part 1: Standards 1-3 Agenda: Part 1 Overview Content of PCMH 2011 Standards 1 3 Documentation examples* * Examples in the presentation only illustrate the element intent.
What GI Practices Need to Know About the Electronic Health Record Incentive Program. Joel V. Brill, MD, AGAF Lawrence R. Kosinski, MD, MBA, AGAF
What GI Practices Need to Know About the Electronic Health Record Incentive Program Joel V. Brill, MD, AGAF Lawrence R. Kosinski, MD, MBA, AGAF Disclosures Joel V. Brill, MD AGAF AGA Registry Executive
Agenda. What is Meaningful Use? Stage 2 - Meaningful Use Core Set. Stage 2 - Menu Set. Clinical Quality Measures (CQM) Clinical Considerations
AQAF Health Information Technology Forum Meaningful Use Stage 2 Clinical Considerations Marla Clinkscales & Mike Bice Alabama Regional Extension Center (ALREC) August 13, 2013 0 Agenda What is Meaningful
2013 Meaningful Use Dashboard Calculation Guide
2013 Meaningful Use Dashboard Calculation Guide Learn how to use Practice Fusion s Meaningful Use Dashboard to help you achieve Meaningful Use. For more information, visit the Meaningful Use Center. General
About NEHI: NEHI is a national health policy institute focused on enabling innovation to improve health care quality and lower health care costs.
1 Aaron McKethan PhD ([email protected]) About NEHI: NEHI is a national health policy institute focused on enabling innovation to improve health care quality and lower health care costs. In partnership
Meaningful Use Stage 2. Presenter: Linda Wise, EMR Training Specialist
Meaningful Use Stage 2 Presenter: Linda Wise, EMR Training Specialist 1 AGENDA 2 Agenda Meaningful Use in Review Moving Into Stage 2 Meaningful Use Learning the Requirements Understanding the Measures
IMS Meaningful Use Webinar
IMS Meaningful Use Webinar Presented on: May 9 11:00am 12:00pm (PDT) May 13 12:00pm 1:00pm (EST) This Webinar Will Be Recorded! Please send questions that you may have after the session to: [email protected]
Meaningful Use Stage 3 Proposed Rule: What it Means for Hospitals, Physicians & Health IT Developers
Meaningful Use Stage 3 Rule: What it Means for Hospitals, Physicians & Health IT Developers Vernessa T. Pollard and Nicole Liffrig Molife April 2015 With the publication of the Stage 3 Meaningful Use Rule
Reporting Period: For Stage 2, the reporting period must be the entire Federal Fiscal Year.
Eligible Hospital and Critical Access Hospital (CAH) Attestation Worksheet for Stage 2 of the Medicare Electronic Health Record (EHR) Incentive Program The Eligible Hospital and CAH Attestation Worksheet
HITECH Act Update: An Overview of the Medicare and Medicaid EHR Incentive Programs Regulations
HITECH Act Update: An Overview of the Medicare and Medicaid EHR Incentive Programs Regulations The Health Information Technology for Economic and Clinical Health Act (HITECH Act) was enacted as part of
Vendor Evaluation Matrix
Vendor Evaluation Matrix Before evaluating vendors: Categorize each function or usability characteristic as a H (high priority), M (medium priority) and L (low priority). Think of additional functions
MEDICFUSION / HERFERT. MEANINGFUL USE STAGE 1 and 2 ATTESTATION GUIDE 2015
MEDICFUSION / HERFERT MEANINGFUL USE STAGE 1 and 2 ATTESTATION GUIDE 2015 The following document is intended to aid in preparation for gathering necessary information to attest in early 2016. All Medicfusion
Measure Information Form (MIF) #275, adapted for quality measurement in Medicare Accountable Care Organizations
ACO #9 Prevention Quality Indicator (PQI): Ambulatory Sensitive Conditions Admissions for Chronic Obstructive Pulmonary Disease (COPD) or Asthma in Older Adults Data Source Measure Information Form (MIF)
Meaningful Use Qualification Plan
Meaningful Use Qualification Plan Overview Certified EHR technology used in a meaningful way is one piece of a broader Health Information Technology infrastructure intended to reform the health care system
Advancing Health Equity. Through national health care quality standards
Advancing Health Equity Through national health care quality standards TABLE OF CONTENTS Stage 1 Requirements for Certified Electronic Health Records... 3 Proposed Stage 2 Requirements for Certified Electronic
Meaningful Use Cheat Sheet CORE MEASURES: ALL REQUIRED # Measure Exclusions How to Meet in WEBeDoctor
Meaningful Use Cheat Sheet CORE MEASURES: ALL REQUIRED # Measure Exclusions How to Meet in WEBeDoctor 1 CPOE (Computerized Physician Order Entry) More than 30 percent of all unique patients with at least
Meaningful Use Stage 2
Meaningful Use Stage 2 Presented by: Sarah Leake, HTS Consultant HTS, a division of Mountain Pacific Quality Health Foundation 1 HTS Who We Are Stage 2 MU Overview Learning Objectives 2014 CEHRT Certification
Meaningful Use in 2015 and Beyond Changes for Stage 2
Meaningful Use in 2015 and Beyond Changes for Stage 2 Jennifer Boaz Transformation Support Specialist Proprietary 1 Definitions AIU = Adopt, Implement or Upgrade EP = Eligible Professional API = Application
Stage 2 Meaningful Use What the Future Holds. Lindsey Wiley, MHA HIT Manager Oklahoma Foundation for Medical Quality
Stage 2 Meaningful Use What the Future Holds Lindsey Wiley, MHA HIT Manager Oklahoma Foundation for Medical Quality An Important Reminder For audio, you must use your phone: Step 1: Call (866) 906-0123.
