Special Article

Effect of the Transformation of the Veterans Affairs Health Care System on the Quality of Care

List of authors.
  • Ashish K. Jha, M.D.,
  • Jonathan B. Perlin, M.D., Ph.D.,
  • Kenneth W. Kizer, M.D., M.P.H.,
  • and R. Adams Dudley, M.D., M.B.A.

Abstract

Background

In the mid-1990s, the Department of Veterans Affairs (VA) health care system initiated a systemwide reengineering to, among other things, improve its quality of care. We sought to determine the subsequent change in the quality of health care and to compare the quality with that of the Medicare fee-for-service program.

Methods

Using data from an ongoing performance-evaluation program in the VA, we evaluated the quality of preventive, acute, and chronic care. We assessed the change in quality-of-care indicators from 1994 (before reengineering) through 2000 and compared the quality of care with that afforded by the Medicare fee-for-service system, using the same indicators of quality.

Results

In fiscal year 2000, throughout the VA system, the percentage of patients receiving appropriate care was 90 percent or greater for 9 of 17 quality-of-care indicators and exceeded 70 percent for 13 of 17 indicators. There were statistically significant improvements in quality from 1994–1995 through 2000 for all nine indicators that were collected in all years. As compared with the Medicare fee-for-service program, the VA performed significantly better on all 11 similar quality indicators for the period from 1997 through 1999. In 2000, the VA outperformed Medicare on 12 of 13 indicators.

Conclusions

The quality of care in the VA health care system substantially improved after the implementation of a systemwide reengineering and, during the period from 1997 through 2000, was significantly better than that in the Medicare fee-for-service program. These data suggest that the quality-improvement initiatives adopted by the VA in the mid-1990s were effective.

Introduction

The quality of health care in the United States is variable and too often inadequate.1-10 The Veterans Health Administration in the Department of Veterans Affairs (VA) has been criticized for poor quality of care.11-14 In 1995, the VA launched a major reengineering of its health care system with aims that included better use of information technology, measurement and reporting of performance, and integration of services and realigned payment policies.15-19

We sought to determine how the quality of care provided by the VA changed after reengineering and to compare the quality of care with that provided by another government-funded health care program, the Medicare fee-for-service system. We used measures of quality that primarily focus on process, rather than outcomes, to assess the short-term effect of quality-improvement initiatives, since processes can be changed more quickly and typically do not require risk adjustment.4

Methods

Design

We used data from the VA's External Peer Review Program20 to assess the quality of care from 1994 through 2000 (such data were not available before 1994). Base-line data were collected in years 1994 and 1995, just before reengineering was initiated, and annually starting in 1997. We used performance scores for the VA health care system for years 1994 and 1995 as base-line values and evaluated changes in performance scores through 2000. Data from the External Peer Review Program are collected by abstracters trained by the West Virginia Medical Institute, a professional review organization with extensive experience and programs to ensure reliable and accurate data collection. Analyses of these data suggest high interrater reliability (kappa = 0.9).

Comparison of the VA with Medicare

We chose performance indicators in the External Peer Review Program for which there are comparable national data from the Medicare fee-for-service system.4,5 To our knowledge, there have been no prior national quality-of-care comparisons of the VA health care system and Medicare, but Jencks and colleagues recently reported the results of a national survey of the quality of care provided to Medicare beneficiaries in the fee-for-service component of the program.4,5 They used two sets of data: the first covered the period from 1997 through 1999, and the second the period from 2000 through 2001. Since the Medicare sample for screening-mammography rates included women 52 to 69 years of age, we included all female VA patients in that age range for that comparison. Because Medicare data included adults with diabetes who were younger than 75 years of age, we included only VA patients who were younger than 75 years old. Because the Medicare data on the rates of influenza and pneumococcal vaccination included only patients living in the community, we excluded any VA patients who were in nursing homes at the time of the survey for that comparison.

Sampling

Data from the External Peer Review Program are obtained on an ongoing basis from cross-sectional samples. From 1994 through 1999, patients were eligible to be included in a sample if they had made three or more visits to any VA primary care or specialty clinic in the previous 12 months (i.e., two or more visits before the visit in question). In 2000, the sampling frame changed to include patients with 2 years of continuous enrollment in the VA who had made only one or more visits in the previous 12 months. There was a 92.6 percent rate of concordance between these two sampling schemes (i.e., 92.6 percent of those who were in the sampling frame in 2000, with its looser eligibility standards, would have been in the sampling frame with the use of the prior eligibility rules).

Table 1. Table 1. Quality-of-Care Indicators and Sampling Frame Used for Veterans Affairs (VA) and Medicare Patients.

Among eligible patients, two types of sampling were done. First, a large enough random sample of all patients was obtained to ensure that the data would represent stable estimates for each of the VA's 22 regional networks. Then, random samples were obtained of patients with common chronic diseases (e.g., diabetes, congestive heart failure, ischemic heart disease, and chronic obstructive pulmonary disease, identified by searching VA inpatient and outpatient data bases for specific International Classification of Diseases, 9th revision, codes). Visits made by VA employees were excluded because most such patients were not otherwise enrolled in VA health care and usually received their care elsewhere. Eligibility criteria and the sampling frame for all performance measures are described in Table 1.

Medicare data were obtained from published data4,5 on reported rates and ranges of sample sizes for each state. These data were collected by the Medicare program with the use of relatively similar sampling schemes.4,5 The comparable measures of inpatient care involved patients with myocardial infarction or congestive heart failure. These data were collected with the use of a random sample of up to 750 patients per state who were discharged with the principal diagnosis of acute myocardial infarction or congestive heart failure. Patients with contraindications to the therapy of choice (aspirin, beta-blockers, or angiotensin-converting–enzyme inhibitors in those with an ejection fraction of less than 40 percent) were excluded from the analysis.

Vaccination rates were obtained from the Behavioral Risk Factor Surveillance System of the Centers for Disease Control and Prevention. Data from this system were obtained through random-digit-dialing telephone surveys of noninstitutionalized adults. These estimates are for all persons 65 years or older; the median number of subjects was 430 per state in 1997. Mammography rates were calculated by determining whether Medicare had paid a claim for a diagnostic or screening mammogram in the previous two years. The sample of patients with diabetes consisted of randomly selected fee-for-service beneficiaries who had had two separate outpatient claims for diabetes or one such inpatient claim within the 12 months preceding the study period. The sampling criteria are further described in Table 1.

Measurements

We used all available measures of quality from the External Peer Review Program that were similar over time to assess the long-term quality of care. We included all VA performance data that were comparable to those for Medicare. These comprise frequently used measures of the quality of prevention (e.g., vaccinations and screening tests), outpatient care of chronic diseases (e.g., annual retinal examinations in patients with diabetes), and inpatient care (e.g., treatment with aspirin after an acute myocardial infarction). The specific performance measures chosen are listed in Table 1. Data on the control of hypertension were not available from the External Peer Review Program in the sample for 1994 through 1995; therefore, we used data from a study of 800 veterans with a mean age of 65.5 years.22 Since not all of these quality processes were measured in all years, results are reported when available.

Statistical Analysis

We calculated rates of services provided each year by dividing the number of eligible patients in a sample by the number who met the criteria for the service. We used a chi-square test for trend to assess whether performance was improving during the sampling period. All analyses were prespecified, and all P values are based on two-sided tests.

National Medicare sample sizes were calculated from published reports.4,5 Since each state has a range of sample sizes listed for each measure of quality, in order to be conservative, each state was assigned the smallest number of patients in its sample range (i.e., if a state had a reported sample size of 31 to 100, we assigned that state a sample size of 31). We also performed sensitivity analyses by using the largest sample sizes in the published range to calculate national sample sizes. Because our assumptions about sample sizes had no significant effect on the results of comparisons of Medicare and VA data, we present only the smallest sample sizes. All analyses were performed with the use of Stata software, version 7.0.

Results

Table 2. Table 2. Performance of the Veterans Affairs Program in Fiscal Years 1994–1995 through 2000.

The number of patients included in the External Peer Review Program sample varied from year to year, but 48,505 patients were included in the base-line data collection in the period from 1994 through 1995, and the size of each annual sample rose consistently until 2000, when the total was 84,503 patients. In the period from 1994 through 1995, the performance of the VA health care system was poor in nearly all areas, ranging from a 27 percent rate of pneumococcal vaccination to a 64 percent rate of breast-cancer screening among female patients (Table 2). The rates of aspirin and beta-blocker use were better, with 89 percent of patients who were admitted with a myocardial infarction receiving aspirin at the time of discharge.

The first batch of data collected after the implementation of the reengineering efforts (in 1997) showed improved performance in all areas, with pneumococcal and influenza vaccination rates more than doubling, substantial increases in the rates of appropriate diabetes management, and improvements in inpatient management of acute myocardial infarction. Performance rose steadily throughout the 1990s, and by 2000, high rates of screening and vaccination, management of chronic diseases, and inpatient care were reported. There were moderate improvements in the rates of hypertension control, eye examination among patients with diabetes, and screening for colorectal cancer. For the 13 measures for which multiyear data were available, there were significant improvements in 12 measures (P for trend <0.001 by the chi-square test).

Table 3. Table 3. Comparison of the Performance of the Veterans Affairs (VA) and Medicare Programs from 1997 through 2001.

When we compared VA performance scores from 1997 through 1999 among veterans who met the age and clinical criteria used to assess the quality of care received by patients in the Medicare fee-for-service system, we found 11 overlapping indicators. The performance of the VA system was significantly better than that of Medicare for all 11 measures (Table 3). The smallest difference was in the rate of annual eye examinations among patients with diabetes (absolute difference, 4 percent; P<0.001), and the largest difference was in the rate of mammography (absolute difference, 33 percent; P<0.001). Similarly, when we compared the performance of the VA system in 2000 with that of Medicare in the period from 2000 through 2001, we found 13 overlapping indicators. The VA system performed better on 12 of these indicators, whereas Medicare had a higher rate of annual eye examinations among patients with diabetes.

Finally, we were concerned that undertreatment of the elderly might bias our comparisons, since Medicare has a higher proportion of elderly patients than does the VA system. Therefore, we assessed the performance of the VA system among patients who were at least 65 years of age and those who were younger than 65 years of age. The performance of the VA system did not vary significantly according to age with respect to inpatient care and chronic disease management and was substantially better with respect to vaccinations among elderly patients than among younger patients. Therefore, if we had restricted our VA sample to those who were at least 65 years old, our conclusions would not have changed.

Discussion

We compared the quality of care in the VA health care system before and after its reengineering and found that the quality of care improved dramatically in all domains studied. These improvements were evident within two years after the system was reengineered and continued through fiscal year 2000. When we compared similar indicators of quality in the VA and Medicare fee-for-service systems during similar time periods, we found that the VA system performed better.

There are several possible explanations for the observed improvement in the VA's performance. We believe that the reengineering of VA health care, which included the implementation of a systematic approach to the measurement of, management of, and accountability for quality, was at the heart of the improvement. Routine performance measurements for high-priority conditions such as diabetes and coronary artery disease, emphasizing health maintenance and management of care, were instituted. Performance contracts held managers accountable for meeting improvement goals. Whenever possible, quality indicators were designed to be similar to performance measures commonly used in the private sector. Data gathering and monitoring were performed by an independent agency — the External Peer Review Program. Critical process improvements, such as an integrated, comprehensive electronic medical-record system, were instituted at all VA medical centers. Finally, performance data were made public and were widely distributed within the VA, among key stakeholders such as veterans' service organizations, and among members of Congress.

Another factor that might have contributed to the improvement in the VA system is a secular trend toward better performance industrywide. The indicators we used are included in either the Health Plan Employer Data and Information Set (HEDIS) for outpatient care or core measures used by the Joint Commission on Accreditation of Healthcare Organizations for inpatient care. The focus on these measures may have stimulated improvement among all providers, including the VA. However, it is unlikely to explain the bulk of the improvement in performance, since the VA is not included in HEDIS and has achieved performance levels well above those of the Medicare fee-for-service system on most indicators.

There are several possible reasons why the VA system outperformed Medicare's fee-for-service system. The structural differences between the two systems would make it difficult to ascribe the VA's superior quality to any one feature in this cross-sectional comparison. However, most of the structural differences between the two systems — such as the VA's centralized decision-making capabilities, salaried physician workforce, educational programs, and fixed budgets — were also present in 1995, when the quality of the VA system was worse. Therefore, the VA's superior quality relative to that of Medicare for the period from 1997 through 2000 probably has more to do with the quality-improvement initiatives that were instituted in the mid-1990s than with structural differences.

Some of the observed differences in performance might reflect differences in sampling. Before 2000, the VA obtained data from patients who had made at least two visits before the index visit, an approach that might have biased the results by potentially including only heavy users of the VA health care system. However, after the criteria were changed in 2000 to require only one prior visit, most of the performance rates remained essentially unchanged or improved. Furthermore, the sample population analyzed in 2000 was very similar to that of Medicare, since most patients in each system made at least one visit to a health care provider.23 Since the average annual number of visits per Medicare enrollee is five,24,25 it is unlikely that the VA's sample populations before 2000 were meaningfully different from those of Medicare.

Another potential explanation for our results is differences in patients between the two systems. However, as compared with Medicare enrollees, users of VA health care are more likely to be in poor health; to have a low level of education, disability, or a low income; to be black; and to have higher rates of psychiatric illness.26-30 These characteristics are associated with receiving poorer quality care,31-33 thus making such differences an unlikely explanation for our findings.

Although operational reorganization and the implementation of quality-management principles, including some recently advocated by the Institute of Medicine,10 were most likely important, other differences between the VA and Medicare systems may have had a role. In particular, since the mid-1990s, the goal setting and resource allocation have been much more centralized in the VA system than in the private sector. Thus, the management structure may have made it more amenable to improvement than the less centralized Medicare fee-for-service system.

Two other important differences between the two systems are that the VA allocates its funds on a modified capitation basis34 and that VA managers know they are likely to care for their patients throughout their lives. The combination of these two factors creates an environment in which investments made in health promotion and care management offer a greater return over time for health care providers in the VA system than for those in the Medicare fee-for-service system.

The VA system did not perform as well on measures of hypertension control and colorectal-cancer screening as it did on other measures. This difference may reflect the greater dependence of these measures on the compliance of patients with medical care (especially drug therapy for hypertension or fecal occult-blood testing) and the fact that the importance of colorectal screening is less well recognized by patients and providers than is, for instance, the importance of mammography.

Our study has several limitations. Most important, our results derive from observational data, so we cannot be certain that the improvement seen reflects the quality-improvement interventions. Although the sampling methods used in the External Peer Review Program and those used by Jencks et al.4,5 are quite similar, they are not identical, and the results may therefore be affected by differences in sampling that are not apparent. However, any difference in sampling is unlikely to account for the large differences observed in performance. In addition, because the indicators we studied represent processes of care or intermediate outcomes, they reflect only selected aspects of the overall quality of care. A full assessment would require the measurement of outcomes such as mortality and patient satisfaction. However, there is strong evidence linking the processes of care we used with clinical outcomes, and as Palmer has suggested, process data may reveal more about the performance of health care providers and organizations than do outcomes data.35

Finally, we were able to measure quality in only a few clinical areas, and though the indicators target common diseases, we could not assess the quality of care provided along the entire spectrum of clinical conditions. We therefore cannot generalize our results to encompass the overall quality of care in the VA system, since the focus on these specific areas by VA management may have led to improvements in the targeted conditions alone. However, we did use data from all the clinical domains for which quality was measured.

Because the VA has electronic medical records, some of the differences we observed may be due to better documentation within the VA system than within Medicare. However, given that the Medicare rates were derived with the use of a combination of patient surveys, billing records, and detailed chart review, any deficiencies in documentation in the Medicare sample should have been mitigated somewhat. Furthermore, there may have been underreporting in the VA data set, since some VA patients get care outside the system and this care may not be documented in the VA records. If there were more complete recording of the care received by such patients, the observed differences would be expected to be greater.

Finally, we do not have detailed cost information about the changes instituted by the VA, and therefore, we cannot consider issues of costs. However, we do know that the budget of the Veterans Health Administration was essentially flat between fiscal years 1995 and 2000 while the number of patients increased by over 40 percent. Further research would be needed to determine whether the costs of the VA's quality initiatives justified the clinical benefits achieved.

In conclusion, the reengineering of the Veterans Health Administration appears to have resulted in dramatic improvements in the quality of care provided to veterans. Many of the principles adopted by the VA in its quality-improvement projects, including an emphasis on the use of information technology, performance measurement and reporting, realigned payment policies, and integration of services to achieve high-quality, effective, and timely care, have recently been recommended for the health care system as a whole by the Institute of Medicine.10 Our findings suggest that initiatives based on these principles may substantially improve the quality of care.

Author Affiliations

From the Office of Quality and Performance, Veterans Health Administration, Washington, D.C. (A.K.J., J.B.P.); the Division of General Internal Medicine, San Francisco Veterans Affairs Medical Center, San Francisco (A.K.J.); the Division of General Internal Medicine, Brigham and Women's Hospital, Boston (A.K.J.); the National Quality Forum, Washington, D.C. (K.W.K.); and the Institute for Health Policy Studies, University of California, San Francisco, San Francisco (R.A.D.).

Address reprint requests to Dr. Dudley at the Institute for Health Policy Studies, Box 0936, 333 California St., Suite 265, San Francisco, CA 94118, or at .

References (35)

  1. 1. Wennberg JE, ed. The Dartmouth atlas of health care 1998. Chicago: American Hospital Publishing, 1998.

  2. 2. Schuster MA, McGlynn EA, Brook RH. How good is the quality of health care in the United States? Milbank Q 1998;76:517-563

  3. 3. Kohn LT, Corrigan JM, Donaldson MS, eds. To err is human: building a safer health system. Washington, D.C.: National Academy Press, 2000.

  4. 4. Jencks SF, Cuerdon T, Burwen DR, et al. Quality of medical care delivered to Medicare beneficiaries: a profile at state and national levels. JAMA 2000;284:1670-1676

  5. 5. Jencks SF, Huff ED, Cuerdon T. Change in the quality of care delivered to Medicare beneficiaries, 1998-1999 to 2000-2001. JAMA 2003;289:305-312

  6. 6. The President's Advisory Commission on Consumer Protection and Quality in the Health Care Industry. Quality first: better health care for all Americans: final report to the president of the United States. Washington, D.C.: Government Printing Office, 1998.

  7. 7. Reducing the costs of poor-quality health care through responsible purchasing leadership. Chicago: Midwest Business Group on Health, 2002.

  8. 8. Asch SM, Sloss EM, Hogan C, Brook RH, Kravitz RL. Measuring underuse of necessary care among elderly Medicare beneficiaries using inpatient and outpatient claims. JAMA 2000;284:2325-2333

  9. 9. Chassin MR, Galvin RW. The urgent need to improve health care quality: Institute of Medicine National Roundtable on Health Care Quality. JAMA 1998;280:1000-1005

  10. 10. Committee on Quality of Health Care in America. Crossing the quality chasm: a new health system for the 21st century. Washington, D.C.: National Academy Press, 2001.

  11. 11. Gardner J. VA on the spot: care quality, oversight to be probed by Congress. Mod Healthc 1998;28:39-39

  12. 12. Kent C. Perspectives: VA under fire for quality of care. Faulkner Grays Med Health 1991;45:Suppl 4p-Suppl 4p

  13. 13. Holloway JJ, Medendorp SV, Bromberg J. Risk factors for early readmission among veterans. Health Serv Res 1990;25:213-237

  14. 14. Zook CJ, Savickis SF, Moore FD. Repeated hospitalization for the same disease: a multiplier of national health costs. Milbank Mem Fund Q Health Soc 1980;58:454-471

  15. 15. Kizer KW. The “new VA“: a national laboratory for health care quality management. Am J Med Qual 1999;14:3-20

  16. 16. Journey of change. Washington, D.C.: Department of Veterans Affairs, 1997.

  17. 17. Kizer KW. Health care, not hospitals: transforming the Veterans Health Administration: In: Dauphinais GW, Price C, eds. Straight from the CEO: the world's top business leaders reveal ideas that every manager can use. New York: Simon & Schuster, 1998.

  18. 18. Demakis JG, McQueen L, Kizer KW, Feussner JR. Quality Enhancement Research Initiative (QUERI): a collaboration between research and clinical practice. Med Care 2000;38:Suppl 1:I-17

  19. 19. Kizer KW. Reengineering the veterans healthcare system. In: Ramsaroop P, Ball MJ, Beaulieu D, Douglas JV, eds. Advancing federal sector health care: a model for technology transfer. New York: Springer-Verlag, 2001:79-96.

  20. 20. Doebbeling BN, Vaughn TE, Woolson RF, et al. Benchmarking Veterans Affairs Medical Centers in the delivery of preventive health services: comparison of methods. Med Care 2002;40:540-554

  21. 21. VHA performance measurement system: technical manual. Washington, D.C.: Office of Quality and Performance, Veterans Health Administration, 2000.

  22. 22. Berlowitz DR, Ash AS, Hickey EC, et al. Inadequate management of blood pressure in a hypertensive population. N Engl J Med 1998;339:1957-1963

  23. 23. Trude S, Colby DC. Monitoring the impact of the Medicare Fee Schedule on access to care for vulnerable populations. J Health Polit Policy Law 1997;22:49-71

  24. 24. Mitchell JB, Menke T. How the physician fee schedule affects Medicare patients' out-of-pocket spending. Inquiry 1990;27:108-113

  25. 25. Older Americans 2000: key indicators of well-being. Vol. 2003. Washington, D.C.: Federal Interagency Forum on Aging-Related Statistics, 2003. (Accessed April 7, 2003, at http://www.agingstats.gov/chartbook2000/healthcare.html#Indicator29.)

  26. 26. Kazis LE, Ren XS, Lee A, et al. Health status in VA patients: results from the Veterans Health Study. Am J Med Qual 1999;14:28-38

  27. 27. Kazis LE, Miller DR, Clark J, et al. Health-related quality of life in patients served by the Department of Veterans Affairs: results from the Veterans Health Study. Arch Intern Med 1998;158:626-632

  28. 28. Jha AK, Shlipak MG, Hosmer W, Frances CD, Browner WS. Racial differences in mortality among men hospitalized in the Veterans Affairs health care system. JAMA 2001;285:297-303

  29. 29. Klein RE. Data on the socioeconomic status of veterans and VA program usage. Washington, D.C.: Veterans Health Administration, 2001.

  30. 30. Wilson NJ, Kizer KW. The VA health care system: an unrecognized national safety net. Health Aff (Millwood) 1997;16:200-204

  31. 31. Fiscella K, Franks P, Gold MR, Clancy CM. Inequality in quality: addressing socioeconomic, racial, and ethnic disparities in health care. JAMA 2000;283:2579-2584

  32. 32. Schneider EC, Zaslavsky AM, Epstein AM. Racial disparities in the quality of care for enrollees in Medicare managed care. JAMA 2002;287:1288-1294

  33. 33. Schneider EC, Cleary PD, Zaslavsky AM, Epstein AM. Racial disparity in influenza vaccination: does managed care narrow the gap between African Americans and whites? JAMA 2001;286:1455-1460

  34. 34. Commitee on Veterans' Affairs. Veterans Equitable Resource Allocation system (VERA). Washington, D.C.: Government Printing Office, 1997.

  35. 35. Palmer RH. Using health outcomes data to compare plans, networks and providers. Int J Qual Health Care 1998;10:477-483

Citing Articles (422)

    Letters

    Figures/Media

    1. Table 1. Quality-of-Care Indicators and Sampling Frame Used for Veterans Affairs (VA) and Medicare Patients.
      Table 1. Quality-of-Care Indicators and Sampling Frame Used for Veterans Affairs (VA) and Medicare Patients.
    2. Table 2. Performance of the Veterans Affairs Program in Fiscal Years 1994–1995 through 2000.
      Table 2. Performance of the Veterans Affairs Program in Fiscal Years 1994–1995 through 2000.
    3. Table 3. Comparison of the Performance of the Veterans Affairs (VA) and Medicare Programs from 1997 through 2001.
      Table 3. Comparison of the Performance of the Veterans Affairs (VA) and Medicare Programs from 1997 through 2001.