Growing a better NIH

A radical way to fix the nation’s medical-research establishment

By Michael M. Crow
June 19, 2011

E-mail this article

Invalid E-mail address
Invalid E-mail address

Sending your article

Your article has been sent.

Text size +

The United States spends around $30 billion a year on the National Institutes of Health, an agency that has been called the “jewel in the crown of the federal government.” The NIH is by far the nation’s most important single funder of medical research — the scientific work that drives our university labs, our drug companies, and our major hospitals — and its budget amounts to an enormous bet that by advancing basic medical science, we can reap improvements in national health care.

In one arena, at least, that bet is paying off: America has become the unquestioned global leader in biomedical science. As it has, the NIH has also become critically important to states like Massachusetts, which reaped more than $2 billion in funding last year, fueling a high-tech economy of high-paying jobs.

But biomedical science is not the same thing as health, and in a very important sense, our investment in the NIH is not fully paying off. The agency’s own mission statement holds that its ultimate goal is applying knowledge to “enhance health, lengthen life, and reduce the burdens of illness and disability.” And on that count, America is doing less impressively. Among the large industrialized countries of the Organisation for Economic Co-operation and Development, the United States spends the most money on its health care both as a share of gross domestic product and per capita, according to a 2009 report — but our life expectancy ranked 24th of the 30 countries in the report. And on numerous other measures — including infant mortality, obesity, cancer survival rates, length of patient stays in hospital, the discrepancy between the care of high versus low income groups — the country fares middling to poor. Our global leadership in research, in other words, has not translated into leadership in health.

To tackle this problem, NIH director Francis Collins made news six months ago by announcing a new “translational research” institute dedicated specifically to converting laboratory findings into medicines and diagnostics that real patients will use. Its supporters hailed the idea as an important step; others in the research sector objected, worried about its impact on the current structure of an agency their work depends on for funding.

But in a larger sense, his proposal — which would create a 28th institute within the NIH bureaucracy — amounts to an admission that America’s medical research establishment is insufficiently focused on outcomes beyond science. For all the money America spends on medical science, its innovations aren’t really improving our health as much as we would hope.

So let me propose a thought experiment: What if the overriding goal of the National Institutes of Health was not further advances in medical research, but actually improving people’s health? Were we to start from scratch, what institutional arrangement would do a better job of improving the health and well-being of the citizens of the United States for the $30 billion we spend on it every year?

Based on three decades of experience designing large-scale knowledge enterprises, including the Earth Institute at Columbia University and, more recently, the reconceptualized Arizona State University, I would argue that a reorganized NIH would look very different from the current agency. Today we understand more fully that progress in health care results not just from scientific breakthroughs, but from knowing how to integrate those advancements with technological, behavioral, social, and cultural shifts. Understanding the impact of behavior on health, for example, can yield dramatic improvements beyond those we derive from knowledge of the basic biological science behind the disease; and even the most innovative technological advances are ineffective without improvements in clinical practice. To improve Americans’ health, then, the NIH would need to be reconfigured around the many determinants of health — with fundamental scientific research as an important component, but not the only one.

For now, these recommendations will have to remain a thought experiment: Political considerations, and the many interests that depend on the current NIH, make even the announcement of a 28th institute controversial. But as America sinks more and more money into its health care, which currently represents one-sixth of the US economy, and faces growing bills in the future, it is worth considering what the priorities of our immense medical-research apparatus really should be.

In 1945, Vannevar Bush, the director of the Office of Scientific Research and Development under Presidents Roosevelt and Truman, issued his science policy manifesto titled “Science: The Endless Frontier,” which set the stage for US government support of science in exchange for scientists securing national defense, economic prosperity, and a healthy life for the American people. Influenced by this, and especially by the success of the scientific contribution to the World War II victory, the government expanded its investment in all forms of science, but mainly in defense and health.

NIH funding in 1939 totaled less than $500,000 a year, a sum that supported just one institute. Adjusting for inflation, the budget has since increased nearly 4,000-fold — and now funds a Byzantine array of 27 separate institutes and centers. Most of these focus on specific diseases or disease clusters; others focus on stages of life, demographic sectors, or fields such as nursing or alternative medicine.

That the NIH budget has grown at such a rate reflects the strong belief of political supporters, including scientists, activist groups, and other constituencies, that more science inevitably leads to more social good. But in health care, it is becoming increasingly clear that this simplistic model is failing to deliver to its fullest potential.

Fortunately, we are starting to learn what does work. A cluster of studies from public-health and research-and-development economists indicates that progress in treating diseases results not just from scientists tackling them in a lab, but from the complex interactions that take place between academic disciplines, technological innovation, and clinical practice.

Take, for example, the advances in treating and preventing cardiovascular disease, which account for most of the gains in life expectancy in the United States during the past half-century. About one-third of the reduction in mortality has been traced to high-tech invasive treatments, such as coronary bypass surgery; one-third has been linked to medications that treat conditions such as hypertension; and one-third to behavioral changes, such as shifts in smoking habits, diet, and exercise. Biomedical research has been important to this success story, in other words — but so have basic lifestyle changes that are inexpensive for patients to make.

The story of lung cancer is a stark example of the cost of letting scientific momentum alone drive research strategies. According to the NIH’s National Cancer Institute, more than 220,000 people in the United States were diagnosed with cancer of the lung and bronchus last year. Between 80 and 90 percent of lung cancers have been linked to smoking tobacco. Yet of the $2.45 billion the NIH has spent on lung-cancer research during the past decade, most has been directed towards the discovery of molecular and genetic causes and treatments, rather than on establishing how the disease could be prevented by modifying people’s behavior. As promising as such cutting-edge research may sound, the actual results have been disappointing: Thirty-two years of data show that lung-cancer death rates overall are worse than they were in the early years of the “war on cancer,” initiated by President Nixon in the early 1970s. Research from the Centers for Disease Control and Prevention, however, indicates that comprehensive state tobacco control efforts during the past two decades correlate with decreases in mortality. If we want to attack lung cancer, in other words, investment in understanding how to change behavior is critical.

Today, despite its remarkable contributions to fundamental research, the NIH remains a fragmented bureaucracy that principally operates according to the familiar linear model of science, which assumes that more and more fundamental research leads to effective clinical applications and treatments. But basic biology can be maddeningly complex, and as the example of lung cancer shows, pouring millions — even billions — of dollars into basic biological research does not guarantee meaningful progress.

What if the NIH were reconfigured to reflect what we know about the drivers of innovation and progress in health care?

This new NIH could be structured around three institutes. A streamlined reorganization would limit the inevitable balkanization that has come from having separate NIH units dedicated to particular diseases. It would also reflect today’s scientific culture, which is moving towards convergence — especially in the life sciences, where collaboration across disciplines is becoming the norm, advances in one field influence research in others, and emerging technologies are frequently relevant across different fields.

A fundamental biomedical systems research institute could focus on the core questions deemed most critical to understanding human health in all its complexity. This would include the basic biological science focus of much of today’s NIH, but also incorporate research on how behavior, the environment, and society affect people’s health.

Take, for instance, the “obesity pandemic.” In the United States, medical costs related to obesity (currently around $160 billion a year) are projected to double within the decade. And by some estimates, indirect spending associated with obesity by individuals, employers, and insurance payors — for example, on absenteeism, decreased productivity, or short-term disability — exceeds direct medical costs by nearly threefold. The NIH certainly conducts and supports leading research on numerous factors relevant to obesity, but efforts are fragmented: Twenty-seven NIH components are associated with the NIH Obesity Research Task Force, a program established to speed up progress in obesity research.

Within this new institute on fundamental systems, scientists could better coordinate and integrate efforts to investigate all the drivers of a problem like obesity, from genetics to psychology, from sedentary lifestyles to the economics of neighborhoods that lack enough fresh fruits and vegetables.

A second institute could be devoted to research on health outcomes — that is, measuring what improves people’s health. It would seek to determine what actually works in clinical practice. In diabetes, for example, it could examine whether behavioral changes are more important than drugs in the treatment of the disease, or vice-versa. It would track whether certain kinds of surgery really improve people’s lives. It would draw on behavioral sciences, economics, technology, communications, and education as well as fundamental biomedical research. The NIH already funds research in these areas, and the existing research could serve as the basis for expanded programs that operate within an organization built expressly for that purpose. If the aim is to halve national obesity levels, for example, project leaders would measure progress against the goal itself, and not some scientific milestone such as the discovery of a gene associated with the problem.

The third institute, a “health transformation” institute, could focus on a problem that increasingly looms over our entire health care system, as the population ages and health care becomes more expensive: how to deliver health care in a way that stays affordable to the nation. If we want a healthier population, this issue is far more important than any particular lab advance. This institute would develop more sustainable cost models, by integrating science, technology, clinical practice, economics, and demographics. In the private sector, this focus on getting bang for the buck is second nature by now: It’s simply what corporations have to do to be successful in a competitive high-tech world. Rather than be rewarded for maximizing the production of new technical knowledge, this institute would receive funding based on its success at producing cost-effective public health improvements.

Though many of the ideas behind this new NIH are already in circulation, building it would require a new mindset — one that focused our national drive for scientific progress and technological innovation clearly on outcomes that benefit society.

In fact, the government has been through this exercise itself on a number of occasions, most recently in 2003 when a committee of the National Research Council considered whether to rethink the organization of the NIH. The idea was deemed both too complex and politically impractical. Despite the “theoretical attractiveness” of a restructuring, the committee concluded that the structure of the NIH is the “result of a set of complex evolving social and political negotiations among a variety of constituencies including the Congress, the administration, the scientific community, the health advocacy community and others interested in research, research training and public policy related to health.”

Shifting the mindset of scientists and policy makers alike must begin somewhere else — perhaps in the system that educates the NIH researchers of tomorrow. At research universities, we’re already seeing some transdisciplinary undergraduate and graduate curriculums that stress the importance of societal outcomes rather than just isolated scientific findings. The Mayo Clinic and Arizona State University, for instance, are jointly developing a master’s degree in the science of health care delivery. In New England, a model for such programs is found in the Dartmouth Center for Health Care Delivery Science, which brings all Dartmouth schools into collaboration with the affiliated academic health system in an effort to improve outcomes nationwide.

It was Harold Varmus, a former NIH director and currently director of the National Cancer Institute, who used the “jewel in the crown” metaphor to describe the NIH in 2001. He expressed concern that “new facets are being added without much thought to overall design, providing a superficial sparkle that may be pleasing to the few, but threatening to the functional integrity of the entire gem.” Especially in these recessionary times, we can no longer afford to simply keep adding to an obsolete model without considering where it is getting us. Spending $30 billion effectively requires that the accretions of the past be replaced with a conceptual framework that better addresses the health care realities and priorities of the 21st century.

Michael M. Crow is president of Arizona State University and previously served as executive vice provost and professor of science policy at Columbia University. This article was adapted from an article published in the journal Nature.