Jason J. Luke, MD, FACP
Since approximately the 1950s, the United States has been the international leader in funding for biomedical research, including cancer. Over the past several years, however, this funding has dropped off substantially and the country is likely to see this translate into a number of adverse consequences in the near and distant future.
During much of the past 10 years, the United States Congress has looked to trim budgets wherever possible. Due to an inability to build consensus around reforming the outdated payment systems for Medicare and Medicaid, discretionary spending has been the target of nearly all budget cuts. This has led to historically low funding levels for governmentally sponsored biomedical research, and the effects of this are likely to prove significant in time.
A recent report published in the Journal of the American Medical Association detailed this stark change in funding priorities over the past 20 years.1
Between 1994 and 2004, funding for biomedical research increased by approximately 6% annually; however, this fell precipitously to 0.8% annually from 2004 through 2012 (the last year of analysis).
Over a similar period from 1994 to 2012, funding from private or commercial sources increased approximately 12% annually. This change in funding has made the pursuit of academic science increasingly difficult and required many potentially valuable ideas to go unexplored. It is telling that as the current National Cancer Institute (NCI) director steps down, he states that while satisfied in his 5-year term, budget cuts and government shutdowns have made the effort of running the NCI more difficult.2
The effects of this poor funding environment have the potential for major consequences immediately as well as down the line. Commentaries have already been written describing the effect of funding cuts on the cooperative group clinical trial network.3
This mechanism that has facilitated some of the most important trials in clinical oncology is now severely limited in the number and scope of protocols that can be developed.
Another immediate issue is the lack of funding for non-commercial purposes. Certainly there are many industry partners eager to work with the academic community and practicing physicians; however, they are unlikely to pursue research that might be clinically useful but not commercially successful. Biomarker development and immuno-oncology are areas where this is particularly acute. With the astronomical costs of biologic drugs, certainly, a focus should be on determining which patients will not benefit from these treatments.
However, it is not in the interest of pharmaceutical companies to investigate this more than necessary for US Food and Drug Administration approval, as their market share only shrinks if only a biomarker-selected subset of patients are prescribed the drug. A longer-term consideration is in the development of the next generation of scientists and clinical investigators who struggle mightily with their more seasoned counterparts to compete for the limited grant funding available. Many junior faculty colleagues of mine have expressed a sense that they have “fought the good fight” and decided that it makes more sense to join industry, where projects can be launched with dedicated funding.
Where will funding come from to fill this gap in research funding? Most likely the answer is that the gap will not be adequately filled and the total amount of research will decline. For research that continues, it seems that the most obvious answer will be greater dependency on industrial and pharmaceutical support for research.
This raises a host of significant issues, however. The most obvious of these is conflicts of interest for the investigators producing the work. Despite the best intentions of the investigators, many studies have now found that bias is introduced when studies are funded by pharmaceutical companies.4
This problem then raises a larger question: Why has public funding for research via the National Institutes of Health dried up? Sadly, it seems that the answer is that “We, the people…” have, via our duly elected representatives, deemed biomedical research to no longer be of value, and/or more broadly, lack an appreciation for the importance of continued scientific research.
I was struck by a recent article in National Geographic
aptly entitled, “The Age of Disbelief–Why Do Many Reasonable People Doubt Science?”5
The article clearly lays out how some of the most important health-related scientific advances of the last 100 years have come to be questioned in the lives of everyday people. From fluorinated water to supplemental vitamins and vaccinations, growing segments of the population are expressing doubt about settled science in a way that undermines the faith of the larger society in scientific research. The reasons for this are complex and often attributed to our diversified media landscape where people can easily find reinforcement of incorrectly held beliefs.