This past Halloween, a century’s old human skeleton resurfaced in the New Haven Green in Connecticut after Superstorm Sandy. It reminded everybody that what is today the city’s central park was once the town’s cemetery.
It also reminded some of us that in an earlier time in New England, life expectancy was about 40 years. Life expectancy increased dramatically in the 20th century and today, life expectancy in the US is 78 years.
What happened in the 20th century that led to this miraculous doubling in life expectancy? A biomedical revolution. During our lifetime – and that of our grandparents – we have seen medical and public health interventions that allowed us to control parasitic and infectious diseases, eradicate scourges such as smallpox, and protect against killers like polio and measles. These advances were fuelled by scientific discoveries made in basic research – the 20th century saw everything from the discovery of antibiotics to the sequencing of the human genome.
How did this happen? Science, medicine and public health existed in the 1800s and before, but it was not until the 20th century that society recognised the value of science in promoting health and quality of life. This resulted in the creation of public agencies such as the National Institutes of Health (NIH) and the National Science Foundation (NSF) for the funding of basic research.
It was a turning point in science, health and medicine. Public funding of basic research created a partnership between scientists and the general public: the public supports science, and science benefits the public. The basic answers that emerged from these efforts have transformed our health, and our lives.
According to Research America, 12 million cancer patients are alive today because of advances in medical science, and a person diagnosed with HIV today can expect to mark his 70th birthday, thanks to discoveries fuelled by publically-funded basic research. This partnership also created unprecedented economic growth, including the birth of the biotechnology industry worth over $430bn today.
But that revolution could come to a screeching halt. The first decade of the 21st century has mostly seen flat budgets for basic research (with the exception of the ARRA influx of funds). For the NIH, the main federal agency that funds health-related research – everything from cancer research to emerging tropical diseases – this has meant the loss of 20 percent of its purchasing power – an effective loss of over $4bn in research funding. Current prospects for research in the US are looking bleak – sequestration, scheduled to take place in March will result in an additional and immediate $2.5bn in cuts for this agency alone.
A cut this large is unprecedented in the history of the NIH, and could have catastrophic consequences. In 2010, the NIH created 487,900 jobs and produced $68bn in new economic activity. The cuts are projected to result in 30,000 fewer jobs and a $4.5bn decrease in economic activity just for the NIH (even more if one considers other federal agencies affected by these cuts, such as the NSF). The consequences of these cuts would also affect science, medicine and our future health.
NIH funding not only supports new discoveries – it is also used to train our future scientists. Research and discovery do not work like other industries, such as manufacturing, where cuts in investments can lead to scalable, and predictable, decreases in productivity. Instead, research works more like an ecosystem, where synergies between different components – research, discovery, education – are crucial. And like an ecosystem, if you cut enough of it, it could just collapse.
For example, the impending cuts would immediately result in thousands of trained scientists unable to use their knowledge, and in fewer resources to train our future generations of scientists.The cuts would also immediately reduce the number of newly funded scientific projects by 25 percent. This combination of fewer trained scientists, fewer research projects funded, and fewer resources for existing projects will choke the research enterprise.
It could result, not in a scalable decrease in 25 percent of productivity and discovery, but in the collapse of the whole system. These cuts are being considered as international investments in research are increasing in China, India, Brazil and Europe. These cuts could also result in the US losing its leadership in basic research, discovery and innovation.
For research to work, investments in research need to be predictable and at the very least keep up with the rate of inflation. The projected cuts and flat budgets will have “far-reaching consequences for scientific discovery, the economy, and global competitiveness”, as recently announced by the Society for Neuroscience.
When the skeleton was found in New Haven on Halloween’s eve, according to the New Haven Independent, a local resident commented, “You think it was a storm? I think it is a dead man trying to tell a tale.” The mass grave where the skeleton was interred is the resting place for victims of the smallpox epidemic, which in the 1800s was wreaking havoc around the world. Smallpox is the very first disease eradicated in the history of humanity, a titanic achievement of the 20th century.
The achievements our generation will make in the 21st century – which diseases will be eradicated; what incurable illnesses will cease to burden our loved ones; how long and how well our children will be expected to live at birth – are being decided right now, when our representatives in Congress decide how much the American people are willing to invest – or disinvest – from basic research.
Daniel Colon-Ramos is an Assistant Professor in the Department of Cell Biology and the Program in Cellular Neuroscience at the Yale school of Medicine and is a Public Voices Fellow with the OpEd Project. You can hear him speak about the value of basic research in medicine in this Tedx talk.