The outbreak of coronavirus, and the fact the Chinese government has offered a rare admission of culpability over its handling of the crisis, highlights just how important it is to prepare for these kind of potentially deadly epidemics.
Of course, we have plenty of precedents: Just over a century ago the Spanish Influenza pandemic of 1918 resulted in over 500 million people becoming infected and killed over 50 million, wiping out a staggering 3% of the world’s population.
For all the very welcome advances in modern medicine since then, governments and health officials still need to be alert to the possibility of new threats to the world’s population.
Indeed, it is scientific advances that may present one of the gravest threats. Synthetic biology, while bringing massive benefits to humanity, could unleash something far more deadly. As University of Oxford’s Global Priorities Project makes clear, synthetic biology can overcome the limits currently faced by pathogens, which could lead to pandemics on an unprecedented scale. “If more widely accessible,” they warn, “[it] would give terrorist groups the ability to synthesise pathogens more dangerous than smallpox.”
Then there is artificial intelligence. Again, AI has the potential to dramatically improve our lives. It too is not without risks, however. The doomsday scenario is if AI develops sentience and becomes self-aware, then decides that us humans are surplus to requirements. This is not as whacky as it sounds and has strong support from leading academics such as Professor Nick Bostrom at the University of Oxford.
So, what is the right response to these kind of big, theoretical-sounding threats? In the context of pandemics, the economist Tyler Cowen suggests it’s time for individual governments to increase investment in vaccine research. This obviously makes sense, as there could come a time when an outbreak becomes so severe that a country may be unwilling to share its vaccine with other countries. Therefore it is important that the UK is able to produce its own vaccine.
As for AI, the government should fund research into what is known as the alignment problem, ensuring that we create and develop AI which shares humanity’s values.
It’s not all about money though. Universities and businesses which are currently undertaking research into these topics need to attract top talent from around the world. For the UK, that means designing an immigration system that allows researchers to move here with the minimum of fuss.
This is pretty uncontroversial and polling suggests voters are very supportive of such high-skilled people making a living in Britain. But while it is encouraging that the £30,000 salary requirement appears to have been dropped, there is still much more that should be done.
We also need to look at our legal framework. Often when scientific advances occur, legislation fails to keep up. That means new and exciting technologies are often held back by out-of-date laws. New technology means that it may soon be possible to enhance human bodies. Not only would this potentially lead to a higher quality of life for those who choose to do so, it could also be essential for the long term survival of humanity. However, as I pointed out in a paper for the Adam Smith Institute last year, research into this could potentially be hampered by archaic legislation.
We have a new government with a fresh, potentially very exciting agenda to improve the lives of ordinary British people. At the same time, the first duty of government is always to protect its people. Though it probably does not present a huge risk to the UK, the emergence of coronavirus should sharpen our resolve to prepare for other threats that have the potential to be altogether more deadly.
Click here to subscribe to our daily briefing – the best pieces from CapX and across the web.
CapX depends on the generosity of its readers. If you value what we do, please consider making a donation.