“The experiments we're doing today are riskier than they've ever been”

Biosecurity and bio-risk expert Dr Filippa Lentzos discusses the 'explosion' of high-risk research, genetic surveillance, and science misinformation - and says bioscientists must do more to understand the ways their work could be misused or abused 

July 19th 2022 

Dr Filippa Lentzos is a senior lecturer in science and international security at King’s College London. She also serves as the NGO coordinator for the international treaty prohibiting biological weapons, and regularly works with the United Nations and the World Health Organization to discuss threats from high-risk bioscience, bio-warfare and bioterrorism.

Working in both King’s Department of Global Health & Social Medicine and its Department of War Studies, Lentzos examines the threat of misuses or abuses of science and technology and helps develop new ways to understand and mitigate against those risks. She believes the recent proliferation of high-risk research across the world has raised the dangers posed by life science to levels not seen before and represents a greater risk to global health security than the development of biological weapons.

She spoke to The Biologist about these and other threats – including cyber attacks on science or medical facilities, health misinformation and disinformation, DIY bioterrorism and genetic surveillance – and how biologists can do more to understand how their work may be misused.

Can you explain what your role as a biosecurity expert typically involves? Are you ever required to inspect facilities or assess capabilities?

My main role is as a lecturer and researcher, but I don’t think there is ever a typical week or month. My area of expertise is the critical study of how we conceive of biological threats and risks, and that involves engagement with a large group of stakeholders from the scientific community. It also involves speaking with policymakers, with diplomats, defence forces, industry and international organisations like the WHO and the UN, all of whom have a stake in how we frame biological risks and biological threats.

I'm not an inspector – the UN actually doesn't have biological weapons inspectors. All biological weapons are prohibited by the Biological Weapons Convention, but when that treaty was negotiated, it very quickly became apparent that it's very difficult to identify what can and can’t be misused in the life sciences, unlike with chemical or nuclear weapons.

While I am not involved in inspections, I do contribute to the debate about how we can ensure compliance and verification with the treaty, and I have been part of trust-building exercises. For example I went to Georgia’s Centres for Disease Control, which had been the subject of a lot of Russian disinformation suggesting they were creating biological weapons there. Georgia invited experts from 20 states to try to assure the world otherwise. I was invited as an independent expert. We spent two days at the facility and produced a report, which said there was nothing to suggest that it is anything other than a public health facility.

It's very interesting that we're now seeing exactly the same sort of disinformation targeting Ukrainian labs.

F Lentzos exercise resizedLentzos during a biosecurity preparedness exercise in 2016. 

Let’s talk about disinformation, which is an area of biosecurity that you study. What is the ultimate risk created by that?

It divides democratic societies and is a threat to democracy. To those who construct disinformation campaigns it often doesn’t matter whether people believe it or not, it's just about bombarding people with different versions of events to sow uncertainty, distrust and confusion.

We’ve seen during the pandemic how disinformation around what SARS-CoV-2 is, or who was responsible, or what medication to take, resulted in some people not taking precautions, not seeking treatment, or seeking the wrong treatment, as well as people attacking medical facilities, care workers and medical staff. It exacerbated anti-immigration sentiments and the anti-vax movement.

Disinformation in the bio-space is nothing new – there were many allegations between the Soviet Union and other communist states and the US about who had biological weapons during the Cold War, and we are all familiar with stories about the origins of AIDS in the 1980s. The difference now is that disinformation campaigns spread much faster and more widely than previously.  

You can’t just tell the truth or provide scientific facts to counter a disinformation campaign. You actually need to expose the way in which these campaigns operate to raise awareness and help people spot what’s happening. That's work that social scientists can play a significant role in.

What do you think is the biggest concern right now in terms of biological risks?

I'm very interested in the explosion of high-containment facilities for working with very high-risk pathogens. My team’s research has shown that there are at least 67 such laboratories in operation or under construction, in 25 countries. Most of these laboratories are located in urban areas, and only a quarter of countries housing these labs score high on international measures of biosafety and biosecurity. Some of the countries establishing these labs don’t have a long tradition of biosafety or doing work the way we do in the West.

Allegations about the biosafety record of facilities in China has fuelled the debate about the origins of COVID-19, and makes you realise that the cultural context in which science is done is really important. In very authoritarian countries, you don't stick your hand up and say I was wrong, I've made a mistake, let's learn from this experience. But that is exactly what you need to do to develop a good biosafety culture.

Earlier this year I wrote in Science about self-spreading viruses and work that looks for viruses that might spill over into the human population, and how some of that work actually is very difficult to justify in terms of risk. [‘Risky research on lab-modified self-spreading viruses has yet to present credible paths to upsides,’ Science (2022).] What our paper was trying to do was get a broader group of stakeholders involved in that debate, but it's actually very hard to get that sort of engagement.

How effective are scientists at assessing the risks of their own research?

I think most biologists and life scientists try to do good and make the world a better place, and that needs to be our bottom line. We shouldn't assume that everybody wants to wreak havoc and do bad things. Are they competent? Of course, they're competent in terms of their specific research projects. But most scientists are not trained to be experts in anything other than their particular science, whether in biology, virology or microbiology. They don't have any specialist training in security, or risk, or ethics. I think it's important that they recognise that. It's unfair on them to suggest that they need to be experts in all these things, and that's why it's extremely important today to work in more interdisciplinary groups, where you actually have people who have this expertise. Certainly when it comes to things like risk assessments, you need to have a more inclusive group of expertise represented.

Are scientists biased against risks? Of course they are: anything that we are familiar with we think of as less risky. If that wasn't the case, we would hardly cross the road for fear of being driven over by a car. As a scientist you are not awarded for taking longer, you are not awarded for engaging more broadly. You're awarded for getting funding and writing papers and getting published. So there is a disconnect there between careful, responsible, deliberative science and what enables career progression in science.

I think it's important to work towards a better understanding of what responsible science looks like today, and how that corresponds with what broader society thinks responsible science needs to look like today. Scientists have a responsibility to think about how science could be misused, and what their products might be used for. For example, much of the genetic surveillance being pursued in China has been aided by Western companies selling their technology to the Chinese government.

Risk assessment is an art – it's not an exact science and it very much depends on the social context in which that assessment is made. So my appeal is always for a more inclusive, deliberative risk assessment, and that takes time.

F Lentzos Tbilisi resizedLentzos on a trust-building visit to Tbilisi, Georgia.

Can you tell me a little bit about what you see as the big risks from genetic technologies and the collection and storage of lots of genetic data?

Biology today is increasingly turning into a digital science, so we need to think more carefully about cyber biosecurity risks and data protection. With a cyber attack you could disrupt public health records or genome sequencing information. It could be explicitly disruptive or you might not even know that it's been done. You could disrupt medical processes, the development of drugs, and so on.

We also know that some drugs work better on some subpopulations than other subpopulations and we can now screen for compounds that are most effective on individuals with different genomes. You could easily turn that technology around and try to make a deliberately harmful compound for that particular sub group [‘Dual use of artificial-intelligence-powered drug discovery,’ Nature (2022)]. That would be an ultra-targeted biological weapon – although that to my mind is an extreme worst-case scenario.

What I actually think is more worrying is how the genetic data we have on different subgroups can be used for surveillance of those populations. We’re seeing this in China already, where the state is collecting information on Tibetans and Uighurs, and using that information for social control. China is also building the biggest genetic forensic database we know of. They're even taking samples from children in kindergarten and from pregnancy tests.

Of course, China won't be the only country doing this kind of thing. I think genetic surveillance is something that perhaps is more relevant to people, in terms of security concerns, than personalised weapons. This idea that states are not just surveilling your face, your voice, and your gait through CCTV cameras. They are going all the way inside your body and surveilling your genome.

The people who developed face recognition didn't know that it would be used to target people in a different country and lock them up, but scientists need to have that reflex. We need to make people smarter about potential misuse and abuses of their own science. I think that is one of the things we need, not just laws and regulations. 

I've been part of a World Health Organization working group that has provided input into a global guidance framework on responsible life sciences that’s coming out in a few weeks. Through several rounds of discussions, we outlined basic principles that are important for safe, secure and responsible science; key tools and mechanisms for biorisk management; and who principal stakeholders are. Scientists have a responsibility of course, but their institutions and the publishers do too, as do funders and international organisations. There's a whole network of stakeholders that needs to be involved to ensure responsible life science is practised.

Can you give me your view on that network, which is a mostly self-regulating network, and how effective is it in keeping us safe from developments in life science and biotechnology? What are your recommendations for beefing up that system of protection and risk-mitigation?

Well, I think there are all kinds of gaps currently, but I don't think the answer is necessarily more regulations. Of course, we need to have statutory laws and regulations in place. For example, the Biosafety Level 4 labs that are being developed all over Asia currently, we want them to operate to the same standards that we have in the West. But that's not the only thing, you also need to develop a culture of safe and responsible science.

So what practically can the scientific community do? We came up with a whole list of things but one example is for journal articles to always have a section showing that the authors have considered how the research discussed could be misused. It should become standard to see that there's been a consideration of misuse or of alternative uses in journal articles.

Dr Filippa Lentzos on the top-secret world of bioweapons
“Medics have ethics as part of their curriculum. Chemists are very conscious of the misuse of their science, from the gas clouds of WW1 to the gas chambers of WW2 through to Syria. Physicists are very conscious of the double-edged sword of their research with Hiroshima and Nagasaki. And we often call engineers civil or civilian engineers because their field started out as military engineering. But historically biological weapons have been developed in very, very secret programmes that we still know very little about – they were often kept even more secret than the nuclear programmes. 
“There are horrific examples of biological weapons being developed, of people and non-human animals deliberately being inflicted with disease to measure disease progression. Attempts to weaponise biological agents have been seen in the US, many European countries, and the Soviet Union, which had a particularly enormous programme. In South Africa there were efforts to try to develop an anti-fertility weapon that would make black women infertile.
"We need to make space in our curriculum for more consideration of the impact of science on society. I'm not saying we need to have endless history lessons, but I do think there needs to be a component of the curriculum that talks about what responsible science entails.”

There has been much talk of the democratisation of bioscience in recent years, and the lowering of barriers to bioscience, with the suggestion that quite sophisticated biological procedures can be conducted outside of conventional research facilities. How much of a concern is this in terms of bioterrorism or biosafety lapses?

You often hear in security policy discussions that anyone can now become a bioterrorist in their own garden shed, and there are concerns about what’s being done in ‘DIY biology’ and hackerspaces. Of course, if anyone has ever visited one of these hackerspaces it's more about how to hold a pipette correctly than developing some very dangerous virus.

I think there's been too much emphasis on the threat from this community. Of course, we need to take that threat seriously, but the much greater threat, and where I think the policy focus needs to be, is where someone is aiming to do something bad and has the resources and facilities and people at their disposal to do that. If you have the intention, science can enable you to do some really bad things.

Another area is these remote experiments or ‘Cloud labs’, which came onto our radar as a big issue a few years ago. Companies are now setting up labs for hire, meaning researchers can sit on their laptop anywhere around the world and send their experiment to these labs to be done by automated machines. The companies are saying that they only work with trusted partners, but of course they will be keen to expand their market.

So yes, the barriers are most definitely coming down if you want to deliberately do something harmful. But I still think that if you are talking about the risks of sophisticated science, then it should primarily be linked to sophisticated facilities and resources, not amateur communities or a single ‘bad apple’.

Stepping back a little bit from your day-to-day caseload and the various frameworks and conventions you work on. How safe do you feel as a person about the threat from biological threats?

I don't lose sleep over biological weapons. I think it the threat is something we need to pay attention to and have quiet discussions about, but I don't think it's something we should hype. We do need to have proper international mechanisms in place in case there is a deliberate biological event, and to have an investigative mechanism so that we can investigate that, but I don't walk around thinking I'm going to be struck down by some awful new pathogen. In the weapons world, we still need to be more worried about nuclear weapons.

I think a much greater risk on the biological side is unintended consequences of research, where you haven't thought through the larger implications if something goes wrong. One of the things that I'm increasingly concerned about is judgments scientists make about risks, “well, we operate safely, and we know how to do this”.

I'm increasingly worried about lab incidents. One of my research projects maps the growing number of labs we have around the world. There are more labs now than ever, there's more funding going in, there are more projects on pathogens, more people working with dangerous pathogens, and the sorts of experiments we're doing today are riskier than they've ever been. So, yes, I'm more worried about accidents and what could come out of that.

In the grand scheme of things, in this long, dark history of bio weapons and worrying dual use research and technology, where do you think we sit now? Are we in more peril than ever?

Yes, I do think so. We also have extraordinary science that we can draw on to counter that – we saw that with the pandemic and how quickly we developed a vaccine. So the science is more powerful than ever to counter potential biological harms. But as we also saw with the vaccine, just having the vaccine isn’t everything. There is always a social context, and when you have antivaxxers and when you have disinformation, and all these other factors involved, you realise we can't just have technical solutions, we also need social solutions and social scientists.

Dr Filippa Lentzos is a senior lecturer in Science & International Security in the Department of War Studies and in the Department of Global Health & Social Medicine at King's College, London. She is also co-director of the Centre for Science and Security Studies at King’s, an associate senior researcher at the Stockholm International Peace Research Institute, and the NGO Coordinator for the UN's Biological Weapons Convention.