Tuesday, October 18, 2005

The Golden Age of Biosecurity

There's been a good deal of talk among SL4 readers about this opinion piece by Bill Joy and Ray Kurzweil that appeared in the New York Times the other day regarding the recent publication of the 1918 influenza genome.

The piece is significant in part because Bill Joy and Ray Kurzweil don't generally see eye to eye on the subject of technological containment. Joy has been outspoken in the opinion that we should relinquish entire areas of research that could enable widespread or even existential calamities. Kurzweil has strongly dismissed this approach as not just ineffective, but counterproductive -- it would mean denying ourselves the benefits of the research even as those who operate outside the law continue to explore their more sinister uses.

So when Joy and Kurzweil agree that the blueprints for known biological WMDs shouldn't just be posted up on the internet for anyone to download, I'm inclined to pay attention. The knee-jerk reaction from the transhumanist crowd tends to be that information should be free, and that any restrictions push us towards a stagnant totalitarian future, and are therefore bad by definition. I sympathize with this position, and often agree. But there is a place for common sense.

Yes, researchers need free access in order to help us understand and defend against the threat of deadly diseases like the 1918 flu. And I agree that any security arrangements effective enough to actually thwart a determined terrorist would mean the end of such access.

But most of us aren't biological researchers, and determined terrorists aren't the only threat.

So let me make a case not for high security or no security, but for low security.

It is said by those who understand physical security that most locks serve only to keep honest people honest. People who would never think of breaking and entering can be tempted by valuable goods left out in the open.

Low-security locks and their kin also serve as buffers against rash decision-making, and to keep hazards away from those untrained to handle them.

We have a constitutional right to own firearms, but we don't leave them within reach of children, and we usually have a waiting period associated with purchasing a handgun.

Anyone is allowed to pull a fire alarm, but we often cover the alarm with a thin pane of glass to remind us that, while it is easy to sound the alarm, it should not be done lightly.

Joy and Kurzweil compared the 1918 influenza genome to the blueprints to an atomic bomb. This is not the best comparison because "rogue" nations and organizations have little difficulty obtaining this information. The limiter to nuclear proliferation has been the fortunate reality that manufacturing weapons-grade fissile material requires a combination of technical prowess and industrial capacity that is difficult to conceal and crushing to small economies.

Biological weapons are simply not in the same league of difficulty, and while today the number of entities that could reconstruct the 1918 influenza from the genome is small, the number is fast becoming non-trivial. The same technology that now allows genes to be sequenced hundreds of times faster than they could just two decades ago will undoubtedly allow them to be manufactured with equally accelerating speed.

In the earlier days of the compact disc, anyone could play them but few could afford a burner and fewer still owned one. MP3 was not a household word, and the internet was just an academic curiosity. It was a golden age for the recording industry. This is where we are at with gene manipulation. How long can we really expect it to last?

I can already hear the Slashdot crowd: 'Fool! CD and DVD copy protection schemes have proven a nuisance for the masses and and trivial for the pirates to crack.'

But genegineers are not 'the masses', and while I'm worried about terrorists, they don't have a monopoly on my fear. I'm worried about 2013. I'm worried about the grad student left alone with the department's GeneJet 6P one Friday night and looking for something fun to print up. I'm worried about the disgruntled employee of a big pharmaceutical corporation who decides to go postal with the tools he knows best.

Would it be too onerous to perform legitimate research if you had to, say, order your deadlier genomes by mail? If you had to have an operator's license to use a geneprinter?

It may sound silly now, but it's not going to seem silly to the thousands or even millions who will be affected by the release of a deadly organism built in a moment of rashness or stupidity. And in the draconian security climate that will follow, researchers will look back fondly on our golden age and wish that all they had to put up with was snail mail.

Just a little security. That's all I ask, and I won't take more.

1 Comments:

Anonymous Michael Anissimov said...

I would prefer a lot of security. Until we safely install superhuman safeguards to human-initiated disaster, our civilization will be at dire risk. The more powerful the destructive technology, the harsher the security measures should be. At the very least, the intensity of security should be directly proportional to the magnitude of risk. As long as the security measures don't selectively empower some elitist group, they will, on average, contribute to our species rather than work against it.

4:41 PM  

Post a Comment

<< Home