462759
462759
462759
edited by
Brian Rappert and Michael J. Selgelid
On the dual uses of science and ethics : principles, practices, and prospects /
editors, Brian Rappert, Michael J. Selgelid.
ISBN:
Contents
Contributors. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . vii
Acknowledgments. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xv
Abbreviations. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xvii
Introduction
1. Ethics and Dual-Use Research. . . . . . . . . . . . . . . . . . . . . . . . . 3
Michael J. Selgelid
10. The Doctrine of Double Effect and the Ethics of Dual Use. . . 153
Suzanne Uniacke
Conclusion
20. Ethics as . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 349
Brian Rappert
Appendix A . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 381
Contributors
Valentina Bartolucci
Valentina Bartolucci is a lecturer at the University of Pisa, Italy, and at the
Marist college campus of Florence, Italy. Valentina is also Visiting Fellow at the
University of Derby (UK). She maintains a close connection with the School for
Social and International Studies at the University of Bradford, UK. Valentina
has a degree in peace studies from the University of Pisa, an MA in conflict
management and human rights from the Sant Anna School of Advanced
Studies, and a PhD from the University of Bradford. Dr Bartolucci has previously
worked as a junior technical expert for the Italian Ministry of Foreign Affairs
in Morocco, and as a consultant for various international organisations (in
Italy, Burkina-Faso and India). She is currently offering consultancies on issues
linked to terrorism, organised crime, migration, and American and Middle
Eastern foreign policies for various national and international organisations as
well as government departments around Europe. In recognition of her impact
internationally, she was recently made a Member of the Aspen Institute (group
I protagonisti italiani nel mondo). In 2011, she was nominated for the Whos
Who Academic Excellence. Apart from her doctoral thesis on the governmental
discourse on terrorism, Dr Bartolucci has written numerous articles in the fields
of politics, critical discourse analysis, security and surveillance. Her research
mainly addresses the foreign and security policies of the United States and other
states, terrorism and counterterrorism, discourses on threat and emergency, and
strategic communication as anti-terrorism. She has recently been awarded with
a Fulbright Research Scholar Bursary for 201314 to be spent at Arizona State
University working on public diplomacy as anti-terrorism.
Louise Bezuidenhout
Louise Bezuidenhout completed her PhD in cardiothoracic surgery at the
University of Cape Town, South Africa, and has worked as a postdoctoral
scientist in the department of cardiovascular sciences at the University of
Edinburgh, UK. In 2008 she was the recipient of an EU scholarship for the
Erasmus Mundus Masters of Bioethics program run by the Catholic University
of Leuven, Radboud University Nijmegen and the University of Padova. These
dual interests in science and bioethics led her to enrol for a PhD in sociology at
the University of Exeter under Professor Brian Rappert, which she completed in
2013. Her research focused on the need for contextual sensitivity in life-science
vii
Steve Clarke
Steve Clarke is a Senior Research Fellow in the Centre for Applied Philosophy
and Public Ethics at Charles Sturt University in Australia. He is also a James
Martin Research Fellow in the Institute for Science and Ethics in the Oxford
Martin School and Faculty of Philosophy at the University of Oxford, and a
Research Fellow in the Uehiro Centre for Practical Ethics, also at Oxford. He
holds a PhD in philosophy from Monash University and has previously held
appointments at the University of Melbourne, the University of Cape Town and
La Trobe University. Steve is a broad-ranging philosopher who has published
more than 60 papers in refereed journals and scholarly collections. His papers
have appeared in such journals as the American Philosophical Quarterly,
the British Journal for the Philosophy of Science, the Journal of Medicine and
Philosophy, Religious Studies and Synthese. He is the author of two books
(Metaphysics and the Disunity of Scientific Knowledge, Ashgate, 1998; and The
Justification of Religious Violence, Wiley-Blackwell, 2014) and the co-editor of
three books. The latest of these is Religion, Intolerance and Conflict: A Scientific
and Conceptual Investigation, edited by Steve Clarke, Russell Powell and Julian
Savulescu (Oxford University Press, 2013).
viii
Contributors
Nancy D. Connell
Nancy D. Connell is Professor in the Division of Infectious Disease in the
Department of Medicine at Rutgers New Jersey Medical School (RNJMS) and
the Rutgers Biomedical Health Sciences. Dr Connell, a Harvard University
PhD in microbiology, has made her major research focus the antibacterial drug
discovery in respiratory pathogens such as M. tuberculosis and B. anthracis. She
is Director of the Biosafety Level Three (BSL3) Facility of RNJMSs Center for
the Study of Emerging and Re-emerging Pathogens and chairs the universitys
Institutional Biosafety Committee. Dr Connell has been continuously funded
by the National Institutes of Health (NIH) and other agencies since 1993 and
serves on numerous NIH study sections and review panels. She has served on a
number of committees of the National Academy of Sciencesfor example, the
Committee on Advances in Technology and the Prevention of their Application
to Next Generation Biowarfare Agents (2004), Trends in Science and Technology
Relevant to the Biological Weapons Convention; an international workshop
(2010), and the Committee to Review the Scientific Approaches used in the FBIs
Investigation of the 2001 Bacillus anthracis mailings (2011). Current work is
with the National Academies of Sciences Education Institute on Responsible
Science, held in Jordan (2012) and Malaysia (2013).
Michael Crowley
Michael Crowley is the Project Coordinator of the Bradford Non-Lethal Weapons
Research Programme (BNLWRP) and is also a senior research associate for the
Omega Research Foundation. He has worked for 20 years on arms control,
security and human rights issues, including as executive director of the
Verification Research, Training and Information Centre (VERTIC). He also acted
as chairman of the Bio-Weapons Prevention Project. Prior to this he worked at
the Omega Research Foundation exploring options for effective restriction of the
development, trade and use of security equipment and technology employed in
torture and ill treatment. He has also managed the Arms Trade Treaty project
at the Arias Foundation in Costa Rica and worked as senior arms trade analyst
at BASIC. He has also held several research and policy positions with Amnesty
International, both in the UK section and at the International Secretariat. He
holds a BSc in genetics from Liverpool University, and an MRes and PhD from
Bradford University, where his doctoral thesis explored the regulation of riotcontrol agents, incapacitating agents and related means of delivery.
ix
Malcolm Dando
Malcolm Dando is Professor of International Security at the University of
Bradford. A biologist by training, his main research interest is in the preservation
of the prohibitions embodied in the Chemical Weapons Convention and the
Biological Weapons Convention at a time of rapid scientific and technological
change in the life sciences. His recent publications include Deadly Cultures:
Biological Weapons Since 1945 (Harvard University Press, 2006), which he edited
with Mark Wheelis and Lajos Rozsa.
Thomas Douglas
Thomas Douglas is a Uehiro Research Fellow in the Oxford Uehiro Centre for
Practical Ethics, Faculty of Philosophy, University of Oxford, and a Junior
Golding Fellow at Brasenose College, Oxford. He trained in medicine (MBChB,
Otago) and philosophy (DPhil, Oxford) and works in applied and normative ethics.
His current research addresses the ethics of producing dangerous knowledge
and of using medical technologies for non-medical purposes, particularly crime
prevention. He has previously worked on organ donation policy, reproductive
decision-making, slippery-slope arguments and compensatory justice.
Dr Nicholas G. Evans
Dr Nicholas G. Evans is an Adjunct Research Assistant at the Centre for
Applied Philosophy and Public Ethics, Charles Sturt University, Canberra. His
dissertation focused on the ethics of censoring dual-use research in the life
sciences through an examination of the history of nuclear science. Nicholas is
the author of a number of articles on military ethics, the philosophy of science,
and ethics, and has taught philosophy, military ethics and physics at universities
around Australia. His research interests include emerging military technologies,
public health ethics, concepts of responsibility and autonomy, and friendship.
John Forge
John Forge is an Honorary Associate in the Unit for History and Philosophy of
Science, Sydney University, having previously worked at the Universities of
New South Wales, Wollongong, Griffith and Macquarie. His research ranged
from the philosophy of physical science, especially explanation, in the early
years, to science and ethics later on. Forge has authored numerous articles and
six books, the latest being The Responsible Scientist (Pittsburgh, 2008), winner
of the David Harold Tribe Prize for Philosophy and the Eureka Prize for Research
Ethics; Designed to Kill: The Case against Weapons Research (Springer, 2012);
and On the Morality of Weapons Research (Ashgate, 2014).
Contributors
Alexander Kelle
Alexander Kelle is a political scientist by training and a Senior Policy Officer
in the Office of Strategy and Policy of the Organisation for the Prohibition of
Chemical Weapons in the Hague, Netherlands. He contributed to this volume
while senior lecturer in politics and international relations at the Department of
Politics, Languages and International Studies at the University of Bath, UK. Before
moving to Bath, he held positions at Queens University Belfast, the University
of Bradford, Stanford University, Goethe University Frankfurt and the Peace
Research Institute Frankfurt. His past research has addressed international
security cooperationwith a view to chemical and biological weaponsdualuse governance of synthetic biology, and the foreign and security policies of
Western liberal democracies.
Seumas Miller
Seumas Miller is a Professorial Research Fellow at the Centre for Applied
Philosophy and Public Ethics (CAPPE) (an Australian Research Council Special
Research Centre) at Charles Sturt University (Canberra) and the 3TU Centre
for Ethics and Technology at Delft University of Technology (the Hague) (joint
position). He is the foundation director of CAPPE (200007). He has authored
or co-authored more than 150 academic articles and 15 books, including (with
M. Selgelid) Ethical and Philosophical Consideration of the Dual Use Dilemma
in the Biological Sciences (Springer, 2008), Terrorism and Counter-Terrorism:
Ethics and Liberal Democracy (Blackwell, 2009), The Moral Foundations of Social
Institutions: A Philosophical Study (Cambridge University Press, 2010) and
(with I. Gordon) Investigative Ethics: Ethics for Police Detectives and Criminal
Investigators (Blackwell, 2013).
Brian Rappert
Brian Rappert is a Professor of Science, Technology and Public Affairs and Head
of the Department of Sociology and Philosophy at the University of Exeter.
His long-term interest has been the examination of how choices can be and
are made about the adoption and regulation of security-related technologies
particularly in conditions of uncertainty and disagreement. Recent books by
Rappert include Controlling the Weapons of War (Routledge, 2006), Biotechnology,
Security and the Search for Limits (Palgrave, 2007), Technology & Security (ed.,
Palgrave, 2007), Biosecurity (co-ed., Palgrave, 2009) and Experimental Secrets
(UPA, 2009). More recently he has been interested in the social, ethical and
epistemological issues associated with researching and writing about secrets, as
in his book Experimental Secrets (2009) and How to Look Good in a War (2012).
xi
David B. Resnik
David B. Resnik (JD, PhD) is a bioethicist and Institutional Review Board (IRB)
Chair at the National Institute of Environmental Health Science, National
Institutes of Health. He has published eight books and 200 articles on ethical,
legal, social and philosophical issues in science, medicine and technology, and
is Associate Editor of the journal Accountability in Research.
Michael J. Selgelid
Professor Michael J. Selgelid is Director of the Centre for Human Bioethics at
Monash University in Melbourne, Australia. He was previously Director of the
World Health Organisation Collaborating Centre for Bioethics at The Australian
National University. His research focus is public health ethicswith emphasis
on ethical issues associated with biotechnology and infectious disease. He is
editor of a book series in public health ethics analysis for Springer, Co-Editor of
Monash Bioethics Review, and Associate Editor of the Journal of Medical Ethics.
He co-authored (with Seumas Miller) Ethical and Philosophical Consideration of
the Dual-Use Dilemma in the Biological Sciences (Springer, 2008).
Michael Smithson
Michael Smithson is a Professor in the Research School of Psychology at The
Australian National University in Canberra. He is the author of six books, coeditor of two, and his other publications include more than 140 refereed journal
articles and book chapters. His primary research interests are in judgment and
decision-making under uncertainty, statistical methods for the social sciences,
and applications of fuzzy-set theory to the social sciences.
xii
Contributors
Judi Sture
Judi Sture is the Head of the Graduate School at the University of Bradford,
UK, where she leads two doctoral research training programs. She lectures in
research ethics and research methodology and is closely involved in devising
and developing postgraduate and ethics policy and practice at the university
and beyond. As a member of the Wellcome Trust Dual-Use Bioethics Group and
associate member of the Bradford Disarmament Research Centre, she is engaged
with colleagues from a number of UK and overseas universities in developing a
bioethics approach to counter-biosecurity threats in the life sciences. Judi holds
a BSc(Hons) in archaeology (University of Bradford), in which she specialised
in the study of human skeletal remains, and a PhD (University of Durham) in
biological anthropology, focusing on environmental associations with human
birth defects. Her research in biological anthropology continues, including
further work on developmental defects. She is currently engaged in analysis of
skeletal remains held at the Museum of London and is working with the British
Association for Biological Anthropology and Osteoarchaeology on developing
ethical practice in the profession.
Emmanuelle Tuerlings
Dr Emmanuelle Tuerlings has extensive experience on dual-use research,
biosecurity and global health security. She carries out training and consultancy
with a particular interest in public health. She was a scientist to the Biorisk
Reduction for Dangerous Pathogens (BDP) program at WHO headquarters,
Geneva, from 2004 to 2011, where she was leading the project Responsible Life
Sciences Research for Global Health Security. Before joining WHO, she was
based at the Harvard Sussex Program, University of Sussex, UK. She also worked
with several international and non-governmental organisations on issues related
to dual-use biological technologies and their governance. She holds an MSc and
a doctorate in science and technology policy from the University of Sussex (UK).
Suzanne Uniacke
Suzanne Uniacke is Director of the Centre for Applied Philosophy and Public
Ethics, Charles Sturt University, Canberra. She has held positions in philosophy
departments in England and Australia and visiting research fellowships at St
Andrews, Harvard and Stirling universities. Professor Uniacke has published
extensively in ethics, applied philosophy and philosophy of law and was editor
of the Journal of Applied Philosophy from 2001 to 2013.
xiii
Simon Whitby
Simon Whitby works at the interface between life and associated science and
national security communities to address the threat of deliberate disease in
the context of rapidly advancing dual-use science and technology. Whitbys
work contributes to the discourse on dual-use biosecurity and bioethics and
thus on raising awareness at governmental, civil society, life science and
industry levels about the ethical, legal and social implications of life science
and associated science research. He has been actively engaged in building a
worldwide capability in dual-use bioethics to engage life and associated science
communities in awareness-raising programs about the importance of responsible
conduct of life-science research. Significantly, he has developed a novel and
innovative online distance-learning masters-level train-the-trainer program as
well as short courses in applied dual-use biosecurity/bioethics.
Jim Whitman
Jim Whitman is Professor of Global Governance in the Department of Peace
Studies, University of Bradford, and Director of Postgraduate Studies for the
School of Social and International Studies. His latest book, Governance Challenges
of Nanotechnology, will be published by Palgrave in 2014.
xiv
Acknowledgments
A Wellcome Trust award, Building A Sustainable Capacity in Dual-Use
Bioethics, supported a January 2010 workshop titled Promoting Dual Use
Ethics at The Australian National University that underpinned the development
of this volume. A grant by the Research Councils UK (ES/K011308/1) enabled
Brian Rappert and Malcolm Dando to undertake work for their chapters.
An award from the Economic and Social Research Council covered the production
costs associated with making this volume Open Access through ANU E Press.
xv
Abbreviations
AAAS
AMA
ANU
APS
ASM
BBSRC
BDP
BIO
BMA
BNLWRP
BSL
BSLIII
Bt
BTWC
BW
BWC
CAPPE
CBA
CBRN
CBW
CISSM
COB
CSAIL
CSIS
CWC
DARPA
DDE
DHHS
DHS
DIY
DOC
EGE
ELSI
ENSN
EU
EVC
EVP
FBI
FDA
fMRI
GE
HAC
HSP
IACUCs
IAEA
IAP
IASB
IBC
ICPS
ICRC
INN
IPR
IRBs
ISAAA
ISU
IUBMB
MAS
MIT
MOOTW
MRC
MTA
NATO
NCD
NEST
NET
NGO
NIEHS
NIH
NMHP
NNI
xviii
European Union
expected-value criterion
expected-value procedure
Federal Bureau of Investigation (US)
Food and Drug Administration (US)
functional magnetic resonance imaging
genetic engineering
holistic arms control
Harvard Sussex Project
institutional animal care and use committees
International Atomic Energy Agency
InterAcademy Panel
Industry Association Synthetic Biology
institutional biosafety committee
International Consortium for Polynucleotide Synthesis
International Committee of the Red Cross
International Neuroethics Network
intellectual property rights
institutional review boards
International Service for the Acquisition of Agri-biotech
Applications
Implementation Support Unit (UN)
International Union of Biochemistry and Molecular Biology
marker-assisted selection
Massachusetts Institute of Technology
military operations other than war
Medical Research Council (UK)
material transfer agreement
North Atlantic Treaty Organisation
Neuroscience Center at Dartmouth
New and Emerging Science and Technology
Neuroethics New Emerging Team
non-governmental organisation
National Institute of Environmental Health Sciences (US)
National Institutes of Health (US)
no means to harm principle
National Nanotechnology Initiative (US)
Abbreviations
NPG
NRA
NRC
NRC
NSABB
NSF
ONR
OPCW
PCSBI
PHAC
PI
PNAS
PP
PTSD
QALYs
QTL
RAC
R&D
RCR
rEVP
RNJMS
SAP
S&T
SEU
sPP
TB
TMS
UK
UN
UNESCO
UNGA
UNMOVIC
UNODA
UNSC
UNSCOM
US
USDA
VBM
VERTIC
WHA
WHO
WMA
WMD
xx
Introduction
Isnt the person most able to land a blow, whether in boxing or any other
kind of fight, also most able to guard against And the one who is most
able to guard against disease is also most able to produce it unnoticed?1
Research with potential to be used for both good and bad purposes is now
commonly referred to as dual-use research. While almost any knowledge and
technology can be used for both kinds of purposes, the expression dual-use
research of concern4 is used to refer to research that can be used for especially
harmful purposesthat is, where the consequences of malevolent use would
be potentially catastrophic. Of particular concern are advances in genetics that
might enable development of a new generation of biological weapons of mass
destruction. (In the remainder of this chapter, I will use the expression dualuse research as shorthand for the expression dual-use research of concern.)
Controversial cases
Such danger is illustrated by the following controversial studies that have been
published during the new millennium.
Synthetic polio
Following the map of the polio genome (published on the Internet), American
scientists (via mail order) bought and strung together corresponding strands of
DNA. Addition of the synthesised genome to cell juice (a solution containing
cellular ingredients, but no live cells) led to production of live polio virus that
paralysed and killed mice. The scientists aims were, inter alia, to show that
4 National Science Advisory Board for Biosecurity (NSABB) 2007, Proposed Framework for the Oversight
of Dual Use Life Sciences Research: Strategies for Minimizing the Potential Misuse of Research Information,
National Science Advisory Board for Biosecurity, Bethesda, Md, <http://oba.od.nih.gov/biosecurity/pdf/
Framework%20for%20transmittal%200807_Sept07.pdf> (viewed 7 April 2013).
5 Jackson, R. J., Christensen, C. D., Beaton, S., Hall, D. F. and Ramshaw, I. A. 2001, Expression of mouse
interleukin-4 by a recombinant ectromelia virus overcomes genetic resistance to mousepox, Journal of
Virology, vol. 75, no. 3, pp. 120510.
4
such a feat would be technically possible and to demonstrate that viruses are
ultimately just chemicals.6 They published their findings, along with description
of materials and methods, in Science in 2002.7 A danger is that similar techniques
might be used to create biological weapons agents, such as smallpox, which
bioterrorists and/or state-sponsored biological weapons programs might not
otherwise be able to access easily.
Transmissible H5N1
The most recent and, to date, most controversial dual-use life-science research
involved the study of H5N1 (avian) influenza transmissibility among ferrets,
which provide the best model for influenza in humans. While it is estimated that
H5N1 kills 60 per cent of humans infected, it is not (currently) transmissible
between humans. Researchers in the Netherlands and the United States thus
conducted experiments that aimed to determine whether H5N1 might develop
into a human-to-human transmissible strain. Genetic engineering of the virus
and passaging of the altered virus between ferrets led to creation of strains
that were airborne and easily transmissible among ferretsthus indicating that
natural evolution of a human-transmissible strain of H5N1 might be possible.
Much debate surrounded the question of whether or not this research should
be published in detail. On the one hand, it was argued that publishing these
studies was important because this would facilitate vaccine development and/
or surveillance of relevant changes to H5N1 occurring in nature. The hope with
regard to surveillance is that this would enable earlier detection of emerging
6 Selgelid, M. J. and Weir, L. 2010, Reflections on the synthetic production of poliovirus, Bulletin of the
Atomic Scientists, vol. 66, no. 3, pp. 19.
7 Cello, J., Paul, A. V. and Wimmer, E. 2002, Chemical synthesis of poliovirus cDNA: generation of infectious
virus in the absence of natural template, Science, vol. 297, pp. 101618.
8 Tumpey, T. M. et al. 2005, Characterization of the reconstructed 1918 Spanish influenza pandemic virus,
Science, vol. 310, no. 5745, pp. 7780.
5
Levels of governance
The dual-use phenomenon requires ethical decision-making by various actors
at different levels of the science governance hierarchy. Individual scientists
(insofar as they are at liberty) must decide what research to conduct and/or
publish. Research institutions (insofar as they are at liberty) must decide how to
regulate potentially dangerous research within their confines; how to educate
9 US National Institutes of Health 2011, Press statement on the NSABB review of H5N1 research, <http://
www.nih.gov/news/health/dec2011/od-20.htm> (viewed 7 April 2013).
10 World Health Organisation (WHO) 2012, Public health, influenza experts agree H5N1 research critical,
but extend delay, <http://www.who.int/mediacentre/news/releases/2012/h5n1_research_20120217/en/>
(viewed 7 April 2013).
11 National Science Advisory Board for Biosecurity (NSABB) 2012, March 2930, 2012 meeting of the
National Science Advisory Board for Biosecurity to review revised manuscripts on transmissibility of A/H5N1
influenza virus, <http://oba.od.nih.gov/oba/biosecurity/PDF/NSABB_Statement_March_2012_Meeting.
pdf> (viewed 7 April 2013).
12 Herfst, S., Schrauwen, E. J. A., Linster, M., Chutinimitkul, S., de Wit, E., Munster, V. J., Sorrell, E. M.,
Bestebroer, T. B., Burke, D. F., Smith, D. J., Rimmelzwaan, G. F., Osterhaus, A. D. M. E. and Fouchier, R. A.
M. 2012, Airborne transmission of influenza A/H5N1 virus between ferrets, Science, vol. 22, pp. 153441.
13 Imai, M., Watanabe, T., Hatta, M., Das, S. C., Ozawa, M., Shinya, K., Zhong, G., Hanson, A., Katsura,
H., Watanabe, S., Li, C., Kawakami, E., Yamada, S., Kiso, M., Suzuki, Y., Maher, E. A., Neumann, G. and
Kawaoka, Y. 2012, Experimental adaptation of an influenza H5 HA confers respiratory droplet transmission
to a reassortant H5 HA/H1N1 virus in ferrets, Nature, vol. 486, pp. 4208.
6
researchers working there (regarding dual use and/or ethics); what laboratory
security measures to put into place, and so on.14 Professional societies must
make decisions about the development, promulgation and/or enforcement of
ethical codes of conduct for scientistsand/or decisions about relevant (ethical)
education of their members. Publishers must make decisions regarding processes
of review of papers posing dual-use dangersand they must ultimately decide
which papers to publish. National governments must decide what research
to fund, and the extent to (or manner in) which things like research review,
publication review and/or relevant education of scientists will be mandated
and they must make decisions about the extent to which controls should be
placed on access to potentially dangerous materials. International governance
bodies (such as the WHO), finally, must make relevant decisions concerning
global policyfor example, whether or not there should be international
guidelines regarding dual-use research, or oversight thereof, and/or what the
content of such guidelines should be.
Ethical dilemmas
In all cases the decisions will be difficult. On the one hand, responsible actors
will want to take actions that will promote the development and use of beneficial
science. On the other hand, they will want to take actions that will prevent
the malevolent use of science (which might sometimes require avoidance of
generation and/or publication of potentially dangerous information). An
implication of the dual-use phenomenon, however, is that it is inherently
difficult to achieve both goals at the same time in the cases where the very same
research that is likely to be beneficial might also be used to cause harm. In the
case of governmental decision-making, for example, a laissez-faire approach to
scientific governance might facilitate scientific advance and the benefits thereby
enabledbut it might also lead to especially dangerous research getting done
and/or published. A more restrictive approach, on the other hand, might
prevent generation and/or publication of dangerous informationbut it might
also stifle beneficial scientific advance at the same time. Hence the expression
dual-use dilemma.
In any case, it is important to recognise that key decisions posed by dual-use
research are inherently ethical in nature. The decisions faced by the various
actors enumerated above largely concern: 1) the responsibilities of the actors
in question (for example, to what extent would a scientist be responsible if her
research is used to cause harm?); 2) issues of how one should go about promoting
14 Research institutions, of course, already do such things to varying degreesand numerous relevant
measures (for example, regarding biosafety) are required by law. The point here, however, is that additional
new measures are required to address dual-use research in particular.
7
benefits while avoiding harms or reducing risks (for example, should a paper be
published if this has a good chance of promoting a significant amount of human
wellbeing but a small chance of causing disaster?); and/or 3) questions about
values and value conflict (for example, how should governments strike a balance
between the goal to promote scientific freedom/progress and the goal to promote
security?). Issues regarding responsibilities, harms, benefits and valuesand,
ultimately, what ought to be doneare exactly the kinds of things that ethics
is about.
In the meantime, however, much of the debate about dual-use research has
involved scientists and security experts rather than ethicists in particular.15
While much bioethical discussion has focused on research ethics and the
ethical implications of genetics/biotechnology, it is ironic that relatively little
bioethical discussion has, to date, focused on dual-use research in particular.
This is unfortunate because, given the potential for catastrophic consequences,
dual-use research is surely one of the most important ethical issues regarding
research and genetics/biotechnology.
This volume
The chapters in Part I of this volume, Dual Use in Context, map the terrain
of dual-use issues emerging in various areas of life-science researchthat is:
nanotechnology (Chapter 2, Jim Whitman); neuroscience (Chapter 3, Valentina
Bartolucci and Malcolm Dando); synthetic biology (Chapter 4, Alexander Kelle);
agriculture (Chapter 5, Simon Whitby); and tuberculosis (Chapter 6, Nancy
Connell).
The chapters in Part II, Ethical Frameworks and Principles, explore the
relevance of various existing philosophical frameworks and/or tools of ethical
analysis to the dual-use problem (in cases where their relevance to dual-use
problems has received relatively little previous exploration). In particular, they
examine the impact of environments and institutions on moral development
and ethical reasoning (Chapter 7, Judi Sture); the ethics of weapons research in
general (and the relevance thereof to dual-use research in particular) (Chapter
8, John Forge); application of rational decision theory to the dual-use problem
(Chapter 9, Thomas Douglas); the relevance of the doctrine of double effect to
the dual-use problem (Chapter 10, Suzanne Uniacke); implications of uncertainty
for dual-use decision-making (Chapter 11, Michael Smithson); collective-action
problems associated with dual-use research (Chapter 12, Seumas Miller); the
15 Selgelid, M. J. 2010, Ethics engagement of the dual use dilemma: progress and potential, in B. Rappert
(ed.), Education and Ethics in the Life Sciences: Strengthening the Prohibition of Biological Weapons, ANU E
Press, Canberra, pp. 2334.
8
relevance of just-war theory to the dual-use problem (Chapter 13, Koos van
der Bruggen); and the relevance of the precautionary principle to dual-use
decision-making and policymaking (Chapter 14, Steve Clarke).
Parts III and IV, Ethical Practices and Ethical Futures, consider existing,
developing and future practices and policies regarding ethics and dual-use lifescience researchthat is, the prospect of self-regulation by scientists (Chapter
15, David Resnik); lessons learnt from the history of nuclear physics (Chapter
16, Nicholas Evans); dual-use governance in developing countries (Chapter 17,
Louise Bezuidenhout); the responsibilities of individual scientists in the context
of incapacitating chemical and toxin agents (Chapter 18, Michael Crowley);
and the WHOs project on Responsible Life Sciences Research (Chapter 19,
Emmanuelle Tuerlings and Andreas Reis). The Conclusion (Chapter 20) offers a
wide-ranging and agenda-setting summary by Brian Rappert.
This volume is a product of a workshop (funded by the Wellcome Trust) on
Promoting Dual Use Ethics (organised by Michael Selgelid and Brian Rappert)
held at the Centre for Applied Philosophy and Public Ethics (CAPPE) at The
Australian National University in Canberra in January 2010.
2. Nanotechnology and
Dual-Use Dilemmas
Jim Whitman
1 Many issues in nanoscience and nanotechnology turn on the practical aspects of disciplinarity, for
proponents of nanotechnology, science educators and practising scientists themselves. See Wienroth, M. 2002,
Disciplinarity and research identity in nanoscale science and technologies, in J. S. Ach and C. Weidemann
(eds), Size Matters: Ethical, Legal and Social Aspects of Nanobiotechnology and Nano-Medicine, Lit Verlag,
Berlin, pp. 15777.
2 Roco, M. C. and Bainbridge, W. S. (eds) 2002, Converging technologies for improving human performance:
nanotechnology, biotechnology, information technology and cognitive science, National Science Foundation/
Department of Commerce-sponsored Report, <http://www.wtec.org/ConvergingTechnologies>; Nordmann,
A. 2004, Converging TechnologiesShaping the Future of European Societies, <http://www.ntnu.no/2020/pdf/
final_report_en.pdf>.
13
When set against the products and promise of nanotechnology, the very
considerable risks and unknowns of the diffusion of nanomaterials might
be expected to generate a great many dilemmas both at policymaking and at
laboratory levels. Since these are not much in evidence either in public debate or
in the momentum of nanoscientific advance, it is clear that the powerful interests
involved and the expected gains and competitive advantages carry more weight
than either ethical disquiet or practical concerns. Although this is hardly a novel
position (of 30 000 bulk chemicals in use in the European Union, only 3000 or
so have been formally assessed for health and environmental effects),9 the moral
significance and practical risks involved in nanotechnologies on present and
planned scales cannot be regarded much as one might an incremental addition to
an already chemicalised environment. After all, the transformative qualities of
nanotechnology have been much touted by its proponents, which accounts for
the acknowledgment of serious ethical and societal implications that have found
expression in the academic as well as policy and policy-related literatures.10
The acknowledged possibility of deleterious but unwanted outcomes of
nanotechnology, however, in the form of kinds and degrees of toxicitythat is,
a biochemical risk of a familiar sort, albeit perhaps on an exceptional scaleis
not an end to the matter. It is in the realm of the many uses and adaptations of
nanotechnology in practically every field of the physical and medical sciences and
engineering that the larger dangers resideand as part of them, the prospect of
an explosion of dual-use issues. These are routinely gathered under the generic
term societal implications, with ethical issues often given freestanding
consideration, by which it is unclear where military nanotechnology is (or
should be) sited. After all, the research and development of military uses and
adaptations of nanotechnology are by no means either merely prospective or
peripheral to the establishment of nanotechnology. Although many of these
are at least nominally defensive,11 active research and already articulated warfighting possibilities and other means of inflicting serious harm have not been
precluded.12 Of the 13 US federal agencies which shared $1.6 billion under the
US National Nanotechnology Initiative for 2009, the Department of Defense was
allocated 28 per cent.13
Military uses and adaptations of nano-enabled products and processes are as
yet not concentrated around dedicated means of offensive war-fighting or new
weapons systems. Instead, the foreseeable nanotechnology developments most
likely to be adopted for military purposes have a very wide range of applications:
smart materials, micro-electromechanical systems, nanocomputing, robotics,
micro-sensors and many more. Jurgen Altmann has noted that if the production
facilities for raw material, feedstock, energy, and final products as well as the
transport systems are themselves produced by molecular nanotechnology,
a very fast increase of the production and distribution of military goods is
possible, and one possible outcome of this is that molecular nanotechnology
production of nearly unlimited numbers of armaments at little cost would
contradict the very idea of quantitative arms control and would culminate in
a technological arms race beyond control.14 In any event, since the worldwide
growth of nanotechnology research and development has been propelled by
the expectation that it will usher in a new industrial revolution, an age of
transitions and other epoch-defining transformations, government sanction of
nanotechnology developments has always had a strongly competitive edge
made all the more sharp now that it has been linked to conceptions/aspects of
national security.15 Realist fears inform the development of security dilemmas;
and the dangers are already apparent: By taking cues from Moscows centralized
and opaque institutions, Washington risks misperceiving Russias intentions
and calculations as well as prematurely locking into strategic competition.
The result, if unaltered, could convert the promise of nanotechnology into a
new realm of commercial rivalry and arms racing.16
The perils of nanotechnology, however, are not limited to arms races of the
sort that can, at least in principle, be addressed by dedicated arms-control
initiatives.17 There has already been concern over the comprehensiveness of the
12 Altman, J. 2006, Military Nanotechnology, Routledge, London. See also Shipbaugh, C. 2006, Offensedefense aspects of nanotechnologies: a forecast of military applications, Nanotechnology, pp. 7417; Kharat,
D. K., Muthurajan, H. and Praveenkumar, B. 2006 Present and future military applications of nanaodevides,
Synthesis and Reactivity in Inorganic, Metal-Organic and Nano-Metal Chemistry, vol. 36, pp. 2315.
13 US National Nanotechnology Initiative, Funding, <http://www.nano.gov/html/about/funding.html> .
14 Altman, op. cit.
15 Vandermolen, LCDR T. D. 2006, Molecular nanotechnology and national security, Air & Space Power
Journal, pp. 96106.
16 Stulberg, A. N. 2009, Flying blind into a new military epoch: the nanotechnology revolution, emerging
security dilemmas, and Russias double-bind, Paper presented at the International Studies Association
Conference, February 2009.
17 Whitman, J. 2011, The arms control challenges of nanotechnology, Contemporary Security Policy, vol.
32, no. 1, pp. 99115.
17
18 Pinson, R. D. 2004, Is nanotechnology prohibited by the Biological and Chemical Weapons Conventions?
Berkeley Journal of International Law, vol. 22, no. 2, p. 298.
19 See the Special Issue of The International Journal of Robotics Research, vol. 28, no. 4 (April 2009): Current
State of the Art and Future Challenges in Nanorobotics.
18
can read as much into the definition of dual-use research of concern proposed
by the US National Science Advisory Board for Biosecurity: research that, based
on current understanding, can be reasonably anticipated to provide knowledge,
products, or technologies that could be directly misapplied by others to pose
a threat to public health, agriculture, plants, animals, the environment, or
materiel.22 Moral unease, however, is never automatically congruent with
estimates of the practical significance of possible dangers; and it is in the nature
of a dilemma that countervailing considerations are of roughly equal weight. In
addition, countervailing considerations in life-science dual-use conundrums
that is, those that bolster the case for either the positive or the negative aspects
of the dual use to be determinedmost often entail structural considerations.
This is because of the range of powerful interests involved and because the
encompassing arenas are international and/or global.
The application of bioethics or ethical reasoning more generally is not a matter of
informed individuals unfailingly applying knowledge to circumstance; and this
applies to scientists no less than to any other moral agents.23 In his comprehensive
study of scientists whose research was furthered by collaboration with the
Nazi regime, John Cornwell depicts the striking similarity of pressures facing
scientists then and now: The greatest pressures on the integrity of scientists
are exerted at the interface between the professional practice of science and
the demands of award-giving patrons. The narrative of scientific collaboration
with the Nazi regime reveals the pressures of hubris, loyalty, competition and
dependence leading to compromise. In the final analysis the temptation was a
preparedness to do a deal with the Devil in order to continue doing science.24
Even when there is no obvious devil in the form of a psychopathological regime,
scientific research can but rarely be understood as pure in the sense that it
can wholly be abstracted from the potential for pernicious uses; and ethical
concerns are embedded in risk/benefit calculations that do not present in clear,
dichotomous form. So it is that particular cases of the dissemination of biological
knowledge through open, peer-reviewed publication can be seen as posing
grave risks of misuse;25 but at the same time, there is no enthusiasm for general
prohibitions on scientific exchanges, on which so much beneficent advance
depends, and for which there is no commanding legal reach.26 It follows that in
22 National Science Advisory Board for Biosecurity (NSABB) 2008, Frequently asked questions, <http://
www.biosecurityboard.gov/faq.asp#1>.
23 Whitman, J. 2010, When dual use issues are so abundant, why are dual use dilemmas so rare? Research
report for the Wellcome Trust project Building A Sustainable Capacity in Dual Use Bioethics, <http://www.
dual-usebioethics.net/>.
24 Cornwell, J. 2004, Hitlers Scientists: Science, War and the Devils Pact, Penguin, London, p. 462.
25 Indicative examples are listed in Institute of Medicine and National Research Council 2006, Globalization,
Biosecurity and the Future of the Life Sciences, The National Academies Press, Washington, DC, pp. 534.
26 Marchant, G. E. and Pope, L. L. 2009, The problems of forbidding science, Science and Engineering
Ethics, vol. 15, pp. 37594.
20
27 Fisher, E. and Mahajan, R. L. 2006, Contradictory intent? US federal legislation on integrating societal
concerns into nanotechnology research and development, Science and Public Policy, pp. 516.
21
with moral and practical risks both profound and pervasive. These are hardly
propitious operating conditions for the scope and efficacy of either nano-ethics
or dual-use bioethics as applied to nanotechnology.
Introduction
It is clear that in the past advances in neuroscience were used for hostile as well
as peaceful purposes. Lethal chemical nerve agents, after all, interfered with
the acetylcholine neurotransmitter system1 and, during the twentieth centurys
EastWest Cold War, both sides clearly also made efforts to develop nonlethal chemical agents for various purposes.2 The use of some form of fentanyl
derivative(s) by Russian security forces to break the 2002 Moscow theatre
siege shows that today at least one major state has deployed such a weapons
system. Many commentators fear that Russia would be far from alone in having
an interest in developing novel incapacitating capabilities if the advances in
neuroscience provide suitable opportunities.3
This chapter sketches the areas of interest and methods used by neuroethicists to
ask what they have had to say about the problem of dual use: the fact that advances
in benignly intended civil neuroscience could produce materials, knowledge
and technologies that might then be used for hostile purposes by others. Of
course, it should be understood from the start that this is no small problem,
as has been made abundantly clear, for example, in the Lemon-Relman report
of the US National Academies in 2006, which, in its second recommendation,
stated that it was necessary to [a]dopt a broadened awareness of threats beyond
the classical select agents and other pathogenic organisms and toxins, so as to
include, for example, approaches for disrupting host homeostatic and defence
systems and for creating synthetic organisms.4
1 Dando, M. R. 1996, A New Form of Warfare: The Rise of Non-Lethal Weapons, Brasseys, London.
2 Dando, M. R. and Furmanski, M. 2006, Midspectrum incapacitant programs, in M. L. Wheelis, L. Rzsa
and M. R. Dando (eds), Deadly Cultures: Biological Weapons Since 1945, Harvard University Press, Cambridge,
Mass., pp. 23651.
3 Pearson, A. M., Chevrier, M. and Wheelis, M. L. 2007, Incapacitating Biochemical Weapons: Promise or
Peril? Lexington Books, Lanham, Md.
4 Committee on Advances in Technology and the Prevention of their Application to Next Generation
Biowarfare Threats 2006, Globalization, Biosecurity and the Future of the Life Sciences, The National Academies
Press, Washington, DC.
29
5 Blank, R. H. 1999, Brain Policy: How the New Neuroscience Will Change Our Lives and Politics, Georgetown
University Press, Washington, DC.
6 Ibid., pp. 11 and 12.
7 de Vries, R. 2007, Who will guard the guardians of neuroscience? Firing the neuroethical imagination,
EMBO Reports, vol. 8 (S1), pp. S659.
30
3. What Does Neuroethics Have to Say about the Problem of Dual Use?
after the turn of the century. Martha Farah,8 in a thoughtful attempt to delineate
the field, argued, for example, that [b]eginning in 2002, neuroscientists began
to address these issues in the scientific literature and the field gained a name
neuroethics. At the same time, key meetings brought together large numbers
of experts and specific neuroethics membership organisations began to be
founded.9
The comparatively recent delineation of neuroethics as a special field it is not
too surprising; however, given its acknowledged deep roots within bioethics,
as Parens and Johnston carefully pointed out, it is important to understand that
bioethicists have made major errors in the past. If neuroethicists forget past
mistakes they could end up doing the same in the near future.10 Parens and
Johnston point to three particular problems that could easily arise in the new
field of neuroethics:
1. [T]he problem of reinventing the bioethical wheel
2. [T]he problem of exaggerating what the science can teach us about
who we are
3. [T]he problem of exaggerating what bioethics research can deliver.11
Furthermore, given the recent delineation of the specific field of neuroethics, we
should not be surprised to find that there is as yet no comprehensive, widely
accepted view of the scope of the field. For example, one might argue that finding
out what happens in the brain when ethical decisions are being made should be
central to any conception of neuroethics, but that does not appear to be what
practitioners have decided to do. Rather, the field seems to deal more with what
Farah called the practical and the philosophical: the implications of advances
in neuroscience for practical social issues and the implications of advances in
neuroscience for our understanding of ourselves (which, of course, overlaps to
some extent with the question of how we make ethical decisions). The sections
of Farahs paper provide illustrations of these different aspects (Table 3.1).
8 Farah, M. J. 2005, Neuroethics: the practical and the philosophical, Trends in Cognitive Sciences, vol. 9,
no. 1, pp. 3440, see Figure 3.1: Milestones in the history of ethics in neuroscience.
9 See, for example, The History of Neuroethics section of the entry on Neuroethics in Wikipedia.
10 Parens, E. and Johnston, J. 2007, Does it make sense to speak of neuroethics? EMBO Reports, vol. 8
(S1), pp. S614.
11From Wikipedia, op. cit., p. 64.
31
Neil Levy, in his introduction to the new journal Neuroethics, accepted this
twofold division of the subject but stressed that, to the extent that neuroscience
shows that we are less than rational and autonomous in our decision-making,
that must impact on our understanding of the practical impact of the advances
in neuroscience.12
In an attempt to advance the field of neuroethics, Georg Northoff argued
that both of Farahs two subdivisions of neuroethics (which might be simply
characterised as what we can do and what we know) should be termed
empirical neuroethics, and we should accept that there is no sharp distinction
between them (Table 3.2). So, in his view: Empirical neuroethics deals with
the empirical and practical aspects of the linkage between neuroscientific and
ethical concepts.13
3. What Does Neuroethics Have to Say about the Problem of Dual Use?
Technology
Brain imaging
Pharmacological
enhancement
Questions
Old technology
Safety
Researchers
obligations
Validity
Safety
Validity
New questions
Intentional deception
Neuromarketing
Personal
characteristics
Attention
Memory
Mood
Equity
Military research on
cyborgs
Source: Northoff, G. 2009, What is neuroethics: empirical and theoretical neuroethics, Current Opinion in
Psychiatry, vol. 22, pp. 5659.
He writes, however, that [a]lthough there has been much discussion of various
issues in empirical neuroethics, the discussion of methodological and conceptual
issues and thus theoretical neuroethics has remained rather sparse so far.
Thus, Northoff argues for the need for theoretical neuroethics to focus on the
methodological and conceptual aspects of the linkage between neuroscientific
facts and ethical concepts and for a theoretical neuroscience that is able to give
proper weight to both norms and facts in a normfact circularity.
Towards the end of his paper, Northoff notes that his ideas may strike some
as a mere playground for theoreticiansa criticism that might be particularly
applied to stressing theoretical neuroscience in this paper on the severely
practical issue of dual use. We think, however, he has a general point to make on
neuroethics methodology that is significant: If neuroethics wants to establish
itself as a separate discipline that is different from its neighboring disciplines
like philosophy, ethics and neuroscience, it must develop a special methodology.
In this view, a discipline or field of study cannot claim to be distinct just
because it studies certain things; it must have developed its own distinctive
methodology as well. It would certainly be wrong to consider that this will be
a simple task given that there are still great philosophical differences about how
we should go about understanding our brains and behaviour.14
14 Evers, K. 2007, Towards a philosophy for neuroethics, EMBO Reports, vol. 8, pp. 84851.
33
15 Lombera, S. and Illes, J. 2009, The international dimension of neuroethics, Developing World Bioethics,
vol. 9, no. 2, pp. 5764.
16 Illes, J. et al. 2005, International perspectives on engaging the public in neuroethics, National Review
of Neuroscience, vol. 6, no. 12, pp. 97782; Illes, J. et al. 2010, Neurotalk: improving the communication of
neuroscience, National Review of Neuroscience, vol. 11, no. 1, pp. 120.
17 Wolpe, P. R. 2006, Reasons scientists avoid thinking about ethics, Cell, vol. 125, no. 6, pp. 10235.
18 Sahakian, B. J. and Morein-Zemir, S. 2009, Neuroscientists need neuroethics teaching, Science, vol. 325,
p. 147.
19 Blank, op. cit.
34
3. What Does Neuroethics Have to Say about the Problem of Dual Use?
Case studies
Whilst the intention here is to concentrate on the ethical issues involved in the
practical consequences of the advances in neuroscience, it should be made clear
that this is not because we underestimate the importance of the philosophical
issues raised by modern neuroscience. This point has been made regularly in
reviews of the emerging field: an inadequate model of the brain and mind is
likely to cause great misunderstandings and practical difficulties.20 For example,
if someone is in a persistent vegetative state but neuroimaging shows that he or
she is able to respond to some questions,21 can informed consent be achieved for
treatment? Clearly, the brain is not without response, but is there an ability to
decide on complex questions?
If such questions at the more philosophical end of the spectrum of the empirical
issues are set aside, it is clear that neuroethicists have considered questions
arising from three types of new technologies: neuroimaging; pharmacological
enhancement; and non-pharmacological enhancement (such as brainmachine
interfaces, transcranial magnetic stimulation, and direct current stimulation).
There is also now some consideration being given to how the combination of
these technologies with genomics22 and information technology23 may affect the
ethical questions.
Some of the ethical questions that arise with these new technologies are far from
new. In this regard neuroethics has carefully noted the issues relating to safety
of the technologies, the validity of results drawn from complex analyses derived
from neuro-imaging techniques, and how these images are understood by the
media and general public. Neuroethicists have also discussed the obligations
that researchers have for their findingsfor example, what to do about findings
that suggest that the person involved could have an increased likelihood of
illness in the future.
Despite acknowledging these issues that have long-running parallels in bioethics,
neuroethicists writing makes frequent reference to the fact that some of the
issues that arise from the application of these new technologies are novel. As
20 Glannon, W. 2009, Our brains are not us, Bioethics, vol. 23, no. 6, pp. 3219.
21 Owen, A. M. et al. 2006, Detecting awareness in the vegetative state, Science, vol. 313, p. 1402.
22 Tairyan, K. and Illes, J. 2009, Imaging genetics and the power of combined technologies: a perspective
from neuroethics, Neuroscience, vol. 164, pp. 715.
23 Amari, S.-I. 2002, Neuroinformatics: the integration of shared databases and tools towards integrative
neuroscience, Journal of Integrative Neuroscience, vol. 1, no. 2, pp. 11728.
35
Farah and Wolpe express it: The brain is the organ of mind and consciousness,
the locus of our sense of selfhood. Interventions in the brain therefore have
different ethical implications than interventions in other organs.24
With regard to neuroimaging, there are numerous discussions of the possibility,
and thus implications, of being able to detect when people are intentionally
carrying out a deception. There has been wide discussion of the implications of
being able to detect, through neuroimaging, peoples desire for certain products
and the consequences of the growth of neuromarketing. Concerns have also
been expressed about the dangers to privacy if such personal characteristics can
be elucidated by neuroimaging.
With respect to pharmacological enhancement, there are again issues of safety,
validity and communication of findings to a non-expert audience that are not
specific to the concerns of neuroscientists and neuroethicists. Again, however,
the neuroethicists writing clearly indicates that they believe the possibilities of
enhancing attention, memory and mood and the reverse possibility of damping
down memories (to help people suffering from post-traumatic stress disorder)
do raise new ethical issues. What happens, for example, to those who choose
not to be enhanced, or who cannot afford to be enhanced, in an enhancementridden society? And, again touching on the philosophical, what happens to our
sense of self and worth if we can have a better attention span, memory or mood
not by work to achieve such developments but by popping a pill?
A similar set of novel and not so novel issues arises in regard to non-pharmaceutical
enhancement, but one unusual point can be noted in the concerns expressed
about military funding of work on brainmind interfaces and of new breeds of
cyborgs. Nevertheless, it is clear that this worry about the military implications
of the advances in neuroscience and related technologies is very limited in the
neuroethics literature. The limited nature of the neuroethical debate was noted by
Joelle Abi-Rached in her discussion of the launch of the European Neuroscience
and Society Network. In her view, this has to change and the debate must
also include the controversies surrounding the potential application of various
technologies in neurosecurity and counter-terrorism.25
She correctly references Jonathan Morenos book Mind Wars: Brain Research and
National Defense26 when making her point. Even there, howeverin the only
extended discussion of the possible misapplications of various technologies
there is very little discussion of the problem of dual use. Indeed, although in
24 Farah, M. J. and Wolpe, P. R. 2004, Monitoring and manipulating the human brain: new neuroscience
technologies and their ethical implications, Hastings Center Reports, vol. 34, no. 3, pp. 3545.
25 Abi-Rached, J. M. 2008, The implications of the new brain sciences, EMBO Reports, vol. 9, no. 12, pp.
115862.
26 Moreno, J. D. 2006, Mind Wars: Brain Research and National Defense, Dana Press, Washington, DC.
36
3. What Does Neuroethics Have to Say about the Problem of Dual Use?
Cognitive enhancement
In December 2008, Nature carried a commentary by Henry Greely and five
colleagues titled Towards responsible use of cognitive-enhancing drugs by the
healthy. The stated aim of this piece was to propose actions that will help
society accept the benefits of enhancement, given appropriate research and
evolved regulation.32
Arguing that these drugs are just another means devised by our innovative
species to improve itself, they suggested cognitive-enhancing drugs should
be viewed in the same general category as education, good health habits and
information technology. They listed what they viewed as standard arguments
against the use of these drugscheating, unnaturalness and drug abuseand
dismissed them. They did accept, however, that questions of safety, freedom
(from coercion to enhance) and fairness (towards those who could not afford
enhancement) would need to be addressed. They also suggested a program
of research, professional guidance, public understanding and regulation
development be undertaken so that responsible use was facilitated. In a direct
critique of this commentary, two Canadian neuroethicists struck a much more
cautious note, arguing that the question of safety was far from settled: it is
important to stress that sizeable gaps exist in our current understanding of
the effects, both positive and negative, of neuropharmaceuticals on healthy
individuals.33
And they pointed out that the simple tasks studied so far in laboratories do not
reflect the complexity and diversity of activities in learning and thinking. They
thus implied that there was a degree of overstatement amongst those favouring
cognitive enhancement in the same way that gene therapy had been oversold by
supporters. A more neutral term than cognitive enhancement, they suggested,
might be non-medical use of prescription drugs. These critics also had concerns
about the impact on health resources of interest in cognitive enhancementfor
example, about the safety research program suggested by advocates eating up
resources that were more urgently needed elsewhere, for example, in treating
people who are ill. In short, there does not yet appear to be a settled view
amongst neuroethicists about cognitive enhancement.34
Yet there is a considerable literature on this subject and certainly enough to ask
two questions relevant to dual use: 1) do neuroethicists dealing with cognitive
32 Greely, H. et al. 2008, Towards responsible use of cognitive-enhancing drugs by the healthy, Nature,
vol. 456, pp. 7025.
33 Racine, E. and Forlini, C. 2009, Expectations regarding cognitive enhancement create substantial
challenges, Journal of Medical Ethics, vol. 35, pp. 46970.
34 Harris, J. and Chatterjee, A. 2009, Head to head: is it acceptable for people to take methylphenidate to
enhance performance? BMJ, vol. 338, pp. 15323.
38
3. What Does Neuroethics Have to Say about the Problem of Dual Use?
enhancement consider the issue of dual use, and 2) do the investigations of the
mechanism of enhancement by neuroscientists indicate any awareness of the
possibility that their work could be used for the very opposite manner by those
with malign intent?
Most discussions of cognitive enhancement are concerned with the use of drugs
such as: methylphenidate (used medically to help people suffering from attention
deficit hyperactivity disorder) to improve attention; modafinil (used medically
to help people suffering from sleep problems to improve alertness; and SSRIs
selective serotonin reuptake inhibitors (used medically to help people suffering
from problems of mood)to improve the mood of people who are unwell. A
particularly interesting subject in regard to this chapter is the use of the drug
propranolol, not to improve memory but to help people to forget emotionally
laden traumatic eventsmemories that recur in post-traumatic stress disorder
(PTSD).35
In such discussions it is certainly possible to find what appears to be approval
of military-funded research that could fundamentally change war-fighting and
force employment because of, for example, the elimination of the need for sleep
and the maintenance of a high level of cognitive performance.36 On the other
hand, it is rare indeed to find a clear recognition in the neuroethics literature of
the problem of dual use in regard to cognitive enhancement. As Kathinka Evers
pointed out:
It has been suggested that therapeutic forgetting is interesting for
military purposes, for example, to provide soldiers with propranolol
before a battle. A problem here is that if it helps them forget what they
have been subjected to it also helps them forget what they have done to
others.37
This comment is clearly a special case because Evers is one of the few bioethicists
who has addressed the general issue of dual-use bioethics.38
If we turn to the neuroscience, it is clear that safety issues involved in cognitive
enhancement are sometimes well understood. As Cakic recently pointed out:
For psychostimulants such as methylphenidate, the dangers are real and
relatively well known. Aside from its abuse potential, methylphenidate may
aggravate mental illness, produce sleep disturbances and is associated with
cerebrovascular complications.39
It is also clear that we know a great deal about how emotion-laden memories
are laid down in mammals and the way in which propranolol can be used to
interfere with the role of noradrenaline in the consolidation and reconsolidation
of such memories.40
Whilst accepting that some modest improvements in capabilities may be achieved,
recent detailed reviews of the science and ethics of cognitive enhancement have
emphasised caveats: first doses most effective in facilitating one behavior
could at the same time exert null or even detrimental effects on other cognitive
domains. Second, individuals with low memory span might benefit from
cognitive-enhancing drugs, whereas high span subjects are overdosed.41
And, finally: evidence suggests that a number of trade-offs could occur. For
example, increases of cognitive stability might come at the cost of a decreased
capacity to flexibly alter behaviour.
This particular review also discusses six ethical issues found in the literature:
safety; societal pressure; fairness and equality; enhancement versus therapy;
authenticity and personal identity; and happiness and human flourishing. In
the last of these it does refer to the possibility that the blunting of memory
could involve violation of a duty to remember and bear witness of crimes and
atrocities. Thus it touches on the question of dual use, but again there is no
indication that the general pointthat all of the work in cognitive enhancement
could be dual usehas been understood.
Another recent review that notes the problem of such misuse in the blunting of
memories without drawing the general conclusion about dual use does make the
crucial point that our knowledge of the brain remains limited: the fundamental
question is: are we technically ready and do we have sufficient basic knowledge
to develop such drugs without risking a deadly brain doping?42
A possible response is to say that cognitive enhancement has been much
overhyped and that major brain modifications are years away and therefore
39 Cakic, V. 2009, Smart drugs for cognitive enhancement: ethical and pragmatic considerations in the era
of cosmetic neurology, Journal of Medical Ethics, vol. 35, pp. 61115.
40 Tully, K. and Bolshakov, V. Y. 2010, Emotional enhancement of memory: how norepinephrine enables
synaptic plasticity, Molecular Brain, vol. 3, pp. 1524; Dando, M. R. 2007, Scientific outlook for the
development of incapacitants, in Pearson et al., op. cit., pp. 12348.
41 de Jongh, R. et al. 2008, Botox for the brain: enhancement of cognition, mood and pro-social behavior
and blunting of unwanted memories, Neuroscience and Biobehavioral Reviews, vol. 32, pp. 76076.
42 Lanni, C. et al. 2008, Cognitive enhancers between treating and doping the mind, Pharmacological
Research, vol. 57, pp. 196213.
40
3. What Does Neuroethics Have to Say about the Problem of Dual Use?
we should not worry too much about the state of neuroethics and its lack of
coverage of the problem of dual use. That is indeed what one neuroscientist
stated recently in a review of a book of essays on neuroethics: the participants
in this discussion often claim that their speculative approach provides us with
the unique opportunity to discuss the ethical consequences of new technologies
before they are fully developed However, do we really need a debate on a
technology that will probably never materialise?43
In this view, we do not really need to be concerned about neuroethics because
the debate amongst neuroethicists is not of great importance. Many people took
amphetamines during and after World War II so the present uses of drugs for
cognitive enhancement is nothing new and, given the limited benefits that are
likely to be available for the foreseeable future, the present phase will surely
pass quietly away.
Conclusion
The field of neuroscience has grown considerably in the past decade. Advances
in neuroscience already have raised important questions on a wide range of
policy issues, such as those affecting neurotoxins and the environment, mental
health, child development, cognitive enhancement, criminal behaviour, the
safety and efficacy of pharmaceuticals and medical devices, and the ethics and
regulation of emerging discoveries. To an ever-increasing understanding of the
45 McCreight, R. 2007, Protecting our national neuroscience infrastructure: implications for homeland
security, Presentation to National Security and the Future of Strategic Weapons, George Washington
University, Institute of Crisis, Disaster and Risk Management, <http://www.chds.us/?fs:file&mode=dl&drm=
..%2F..%2Fresources%2Fsummit%2F%2Fsummit07&f=McCreight-GeorgeWashUniv.ppt&altf=McCreightGeorgeWashUniv.ppt>.
46 Bell, C. 2010, Neurons for peace: take the pledge, brain scientists, New Scientist, vol. 2746 (8 February).
47 Atlas, R. M. 2009, Responsible conduct by life scientists in an age of terrorism, Science and Engineering
Ethics, vol. 15, pp. 293301.
48 Robert, J. S. 2009, Toward a better bioethics, Science and Engineering Ethics, vol. 15, pp. 28391.
49 Sutton, V. 2009, Smarter regulations, Science and Engineering Ethics, vol. 15, pp. 3039.
42
3. What Does Neuroethics Have to Say about the Problem of Dual Use?
brain mechanisms associated with core human attributes and values should
also correspond an increasing interest in the possible dual-use implications of
such advancements, giving the various ways, not always benign, in which the
new knowledge could be used. Neuroethics addresses the various philosophical
issues around the relationship between brain and mind as well as practical issues
about the impact upon society of our ability to understand and manipulate
the brain. The problem of dual use, consisting in a malign appropriation of
knowledge initially designed for benign purposes, should be an important
focus of neuroethicists analyses. In particular, the long-term applications and
impacts of neuroscience are likely to be powerful and profound. As pointed out
by Marchant and Gulley: Military and intelligence agencies, with the most at
stake from such applications in terms of both benefits and risks, recognize the
potential of neuroscience to revolutionize intelligence gathering and warfare.50
Furthermore, as pointed out by Jonathan Moreno, the 11 September 2001
attacks have resulted in increased efforts to exploit all technical possibilities
for enhancing security. The Pentagons Defense Advanced Research Projects
Agency (DARPA) is supporting work at Lockheed Martin on remote brain prints
and the scientist in charge already claims to be able to tell if a person is thinking
of a certain number. In the words of Moreno, a striking aspect of much of this
and other national security work being done in the field of neuroscience is that
it is dual usepotentially applicable to medical therapy or other peaceful
purposes as well as combat, riot control, hostage situations, or other security
problems.51
Worryingly, from a preliminary review of the literature on neuroethics, it clearly
emerges that, while publications abound on issues such as lie detection, informed
consent for certain patients and around the implications of neuroimaging, the
problem of dual use is very marginally addressed.
In the majority of research centres and institutions, no mention is made of the
problem of dual use, and the overwhelming assumption is that neuroscience
works for the betterment of humanity worldwide (see Appendix A). If this is
what neuroethicists are working for, more attention should surely be devoted
to the way advances in neuroscience could be misused. Leading neuroethicists
also very rarely address the issue of dual use in neuroscience, with few notable
exceptions.52
50 Marchant, G. and Gulley, L. 2010, National security neuroscience and the reverse dual-use dilemma,
AJOB Neuroscience, vol. 1, no. 2, pp. 202.
51Ibid.
52 Moreno, J. 2005, Dual use and the moral taint problem, The American Journal of Bioethics, vol. 5, no.
2, pp. 523; Moreno, J. 2008, Using neuro-pharmacology to improve interrogation techniques, Bulletin of
the Atomic Scientists; Huang, J. Y. and Kosal, M. E. 2008, The security impact of the neuroscience, Bulletin
of the Atomic Scientists.
43
44
Introduction
Over the past decade synthetic biology has emerged as one of the most dynamic
subfields of the post-genomic life sciences. According to a European high-level
expert group, synthetic biology comprises the synthesis of complex, biologically
based (or inspired) systems which display functions that do not exist in nature
[and] is a field with enormous scope and potential.1 Some of the areas where
this expert group argues that synthetic biology could have a major impact
include biomedicine, a sustainable chemical industry, environment and energy,
and biomaterials. If the emerging discipline of synthetic biology can deliver on
the promises of some of its leaders and become as pervasive as computing has
become in the past few decades, we might very well be witnessing a fundamental
shift similar to the one that happened to chemistry with the introduction of the
periodic table. If synthetic biologists live up to some of the more far-reaching
expectations, biology ultimately may become a mechanistic science.
On the one hand, synthetic biology developments show promise of leading
to beneficial applications in a number of areas, such as drug development,2
biodegradation3 and biofuels.4 At the same time, the dual-use character of this
new technoscience carries with it the possibility of synthesised biological parts,
modules and systems being malignly misused. This dual-use potential hasat
a rather abstract level and with a focus on one particular subfield of synthetic
biology, that is, DNA synthesisbeen recognised by practitioners in the field
as well as analysts. While this is a positive development, however, these mostly
technical governance measures that are addressing DNA synthesis capabilities
need to be broadened so as to cover all aspects of synthetic biology and to allow
for a comprehensive bioethical analysis of the fields dual-use implications. In
addition, part of the discourse on the broader societal implication of synthetic
biology can be traced back to the debates on ethical, legal and social implications
1 Synthetic Biology. Applying Engineering to Biology, Report of a New and Emerging Science and Technology
(NEST) High-Level Expert Group, European Commission, Brussels, 2005, p. 5.
2 See Neumann, H. and Neumann-Staubnitz, P. 2010, Synthetic biology approaches in drug discovery and
pharmaceutical biotechnology, Applied Microbiology and Biotechnology, vol. 87, pp. 7586.
3 See Kirby, J. R. 2010, Designer bacteria degrades toxin, Nature Chemical Biology, vol. 6, pp. 3989.
4 See Dellmonaco, C. et al. 2010, The path to next generation biofuels: successes and challenges in the era
of synthetic biology, Microbial Cell Factories, vol. 9.
45
(ELSI) of genetic engineering. Yet, given their limited nature, these debates do
not provide a solid foundation for a comprehensive discussion and assessment
of synthetic biologys dual-use potential.
This chapter will first outline the scope of synthetic biology as a new subfield in
the life sciences in which different science and engineering disciplines converge.
This will be followed by a brief discussion of synthetic biologys potential for
malign misuse as well as some of the proposals for governance of this new
technoscience. Thus far, the mostly technical character of these proposals
has resulted in a rather limited appreciation of the wider governance issues
related to the full breadth of approaches usually subsumed under the synthetic
biology label. The final section will discuss both academic and institutional
contributions to a bioethically informed discourse on the misuse potential of
synthetic biology.
47
48
Aims
Method
Chemical production of
cellular containers, insertion of
metabolic components
To construct viable
approximations of cells; to
understand biology and the
origin of life
Protocells
Changing structurally
conservative molecules such
as DNA
Source: Schmidt, M. 2009, Do I understand what I can create? in M. Schmidt, A. Kelle, A. Ganguli-Mitra and H. de Vriend (eds), Synthetic Biology. The Technoscience and
Its Societal Consequences, Springer, Dordrecht, p. 84.
Examples
Techniques
Minimal genome
DNA-based bio-circuits
11 See, for example, de Lorenzo, V. and Danchin, A. 2008, Synthetic biology: discovering new worlds and
new words, EMBO Reports, vol. 9, pp. 8227.
12 Campos, L. 2009, That was the synthetic biology that was, in Schmidt et al., op. cit., pp. 521.
13 Carlson, R. 2001, Open source biology and its impact on industry, IEEE Spectrum, pp. 1517, as quoted
by Campos, op. cit., p. 17.
14 Knight, T. F. 2002, DARPA BioComp Plasmid Distribution 1.00 of Standard Biobrick Components, <http://
dspace.mit.edu/handle/1721.1/21167>.
15 For an even more detailed subdivision of the field than the one used here, see Lam, C. M. C., Godinho, M.
and dos Santos, V. 2009, An introduction to synthetic biology, in Schmidt et al., op. cit., pp. 2348.
49
22 See Epstein, G. L. 2008, The challenges of developing synthetic pathogens, Bulletin of the Atomic
Scientists, vol. 64, pp. 467.
23 Garfinkel, M. S., Endy, D., Epstein, G. L. and Friedmann, R. M. 2007, Synthetic Genomics: Options
for
Governance,
<http://www.jcvi.org/cms/fileadmin/site/research/projects/synthetic-genomics-report/
synthetic-genomics-report.pdf>, p. 12.
24 Synthetic Biology, op. cit., p. 14.
51
destruction25 could be easily diverted from their intended benign use to malign
applications that would, for example, aim at delivering pathogenic genes or
target not cancer, but nerve or other essential cells.
such list would be measured in months, not years.31 To emphasise this point,
the report pointed out that [n]ew, unexpected discoveries and applications in
RNAi and synthetic biology arose even during the course of deliberations by
this Committee.32 Instead, the committee developed a classification scheme for
science and technology (S&T) advances with four different groups. These four
groups are
1. technologies that seek to acquire novel biological or molecular diversity
2. technologies that seek to generate novel but predetermined and specific
biological or molecular entities through directed design
3. technologies that seek to understand and manipulate biological systems in a
more comprehensive and effective manner
4. technologies that seek to enhance production, delivery and packaging of
biologically active materials.33
Synthetic biology is explicitly mentioned by the committee in relation to the
first two of these categories. A concise discussion of the future applications of
synthetic biology in the report acknowledges that DNA synthesis technology
could allow for the efficient, rapid synthesis of viral and other pathogen
genomeseither for the purposes of vaccine or therapeutic research and
development, or for malevolent purposes or with unintended consequences.34 It
is thus fair to conclude that the biosecurity community during the deliberations
of the Lemon-Relman Committee had clearly identified synthetic biology, albeit
with an emphasis on DNA synthesis and not the four subfields of synthetic
biology outlined above, as one of the technologies that will have a major impact
on the future biothreat spectrum.
This emphasis on DNA synthesis is also reflected in the approach of the National
Science Advisory Board on Biosecurity (NSABB).35 In order to conduct its work,
NSABB can set up working groups to address specific issues including one in
the field of synthetic biology. In the first phase of its work, the NSABB synthetic
biology working group has sought to address biosecurity implications of the
de novo synthesis of select agents.36 A report of the synthetic biology working
group on this issue was discussed during a NSABB meeting in October 2006
31 Ibid., p. 3.
32 Ibid., p. 103.
33 Ibid., p. 3.
34 Ibid., p. 109.
35 National Science Advisory Board for Biosecurity (NSABB) 2006, Addressing Biosecurity Concerns Related
to the Synthesis of Select Agents, National Science Advisory Board for Biosecurity, Washington, DC, <http://
oba.od.nih.gov/biosecurity/pdf/Final_NSABB_Report_on_Synthetic_Genomics.pdf>.
36 Select agents are those biological agents and toxins that can pose a severe threat to public, animal or plant
health, or to animal or plant products. For the current list of select agents, see <http://www.cdc.gov/od/sap/
docs/salist.pdf>.
53
47 Industry Association Synthetic Biology n.d., Report on the Workshop Technical Solutions for Biosecurity
in Synthetic Biology, <http://www.ia-sb.eu>.
48 Ibid., p. 2.
49 Ibid., p. 16.
57
In contrast, the notion of making a library of bioparts and modules the object of
dual-use informed governance measures has yet to receive substantial attention.
It seems that in this context more thought has been devoted to issues of open
source versus intellectual property rights (IPR), with large biotechnology
companies increasingly discovering synthetic biology for commercial purposes.50
Unless a systematic discourse on dual-use governance structures for parts-based
and the other synthetic biology subfields identified above will commence soon,
it could thus be pre-empted by the IPR-driven attempts to formulate governance
solutions. This could also complicate the realisation of any bioethically informed
dual-use governance approach for synthetic biology that might be developed
out of the approaches and deliberations discussed in the next section.
50 Oye, K. A. and Wellhausen, R. 2009, The intellectual commons and property in synthetic biology, in
Schmidt et al., op. cit., pp. 12140.
51 Cho, M. K., Magnus, D., Caplan, A. L., McGee, D. and the Ethics of Genomics Group 1999, Ethical
considerations in synthesizing a minimal genome, Science, vol. 286, pp. 2087, 208990.
52 Ibid., p. 2089.
53 Yearley, S. 2007, Review: the ethical landscape: identifying the right way to think about the ethical
and societal aspects of synthetic biology research and products, Journal of the Royal Society Interface,
<doi:10.1098/rsif.2009.0055.focus>, p. 3.
54 Ibid., p. 6.
58
Miller and Selgelid, in contrast, provide a much more detailed and in-depth
ethical and philosophical consideration of dual-use issues in the life sciences.55
Although not specifically targeted at synthetic biology, their analysis illuminates
many aspects of bioethical reasoning that are of relevance to synthetic biology
too. Based on a particularly morally problematic species of the dual-use
dilemma,56 in the form of a number of experiments of concern, they discuss
the permissibility of certain kinds of research, debate dissemination of dualuse research results, and analyse different ethically informed governance
models with which to tackle the dual-use issues presented by the biological
sciences. With regard to this last aspect, Miller and Selgelid discard both the
laissez-faire option of giving the individual scientist complete autonomy over
their research with dual-use potential and the rather draconian option of
complete governmental control. Instead they argue for either a mixed system of
institutional and governmental controls or a governance approach that would
rely on an independent authority being set up.57 Thus, while providing a
detailed discussion of ethical issues in relation to dual-use life-sciences research,
and a narrowing ofin their viewsuitable governance options, the analysis
remains in this latter dimension somewhat inconclusive.
In contrast with Miller and Selgelid, Ehnis discussion of the ethical
responsibilities of scientists engaged in dual-use research is more limited in
scope.58 He approaches the issue by discussing the basic conflict between the
freedom of science and the duty to avoid causing harm from two perspectives
that of moral skepticism and the ethics of responsibility by Hans Jonas.59 On
this basis, Ehni evaluates
four basic duties to define the prospective responsibility of scientists:
1) stopping research in some cases, 2) systematically exploring dangers
of dual use in some cases, 3) informing public authorities about possible
dangers resulting from research and the application of its results, and 4)
not publishing results and descriptions of research results and possible
dual-use applications.60
Given the nature of the dual-use issues at hand and the way science is organised,
Ehni concludes along the lines of Miller and Selgelid that [i]t is no solution to
55 Miller, S. and Selgelid, M. J. 2007, Ethical and philosophical considerations in the dual-use dilemma in
the biological sciences, Science and Engineering Ethics, vol. 13, pp. 52380.
56 Ibid., p. 531.
57 Ibid., p. 573.
58 Ehni, H.-J. 2008, Dual use and the ethical responsibility of scientists, Archivum Immunologiae Et
Therapiae Experimentalis, vol. 56, pp. 14752.
59 Ibid., p. 147.
60 Ibid., p. 151.
59
61Ibid.
62 Kuhlau, F., Erikson, S., Evers, K. and Hglund, A. T. 2008, Taking due care: moral obligations in dual
use research, Bioethics, vol. 22, pp. 47787.
63 Ibid., p. 481.
64Ibid.
65 Ibid., p. 481 ff.
66 Ibid., p. 487.
67Ibid.
60
including legal, political and ethical considerations. Since synthetic biology may
result in major changes of traditional biology, governance needs to be reflected
on all these levels, finally entering the legal sphere.68
With respect to ethical consideration of synthetic biology, the EGE distinguishes
between conceptual and specific issues, and addresses both biosafety and
biosecurity under the latter heading.69 In its discussion of potential steps to
be taken, the EGE opinion states under the heading of Biosecurity, prevention
of bioterrorism and dual uses that [e]thical analysis must assess the balance
between security and transparency,70 and moves on to recommend that ethical
issues that arise because of the potential for dual use should be dealt with at the
educational level. Fostering individual and institutional responsibility through
ethics discussion on synthetic biology is a key issue.71
In addition, the EGE opinion contains three formal recommendations: 1)
linking dual-use bioethics to the Biological Weapons Convention (BWC) by
recommending that this international treaty should incorporate provisions on
the limitation or prohibition of research in synthetic biology; 2) requesting the
European Commission to define a comprehensive security and ethics framework
for synthetic biology; and 3) requesting the establishment of DNA-sequence
databases with supporting, legally based rules and procedures.72
A similarly wide-ranging attempt to chart the ethical issues surrounding
synthetic biology was undertaken by the US Presidential Commission for the
Study of Bioethical Issues (PCSBI), which in December 2010 produced its first
report, entitled New Directions: The Ethics of Synthetic Biology and Emerging
Technologies.73 Guided by five ethical principlesthat is, (1) public beneficence,
(2) responsible stewardship, (3) intellectual freedom and responsibility, (4)
democratic deliberation, and (5) justice and fairness74the report arrives at
18 recommendations, some of which are informed by the dual-use character of
synthetic biology or seek to address its implications. Of particular relevance
in this context are recommendations 12 and 13. Acknowledging the dynamic
character of the field and the resulting changes in dual-use issues of relevance,
the committee recommends periodic assessments of safety and security risks be
undertaken. It states:
68 European Group on Ethics 2009, Ethics of Synthetic Biology, Opinion No. 25, European Union, Brussels,
p. 36.
69 Ibid., pp. 424.
70 Ibid., p. 51.
71 Ibid., p. 52.
72Ibid.
73 Presidential Commission for the Study of Bioethical Issues (PCSBI) 2010, New Directions: The Ethics
of Synthetic Biology and Emerging Technologies, Presidential Commission for the Study of Bioethical Issues,
Washington, DC.
74 Ibid., p. 5.
61
Risks to security and safety can vary depending on the setting in which
research occurs. Activities in institutional settings, may, though certainly
do not always, pose lower risks than those in non-institutional settings.
At this time, the risks posed by synthetic biology activities in both
settings appear to be appropriately managed. As the field progresses,
however, the government should continue to assess specific security and
safety risks of synthetic biology research activities in both institutional
and non-institutional settings including, but not limited to, the do-ityourself community An initial review should be completed within
18 months and the results made public to the extent permitted by law.75
In case this review identifies significant unmanaged security or safety
concerns, recommendation 13 foresees changes to existing oversight and
control mechanisms with a view to making compliance with certain oversight
or reporting measures mandatory for all researchers regardless of funding
sources.76 This last point would lead to a significant tightening of existing
oversight mechanisms as it would expand their reach beyond publicly funded
life-science research and oblige commercial research activities to abide by the
same regulatory framework.
Conclusions
This chapter set out to first illustrate that synthetic biology is one of the most
dynamic new subfields of the life sciences. It offers the potential to live up to
the promise that the discipline behind the label of genetic engineering has long
aspired to: the engineering of biological parts, devices and systems, either to
modify existing or to create new ones. By applying the toolbox of engineering
disciplines and information technology to biology, a wide range of potential
applications becomes possible, ranging across scientific and engineering
disciplines. Some of the anticipated benefits of synthetic biology, such as the
development of low-cost drugs or the production of chemicals and energy by
engineered bacteria, are potentially very significant. There are, however, also
significant risks due to deliberate or accidental damage. In a way, synthetic
biology can be described as the prototypical emerging dual-use technoscience.
Although first attempts at formulating governance mechanisms can be identified,
these are for the most part focusing only on a subfield, or, as some would say,
enabling technology, of synthetic biologythat is, large-scale commercial
DNA synthesis. The conceptualisation of dual-use governance processes and
structures for the bioparts and modules-based approach within synthetic
75 Ibid., p. 13.
76 Ibid., p. 14.
62
biology is, by contrast, still in its infancy (as is the case for the other subfields
outlined above). Academic work on the characteristics of dual-use issues from
an (bio-)ethical perspective has increased in numbers over recent years, but
stops short of considering embedding their analysis and recommendations
into the wider institutional or political context that synthetic biologists find
themselves in. Similarly, the opinions and recommendations of advisory bodies
and committees briefly discussed are quite generic and deal with dual-use issues
among many other ethical questions raised by synthetic biology.
What are required are thus more detailed analyses of the dual-use implications
of the whole of synthetic biology and systematic dual-use bioethics awarenessraising efforts that reach all practising synthetic biologists and that are
supplemented by education and training efforts as well as the formulation of
codes and other governance tools that go well beyond the rather technically
orientated order screening by DNA-synthesis companies.
63
Introduction
This chapter explores the dual-use quality of scientific research and technological
development in the field of phytopathology (plant science). It offers a brief
survey of naturally occurring pathogens that have been developed for use in
weapons and considers areas of convergence and overlap between the hostile
use of disease organisms as a form of warfare and the peaceful deployment of
bio-control and plant inoculants. The relevance of bio-control agent and plant
inoculant production to the Biological Weapons Convention (BWC) is then
considered. Included in this chapter also is a snapshot of some significant
developments in civil plant science, alluding to the scale and speed of progress
in plant science and technology.
I argue that since they can be used for both peaceful and hostile purposes, plant
science and technology raise issues of dual-use biosecurity concern that are
thus worthy of ethical consideration. In this connection, this chapter argues
that ethical review processes could usefully be located alongside deliberative
processes that facilitate consideration of the legal and social implications of
plant-science research. Deliberation regarding its potential as dual-use research
of concern may therefore be best located within the context of a comprehensive
system for oversight of scientific research such as that recommended by the US
National Science Advisory Board for Biosecurity (NSABB).1 Therefore, a brief
survey of the contours of the latter is included in the concluding section of the
chapter, which focuses specifically on the role of the principal investigator (PI)
the most critical element in the oversight of dual-use life-sciences research2
and the requirement to seek to ensure through improved awareness and training
that PIs are sufficiently aware of dual-use research issues and concerns.
1 National Science Advisory Board for Biosecurity (NSABB) 2007, Proposed Framework for the Oversight
of Dual Use Life Sciences Research: Strategies for Minimizing the Potential Misuse of Research Information,
National Science Advisory Board for Biosecurity, Bethesda, Md.
2 Ibid., p. 11.
65
Type
Hosts
Category A
agents
Category B
agents
Rice blast
Category C
agents
Source: Stockholm International Peace Research Institute (SIPRI) 1973, The Problem of Chemical and
Biological Wafare, vol II, Stockholm International Peace Research Institute, Stockholm.
3 Cecil, F. 1986, Herbicidal Warfare: The RANCH HAND Project in Vietnam, Praeger, New York. See also,
Karnow, S. 1997, Vietnam: A History, 2nd edn, Penguin, New York.
4 Stockholm International Peace Research Institute (SIPRI) 1973, The Problem of Chemical and Biological
Wafare, vol II, Stockholm International Peace Research Institute, Stockholm.
66
5Ibid.
6 United Nations Special Commission (UNSCOM) 1995, Report to the Secretary-General, 11 October.
7 Tucker, J. 1999, Biological weapons in the former Soviet Union: an interview with Dr. Kenneth Alibek,
The Nonproliferation Review, (SpringSummer), p. 2.
8Ibid.
67
9 Goosen, P. 2001, Statement by Chief Director: Peace and Security, Department of Foreign Affairs,
Pretoria, South African Delegation to the Fifth Review Conference of the Convention on the Prohibition
of the Development, Production, and Stockpiling of Bacteriological (Biological) and Toxin Weapons and on
their Destruction, Geneva, 19 November, <http://www.opbw.org/rev_cons/5rc/docs/statements/5RC-OSSAFRICA.pdf>.
10 For a systematic study of such state offensive anticrop biological warfare programs, see: Whitby, S. 2001,
Biological Warfare against Crops, Palgrave, London.
68
and Trichoderma.11 Plant inoculants are formulations containing living microorganisms, used in the treatment and propagation of seeds and plant propagation
matriel for enhancing growth and disease resistance in plants. They are also
used for the restoration of the microflora of soil. Indeed the technologies
associated with the dissemination of such agents appear to equate with those
used in the dissemination of biological warfare agents.
Prior to the First Review Conference of the BWC, the Preparatory Committee
requested that depositary governments prepare a background paper12 on new
scientific and technological developments relevant to the convention and
invited states parties to submit their views on new scientific and technological
developments relevant to the convention. Prepared by experts of the depositary
governments, the review13 focused on new scientific and technological
developments relevant to the convention and looked inter alia at the microbial
control of pests.14 Since significant environmental and human health implications
arose from the deployment of synthetic chemical pesticides that had seen
extensive use in Vietnam, this section of the report noted environmental and
human health concerns and questioned the efficacy of the use of agents against
plants that might develop resistance to their use. The review noted, however,
that there had been a remarkable increase in interest in this area. This was
summarised as follows:
Microbiological methods involve the large-scale production of certain
live micro-organisms or their extractable toxins, the formulation of a
liquid or powder product and dissemination of the product by vehicle
or aircraft-borne sprays (or in rodent control, the use of ground bait)
over crops or forests. With live microbial agents death of insect or rodent
occurs through infection; with microbial toxins death is produced by
toxic effects. In some basic respects the whole sequence resembles biological
warfare. [Emphasis added]
Table 5.2 illustrates the methods of production and dissemination of viral,
bacterial and fungal bio-control agents of relevance to the BWC.
11 McSpadden Gardener, B. B. and Fravel, D. R. 2002, Biological control of plant pathogens: research,
commercialisation, and application in the USA, Plant Health Progress, [Online], <doi:10.1094/PHP-20020510-01-RV>.
12 Report of the Preparatory Committee for the Review Conference of the Parties to the Convention on the
Prohibition of the Development, Production, and Stockpiling of Bacteriological (Biological) and Toxin Weapons
and on their Destruction, BWC/CONF.I/5, 6 February 1980, <http://www.opbw.org>.
13 Not all states parties submitted information to the Secretary-General of the United Nations that referred
either directly or indirectly to potential problems posed by the use of microbial agents against crops. For the
purpose of this discussion, it has been necessary to refer selectively to the official documentation.
14 Report of the Preparatory Committee, op. cit., Appendix E.
69
Bacteria
Fungi
* Facultative pathogens include those with mechanisms for infecting human body tissue.
Source: Report of the Preparatory Committee for the Review Conference of the Parties to the Convention on the
Prohibition of the Development, Production, and Stockpiling of Bacteriological (Biological) and Toxin Weapons
and on their Destruction, BWC/CONF.I/5, 6 February 1980, <http://www.opbw.org>.
70
15 Background Document on New Scientific and Technological Developments Relevant to the Convention on the
Prohibition of the Development, Production and Stockpiling of Bacteriological (Biological) and Toxin Weapons and
on their Destruction, BWC/CONF.11/4, 6. Microbial Control of Pests, p. 8, <http://www.opbw.org>.
16 Ibid., para. 6.2, p. 8.
b.
The production through GE of toxins in species beyond those bacteria that produce
them in nature.
c.
d.
e.
f.
g.
h.
i.
Source: Report of the Preparatory Committee for the Review Conference of the Parties to the Convention on the
Prohibition of the Development, Production, and Stockpiling of Bacteriological (Biological) and Toxin Weapons
and on their Destruction, BWC/CONF.I/5, 6 February 1980, <http://www.opbw.org>.
71
21 Kelle, A., Nixdorff, K. and Dando, M. 2006, Controlling Biochemical Weapons: Adapting Multilateral Arms
Control for the 21st Century, Palgrave, Basingstoke, UK, p. 35.
22 The sum total of genetic information of an individual, which is encoded in the structure of
deoxyribonucleic acid (DNA), is called a genome. The study of the genome is termed genomics. Recently, the
order of most of the chemical building blocks, or bases, which constitute the DNA of the genomes of human
beings (estimated to amount to three billion), several other animal species and a variety of human pathogens
and plants has been determined. Over the next few years this remarkable achievement will be completed and
augmented by research into functional genomics, which aims to characterise the many different genes that
constitute these genomes and their variability of action. Such research will also determine how these genes
are regulated and interact with each other and with the environment to control the complex biochemical
functions of living organisms, both in health and in disease.
23 Second Review Conference of the Parties to the Convention on the Prohibition of the Development,
Production and Stockpiling of Bacteriological (Biological) and Toxin Weapons and on their Destruction, Final
Document PART II Final Declaration, BWC/CONF.II/13/II, <www.opbw.org>.
73
24 Walbot, V. 2000, A green chapter in the book of life, Nature, vol. 408, p. 794. See also The Arabidopsis
Genome Initiative 2000, Analysis of the genome sequence of the flowering plant Arabidopsis thaliana,
Nature, vol. 408 (December), pp. 796815.
25 World Health Organisation (WHO) 2005, Modern Food Biotechnology, Human Health and Development:
An Evidence-Based Study, World Health Organisation, Geneva, p. 37, <http://www.who.int/foodsafety/
publications/biotech/biotech_en.pdf>.
74
sequencing of the second complete genome,26 that of rice, Oryza sativa, in 2005.
The economic and social importance of rice is significant. Rice is the worlds most
important food crop, consumed by more than half of the worlds population,
and to meet projected demand over the next 20 years production will have to
rise by an estimated 30 per cent.
In connection with progress in plant science outlined above, a first generation
of genetically modified crop products that have emerged in the marketplace
over the past decade has expressed a limited number of characteristics. In
particular, since 1996, millions of acres have been used for the production of
genetically modified crops with, in the case of the United States, large-scale
production of modified varieties of corn, cotton and soya (soya beans) enhanced
by gene-transfer techniques that confer herbicide tolerance and insect resistant
in crops. According to the International Service for the Acquisition of AgriBiotech Applications report Global Status of Biotech Crops,27 as early as 2005 the
United States had approximately 49.8 million ha planted with crops that were
the product of genetic modification. The extent to which some of these crops
have been adopted in US agriculture and the short time span over which this has
taken place are perhaps indicative of the willingness with which agricultural
enterprises in the United States have embraced such technologies in spite of
concerns raised regarding human health and the environment. According to
the US Department of Agriculture (USDA),28 planting of herbicide-tolerant soya
beans expanded from 17 per cent of US soya bean acreage in 1997 to more than
85 per cent in 2005. Planting with herbicide-tolerant cotton expanded from 10
per cent of US acreage in 1997 to more than 60 per cent in 2005. Planting with
insect-resistant transgenic corn and cotton containing the Bacillus thuringiensis
(Bt) toxin gene also increased significantly over this period in the United States.
USDA figures29 for 1995 reveal the [a]doption of all GE cotton, taking into
account the acreage with either or both HT and Bt traits, reached 79 percent in
2005, versus 87 percent for soybeans. In contrast, adoption of all biotech corn
was 52 percent.
According to the International Service for the Acquisition of Agri-biotech
Applications (ISAAA),30 global biotech planting had exceeded 1 billion ha by
2010.
26 For an analysis of some of the salient features of the rice genome, see: International Rice Genome
Sequencing Project 2005, The map-based sequence of the rice genome, Nature, vol. 436 (August), p. 793.
27 Available at: <http://www.isaaa.org/>.
28 United States Department of Agriculture (USDA) n.d., Adoption of Genetically Engineered Crops in the
U.S.: Extent of Adoption, Economic Research Service, United States Department of Agriculture, Washington,
DC, <http://www.ers.usda.gov/Data/biotechcrops/adoption.htm>.
29Ibid.
30 International Service for the Acquisition of Agri-biotech Applications (ISAAA) 2010, Global Status of
Commercialized Biotech/GM Crops: 2010, ISAAA Brief, <http://www. ISAAA.org>.
75
In the past two decades, genome studies have facilitated manipulation of the
genetic characteristics of food crops. Crops can now be produced with built-in
defences against insect pathogens such as Bacillus thuringiensis. They can also
be manipulated to delay ripening, as in the case of the slow-ripening Flavr Savr
tomato, which was approved for sale in the United States in 1994. Infertility can
be conferred on plant seeds, as in the case of the controversial Terminator gene.
Plant science is, however, now beginning to focus on a second generation of
crops that have been genetically modified to express a broader and more complex
range of plant traits. The challenge now also extends to assigning functions to
genes, and in the case of Arabidopsis much work31 had been done by 2007.
According to a recent edition of Current Opinion in Plant Biology,32 advances
in understanding at the level of functional genomics will result, in the case of
Arabidopsis, in:
An understanding of the networks through which these genes interact
to control plant development, metabolism, reproduction and other
fundamental processes will accelerate the advent of a new generation of
improved crop products to benefit growers, processors and consumers.
Bringing together knowledge of the function of genes and gene networks,
and of their regulation within the contexts of cell, organ, organism, and
environment will be crucial for achieving the level of precision in crop
engineering that will be required to fuel the development of the nextgeneration products.
Considerable progress has been made in regard to the re-annotation33
including both the structure and the functions of genesof the gene sequences
of Arabidopsis, and a similar re-annotation of rice is also under way.
The revolution in genomics signals a transition in plant biology from a
descriptive to a predictive science. As Dixon34 points out: Genomics (originally
DNA and transcript based, but recently extended to integrate the proteome and
metabolome) has revolutionized the speed of gene discovery for important plant
traits.
31 Lan, H., Carson, R., Provart, N. J. and Bonner, A. J. 2007, Combining classifiers to predict gene function
in Arabidopsis thaliana using large-scale gene expression measurements, BioMedCentral, <http://www.
biomedcentral.com/1471-2105/8/358>.
32 Salmeron, J. and Herrera-Estrella, L. 2006, Plant biotechnology: fast-forward genomics for improved
crop production, Current Opinion in Plant Biology, vol. 9, pp. 1779.
33 Rensink, W. A. and Buell, R. C. 2005, Microarray expression profiling resources for plant genomics,
Trends in Plant Science, vol. 10, no. 12 (December).
34Ibid.
76
35 Fermin-Muoz, G. A., Meng, B., Ko, K., Mazumdar-Leighton, S., Gubba, A. and Carroll, J. E. 2000,
Biotechnology: a new era for plant pathology and plant protection, APSnet Feature, May, <http://www.
apsnet.org/publications/apsnetfeatures/Pages/Biotechnology.aspx>.
36 Fermin-Munoz, G. A. 2000, Enhancing a plants resistance with genes from the plant kingdom, APSnet
Feature, May, <http://www.apsnet.org/publications/apsnetfeatures/Pages/EnhancingPlantResistance.aspx>.
37 Meng, B. and Gubba, A. 2000, Genetic engineering: a novel and powerful tool to combat plant
virus diseases, APSnet Feature, May, <http://www.apsnet.org/publications/apsnetfeatures/Pages/
GeneticEngineering.aspx>.
38Ibid.
77
39 Ko, K. 2000, Using antimicrobial proteins to enhance plant resistance in biotechnology: a new era
for plant pathology and plant protection, APSnet Feature, May, <http://www.apsnet.org/publications/
apsnetfeatures/Pages/AntimicrobialProteins.aspx>.
40 Dixon, R. A. 2005, Plant biotechnology kicks off into the 21st century, Trends in Plant Science, vol. 10,
no. 12; Neal-Stewart, C., jr, 2005, Plant functional genomics: beyond the parts list, Trends in Plant Science,
vol. 10, no. 12 (December).
41 Wang, Y., Xue, Y. and Li, J. 2005, Towards molecular breeding and improvement of rice in China, Trends
in Plant Science, vol. 10, no. 12 (December), pp. 61014.
42Ibid.
78
Malign applications
The above developments offer a glimpse of the rapid progress in plant biology
over the past 10 years. The genomics revolution opens up a range of new
possibilities for improvements in the quality and quantity of crop yields and
an increasing number of techniques and applications will become available for
combating disease that is caused naturally and accidentally; however, the same
developments also open up a range of possibilities for malign applications.
Analysts have for many years expressed concern regarding the ways in which
naturally occurring plant pathogens might be deployed for malign purposes.
This might involve the simple introduction into a crop species, for example, of
a pathogen to which no natural immunity exists. Van der Planks well-known
observations of the seemingly explosive spread of some plant pathogens in the
absence of immunity remain particularly salient in spite of great progress that
has been made in phytopathology over the past 40 years.
Scenarios may also include the introduction into crops of pathogens that have
mutated naturally in the environmentwitness, in this connection, the nearfuture possibility of the re-emergence in regions such as the Middle East and
in countries such as India of a new and virulent strain of wheat rust, Ug99
49Ibid.
80
(Ug99 has been found in Egypt at least and perhaps other countries in the
Middle East), and its predicted associated immense potential for social and
human destruction as it has been described by Borlaug50 in the New Scientist.
According to Mackenzie,51 to combat this mutation effectively, the production
of enough Ug99-resistant seed to plant our wheat fields might take up to eight
years.
As in the case of bio-control agent production discussed above, there is also a
possibility that crop pathogens could be genetically modified deliberately. The
result could include increased toxicity or pathogenicity. Or, as suggested by
Kelle et al.,52 plants innate immune systems are vulnerable to manipulation,
possibly affecting the response to pathogen invasion. For example, a plants
response could be manipulated so as to trigger systemic, rather than localised,
hypersensitive reactions to pathogen invasion. Nixdorff explains the mechanisms
involved in systemic plant resistance mechanisms thus: The main systemic
signals include salicylic acid, jasmonate and ethylene, which are produced
in response to wounding and insect attack. H2O2 is the most important
response mechanism to pathogen invasion
involved in downstream signalling, leading to the activation of signalling
cascades in Arabidopsis as well as activation of genes controlling the
production of proteins involved in HR [hypersensitive reaction].
[Therefore] plants may be attacked through their innate immune
systems, for example, by targeting either the receptors of signalling
cascades, or by inhibiting or producing an over-reaction in a signalling
cascade with the use of inhibitors of key components in that cascade.
Kagan et al.53 present concerns about the introduction of noxious DNA
material in the form of, for example, a bio-regulator into a bio-control agent
such as Bacillus thuringiensis in quantities sufficiently large to contaminate a
food product. Indeed, further to this, Chofness et al.54 also note in relation to
transgenic plants that such plants
could be malevolently engineered to produce large quantities of
bioregulators or toxic proteins, which could either be purified from
plant cells or used directly as biological agents. As with legitimate
production, using transgenic plants as bioreactors would eliminate the
50 Mackenzie, D. 2007, Rusting defences in the battle for wheat, New Scientist, 3 April.
51 Mackenzie, D. 2007, Billions at risk from wheat super-blight, New Scientist, 3 April, pp. 67.
52 Kelle et al., op. cit., p. 76.
53 Kagan, E. 2006, Bioregulators as prototypic nontraditional threat agents, Clinics in Laboratory Medicine,
vol. 26, no. 2 (June), pp. 42143.
54 Choffnes, E. R., Lemon, S. M. and Relman, D. A. 2006, A brave new world in the life sciences, the breadth
of biological threats is much broader than commonly thought and will continue to expand, Bulletin of Atomic
Scientists, vol. 62, no. 5 (SeptemberOctober), pp. 2633.
81
Conclusion
This survey of offensive biological warfare agents, bio-control agents and plant
inoculants of relevance to the BWC, and recent trends in phytopathology
and plant technology, has alluded to a broad range of scientific discovery
and technological innovation. This highlights a number of areas where
considerationincluding ethical deliberationregarding the dual-use nature
of scientific discovery and technological application might usefully be applied.
Indeed, where deliberation might take place, who might be involved, and
what mechanisms and systems might be put in place to facilitate deliberative
processes have been the subjects of considerable attention in the United States.
Since 2006, the NSABB has been tasked,55 inter alia, with advancing thinking
around proposals for the development of recommendations for scientific oversight
measures and for recommending how such measures might be developed so as
to minimise the risk of misuse of scientific information. One of the challenges
inherent in pursuing this mandate is how oversight mechanisms might be
created that are both efficient and effective but constructed and implemented in
such a way as to mitigate the stifling of life-science innovation. This initiative
was viewed as a preliminary step towards establishing a mechanism of oversight
through the development and implementation of a comprehensive system for
the responsible identification, review, conduct, and communication of dual use
research.56 The NSABB proposed that a system of oversight might include the
following seven key features:
The development of Federal Guidelines for oversight of dual use life
science research
Enhanced levels of awareness of dual use research of concern amongst
practicing life scientists
Enhanced, ongoing, mandatory education that raises awareness of dual
use research of concern and addresses the roles and responsibilities of
life scientists
Finally, building on research that was previously set out in the influential
2004 Fink Report,66 NSABB also set out seven criteria for the identification of
endeavours and discoveries that might trigger discussion and review. The
following seven classes of experiments might provide a useful framework for
considering the types of endeavours or discoveries that, if proposed, might
trigger review by PIs including, where appropriate, review and discussion
by informed members of the scientific and medical community before they are
undertaken or, if carried out, before they are published in full detail.67
The experiments include those that:
1. Would demonstrate how to render a vaccine ineffective. This would
apply to both human and animal vaccines. Creation of a vaccine-resistant
smallpox virus would fall into this class of experiments.
2. Would confer resistance to therapeutically useful antibiotics or
antiviral agents. This would apply to therapeutic agents that are used
to control disease agents in humans, animals, or crops. Introduction of
ciprofloxacin resistance in Bacillus anthracis would fall in this class.
3. Would enhance the virulence of a pathogen or render a nonpathogen
virulent. This would apply to plant, animal, and human pathogens.
Introduction of cereolysin toxin gene into Bacillus anthracis would fall into
this class.
4. Would increase transmissibility of a pathogen. This would include
enhancing transmission within or between species. Altering vector
competence to enhance disease transmission would also fall into this class.
5. Would alter the host range of a pathogen. This would include making
nonzoonotics into zoonotics agents. Altering the tropism of viruses would
fit into this class.
6. Would enable the evasion of diagnostic/detection modalities. This could
include microencapsulation to avoid antibody-based detection and/or the
alternation of gene sequences to avoid detection by established molecular
methods.
7. Would enable the weaponization of a biological agent or toxin. [Emphasis
added]
Whilst a detailed analysis of NSABBs oversight system is beyond the purview
of this chapter, the above criteria could be used to trigger interventions
66 Committee on Research Standards and Practices to Prevent the Destructive Application of Biotechnology,
Development, Security, and Cooperation, Policy and Global Affairs, op. cit.
67 NSABB, op. cit., p. 18.
85
86
Introduction
Tuberculosis (TB) is a devastating disease that leads to 10 million deaths per year
worldwide.1 Despite enormous advances in TB research in recent years, the surge
of drug-resistant strains and deadly synergy with the HIV virus have threatened
to destabilise gains made in its control. During my career as a microbial geneticist,
I have studied the physiology of Mycobacterium tuberculosis, the causative agent
of TB, and its interaction with the human macrophage, its primary host cell. The
major function of the macrophage is to engulf and destroy invading organisms,
but many pathogens, such as TB, have developed intricate and clever ways to
avoid or subvert these host defences.
This chapter discusses how a classical approach to asking a very basic question
about microbial pathogenesis can lead to a surprising result and a dual-use
dilemma. The elimination of a single gene from M. tuberculosis created a strain
with increased virulence when measured in a model system (in vitro) but not
during animal infection (in vivo).
Experiments of concern
The concept of dual-use researchthat certain kinds of experimental endeavours
might be used for malignant purposes despite their original beneficent intent
has entered the biomedical lexicon.2 My personal interest in the matter grew out
1<http://www.who.int/tb/publications/global_report/2010/en/index.html>.
2 National Research Council and the American Association for the Advancement of Science (NRC/AAAS)
2009, A Survey of Attitudes and Actions on Dual Use Research in the Life Sciences: A Collaborative Effort
of the National Research Council and the American Association for the Advancement of Science, Washington,
DC; American Association for the Advancement of Science (AAAS) 2009, Building the Biodefense Policy
Workforce, American Association for the Advancement of Science, Washington, DC; American Association
for the Advancement of Science (AAAS) 2008, Professional and Graduate-Level Programs on Dual Use Research
and Biosecurity for Scientists Working in the Biological Sciences, American Association for the Advancement
of Science, Washington, DC; Atlas, R. and Dando, M. 2006, The dual-use dilemma for the life sciences:
perspectives, conundrums, and global solutions, Biosecurity and Bioterrorism, vol. 4, no. 3, pp. 111.
87
6. The Super TB Experiment: Evolution and Resolution of an Experiment with Dual-Use Concerns
host range mutants are developed to understand the mechanism of host range
restriction and to develop methods of blocking hostpathogen interaction,
the first step of infection
transmissibility mutants can be created to understand the mechanisms that
control transmissibility, to develop methods of reducing transmissibility
altering virulence can be used to identify the genes encoding virulence
factors, and interfering with these factors that may limit infection and
pathogenesis of disease.
Therefore, any comprehensive genetic analysis of virulence will eventually be
confronted with the possibility of an experiment of concern. The question for
researchers is whether to continue to pursue the line of inquiry or to find an
alternative approach. I now want to consider these issues in relation to my own
work.
situation could have developed: that a bacterium that spends its entire life cycle
in mammalian cells does so in the presence of a small molecule, glutathione, that
is toxic! We characterised this toxicity, and showed that in mouse-derived cells,
when the glutathione levels were high, M. tuberculosis did not grow well, but
when the glutathione levels were low, the bacteria grew very efficiently.8
We were, however, now at a crossroads. First, we showed that glutathione is toxic
to M. tuberculosis, and we found glutathione at toxic levels in all mammalian
cells. Second, we had created a mutant of M. tuberculosis in the laboratory that
was lacking a peptide transporter and incapable of accumulating small peptides,
including glutathione. Therefore, this mutant should have been resistant to the
toxic effects of glutathione. We showed that this was indeed the case.
But if glutathione is toxic to M. tuberculosis and present in all mammalian cells
then it might play a natural role in limiting M. tuberculosis growth. This was our
hypothesis. The logical next step was to see whether our mutant strain survived
better than its normal parent strain during intracellular growth in white blood
cells in tissue culture. It did. In fact, the mutant strain grew to three to five times
higher levels than normal parent strains when inoculated and grown in cultured
macrophage cell lines. Our experimental approach following a logical series
of steps had led us to create a hyper-virulent mutant of M. tuberculosis. We
had generated a strain that survived in host cells better than its normal parent
strain, as a result of the deletion of a single gene encoding one component of
a basic nutrient transport system. Interestingly, the dual-use implications of
our work were brought to my attention only during a discussion of that issue
with members of the Controlling Dangerous Pathogens Project at the Center for
International and Security Studies at Maryland (Elisa Harris and colleagues).
Apparently, I had partitioned my thinking about these matters, and separated
the experiment itself from the possible ethical concerns associated with it,
despite my own decades-long involvement with the issue of biological weapons
and the responsibility of scientists in preventing a biological arms race.
Virulence factors
Defining virulence is not a simple matter. By a general and superficial definition,
virulence is the degree of pathogenicity of a biological agent, and pathogenicity
8 Venketaraman, V., Dayaram, Y. K., Amin, A. G., Ngo, R., Green, R. M., Talaue, M. T., Mann, J. and Connell,
N. D. 2003, Role of glutathione in macrophage control of mycobacteria, Infection and Immunity, vol. 71,
pp. 186471; Venketaraman, V., Dayaram,Y. K., Talaue, M. T. and Connell, N. D. 2005, Glutathione and
nitrosoglutathione in macrophage defense against Mycobacterium tuberculosis, Infection and Immunity, vol.
73, pp. 18869.
90
6. The Super TB Experiment: Evolution and Resolution of an Experiment with Dual-Use Concerns
is the ability of an organism to cause disease. Both terms are relative and highly
dependent on context. Furthermore, virulence and pathogenicity are frequently
used interchangeably.
Myriad factors contribute to the virulence of an infectious agent. Virulence
factors that cause severe damage to the host are called toxins. Some toxins
act directly on the host cell such as listeriolysin, which punches holes in the
membranes of host cells and breaks them open.9 Many viruses reproduce to such
high numbers in cellsgrabbing bits of the host cell membranes as they go
that the cells just break apart.10 Other toxins act directly on host cells to alter
their function severely. For example, one component of anthrax toxin destroys
the proteins that orchestrate the immune response and cause swelling in the
tissues surrounding the site of infection, while another component inactivates
a small protein, called G-protein, which regulates the movement of ions across
membranes; its inactivation leads to severe oedema.11 Other toxins block protein
synthesis, so the cell can no longer carry out normal functions (diphtheria
toxin),12 or interfere with neurotransmitter activities, as with the clostridial
toxins, botulinum toxin and tetanus.13 In some, but not all, cases elimination
of the gene encoding the toxin will render the bacterium completely harmless
(avirulent); this is true with Clostridium botulinum, the causative agent of
botulism. Conversely, expression of the gene encoding the botulinum toxin in
an alternative bacterial host would lead to the creation of a very dangerous
agent. Ken Alibek, a former Soviet bioweapons researcher, claimed that the
Soviet offensive program devoted considerable effort to splicing the botulinum
toxin genes into other bacteria for delivery.14
But in the case of Yersinia pestis (plague),15 for example, and Burkholderia mallei
(glanders)16both select agentsthere are many small, secreted virulence
factors that work together to damage the host; elimination of one or two of the
genes encoding the complex array of effectors will not lead to inhibition of
virulence, rather, just defects in the ability to cause disease (hypo-virulent,
9 Schnupf, P. and Portnoy, D. A. 2007, Listeriolysin O: a phagosome-specific lysin, Microbes and Infection,
vol. 9, pp. 117687.
10 Kaminskyy, V. and Zhivotovsky, B. 2010, To kill or be killed: how viruses interact with the cell death
machinery, Journal of Internal Medicine, vol. 267, pp. 47382.
11 Moayeri, M. and Leppla S. H. 2009, Cellular and systemic effects of anthrax lethal toxin and edema
toxin, Molecular Aspects of Medicine, vol. 30, pp. 43955.
12 Iglewski, W. J. 1994, Cellular ADP-ribosylation of elongation factor 2, Molecular and Cellular
Biochemistry, vol. 138, pp. 1313.
13 Popoff, M. R. and Bouvet, P. 2009, Clostridial toxins, Future Microbiology, vol. 4, pp. 102164.
14 Alibek, K. and Handleman, S. 1999, Biohazard, Random House, New York.
15 Viboud, G. I. and Bliska, J. B. 2005, Yersinia outer proteins: role in modulation of host cell signaling
responses and pathogenesis, Annual Review of Microbiology, vol. 59, p. 689.
16 Sun, G. W. and Gan, Y. H. 2010, Unraveling type III secretion systems in the highly versatile Burkholderia
pseudomallei, Trends in Microbiology, vol. 18, pp. 5618.
91
6. The Super TB Experiment: Evolution and Resolution of an Experiment with Dual-Use Concerns
appropriate safety and security measures. We found that there was no difference
in the survival or replication rate of the mutant compared with the normal
M. tuberculosis strain (unpublished observations). Thus, the hyper-virulent
behaviour we observed in the tissue culture model was not carried through in
an experimental system closer to the human disease state.
Our increased awareness of the dual-use implications of altering virulence
significantly impacted on our deliberations concerning additional testing of our
hyper-virulent mutant of M. tuberculosis. On the one hand, we were under
considerable pressure from our grant reviewers to demonstrate hyper-virulence
in the mouse model. If we did not demonstrate hyper-virulence then they might
not consider funding out project. On the other hand, our deliberations in
the laboratory about the nature of virulence and pathogenicity increased our
appreciation of the complexity of infectious potential and the inherent dual-use
nature of our research. These are the kinds of careful discussions that we would
urge scientists to have at many stages of their work.
93
97
laws for the individual and the group because popular usages and traditions,
when they include a judgment that they are conducive to societal welfare
exert a coercion on the individual to conform to them, although they are not
coordinated by any authority.6 I would argue that this is a relatively accurate
description of the position of most scientists and science stakeholders today in
the early years of the twenty-first century.
It is important that recognition of the role of pre-existing cultural folkways
(national/ethnic, economic, religious, social and technological values and so on)
and within these, domestic (familial) folkways or value sets, is factored in to any
consideration of how individuals reach a state of moral and ethical maturity in
their private and professional lives. This is particularly important in science,
where the rapid speed of advance typically outstrips the development of
appropriate ethical responses needed to deal with it and in which it is typically
supposed that all practitioners share a common set of professional values and
perspectives by virtue of their identity as scientists.
This chapter looks at the issues of moral development and ethical decisionmaking in terms of the publicly stated group value sets and individual, privately
held value sets. Later in the chapter, I describe the case of a conflicted scientist
as he tries to make sense of a number of professional/personal value conflicts
that face him in his daily work life. This unfortunate individual faces a set of
daily challenges that create an uncomfortable dissonance in him; he needs to
find a way in which to move forward without disadvantaging himself. Such
ways forward cannot be identified unless we have a better understanding of
the origins of the dissonance. By focusing on biological, psychological and
anthropological research into the development of value systems, within the
context of cultural identity, it is possible to highlight possible mechanisms that
may usefully be explored in further research.
Terminology
Many authors use the term morals to refer to ethical practice, standards or
beliefs; indeed it is common to find that the words morals and ethics are used
interchangeably. The very fact that there is this lack of clarity and consensus
in this vocabulary indicates a lack of understanding of the potential differences
between private and public values held by both individuals and groups. Because
this chapter focuses on the potential variance between private and public value
sets, I have chosen to be specific in the way I use these terms. For the purposes
of this chapter, the term morals will be used to refer to the attitudes and values
6 Ibid., p. 4.
98
learned in childhood, pre-professional and private life, and the term ethics
to refer to the attitudes and standards learned and required in the context of
professional life.
The word moral(s) tends to be heavily laden with culture-specific values and
especially religious meaning, both of which are highly influential in forming
our early values and views. These values may belong to a value set separate
to that held professionally (although religious values may also direct ethical
values). Furthermore, many people think of morals as the attitudes they
learned from their personal and family lives, with ethics being something
outside the domestic sphere: a morality for the private sphere that is personal
and subjective, and a morality for the public sphere that is impersonal and
objective.7
The English word moresmeaning values, norms and, occasionally, virtues
is derived from the Latin plural noun mores. Mores refers to the traditions and
customs practised by specific groups, which relate directly to the subject of this
chapter. These traditions and customs are derived from accepted behaviours
and practices in a group rather than from actual laws, and consist of a shared
understanding of the manners and conduct that are acceptable to and expected
by the group. In short, either one conforms to the traditions and customs of the
group or one cannot function effectively as a full member. Indeed, an individual
whose values clash with the predominant values of the group may find that s/
he is not welcome in it at all.
Folkways, according to Sumner, are habits of the individual and customs of
the society which arise from efforts to satisfy needs. These folkways can be
understood as informal mores and, while not generally enforced in any way, they
tend to be perpetuated by members of the group emulating the behaviours and
attitudes of older or more established members. Once folkways are established,
they become, essentially, a value set that is shared by members of the group
and is maintained by group consent rather than by formal governance. Clearly
this process has implications in the biosecurity context when we are seeking to
develop, in effect, a new norm of biosecurity to which all scientists and science
stakeholders can subscribe.
In this chapter, the term culture refers to the culture of origin of the individual
(as in the ethnic sense), as well as to nationality and to religious, economic,
technological and linguistic groupings to which the individual may belong. All
of these dimensions impact on one another, having differing degrees of influence
depending on the context in which the individual is acting at any one time. One
of the outputs of such cultural influence is a value set, or group of value sets.
7 Catchpoole, V. M. 2001, A sociobiological, psychosocial and sociocultural approach to ethics education, PhD,
Centre for the Study of Ethics, School of Humanities, Queensland University of Technology, Brisbane, p. 274.
99
These may also be defined as beliefs, attitudes and perspectives on the world.
Clearly, these in turn impact on actions and behaviour patterns. Clashes occur
when an individual is faced with some conflict between the influences of one
or more cultural value sets in a given situation. Which value set should have
precedence?
Ethics may be defined as (potentially) universal standards and commitments
that support justice, care, welfare and rights;8 among other definitions, they may
also be defined as a principled sensitivity to the rights of others.9 The irony
here is, of course, that most people tend to believe that their own moral (private
and/or family derived) standards are also defined in the same way. Variations in
these have a role in defining scientists and policymakers ethical standards and
norms and must be recognised as key variables in any debate about what needs
to be done to prevent the hostile use of the life sciences. Nevertheless, bearing
in mind that cultural variations exist, it is probably reasonable to agree that
any desired socially responsible ethical approach involves the development
of adequate commitments that inform actions10 and that this includes an
understanding of what is beyond self-interest, and how we may justify that to
reasonable people and put it into practice.11 Even so, these aspirations, while
striving to be comprehensive and culture neutral, still incorporate approaches
that may not be prioritised by all cultures and therein lies part of the difficulty
in seeking a common set of values or norms. Everyone thinks that their existing
moral outlook is commonsense, and that their professional ethical values are
also commonsense. It is in this balancing act between the variation in moral
and ethical values as defined by diverse cultural backgrounds that we need to
find a way forward that allows individuals and groups to engage equally and
openly in the context of biosecurity as it challenges the needs of science in the
way it meets the needs of society.
100
8 Laupa, M. and Turiel, E. 1995, Social domain theory, in W. M. Kurtines and J. L. Gewirtz (eds), Moral
Development: An Introduction, Allyn & Bacon, Boston, pp. 45573.
9 Gilbert, N. 2001, Researching Social Life, Sage, London, p. 45.
10 Catchpoole, op. cit., p. 7.
11 Singer, P. 1993, How Are We to Live? Ethics in An Age of Self-Interest, Text, Melbourne.
sphere, but another in the work context; we may employ yet more value sets in
our relations with people we do not like, or with people we need, such as our
bank manager or our doctor.
101
obvious factor in survival: those of our ancestors who looked after their own
interests survived to reproduce themselves, and those who did not, failed to do
so.
Work by Krebs and Janicki21 indicated that the evolutionary development
of norms, or memes as Dawkins may refer to them (which may be defined as
culturally transmitted units of data such as ideas, habits and so on), has resulted
in four identifiable ways in which people operate ethically
1. they try to get others to invoke the norms that they themselves hold
2. they try to tailor their norms to others so as to enhance their persuasive
impact
3. recipients tend to adapt to the norms that are most advantageous to them
4. people in different sorts of relationships will preach different sorts of norms.
These operating mechanisms are, according to Krebs and Janicki, biologically
determined in that they are evolutionary adaptations to ensure survival.
Whether these mechanisms are truly biological or are solely culturally
determined artefacts is open to debate.
The development of reciprocity, empathy and other relationship-related
characteristics may be considered as part of a psycho-cultural mechanism of
learning morality and ethics. Our capacity to consider the needs of others is
enhanced by our cultural interactions with other people at a psychological level.
Most people learn how to make and sustain friendships, work relationships
and marriage relationships based on cultural concepts such as respect,
interdependence, empathy, cooperation and negotiation, amongst others. All
of these are partial foundations of a moral and ethical perspective, even though
they vary in how they are prioritised and expressed culturally.
Research suggests that humans can exhibit empathy (the capacity to
intellectually understand how another person feels) from an early age, which is
potentially open to development by other cultural factors. Work by a number
of psychologists has shown that even very young children have the capacity
for empathy.22 This runs contrary to earlier thought by Piaget,23 who believed
that very young children were essentially egocentric and unable to perceive
the world from the perspective of others. Hoffman devised a set of levels in the
development of empathy encompassing the first 12 years of life.24
21 Krebs, D. L. and Janicki, M. 2004, Biological foundations of moral norms, in M. Schaller and C. Crandall
(eds), Psychological Foundations of Culture, Lawrence Erlbaum Associates, Mahwah, NJ, pp. 12548.
22 For example, Hoffman, M. L. 1988, Moral development, in M. H. Bornstein and M. E. Lamb (eds),
Social, Emotional and Personality Development. Part III of Developmental Psychology: An Advanced Textbook,
Lawrence Erlbaum Associates, Hove, UK, pp. 497548; Zahn-Waxler, C., Radke-Yarrow, M. and Wagner, E.
1992, Development of concern for others, Developmental Psychology, vol. 28, part 1, pp. 12636.
23 Piaget, J. 1932, The Moral Judgment of the Child, Kegan Paul, London.
24 Hoffman, M. L. 1988, Moral development, in Bornstein and Lamb, op. cit.
103
Level 2:
Conventional
morality
Level 3:
Postconventional
morality
Stage 4: Law and order orientation (abiding by laws for the good of
society, not just for the resulting benefits to yourself)
Source: Kohlberg, L., Levine, C. and Hewer, A. 1983, Moral Stages: A Current Formulation and A Response
to Critics, Karger, Basel.
always to the same end. This means that people in any given culture only need
to achieve the level of reasoning that is necessary for their culture to operate
effectively.29 Nevertheless, even when different cultures have different attitudes
to an issue, they will express the same reasoning methods when coming to their
differing conclusions. I would suggest that individuals and groups may operate
at several levels simultaneously, depending on the multiple contexts they are
operating in at any one time (home and work, friends and work, friends and
home, and so on), and this may be a mechanism by which scientists manage
any clash between their private morals and public ethics. In addition, I would
suggest that although we may like to think we are all operating professionally
in Kohlbergs Stage 6, we are probably not, or at least, not all the time and in
all circumstances.
Kohlbergs work has been subject to criticism, including that his work is
culturally biased. Simpson30 said that Kohlbergs stage model is essentially
based on the Western philosophical tradition and that it is inappropriate to
apply it to non-Western cultures without adequate reflection on their different
moral outlooks; however, the model appears to remain a useful indicator of
the development of moral reasoning per se, and I would suggest that as long
as cultural variations are effectively and appropriately considered as variables
when applying the model, it retains useful credibility.
We must also consider the psychological effects or outcomes of our personal
cultural upbringing. A range of cultural factors may be explained by psychology
as well as by anthropology and the social sciences. Characteristics of cognition,
motivation and social interaction may answer questions about the origins and
development of any culture, and subsequently those subject to the influences
of a culture. For example, a society that values the rights of the individual
will produce people who think differently about any given scenario to people
who originate from a society that does not place the rights of the individual
high on the social agenda. Likewise, cultural views on gender, equality, age,
family structure, education, work, marriage, religion and science all influence
us heavily as we grow up, embedding within us a standard of normality by
which we measure the rest of the world. This is the basis of some of the criticism
of Kohlbergs stage model. It is probably only when we leave the confines of our
own cultural context that we are challenged to focus on our own reactions to
the others, and have to negotiate ways forward that enable us to coexist in a
mutually acceptable way. This can, of course, include the transition from the
private, domestic sphere to the professional, public sphere.
29Ibid.
30 Simpson, E. L. 1974, Moral development research, Human Development, vol. 17, pp. 81106.
105
resistance to new ideas that challenge the status quo, the selective forgetting
of data that contradict personal views, favouring of work supporting their own
views and a generally conservative approach to their practice. Conflicts between
the personal and professional value sets will obviously arise here.
This conflict may be explained psychologically by cognitive dissonance
theory,37 which describes how, in situations in which an individuals behaviour
or beliefs must conflict with beliefs that are integral to his or her self-identity
(for example, when being asked to professionally prioritise a value that is not
equally prioritised in his private value set), a conflict will arise. This conflict
is referred to as cognitive dissonance. In order to resolve this dissonance, the
individual can either leave the situation that presents the source of the conflict
(in this case, the professional role) or reduce his attachment to the private value
that is being challenged or compromised. If he does the latter then he may ease
the conflict further by emphasising the positive aspects of the professional value
that is challenging him. Cognitive dissonance is clearly an important factor in
the decision-making process.
Three key psychological strategies are commonly used to alleviate cognitive
dissonance. The individual may decide to seek more supportive beliefs or
ideals that could outweigh the dissonance; in essence, this may mean masking
or suppressing the conflicting value and his response to it. This would, in
turn, potentially cause longer-term psychological stresses. The individual
may convince himself that the challenging belief is not really important, and
can therefore be ignored or at least given less priority. He may also begin to
manipulate or moderate the belief that is challenging him and seek to bring
it more into line with his other beliefs and action patterns. These strategies
involve rationalisation: the tendency to develop their own explanations as to
why they have made their choices in relation to the challenging belief or action.
Festingers cognitive dissonance theory has not been without its detractors.
Bem38 suggested that individuals attitudes may be changed by self-observation
of their own behaviours (rather than by any response to feelings of discomfort
in a challenging situation), followed by consideration of what attitudes caused
those behaviours, thus producing emotional responses. For some time it was
considered that cognitive dissonance and self-perception theories were in
competition; however, work by Fazio et al.39 identified attitude-congruent
37 See Festinger, L. 1957, A Theory of Cognitive Dissonance, Row & Peterson, Evanston, Ill.; and Festinger,
L. and Carlsmith, J. M. 1959, Cognitive consequences of forced compliance, Journal of Abnormal and Social
Psychology, vol. 58, pp. 20310.
38 Bem, D. J. 1967, Self-perception: an alternative interpretation of cognitive dissonance phenomena,
Psychological Review, vol. 74, part 3, pp. 183200; and Bem, D. J. 1972, Self-perception theory, in L.
Berkowitz (ed.), Advances in Experimental Social Psychology, Academic Press, New York, pp. 162.
39 Fazio, R. H, Zanna, M. P. and Cooper, J. 1977, Dissonance and self-perception: an integrative view of each
theorys proper domain of application, Journal of Experimental Social Psychology, vol. 13, part 5, pp. 46479.
108
as a qualified and employed scientist, his priorities are twofold: accepting and
promoting the scientific method as an independent, value-free and objective
concept, and making sure he gets on well with his workmates and bosses.
He can see that his organisations new vaccine has considerable economic
potential as well as humanitarian value; however, he is unexpectedly challenged
by the need to manage his response to some new valuesin this case, a new
version of the biosecurity normwhich have been presented to him through
an international report40 he has just read. He wishes to persuade his colleagues
of the new perspective that he has recognised because he is so persuaded by the
argument that he has prioritised this new ethical approach at a high position in
his own professional value set. He is concerned that one or more of the processes
involved in the vaccine research and development could be used against human
populations if they were to be used maliciously. He sees this very clearly once
he has thought about it, and is concerned that no-one at work has considered
this before. How can he proceed?
He tries to tell his colleagues about his concerns. One or two of them agree that
there may be some theoretical risk, but generally take the view that it will
never happen and tell him that he should forget about it. Some point out to him
that this new process will generate a lot of money for the organisation (and longterm employment) and he should not raise problems with it. He tries to forget
his concerns, but as he continues to work on the new process, he sees more and
more chances for it to be misused. Other researchers from different branches of
the organisation have had access to the laboratories during the work, and no
confidentiality mechanisms were in place that would have prevented them from
seeing the work, so the processes could already be repeated at other sites.
Our scientists research team leader is planning to publish the work, albeit not
in full, so as to protect the organisations economic advantage. Even so, key
processes and outcomes can be relatively easily worked out by anyone with a
basic knowledge of genetics and vaccine science from what is to be published.
In order to advance his position further in the discipline, his team leader is
also going to give some more detail in an international conference podium
presentation, where he will take questions and presumably give some answers.
Our scientist goes to see the team leader to ask him if it would be possible
to restrict the amount of information given out in the publication and at the
conference, as he has concerns about security and what could happen if the
information fell into the wrong hands (delegations from certain countries with
security problems will be present). His team leader is shocked and annoyed at
the suggestion that he could possibly be involved in work that could be used
to make, in effect, a biological weapon. He challenges our scientist to show him
40 For example, National Research Council, op. cit.
110
what is wrong with the work, and goes on to say that he cannot be responsible
for what someone else may do with his work. Our scientist is not sufficiently
up to speed with all the possible ways in which the work may be misused, but
he suggests that certain potential outcomes of the research could be used against
the interests of society. He finds it difficult to understand that his colleagues and
team leader cannot see the potential problems that may arise from the teams
work on interfering with the susceptibility of organisms to vaccines; they do
not seem to be concerned that it could be used against human communities. The
team leader then falls back on the argument of scientific freedom and the duty
of the scientist to share and to replicate work with other scientists. Moreover,
he also points out that the research is going to generate a major income stream
for the organisation. He suggests to our scientist that he might wish to consider
his junior position on the team, and points out that contracts will soon be up for
renewal. Over the next few weeks, the team leader begins to give more work to
other colleagues on the team and less to our scientist, who does not receive an
invitation to the conference at which the leader is to speak.
Our scientist manages to find a friend who is quite senior at another workplace,
with whom he shares his concerns in a general way. This new friend suggests
that he could think about what it is about his own values that is causing him so
much stress in this situation. He has another friend, who is not a scientist, who
says that he should ignore the stress and what others say, and simply focus on
himself and what he can do to change himself in order to manage better at work.
Our scientist is now in an ethically challenging situation and does not know
which way to turn. What should he do?
We can review our scientists position using the concepts that we have been
looking at in this chapter. His upbringing has given him a family-based, culturally
derived value set (a bio-cultural mechanism) that encourages him to abide by the
rules in order to maintain his social position and to gain advancement socially as
he fits in with his tribes both at home and at work (Wilsons theory). He has
grown up with this approach, which has worked so far, but if he has to challenge
it beyond certain narrow limits, he feels emotionally stressed (Griffiths and
Izards theories). He is aware that it is necessary to balance competing values
and tries his best through life to manage the periodic clashes between the need
to fit in and the need to maintain his own values (the theories of Sahlins, Gould
and Lewontin on cultural drivers overcoming genetic drivers). He feels that it
is possible to manage this sort of balance by everyone giving and taking in
order for everyones needs and values to be accommodated (Trivers reciprocal
altruism). He is aware that some people are manipulators and will take when
appearing to give, but as long as it all balances out he is happy and assumes
that everyone else is as well (Moores criticism of Trivers, and the work of Mauss
on gift-giving as a means to self-promotion). He is even of the mind, at times,
111
that Dawkins ideas on genes as drivers can be related to humans in the sense of
every man for himself, although he does not like this view of life, as it seems
uncaring and inhumane.
He has a number of options for dealing with these questions (psycho-cultural
mechanisms). He is vaguely aware that he manages the periodic clashes between
his personal and professional value sets by balancing his unconscious feelings of
guilt and shame for doing wrong if and when he steps out of line in the family
or at work (Williamsons theory). His mechanisms for managing these emotional
reactions are those described by Krebs and Janickihe always manages, usually
unconsciously, to use one of their four ways to make sure he can operate in the
right way in any given situation. He is, unconsciously, at different stages of
Kohlbergs model depending on which context he is operating in at any given
timefor example, at home, at work or at play.
Now that he has been confronted with the challenge at work that I have
described above, he is faced with a situation of cognitive dissonance. In order to
remain at work and to flourish in the way he wishes to professionally, he must
now either conform to the prevailing view at work by giving up his concerns
(or at least stop talking about them) or choose to leave and seek work elsewhere.
As a junior member of the team, he does not think he is in a position to influence
the team himself. If he were higher up the food chain, he may feel that this is
a third option.
Our scientist now has four options to mitigate his cognitive dissonance at work,
thereby relieving himself of any sense of ethical responsibility about the possible
outcomes of the scientific activity of his organisation. First, he may choose to
look for some other beliefs that will help him to mask the dissonance so he can
pretend it does not exist; he could do this by saying to himself that his actions
alone will have no effect in the big picture as he is only a junior researcher:
what do I know? Second, he may decide to go along with his colleagues and
persuade himself on some level that it will never happen; he could do this
by actively deciding that he is worrying over nothinghe may be the only
person who has seen the potential problems, so maybe he is blowing it all out of
proportion and he is worrying about nothing. Third, he may try to manipulate,
in his head, the challenges that the dissonance is causing, so that the views of
his leader and colleagues do not cause him so much difficulty; he may do this by
reassessing the value that he places on the views of his colleagues and leaders,
deciding that their views are not as important to him as he previously thought.
Fourth, he may choose to be a whistleblower and speak publicly in some way
about his concerns.
The first three approaches involve our scientist in a process of rationalisation
whereby he will be able to invent and justify his own explanation as to how
112
he has reached his eventual position of relative ethical comfort (or discomfort).
Unfortunately, all these actions are likely to lead to further stress, which may
debilitate him in the future. In the first instance, he is effectively degrading his
own value in the team as a scientist and as an individual. In the second, he is
subduing his own intelligence and ignoring what his own values and intellect
are telling him. In the third, he is downgrading the value of those to whom
he naturally looks for learning. All of these rationalisations ultimately mean
compromise of his personal and professional values and are highly likely to
result in further dissonance.
In terms of self-perception theory, all three strategies will simply move him
to various stages along the continuum of attitude-congruent (acceptable) and
attitude-discrepant (unacceptable) behaviour. If he makes himself comfortable
at one point, this means that he will thereby throw up a further dissonance that
requires another relocation on the continuum. This may progress to such an
extent, raising more and more stress, that our scientist may feel that he is in an
impossible situation if he is unable to convince his colleagues and leaders to take
his concerns about the research seriously and to act on them.
The fourth approachto become a whistlebloweris, unfortunately, not a
stress-free option, even though it may assuage his moral and ethical concerns.
Our scientist may well blow the whistle on the research, and be the cause of it
being abandoned or amended by his organisation. So far so goodhe may have
prevented a possible biosecurity threat from translating into an unwelcome
action; however, in the process of doing this, if we read Brian Martins work
on whistleblowing,41 we find that he is highly likely to be ostracised by his
workmates, sidelined (at best) in the organisation and, more likely, fired from
his post as a result. If not fired, he may find that his contract is not renewed as
the organisation is restructured. Either way, he now has no job, no references
and little hope of continuing to work in the field he had chosen to spend his
life in. This prevents or seriously delays him buying his first house, starting
a family and advancing up the career ladder. Maybe he should have kept his
mouth shut after all?
reduces the risk of biosecurity lapses occurring. One such way is to develop and
foster a new norm of biosecurity; this would become, in time, a part of everyday
scientific practice, just as the biosafety norm is hopefully embedded in daily
activities.
This is best considered in the context of the ways in which scientists carry
out their daily activities and approach potential challenges in their work.
We have seen earlier in this chapter that there are two possible psychological
mechanismsbio-cultural and psycho-culturalthat may be usefully pursued
further when attempting to identify, understand and predict the changing
perspectives of scientists when faced with the challenges of biosecurity in their
professional and personal lives. We should therefore consider these potential
explanations in the derivation and development of value sets or norms when
developing a new ethic of biosecurity and dual use. What we are faced with is,
essentially, a relativist scenario not only between the differing values held by
different people in regard to specific concepts, but also in the differing values
held by individuals themselves in terms of specific concepts through conflict
between their private moral values and their public ethical values.
Now that we have seen the challenges faced by our hypothetical challenged
scientist (albeit supported by real-life examples in Brian Martins work), lets
look at how cultural ideals and values can be transmitted effectively, thus,
hopefully, avoiding or reducing the chances of his situation being repeated in
real life.
Norms may be defined as formal or informal rules that govern what is acceptable
in terms of behaviours, attitudes, beliefs and values within a group or culture.
They may be explicitly stated or implicitly approved by the majority; they may
lead to warnings or reproach for those transgressing minor normative rules,
or to more severe punishment for those guilty of the transgression of norms
considered to be of great consequence. Norms develop over time, and are subject
to a range of cultural influences. They are often closely tied to religious and
social forces, and are commonly associated with group or cultural identity. Once
this cultural identification element takes hold, it may be particularly difficult to
dislodge or change a norm. Considering that such norms are usually part of our
childhood development and therefore our moral learning process, we can see
how they may challenge later professional ethical values that may clash with
them. So how do we develop and implement a new norm? Maybe it is better to
start with a smaller task and look at what Dawkins refers to as memes, which
may be referred to as infant norms, for want of a better term.
114
Memes42 are units of cultural transmission that may be skills, knowledge, beliefs,
attitudes, behaviours or values. They are spread through observation and social
learning43 and have been called cultural software44 that may be passed on as
they are or in modified forms. Dawkins has, in the past, cited the fashion for
wearing a baseball cap back-to-front as an example of a meme that has spread
through certain sections of society. One could argue that a well-entrenched
meme becomes, in effect, a new norm, if it lasts long enough. Crucially, unlike
genes and norms, which can take long periods to change, memes can, apparently,
change very quickly. It may follow then that ethical attitudes, knowledge and
characteristics could also be transmitted as memes, which, in passing from
one person to another, will change slightly with each transmission, allowing
development and swift cultural change to occur:
The power of human reason made possible in part by the memes we
possess, is also the power to mutate those memes and create something
new from something old. We are not simply inheritors of a zealously
guarded patrimony but entrepreneurial products of a new cultural
software which will help constitute future generations of human
beings.45
In order for old memes to be transmuted into new memes, however, there must
be a dominant host culture that allows change to take place.
In order to enfranchise and empower all people there needs to be a mutual
exchange between equals. If some individuals or groups are marginalised or
excluded for any reason, be it at the political, legal, social or economic levels,
there is no basis for trust, mutual respect and cooperation.46
This is clearly the level at which we need to operate when developing a new
norm of biosecurity as it is embedded in the responsible conduct of science. In
any moves to take on board shared common ethical values, it is vital that we do as
much as possible to support all actors in the biotechnology field so that no-one is
left behind and marginalised. Such help does not need to be financial, although
it could be, at least in part. Help could be provided in the form of intellectual,
educational, training or other relevant forms of support as well. Once a norm
has become established, recognised and shared among growing numbers within
a community, it is important that we define appropriate penalties to be faced by
those breaking the norm. Next, we need to agree on the implementation of those
penalties; without imposed penalties, a norm, by its very nature, is effectively
42 Dawkins, op. cit.
43 Catchpoole, op. cit., p. 77.
44 Balkin, J. M. 1998, Cultural Software: A Theory of Ideology, Yale University Press, New Haven, Conn., and
London, p. 43, cited in ibid., p. 265.
45Ibid.
46 Catchpoole, op. cit., p. 267.
115
toothless. In todays world, it may be argued that the economic penalty is the
one of greatest effect; however, penalties need not be just economicbut that is
another argument for another day.
Lets now consider the education route as a norm developer. Education is itself a
memes transmitter. We absorb new memes through the values set by those who
pass the new values on to us. Over time, new memes can become new, deep-rooted
norms. Social norms (such as the baseball cap) may start as memes, developing
into expectations of behaviour, then into controls that regulate behaviour and
identity (the baseball cap again). Philosophical norms imply some obligation or
duty, carrying threatened or real sanctions if they are broken. There is cultural
pressure to abide within the norm, so culture provides a setting in which the
norm may survive and develop.
Ethics and ethical attitudes may be passed on as memes (quick-changing,
initially superficial cultural artefacts) and then develop over time into norms
(more deep-rooted values and attitudes within a culture). Ethics, as norms, can
be implicit or explicit. They can vary from group to group and place to place
and can change over time. It is this capacity to change that we must address
in order to develop a new ethics of biosecurity so that it becomes a dominant
scientific norm. The biosafety norm is based on the principle of containment
and safe management of harmful biological agents, but in such a way as to
allow ongoing scientific practice; the biosecurity norm needs to be built on
the principle of the containment and safe management of biological knowledge
and processes, also allowing ongoing scientific practice. Just as the biosafety
norm (when implemented properly) draws a line in the sand over which practice
cannot step, so must the biosecurity norm.
So how may we develop research ethics as memes-into-norms and persuade
others to sign up to them in the biosecurity context? This would mean taking a
norm of, say, no harm, and defining it in practical ways in order to teach the
principle. Catchpoole believes that new norms can be taught and developed as
memes. She suggests that education can choose which memes to transmit and
which it will not.47 Educational transmission of a commonly held value, such
as care for others, could be inserted as a norm as part of the building of a
multicultural ethic of biosecurity, with a common value set, around it:
Having care for one another and our world involves finding a balance
between the needs of self, close others and global others in ways that
respect basic human rights and yet which seek to encourage more than a
minimalist commitment to having care for one another By delineating
the nature of care itself and recognising the ubiquitous nature of power
47 Ibid., p. 126.
116
within relationships, an extended ethic of care provides a set of transcultural memes for both guiding and evaluating the development of the
ethical form of life in the private and public spheres.48
The memes of ethics could therefore be taught and learned as a form of cultural
transmission in a multicultural context if they are presented from the perspective
of a commonly held value such as care for others, which few cultures would
dispute.49 At the same time, we need, however, to recognise that a range of
cultural pressures may be brought to bear on the implementation of a no harm
or care for others value. It is one thing to state a belief in do no harm to others
publicly, but at the same time, be driven by a more pressing value to actually
prepare to do the oppositewe only need to look at those countries in violation
of the BWC with their peaceful purposes cloak hiding offensive weapons
programs to see real-life cases in action. This prioritisation of pressures is to be
the subject of further research.50 One could argue that the norm of no harm is
so routinely violated in science (for example, in the use of animals in product
testing) that it is diminished as an argument and is no longer worth pursuing
or developing in practice; however, this in itself is an ethical debate: we either
accept the norm of no harm as being something to aspire to or give in to an
anarchic world in which care for the other no longer has a place worth arguing
for. I would suggest that it is inside the norm of no harm that much future
research is required. We can readily see that many individuals and groups apply
this norm differently, simply by the way in which they define who is to be
actually protected from harm, in what circumstances and why. This is the area
in which we seem to make our moral and ethical judgments based on culturally
appropriate norms with which we face everyday life.
Conclusions
This chapter has shown that we need to appreciate the variety of private moral
backgrounds of individual scientists as we seek to identify common values with
which to form a new norm of biosecurity within the public culture of ethical
science.
It is probable that there is a biological foundation underpinning our moral
development from infancy into adulthood. On top of this genetic and physiological
framework, we can identify a range of psychological processes that further
48 Ibid., p. 47.
49 Ibid., p. 186.
50 Further culture-based research is being planned by Judi Sture and Masamichi Minehata of Bradford
Disarmament Research Centre, looking at how scientists and educators prioritise and manage pressures in
their everyday work.
117
51 See Renfrew, C. and Bahn, P. 1991, Archaeology: Theories, Methods and Practice, Thames & Hudson,
London.
52 United Nations 2008, BWC/MSP/2008/5, p. 6.
118
ourselves. Fourth, we need to look out for threats or undue influence from the
five archaeological expressions of culture, as these may exert overt or covert
pressure on scientists.
Finally, by identifying even a single commonly held value, such as care for
others, it may be possible to start laying the foundations of a new value set that
will underpin our efforts in building a sustainable capacity in biosecurity in the
life sciences.
119
Introduction
This chapter addresses the moral dimension of dual-use research. To set the
scene, I will begin by explaining what I take dual use to be. I understand a dualuse item to be something that has both a good or neutral (neither good nor bad)
use or application and a bad use. Three different categories of dual-use items
can be distinguished: research, technologies and artefacts.1 These are clearly
different sorts of things. Research is an activity, while technology is a form
of knowledgeknowledge of the techniques for the production of artefacts
whereas artefacts are objects. But it is also clear that these items are related:
research aims to give us technology, which in turn produces artefacts.2 It is the
last that normally has the immediate impact on the individual, be this good
or bad. For instance, research has led to methods for mass-producing cheap
ammonium nitrate, which has a primary use as a fertiliser and a secondary use
a bomb-making material. It is the substancethe artefactammonium nitrate
that is the fertiliser/bomb component, not the technology or the research. The
import of this is that when we talk of threats or risks associated with dual use,
we need to distinguish between different levels of risk or threat, depending
on the nature of the dual-use item, and between different ways of dealing
with these. For instance, substances like ammonium nitrate can be physically
contained, but the knowledge of how to manufacture it cannot.
My suggested definition of a dual-use item is therefore as follows:
An item (knowledge, technology, artefact) is dual use if there is a
(sufficiently high) risk that it can be used to design or produce a
weapon, or if there is a (sufficiently great) threat that it can be used in
an improvised weapon, where in neither case is weapons development
the intended or primary purpose.3
1 This is the classification that I gave in Forge, J. 2010, A note on the definition of dual-use, Science and
Engineering Ethics, pp. 11118. This paper was written as a response to a challenge by David Resnik to come
up with a definition of dual use that is neither too narrow nor too broad. I do not claim that my contribution
is the last word, but I do think it is along the right lines. I also think that it is enough by way of a starting
point for the present discussion. I would, however, note that the sense in which artefacts are said to be objects
is a special onesee ibid., p. 115.
2 There are of course other kinds of research and other kinds of technologies besides those that aim to
produce artefacts.
3 Forge, op. cit., pp. 11118.
121
World War II, and that nuclear weapons prevented the Cold War from becoming
a hot one. As we learn more about the intentions of the former Soviet Union,
the latter proposition is becoming increasingly less plausible. (I am also willing
to argue, though not here, that all scientific research for the ends of weapons
innovation is wrong.12) Whatever one makes of these particular examples,
even someone with blind faith in science would be hard put to maintain that
everything scientists do has turned out for good. It should be stressed that it is
not part of the present position that when something has gone wrong then this
has been done intentionally or that scientists are always to blame. And it is far
from the present position that science either has no effect on us or is bad overall.
Clearly, science has done a great deal of good, as is clear when we review the
history of medical science.13
Once we accept SAP, it seems clear that we would like science to somehow
maximise the good outcomes and minimise the bad ones. The first step, or
maybe the preliminary step, is to get scientists to accept SAP; by this I mean all
scientists and not just those working in applied areas. Surely this should not be
too difficult and is probably already true of the majority of scientists. That all
scientists should be aware that their work can affect others can be argued with
reference to what I call the changed context of science. Indeed, since World War
II, all working scientists really should know that no scientific research is pure
in the sense that it cannot in principle affect people. Once this is established,
we get on to the hard choices: just what kind of research should scientists be
encouraged to do, and what to avoid? Philosophers can have some influence on
the policy processat least they can in theoryby suggesting some general
guidelines. What I have tried to do is to provide a set of moral principles that
could be used to inform the choices made by scientists themselves, though
policymakers could also use them. There are different ways in which this can be
done, given what I have said so far about responsibility, and in the rest of this
section, I will outline the way I prefer.
The reader may be aware that there are two traditions in moral philosophy
traditions that seem sharply opposed. According to the consequentialist tradition,
the moral import of an action or choice resides solely in its consequences. The
nature of the act itself, or of how we describe or characterise it, counts for
nothing. Evidently, there will be issues here for the consequentialist, resembling
those discussed above, in regard to how consequences stretching into the future
can be determined and weighted with respect to their moral significance.
Leaving these aside, a consequentialist normative ethic will specify some
12 See my Designed to Kill: The Case against Weapons Research. Dordrecht: Springer, 2012.
13 But even here there is no universal agreement: some strange folk even think that the triple antigen
vaccination is wrong. My general point here is that science is a mixed blessing and that is why it is important
to determine just what the responsibilities of the scientist are, Forge, 2008, op. cit., pp. 2831.
125
property, to be taken as the good, and then require moral agents to maximise
this property by their actions. Famous nineteenth-century consequentialists
like Bentham and Mill took happiness to constitute the good, and thus moral
action was such as to maximise happiness, the agents included. The calculus
has to include consequences where some were made unhappy, and this means
that all sorts of scenarios can be constructed in which happiness is maximised
but some small number of unfortunates is made utterly and forever miserable.
Some of these scenarios are portrayed as counterexamples to the theory. My
own account is non-consequentialist in that it does not hold that consequences
are all that matter.14 I wont make any further criticism of the consequentialist
tradition here.
Non-consequentialist moral philosophy is usually expressed as a set of norms
or rules. What these are, how these are understood and how they are justified
distinguish different non-consequentialist accounts. Modern accounts do not
usually interpret the rules as absolutely binding, and here there is a break
with the Kantian tradition. Kant famouslyor notoriouslysaid that one
should not lie, not even to mislead a maniac looking for ones best friend with
malicious intent. The rules contained in modern accounts are thus often said
to be prima facie, or to lay down prima facie duties: they are to be obeyed in
the first instance, unless there are very good reasons not to obey them. As for
the content of these rules, I follow those like Bernard Gert who think that they
should forbid harming but not require any positive or benefiting action. Here
there is evidently a considerable difference with the traditional consequentialist.
One good reason to accept our view is that it is possible to impartially refrain
from harming everyone, but one cannot impartially benefit everyone. If I have a
limited amount of help to give, I must favour some, but moral action is supposed
to be impartial. Gerts moral philosophy, which I follow closely here, does not
exclude trying to do good; however, good action, which is understood to be
equivalent to the prevention of harm, is taken to be the content of moral ideals,
and hence agents are not bound by rule to prevent harm. It has to be said
that these are quite subtle matters and it is hard to do justice to them here.
Another way to put the difference between the rules and the ideals is as follows:
agents need to justify breaking the rulesrecall that these are not absolutely
bindingbut only encouraged to act in accord with the ideals. Gert refers to his
account as common morality.
To conclude this discussion of the moral basis of responsibilities of the scientist,
we should first note that this way of doing moral philosophy makes a lot of
sense. No person in her right mind wants to be harmed: everyone, wherever
14 Note that non-consequentialism is not the contrary of consequentialism, which would be that
consequences do not matter at all.
126
they come from, can agree with that.15 Thus everyone can accept a moral code
that proscribes harming, and agree that everyone would be better off with
less harming. Doing good in some positive sense is much more difficult to
universalise. Moreover, striving not to harm others is surely within the reach
of everyone, while looking to maximise the good consequences of ones actions
looks just too hard. Now scientists are in a special position because of the farreaching implications of their work, primarily through technology.16 Therefore
it seems fair to attribute to them special responsibilities, and in line with the
proposed moral position, these should enjoin scientists in the first place not
to do research that has harmful outcomes. I have taken this to mean above all
(and at the very least) that scientists should not engage in weapons research;17
however, I acknowledge that scientists should be encouraged to do research
that will have beneficial outcomes. In The Responsible Scientist, I therefore
maintained that the responsibilities of the scientist are two tiered. The first tier
comprises this demand to avoid research that can have harmful outcomes, while
the second encourages scientists to do research that prevents harm. Research
in the biomedical sciences could well fall into the latter category, as could
research that breaks down racial and sexual stereotypes and biases. These two
tiers correspond to the moral rules and the moral ideals of common morality.
Most of this section has been devoted to the forward-looking responsibility of
the scientist, and I will come back in the final section to see how this applies
to dual-use research. The next section, on the other hand, will be concerned
mainly with backward-looking responsibility.
experiments or even theoretical work on this topic count as dual use. There are
two reasons a scientist might think it is morally acceptable for her to work on
this kind of project in applied nuclear physics. She might think that in fact there
is no bad use, or that if she only intends to work on enrichment or reprocessing
insofar as it has a benign use in civilian power reactors and that it deals with
the disposal of hazardous waste, that is okay. The first of these reasons raises an
important general issue about dual usenamely, the basis on which the uses in
question are classified.18 The second raises issues that concern attributions of
backward-looking responsibility. I will address this second matter first.
The second issue can be dealt with by what I call the wide view of backwardlooking moral responsibility, given in The Responsible Scientist. The wide view
holds that scientists are responsible for what they foresee as well as for what they
intend (and they can also be responsible for what they should have foreseen). The
view expressed by our scientist abovethat one is only responsible for what one
intendsis what John Mackie called the straight rule of moral responsibility and
which I call the standard view. The latter term is appropriate, as the view seems
to have slipped in as a kind of orthodoxy, being endorsed by Peter Strawson and
John Austin, both, like Mackie, renowned Oxford philosophers, but without
convincing argument. At first sight it seems plausible, because intentions are
given as reasons for action: Why do you work on this project? To make better
nuclear fuel rods and remove hazardous nuclear waste. The obvious question,
then, is why we should only be responsible for actions that we have a reason
for doing.19 For those who do think that responsibility must be tied to reasons
for action, and I am not one of this number, it is easy to see how a foreseen but
not intended outcome or side-effect of an action can be included in a reason for
action as an explicit qualification. In this example, it can be done by adding
although I am aware that reprocessed fuel can be used in weapons and this
did not make any difference to my choice. By incorporating foreseen outcomes
in this way, we get what I have called the modified standard view.20 How we
might manipulate or change the standard view is really beside the point, for it is
simply a commonplace that we do hold people responsible for what they foresee
as well as what they intend, in spite of what Mackie, Strawson and Austin might
18 I raised the issue of the role of values in the definition of dual-use items and how different sets of values
could lead to different judgments about what counts as a dual-use item in Forge, 2010, op. cit., p. 117. I have
more to say about this question below.
19 In some ways it would be clearer if this were expressed in terms of interests. Our researcher has an
interest in disposing of hazardous waste, but no interest in making material for nuclear weapons; her reason
for undertaking the project cites the former as her reason for action. Nevertheless, she sees that her work
contributes to the latter end.
20 The modified standard view, and why it is not enough, is discussed at length in Chapter 5 of Forge,
2008, op. cit. Also, it is one thing to recognise that something is a commonplace and quite another to give
it a convincing philosophical rationalehence the length of Chapter 5 and the chapters before and after it.
128
have thought. And hence the response of our scientist that any other outcomes
of this work are beside the point because they are not what she intends will not
wash.21
On what basis do we decide whether something has both good and bad outcomes
or uses, and hence whether it is a dual-use item? And if there are alternatives that
lead to different judgments, which do we chose? These are important questions,
ones that I cannot address fully here. But for a start we can all agree that bad
outcomes are those that are, or have the means to be, harmful. Common morality
of course would have this implication. If we think of our moral philosophy
as expressing values then common morality elevates the value of not harming
above all others, and would not, for example, allow the harming of a few to
benefit many. Again, this is surely something that most of us would accept: if
we could benefit at the expense of others being harmed then most of us would
not accept such a deal if it were offered to us (at least I hope we would not). So
I assume again that something like the values expressed by common morality
provide a plausible and realistic value system. We will see in a moment how this
can affect classifications of items as dual use. First, however, note that there is
a difference between harmful acts, acts that directly harm moral subjects, and
outcomes of researchthe topic of the present discussionwhich are such that
they could be the means to harm. Provision of the means to harm is not the same
as harming, and if we are to try to attach the kind of strong moral prohibitions
about harming suggested here to activities that do not directly harm, it seems
that we need some further argument.
In fact I dont think that this is too hard to supply in cases of research whose
explicit aim is to design new and better weaponry.22 Scientists who engage in this
kind of work need justification for what they do: they need to give reasons why
it is acceptable for them to engage in an activity whose objective is to provide
tools whose purpose is to harm others. The typical justification is that we need
weapons to defend ourselves from others, from the bad guys. This is similar
in form to the typical justification that we give when we feel we need to harm
othersnamely, that we need to do so to prevent harm to others and ourselves.
The acceptability of that justification will depend on the circumstances: clearly,
inflicting a great deal of harm to prevent a small amount of harm will not do.
The harm inflicted should in some way be commensurate with that prevented.
But this judgment is more difficult to sustain when it comes to deciding whether
making new and improved weaponsnamely, making the means to harm rather
than harming directlyis justified. Are they really necessary? Will others
21 The challenge, of course, is, as in the previous footnote, to provide the philosophical argument.
22 Not too hard, but quite lengthy: in Forge, 2008, op. cit., pp. 1558, I argue for a means principle that is
such as to transfer responsibility for bad outcomes using artefactsweapons, for instanceto the designer
of the artefact.
129
acquire them and use them for evil purposes? What alternatives are there to
weapons research? I think these questions are extremely hard if not impossible
to answer in ways that support weapons research (namely, yes, no, none), and so
I think that weapons research is an activity that should be undertaken only in
exceptional circumstances.23
In this way we can see how to go about addressing questions about matters
that are morally suspect, such as weapons research. When the topic is dual
use, one of which involves weapons, for instance, the questions become more
complicated, but I think we can now see just where the complications arise.
The assumption is that R1 is such that we know it can have bad uses. Any
research that can be harmful needs to be justified, and so far we have focused on
justifications that do not take into account any good use (that is not a product
or function of the bad use). A good use, on the present system of values, is one
that prevents harm; in the case of R1 this is the disposal of hazardous waste.
The role this will play is thus as part of the justification of the project. What
the present system of values and justifications requires for dual-use research is
that the bad use is offset by considerations that appeal to the good use, and
any other relevant considerations, and that these good uses must involve the
prevention of harm. Where the good use does not prevent harm, where it is
neutral as evaluated on the prevent system, it cannot figure in the justification
of the project. In regard to a project like R1, it may therefore even be useful
to proceed as follows: justification needs to take the form as if it were the bad
use that is the main object of the investigationthat this is foreseen but not
intended is irrelevant hereand the good use is to be cited as a reason not to
abandon the project.
or applied, and all scientists are obliged to try to look ahead and see where
their research is going. Everyone has responsibilities when they take on the
role of the scientist, and in this way scientists resemble our train guard.24 But
again there are difficulties: how are scientists able to look into the future and
see where their research will lead, for are not outcomes unpredictable? This
matter was raised above, where I conceded that some research is unpredictable
but also maintained that some is not. Moreover, in the present context where
science is heavily sponsored and funded, it can only operate where it promises
to give useful outcomes. Were this not true then institutions such as defence
departments would not spend more money than anyone on scientific research.25
Science is now far too big and expensive to be left undirected. Of course, when
scientists propose research projects and say what they hope to achieve, they
maintain that these are desirable outcomes, that their methods are proper and
so on. They do not, as a rule, list undesirable outcomes; these will not normally
be the focus of attention of researchers, but I do not see any reason they could
not be. I conclude that the present account of the responsibilities of scientists
has the implication that they are obliged to do their best to look ahead and try
to see if their work will lead to any bad outcomes.
These kinds of assessments involve judgments about risk and threat, and are
therefore more or less uncertain, as indeed are good outcomes, because all
research has some uncertainty associated with it. The present account implies
a cautious and conservative approachand this may not be welcomed by all
researchers. Consider the following familiar scenario: a research project is aimed
at uncovering the structure of a pathogen with the aim of finding a preventative
therapy or cure for the condition that it causes; however, this knowledge could
also be used for making the pathogen more virulent and resistant to the very
measures that the project was designed to put in place. The present account,
as we saw in section one, sees the prevention of harm as something to be
encouraged. This is the second tier of the two-tiered account of the forwardlooking responsibility of the scientist, and we can all agree that the prevention
of harm is a good thing. But the first tier forbids scientists from doing harmful
research, and that strongly proscribes any bioweapons research, including
bioweapons research that is unintended. Where does this leave the research
project? It does not follow that it should not be undertaken, but it does follow
that justification must be given. And here an assessment of the costs versus the
benefits is needed, as well as control of the results.
24 I discuss the case of Frederic Joliot-Curie, who published nuclear data in 1938, against the urging of
Szilard. Fortunately, the data were incorrect and it did not seem to interest the Nazi scientists; in Forge, 2008,
op. cit. Joliot-Curies actions were irresponsible even back then.
25 For many years the US Department of Defense has spent more money on scientific research than anyone
else.
131
If there is some risk that the project will be used for bad ends then it seems
clear that not only should the results not be published, but also they should be
tightly controlled. Michael Selgelid26 has raised the question of the censorship
of research in regard to dual-use questions, and has advocated a moderate and
balanced position. The present account will certainly suggest stricter controls
(perhaps another unwelcome implication). From the perspective of common
morality, agents are free to do whatever they wish, as long as this does not
harm others. Thus, scientists are free to do whatever research they like, but
the freedom to research, or research itself, is not given special value or status.
When there is a risk that research will cause harm then it is firmly proscribed.
And when research also has benefits then it should be conducted with effort to
minimise harms, and if this entails censorship then that is what is required for
the project to go ahead. Indeed, it seems that this is precisely the view that makes
the most sense. The response that the whole purpose of publication is to make
ideas and results open to all members of the community who can then build on
them will not do here. Networks of respected colleagues can be informed, but
the risks of open publication may be too great with dual-use research.
Conclusion
I conclude that the account of science and responsibility put forward in
The Responsible Scientist has relevance for the ethics of dual-use research, and
in two sorts of ways. In the first place, the theory of responsibility ties the
scientist to the outcomes of his or her work, and does so more tightly than
other viewpoints. For instance, an account that incorporated the straight rule
and did not see any special sort of forward-looking responsibility for scientists
would have quite different implications; however, I think that the topic of dual
use shows that we want a wide-ranging account of responsibility, and we want
researchers to think carefully about what they do. The second way in which the
account is relevant is in the way it sees responsibility as being discharged or
cashed out. This is with respect to the no-harming ethic of common morality.
I suspect that this will be the more controversial aspect of the present proposals.
26 Selgelid, M. 2007, A tale of two studies; ethics, bioterrorism, and the censorship of science, Hastings
Center Report, vol. 37, no. 3, pp. 3543.
132
9. An Expected-Value Approach to
the Dual-Use Problem
Thomas Douglas
In this chapter I examine how expected-value theory might inform responses
to what I call the dual-use problem. I begin by defining that problem. I then
outline a procedure, which invokes expected-value theory, for tackling it. I first
illustrate the procedure with the aid of a simplified schematic example of a
dual-use problem, and then describe how it might also guide responses to more
complex real-world cases. I outline some attractive features of the procedure.
Finally, I consider whether and how the procedure might be amended to
accommodate various criticisms of it. The aim is not to defend the procedure
in its original form, but to consider how far we must deviate from it to evade
the criticisms that I consider. I seek to show that, though it is necessary to
make substantial ammendments to the procedure, we need not eschew a role for
expected-value theory altogether. Even if one accepts the criticisms I discuss, it
is possible to defend a role for expected-value theory in responding to dual-use
problems.
137
the individual states. We can think of the expected value of the gamble as a
weighted average of the values of the individual states that may result, with the
weights given by the probabilities of those states coming about.4
To illustrate, suppose that you are considering whether to engage in a gamble
that involves tossing a coin. If the coin turns up heads, you win $10, if it turns
up tails, you lose $5. The probability of a heads is 0.5, and likewise for tails. If
you dont take the gamble, you win or lose nothing. Thus, the expected value of
the gamble will be given by
probability(heads).value(heads) + probability(tails).value(tails)
= 0.5 x $10 + 0.5 x $5
= $2.50.
The expected value of this gamble is $2.50. On the other hand, the expected
value associated with declining the gamble is $0, as, with certainty, one wins
nothing and loses nothing. Since the expected value of the gamble exceeds that
of the alternative, the expected-value approach under consideration will advise
you to take the gamble.
Now lets return to our schematic dual-use problem. The gamble faced by the
scientist has four possible outcomes: I, II, III and IV. So the expected value of
the gamble, Vgamble, will be given by
Vgamble = p(I).v(I) + p.(II).v(II) + p(III).v(III) + p(IV).v(IV)
where p(X) is the probability of state X and v(X) is the value of state X. (A
difference with the coin toss example, however, is that in this case, the values of
the states are supposed to reflect their overall value, not simply their value to the
agent, in this case, the scientist. Our question is not whether the scientist ought,
from a self-interested point of view, to undertake the research, but whether it
would be morally right for her to do so. Insofar as the value of an outcome bears
on the moral rightness of the action that brings it about, it is usually thought to
be the overall value that matters.)
Plugging in the values and probabilities that we assigned to these gambles, we
get the following result:
Vgamble = 0.475 5000 + 0.025 100 000 + 0.025 95 000 + 0.475 0
138
= 2500.
4 Some use the term expected value in a way that presupposes that the values of individual states are
measured in monetary terms. Expected value can then be contrasted with expected utility, which is instead
determined by the amount of utility contained in each state of the world. I use expected value in a way
that is neutral between different metrics of value such as monetary value, utility and health. I thus regard
expected utility as a species of expected value.
That is, an expected 2500 lives will be lost, on net. On the other hand, the expected
value of the alternativeobtaining state IV with certaintyis given by:
Valt = v(IV)
= 0.
The approach under consideration would advise taking the gamble if and only
if Vgamble exceeds Valt, and against taking the gamble if and only if Valt exceeds the
gamble. Given the probabilites and values that we have used, Valt exceeds Vgamble
so the approach will recommend against taking the gamblethat is, against
pursuing the scientific project.
example, there are many different levels of severity (in terms of both lives lost and
other negative consequences) that a bioterrorist attack could have. In addition,
though we assumed above that each outcome would occur either once or not at
all, in actual cases many of the important good and bad applications of scientific
output could occur multiple times. Finally, the development of a technology or
piece of scientific knowledge might have good and bad effects that are unrelated
to how it will subsequently be used (for example, one good effect might be
that there is now more intrinsically valuable knowledge; one bad effect might
be that valuable research funding is consumed).5 These complications all serve
to greatly multiply the number of permutations of different states of the world
that could result from the scientists decision, compared with the simplified
dual-use problem outlined above.
Moving from schematic examples to the real world would thus greatly
complicate any attempt to apply expected-value theory in order to resolve dualuse problems. However, it does not obviously raise any in principle difficulties.
One can still envisage an approach to such problems that would: 1) identify the
possible outcomes of each alternative course of action; 2) identify the possible
states of the world, each consisting of combinations of these outcomes;6 3)
explicitly assess the values and probabilities of these states of the world; 4)
use these probabilities and values to calculate the expected value associated
with each alternative; and 5) select the alternative associated with the highest
expected value. I will call this approach the expected-value procedure or EVP.
act is morally right if and only if its consequences will be at least as good as
those of any alternative action. One variant of consequentialism holds that an
actions actual consequences are what determine its rightness. Others hold that
the actions foreseen, foreseeable or likely consequences are what are important.
Proponents of these latter variants typically hold that the goodness of a set
of foreseen/foreseeable/likely consequences is determined by its expected value.
Their variants of consequentialism thus imply that an action is right if and only
if it is associated with at least as much expected value as any alternative. Thus,
if one applies the expected-value procedure outlined above when faced with a
dual-use problem, one can think of oneself as explicitly trying to identify and
adopt the course of action that is morally right according to some variants of
consequentialism.7
Third, the expected-value approach has a track record. The methodology used
in the EVP is a methodology that has been widely employed, often under the
label risk analysis, risk management or costbenefit analysis, to inform
major corporate and government decisionsfor example, decisions between
alternative power generation projects or mining operations.8
This strategy is motivated by the thought that some authors have, in other areas,
been too quick to move from the thought that a straightforward expected-value
approach faces fatal problems to the claim, often implicit, that expected-value
theory is not relevant at all to the problem at hand. Insofar as my argument
succeeds in showing that, at least in relation to the dual-use problem, one can
accommodate the most important objections to a straightforward expectedvalue approach without giving up entirely on expected-value thinking, I hope I
will have established that such a swift rejection of expected-value theory is not
justified in relation to the dual-use problem.
1. No adequate metric
An initial criticism of the EVP focuses on the need, in that procedure, to evaluate
various possible states of the world. It denies that there is any adequate metric
of value for making such evaluations.
In order to apply the expected-value procedure, we will need a cardinal metric
of value that can be used to evaluate the various states of the world that might
follow from different responses to a dual-use problem.9 Moreover, we will want
this metric to assign a value to every state of the world that is taken as an
input into the EVP, and to capture all of the morally significant features of each
state. Arguably, there is no such metric. One possibility would be to find some
cardinal metric of individual wellbeing. We could then sum the wellbeing of
all individuals in each state of the world and use this measure of aggregate
wellbeing to compare different states of the world. A problem, however, is that
it is controversial whether there is a common cardinal scale on which we can
measure and sum the wellbeing of different people. Some deny that there is any
cardinal metric that allows the wellbeing of different people to be measured in a
reliable way,10 others question whether interpersonal comparisons of wellbeing
have any meaning at all, implying that, as a matter of principle, there could
be no such metric.11 In addition, an aggregate wellbeing metric might fail to
capture some morally significant features of each state of the world. Some would
argue that, holding aggregate wellbeing constant, differences in the distribution
of wellbeing across individuals are morally significant. For example, some argue
that it is worse if a fixed amount of wellbeing is distributed less equally, justly
9 A cardinal metric is one that preserves orderings uniquely up to linear transformations. In a cardinal
metric, the interval between two points on the scale has a consistent meaning: it means the same regardless of
where those points fall on the scale.
10 See, for example, Jevons, W. S. 1970 [1871], The Theory of Political Economy, Penguin Books,
Harmondsworth, UK, Introduction.
11 See, for example, Robbins, L. 1935, An Essay on the Nature and Significance of Economic Science, 2nd edn,
Macmillan, London; Arrow, K. J. 1963 [1951], Social Choice and Individual Values, 2nd edn, Yale University
Press, New Haven, Conn., p. 9. Robbins allows that interpersonal comparisons of wellbeing have meaning as
disguised normative judgments, but denies that they have any descriptive meaning.
142
or fairly.12 Similarly, others would argue that, in assessing the value of a given
state of the world, it matters not just what distribution of wellbeing it contains,
but also how that distribution came about.
12 See, for example, Temkin, L. 1993, Inequality, Oxford University Press, New York; Persson, I. 2008, The
badness of unjust inequality, Theoria, vol. 69, nos 12, pp. 10924.
143
144
3. Bias
In attempting to identify and assign probabilities and values to different
outcomes, agents might have systematic tendencies to neglect or exaggerate
some considerations. For example, people might overstate the disvalue of
145
4. Demandingness
Applying the rEVP comprehensively and conscientiously might require
substantial time, effort, expertise and financial resourcesall resources that
could otherwise be spent on other worthwhile activities. It might also be
psychologically burdensome in the sense of requiring those who apply it to
override their natural inclinations (suppose that a journal editor strongly
committed to academic freedom decides, on the basis of the rEVP, that she
must heavily censor a submitted article). Applying the rEVP might, due to
the presence of such costs, have negative overall consequences even if it leads
decision-makers to make the best choices. The benefits of making the best
decisions might be outweighed by the costs associated with the means via
which those decisions were reached.
13 Lichtenstein, S., Slovic, P., Fischhoff, B., Layman, M. and Combs, B. 1978, Judged frequency of lethal
events, Journal of Experimental Psychology: Human Learning and Memory, vol. 4, no. 6, pp. 55178; Sunstein,
C. 2007, Worst-Case Scenarios, Harvard University Press, Cambridge, Mass., p. 6.
14 Babcock, L., Loewenstein, G., Issacharoff, S. and Camerer, C. 1995, Biased judgments of fairness in
bargaining, American Economic Review, vol. 85, no. 5, pp. 133743.
15 Desvousges, W. H., Johnson, F. R., Dunford, R. W., Boyle, K. J., Hudson, S. P. and Wilson, N. 1993,
Measuring natural resource damages with contingent valuation: tests of validity and reliability, in J. A.
Hausman (ed.), Contingent Valuation: A Critical Assessment, North-Holland, Amsterdam, pp. 91159;
Fetherstonhaugh, D., Slovic, P., Johnson, S. and Friedrich, J. 1997, Insensitivity to the value of human life: a
study of psychophysical numbing, Journal of Risk and Uncertainty, vol. 14, pp. 238300.
16 Tversky, A. and Kahneman, D. 1981, The framing of decisions and the psychology of choice, Science,
vol. 211, no. 4481, pp. 4538.
146
attempting to satisfy the expected-value criterion might not be the best way
of in fact satisfying that criterion, or might involve costs that outweigh the
benefits of satisfying it.
This distinction between criteria of rightness and decision procedures has long
been emphasised by consequentialists when presented with concerns about
demandingness and bias and other related objections. Consequentialists have
argued that their theory provides an abstract description of which acts are
right and which are not. It does not provide a procedure via which to decide
how to act.18 In fact, consequentialism is consistent with adopting some other
ethical theory as a procedure for guiding decisions. In an ideal world where
we responded to our evidence in a perfectly rational and costless way, we could
simply apply consequentialism as a decision procedure. But the actual world
is not like that. In the actual world it may, according to consequentialists, be
best not to make decisions by explicitly calculating which act will have the best
consequences.
Drawing a distinction between criteria of rightness and decision procedures
allows us to retain the basic idea underpinning EVP (and thus rEVP) while
deviating from these procedures. More importantly, it provides us with a
higher standardthe EVCagainst which we can measure alternative decision
procedures. Other things being equal, we should prefer a decision procedure
that comes closer than some alternative procedure to recommending the courses
of action that are right according to the EVCthe courses of action associated
with the highest expected value. In an ideal world, the best decision procedure
would be the EVP or rEVP. In the actual world, it may be some amendment of
these procedures, or even a wholly different approach. Either way, it is plausible
that we should keep the EVC in mind as a standard by which to judge competing
decision procedures.
I now turn to consider two criticisms that suggest that even the EVC will need
to be rejected.
6. Agent-relativity
The EVC is also blind to the way in which an agent posed with a dual-use
problem contributes to a good or a bad outcome; it focuses only on the probability
and value of the outcome. But, according to many nonconsequentialist ethical
theories, an agents relation to an outcome is also important. Consider a case in
which a scientist performs and publishes some piece of research in synthetic
biology that is intended to develop a new cure for cancer but could also be
misused in unjustified biowarfare. It might be argued that this possible negative
outcome should be discounted relative to at least some of the other outcomes
because
1. It is not caused by the scientist, but is merely allowed to occur. In contrast,
at least some positive outcomes of the researchfor example, the production
of intrinsically valuable knowledgemight be said to be caused by the
scientist.20
2. It is not intended, but is merely foreseen, by the scientist. By contrast, the
possible positive outcome of developing a cure for cancer is intended.21
19 See, for example, Hansson, S. O. 2003, Ethical criteria of risk acceptance, Erkenntnis, vol. 59, no. 3, pp.
291309.
20 This claim could be grounded on the doctrine of doing and allowing.
21 This claim could be grounded on the doctrine of double effect. See Suzanne Uniackes Chapter 10, in this
volume.
149
of as a weighted average of the values of those states, with the weights given
by their probabilities. But we could also weight the values using further
factors to accommodate agent-relativity or rational risk aversion. Consider a
nonconsequentialist approach that takes into account the intentions of an agent
posed with a dual-use problem as well as the consequences of her decision.
It may be helpful to think of such an approach as a variant of the expectedvalue approach that weights the values of different possible states of the world
according to both their probability and whether they were intended.22 Adopting
such a formalised approach may help to ensure that we are clear and explicit
about precisely how much the intentions of the agent matter.
Conclusions
I have outlined an approach to dual-use problems that involves: 1) identifying
possible outcomes of each alternative course of action; 2) identifying possible
states of the world that might result, each consisting of a combination of these
outcomes; 3) explicitly assessing the values and probabilities of these states of
the world; 4) using these probabilities and values to calculate the expected value
associated with each alternative course of action; and 5) selecting the alternative
with the highest expected value.
This expected-value procedure is attractive because it captures some
commonsense intuitions about how to respond to risks and benefits, is consistent
with a major school of ethical thought (consequentialism), and has a track record
in government and corporate decision-making. It is, however, also susceptible
to several criticisms, which may justify deviation from it. Criticisms adverting
to the lack of an adequate metric for ranking states of the world and to empirical
and moral uncertainties suggest that we may need to regard expected values
as providing only an imperfect indication of which course of action to pursue.
Criticisms appealing to bias and demandingness suggest that, even if the right
course of action to take when faced with dual-use problems is always the one
that maximises expected value, it may be best to adopt a decision procedure
that does not involve calculating expected values at all. Finally, criticisms based
on agent-relativity and the rationality of risk aversion go even further: they
suggest that in some cases the right course of action will not be the one that
maximises expected value.
I have not assessed whether these criticisms are persuasive. But I have tried to
show that, even if they are, there may be some value in retaining the expected-
22 The expected value assigned to an alternative would then be relative to the agent making the decision
between alternatives; it would no longer be a value that could be ascribed from any standpoint.
151
Further questions
Suppose that we do keep the EVC in mind, at least as a starting point for further
discussion. Two important further questions arise.
First, what outcomes should be included among the positive and negative
outcomes of a course of action? For example, should the production of knowledge
be included as a positive outcome independent of any positive applications of
that knowledge? This will depend on the controversial question of whether
knowledge has any intrinsic value.
A second question is, to the extent that anyone should ever explicitly assess the
expected value associated with a particular strategy for preventing misuse, who
should do it? For example, should the decision be made by some form of expert
committee, or should it be made by a political or representative body?
I leave these questions as potential topics for future discussion.
152
to the morality of what she does. This view can take absolutist and nonabsolutist forms. In its absolutist form, it holds that certain types of actions,
such as intentionally killing an innocent person, are intrinsically wrong and
always morally impermissible (absolutely prohibited). A non-absolutist version
maintains a very strong moral constraint against certain types of actions such
as those that involve the intended killing of an innocent person; on this view,
such actions are always intrinsically morally objectionable even if in extreme
circumstances they might be necessary in the absence of any morally acceptable
alternative.2
Not all prominent contemporary normative moral theories regard an actors
intention as directly morally significant in this way. An entirely outcomeoriented (consequentialist) theory such as utilitarianism, for instance, does
not share the assumptions on which the DDE is founded and thus regards the
DDE as irrelevant to the moral evaluation of actions that have both good and
bad effects (including instances of dual use). Furthermore, some of those who
accept or are sympathetic towards the moral assumptions behind the DDE
nevertheless regard the DDE itself as problematic in some respects. There is an
extensive critical philosophical literature on issues surrounding the DDE, which
obviously cannot be explored here.3 Since the principal purpose of this chapter
is to address the relevance of the DDE to the ethics of dual use, I shall assume
as a modus operandi that there is a morally significant difference between a bad
effect of an action that the actor intends and one that she foresees but does not
intend and that this distinction can be directly relevant to the moral evaluation
of some actions that have both good and bad effects. This assumption can be
disputed but it represents a widely accepted view.
A key to understanding the nature and purpose of the DDE can be found in
its origins. St Thomas Aquinas appealed to a distinction between an intended,
as opposed to a foreseen, effect of an action in his explanation of why it can
be permissible to kill another person in self-defence.4 Homicide in self-defence
posed a problem for Aquinas because he held the more general position that it
is never permissible for a private person to engage in intentional killing. Does
this view imply that I cannot legitimately use lethal force on an unjust attacker
if it is necessary to save my own life? In responding to this question, Aquinas
claimed that the act of fending off an attacker can have two effects: a good
effect (saving ones own life), which the self-defending actor intends, and also
2 This latter position is sometimes referred to as threshold deontology. See Alexander, L. and Moore, M.
2007, Deontological ethics, in Stanford Encyclopedia of Philosophy, Stanford University Press, Stanford,
Calif., <http://plato.stanford.edu/entries/ethics-deontological/> (viewed 6 February 2012).
3 See, for example, McIntyre, A. 2009, Doctrine of double effect, in Stanford Encyclopedia of Philosophy,
op. cit., <http://plato.stanford.edu/entries/double-effect/> (viewed 6 February 2012).
4 Aquinas, St Thomas 1966, Summa Theologiae, vol. 38, Blackfriars edn, Eyre & Spottiswood, London, 2a,
2ae, 64, article 7.
154
10. The Doctrine of Double Effect and the Ethics of Dual Use
a bad effect (the attackers death), which the actor can foresee but need not
intend. Thus, Aquinas maintained, homicide in genuine self-defence does not
contravene an absolute prohibition against intended killing.
The distinction between an actions intended effects and those effects that the
actor foresees but does not intend is central to what later came to be known as
the doctrine of double effect. Subsequent to Aquinass discussion of homicide
in self-defence, the DDE was developed as part of natural-law reasoning about
the morality of a range of actions that can have both good and bad effects, where
the bad effect is something that it would be wrongful to intend. (Those who
invoke the DDE in cases of double effect need not share Aquinass view that
homicide in self-defence against an unjust attacker is always unintended killing
or that it must be justified as such.5 In contemporary applications of the DDE
the foreseen bad effect in question is usually the death of an innocent person.)
In its traditional form, the DDE is now a general principle of practical ethics
that holds that in some circumstances and under specific conditions it is morally
permissible to cause a foreseen bad effect of a type that is always morally wrong
to intend. These conditions are
1. the act itself must be morally good or at least indifferent
2. the agent must not positively will (intend) the bad effect
3. the good effect must be produced directly by the action, not by the bad effect
4. the good effect must be sufficiently desirable to compensate for the bad
effect.6
Although the DDE has its origins in Thomistic and natural-law ethics it has been
taken up more widely as part of secular moral thinking by those who maintain
that whether or not an actor intends to bring about a particular bad outcome,
such as an innocent persons death, is itself a morally significant feature of her
action that can make a difference to the morality of what she does. For instance,
the DDE is frequently invoked in a number of prominent contemporary contexts,
most notably in relation to the ethics of war and to issues of medical ethics. These
applications include the foreseen killing of noncombatants as an incidental
effect of aiming at a military target, the use of triage in hospital emergency
rooms and on battlefields, some cases of risky surgery, and the administration
of increased doses of pain relief that suppress respiration.7 It is perhaps worth
pointing out here that within its natural-law context, the DDE was developed
5 For a critical discussion, see Uniacke, S. 1994, Permissible Killing: The Self-Defence Justification of Homicide,
Cambridge University Press, Cambridge, ch. 4.
6 A fuller statement can be found in the New Catholic Encyclopedia, 1967, vol. 4, McGraw-Hill, New York,
pp. 10202.
7 See Uniacke, S. 2007, The doctrine of double effect, in R. Ashcroft et al. (eds), Principles of Heath Care
Ethics, 2nd edn, Wiley & Sons, Chichester, UK.
155
10. The Doctrine of Double Effect and the Ethics of Dual Use
effect, including the example just described, where a foreseen bad effect of an
action is clearly incidental to what the actor aims to achieve in acting as she
does.) While the actors intention is central to the guidance offered by the DDE
it is not the sole consideration. For an act of double effect to be permissible the
intended good effect must also be sufficiently morally weighty in comparison
with the foreseen bad effect to warrant causing the bad effect. The DDEs fourth
condition requires a judgment of proportionality. This particular condition
could permit my swerving the car in order to avoid hitting a greater number of
innocent people in the circumstances described above, but it would not permit,
for example, my swerving a car in the direction of a pedestrian in order to avoid
hitting a dog or damaging property.
An obvious question is why we should bother with a distinction between
intention and foresight in circumstances such as these. Why not simply consider
something like the DDEs fourth condition and compare the probable outcomes
of the available alternatives (for example, to swerve or not to swerve) and say
that someone who is faced with making such a decision should act so as to save
as many innocent people as possible? Those who think that the DDE is morally
significant in such cases would reply that if we simply appeal to a principle that
tells us to save as many innocent people as possible and do not also invoke a
distinction between an effect that an actor intends as opposed to an effect that
she (merely) foresees, we commit ourselves to killing some innocent people as
a means of saving others, and this is morally unacceptable. It is one thing that I
put the life of a pedestrian in grave danger in swerving the car in order to avoid
hitting a greater number of people; it would be another thing to cause the death
of innocent person as a means of preventing harm to a greater number by, for
example, using an innocent person as a human shield.
A related point is that according to the DDE, actions of double effect that
meet its four conditions are morally permissible. (They are not necessarily the
right thing to do all things considered or what the actor is morally required to
do.) If under the fourth condition the foreseen bad effect is disproportionate
in relation to the intended good effect then the action under consideration is
held to be impermissible. The fourth condition does not represent an all-thingsconsidered judgment based on the actions predicted or actual overall outcome.
This point of clarification is important in the context of a discussion of the
application of the DDE to the ethics of dual use because dual-use dilemmas are
often explicitly discussed solely in cost/benefit terms, as being entirely a matter
of whether the overall likely benefits of an activitys good use can be traded
off against and would outweigh the risks of its malevolent or negligent use. As
with actions of double effect, however, the basic question posed by a dual-use
157
dilemma is, I take it, whether it would be morally permissible to engage in the
activity in question given the risks of its misuse, and not whether it would be
morally right or morally obligatory to do so.
10. The Doctrine of Double Effect and the Ethics of Dual Use
because they involve the actions of others, and this epistemological difficulty
is prominent in the literature on the ethics of dual use in relation to the socalled precautionary principle.10 What is less often highlighted is that dual-use
dilemmas also involve the complex question of the moral significance of another
persons further agency in relation both to the degree to which I ought to take
this into account in deciding what it is permissible for me to do and to the
degree to which by acting in a certain way I am morally obliged to try to prevent
the further agency of another person or its bad effects. Above I have brought
out this moral complexity by identifying a number of dissimilarities between
dual-use dilemmas and typical cases of double effect. Might considerations that
inform the DDE and the debate surrounding its application nonetheless assist us
in reasoning about this moral issue in relation to dual use? I will suggest in the
next section that they can.
Prior to that discussion, as a crucial first step, we must ask whether the DDE
is applicable to cases in which the foreseen bad effect is due to the agency of
another person. (If it is not then dual-use dilemmas are not instances of double
effect.) I see no reason in principle why it cannot be. Consider a modified version
of the earlier example in which my cars brakes fail, but where I will hit the
pedestrian only because when I swerve to the left he will be jostled into the
path of my car by someone else. We can also think about the DDE in relation to
a case of duress. Say the Gestapo threatens to torture or kill an innocent hostage
unless I tell them the whereabouts of a Jewish family in hiding. In resisting this
threat I foresee but I do not intend the injury that the Gestapo will inflict on an
innocent hostage, even if I could prevent it by giving in to the threat. In a third
type of case, we might apply the DDE to my decision to work as a bartender,
where in taking up such a position I foresee that despite taking due care, at
some future point I will almost certainly serve alcohol to someone who will then
drive while intoxicated and possibly injure or kill someone.
If the DDE is indeed applicable to examples such as the three just described, in
which a foreseen bad effect is due to the agency of another person, we cannot
conclude simply on this basis that the DDE is therefore applicable to dual-use
dilemmas. This is because there are also various respects in which each of these
three examples differs structurally from a dual-use dilemma. For instance, in
the first, modified car-swerving example, although the bad effect is contingent
upon the intervening agency of another person (who jostles the pedestrian into
the path of my car), nonetheless I will directly hit the pedestrian by swerving
10 This is a general principle that states that if an action or activity has a suspected risk of causing serious
harm to the public or to the environment, if experts disagree about whether the action or activity will or is
likely to cause this harm, it is up to those proposing to undertake the action or activity to prove that it will
not cause this harm. The implication of the principle is that without such proof they should not proceed.
159
my car.11 In the second, duress example, the act in question (my refusal to reveal
the familys whereabouts to the Gestapo) is a negative action, as opposed to
something that I do; furthermore, even if I could protect the innocent hostage
by telling the Gestapo what they want to know, the injury inflicted on the
hostage will be wholly due to the actions of another agent (the Gestapo). In the
third, bartender example, the action in question is my assumption of a longerterm activity, as opposed to an individual action of double effect. Nonetheless,
the third example is probably structurally the closest to an instance of dual
use of the three examples, in that it involves positive action on my part with a
foreseen bad effect that is both indirect and due to the agency of another person.
More importantly, the third example also shares a crucial feature with dual-use
dilemmas that is missing in the second, duress example, but is present in the
first, modified car-swerving example. And this is that in the circumstances as
they actually occur, my positive action plays a specific causal role in bringing
about the bad effectnamely, it enables or aids another person to do something
wrongful (to jostle the pedestrian into the path of my car/to drive while
intoxicated/to create a bioweapon). (In the second example, on the other hand,
my refusal to give in to the Gestapos threat does not enable them to torture or
kill the innocent hostage: it does not provide them with the means.) For this
reason, in the first and the third examples the description of the foreseen bad
effect of my positive action includes enabling or aiding the wrongful action of
a third party, and this is also true of dual-use dilemmas where what I do (for
example, publish research) enables or aids its malicious or negligent use.
The fact that in each of these cases the further actions of another agent are
properly regarded as a foreseen bad effect of my action presses the following
question: under what conditions am I responsible for the actions of another
person in the sense that I must regard what he or she does as an effect of what
I do for which I can be accountable? This is a broad and complex question that
clearly cannot be addressed in detail in this chapter. Nevertheless, in the next
section I shall identify considerations relevant to the DDE that can also shed
important light on how we should answer this question in the case of dual-use
dilemmas.
10. The Doctrine of Double Effect and the Ethics of Dual Use
good effect. On the contrary, according to the DDE the actor is responsible for the
bad effect in an important sensenamely, that she must take it very seriously
into account in deciding how to act and she must morally account for having
brought it about or not prevented it when she might have done otherwise.12 As
an upshot of this responsibility for the bad effect, the actor must try to achieve
the good effect without the bad if she can possibly do so. It might uncritically
be assumed that this type and degree of responsibility for the bad effect are
due to the fact that in typical cases of double effect although the bad effect is
unintended it is both a foreseen and a direct effect of what the actor chooses to
do. Certainly these features strengthen the case for attribution of responsibility
for the bad effect. But they are not necessary to such attribution. In the first
(modified car-swerving) and the third (bartender) examples discussed earlier,
the foreseen bad effect of my action incorporates the agency of another person,
and this is something that I must take into account in deciding how to act and for
which I can be called to account. To be sure, in the first example, the bad effect
is directly and strongly connected to my swerving the car, but the connection
is less direct in the third example in which my serving someone alcohol might
be a sufficient but not a necessary condition of his being intoxicated and also
an indirect cause of his driving in this condition. In both these examples,
attribution of responsibility for the bad effect to my own agency is due in large
part to the enabling role that my positive action plays in the further agency of
another person. And this will also be true of cases of dual use.
At this point I would like to suggest that I am responsible for the bad effect of
another persons further agency, in the sense that I must take it into account in
deciding how to act and I can be accountable for it (and will bear a degree of
culpability for it in the absence of justification or excuse) if at the time of my
own prior action: 1) I can reasonably foresee that another agent will bring about
this effect; and 2) in acting as I do I provide someone else with the means or the
opportunity to bring about this effect.13 Discussions of dual-use dilemmas tend
to focus on the first of these two conditionsthat is, on the extent to which
another persons further agency is reasonably foreseeable and on the probability
of the suspected bad effect actually occurring. In drawing attention to the
significance of the second condition, I hope to emphasise that where these two
conditions are met, although the bad effect is indirect and due to the agency of
12 Adherents to the DDE regard our obligation to prevent harm as generally less stringent than our obligation
not to do harm. Nonetheless, they also hold that it can be wrong not to prevent harm to an innocent person,
particularly where this is an intended effect of ones inaction. It can be important to the permissibility of
some instances of non-prevention of harm (for example, some decisions not to rescue) that the harm although
foreseen is not intended.
13 Note these two conditions say if and not only if. Some people would also regard me as responsible
in this sense for what the Gestapo does to the innocent hostage in the second example. But a persons
responsibility for a foreseen outcome of her negative action is a much more contentious issue, especially where
it also involves the further agency of someone else, and neednt be taken up here.
161
Concluding remarks
I have maintained in this chapter that dual-use dilemmas are not paradigm
instances of double effect and I have identified significant similarities and
dissimilarities between the two. All the same, our critical thinking about whether
the DDE and its conditions are indeed relevant to dual-use dilemmas has served
to highlight some morally important aspects of the ethics of dual use that might
otherwise not receive sufficient critical attention. These include the relevance
of a distinction between foresight and intention to the moral permissibility
of an action with both good and bad effects; the conditions under which we
are accountable for the further agency of other people who bring about a bad
effect; and the relatively stringent agent-related obligation to prevent harm or
wrongdoing by others where ones action would provide them with the means
or the opportunity. A suitably morally complex ethical evaluation of dual use
needs to address these various issues. But in a more sceptical vein we can ask
whether we need to refer to the DDE in order to do this.
The DDE is centrally concerned with the moral relevance of a distinction between
intention and foresight, since this distinction is held to be crucial to permissible
action in cases of double effect. It is clear in the case of a dual-use dilemma
that the foreseen bad use is not intended by the agent who faces the dilemma
(although it may well be intended by the further agent who engages in the bad
use). According to the DDE, the fact that an actor does not intend a foreseen
bad effect of her action can be highly significant to the moral permissibility
of what she does. This then will also be a consideration in deliberation about
the permissibility of activities that are potentially dual use. The DDE also
emphasises that in cases of double effect the fact that the bad effect is foreseen
is itself morally significant to the permissibility of the action. The actor must
take the bad effect seriously into account in her deliberations and her action
must be justified in terms of a good effect that is sufficiently morally important
to warrant causing the bad effect. And as I have argued above, this can be so in
cases of double effect where the bad effect will be indirect and due to the further
162
10. The Doctrine of Double Effect and the Ethics of Dual Use
163
state spaces are indeterminate. Both state spaces have consequences for decisionmaking and these are elaborated in this section. The third section begins by
pointing out the dangers in restricting judged probabilities and utilities to
falsely precise representations and describing the decisional consequences of
imprecision. It then brings in psychological considerations such as tendencies
towards overconfidence in predictions and confirmation bias.
The discussion of these issues involves a bit of mathematics and some technical
definitions. Both the mathematics and technicalities are necessary, and these are
far from merely academic matters. Dilemmas were not well understood until a
mathematical framework was developed for describing them and distinguishing
them from non-dilemmatic trade-offs, and that development transformed the
economics of public goods and common-pool resources as well as our ideas about
individual versus collective rationality. It also provided heretofore unachievable
insights into the evolution of cooperation, and has spawned a vast literature
in economics, political science and psychology. Likewise, not knowing what
all the possible outcomes are is commonplace in real-world decision-making in
the face of a largely unknowable futureand is routinely ignored by decisionmakers and standard decision frameworks. A systematic consideration of the
consequences thereof includes asking which options (and how many) should be
on the table for responding to dual-use dilemmas. The framework presented
in this chapter has been applied to debates about this issue in law, medicine
and related policy matters. Finally, a careful assessment of the consequence of
uncertainty requires understanding that there are different kinds of uncertainty,
with distinct consequences for reasonable decision-makers. A growing literature
on this topic includes demonstrations of its practical impact in domains such as
insurance.
One reviewer of an earlier draft of this chapter declared that I indulged in makebelieve that mathematics can contribute to our understanding of dual-use
dilemmas and what to do about them. I have three rebuttals to this dismissive
remark. First, as I hope the previous paragraph has made abundantly clear, the
mathematics in this chapter are about as far from make-believe as it is possible
to be; they concern some of the most important uncertainties in real-world
decision-making. Second, without even the simple mathematics that informs
this chapter, it is nigh well impossible to properly understand and deal with
those uncertainties; words simply will not suffice. Finally, failing to distinguish
a trade-off from a dilemma, failing to take into account the fact that we do not
know what new technologies will be invented and yet often underestimate the
likelihood of their emergence, not bothering to think carefully about whether
we need two, three, 10 or 1000 alternative ways of responding to dual-use
166
Dilemmas or trade-offs?
When are the dilemmas actually social dilemmas, as opposed to trade-offs?
Genuine social dilemmas are harder to resolve than trade-offs. They also present
a fundamental difficulty for rational, self-interested agents because the pursuit
of self-interest in a social dilemma leads to the destruction of the common good.
Moreover, the structure of a social dilemma partly determines the approaches
needed to resolve it.
First, social dilemmas are social. They involve a game structure comprising at
least two decision-makers. Some dual-use dilemmas do not readily yield such
a game structure because they are cast as single-agent decisions. An example
is the concern that research conducted for beneficial purposes might be used
by secondary researchers or other users to construct bioweapons. If these users
would not be able to exploit the research if it were not conducted then the
situation reduces to a single-agent decision:
Research potential benefits and risk of exploitation
versus
No research potential costs and no risk of exploitation.
While this decision may be difficult, it is not a social dilemma or even a dilemma
in the sense of damned if you do and damned if you dont. Instead, this is
arguably a trade-off wherein each option combines potentially strong positive
and strong negative consequences.
Dual-use dilemmas can become social dilemmas involving multiple agents if the
decisions made by each agent alter the consequences for all of them. Biological
research as an arms race is perhaps the most obvious example. For instance, if
researchers in country A revive an extinct pathogen and researchers in country
B do not, country A temporarily enjoys a tactical advantage over country B while
also risking theft or accidental release of the pathogen. If country B responds
by duplicating this feat then B regains equal footing with A, but has increased
the overall risk of accidental release or theft. For country A, the situation has
worsened not just because it has lost its tactical advantage but also because
the risk of release has increased. Conversely, if A restrains from reviving the
167
pathogen then B may play A for a sucker by reviving it. It is in each countrys
self-interest to revive the pathogen in order to avoid being trumped, but the
collective interest resides in minimising the risk of accidental or malign release.
A similar example of a social dilemma is where countries A and B are considering
whether to eliminate their respective stockpiles of smallpox. The payoff matrix
is shown in Table 11.1. The entries are
R = reward
T = temptation
S = sucker
P = punishment.
This matrix enables a definition of a social dilemma. A social dilemma exists if
these four conditions hold
R>P
R>S
2R > T + S
T > R or P > S.
Eliminate
Retain
Eliminate
Ra, Rb
Sa, Tb
Retain
Ta, Sb
Pa, Pb
168
Eliminate
Retain
Eliminate
3, 3
2, 4
Retain
4, 1
1, 2
Table 11.2 makes it easy to see the roles played by greed and fear in a social
dilemma. Each country can obtain its best outcome (rated 4) by retaining their
supply if the other country eliminates theirs. Country Bs worst outcome (rated 1)
and country As second-worst result are if each eliminates supply while the other
retains theirs. If both act on fear and/or greed and retain their supplies then the
joint outcome is the worst of all four (rated 1 for A and 2 for B).
Different structures yield distinct pressures for and against eliminating smallpox
stockpiles. A cooperation index is
Thus, in Trust and Prisoner Kf > 0 whereas in Chicken Kf < 0, while in Chicken
and Prisoner Kg > 0, whereas in Trust Kg < 0. In Chicken, Greed is the component
detracting from motivation to eliminate stockpiles, and in Trust, Fear is the
detractor. Prisoner is the only dilemma in which both the Fear and the Greed
components exceed zero, so that both detract from motivation to eliminate. This
is why K generally is lowest for Prisoner.
Another crucial characteristic of a dilemma is the public versus private
nature of the consequences. This strongly influences whether institutional
solutions such as privatisation are potential solutions for social dilemmas.
A good is subtractable if its use by one agent decreases the potential for its
169
for retaining smallpox supplies. This is because Bs actual outcome ranks for
retention are {4, 2} and for elimination they are {3, 1}, whereas A will believe
they are {4, 1} and {3, 2} respectively.
Obviously there is much more to determining the nature of a dual-use dilemmas
structure than has been dealt with in this section. The intention here is merely
to provide a starting point by posing the question of whether a structure
constitutes a social dilemma and, if so, what kind of social dilemma the structure
corresponds to. The crucial difference between a social dilemma and a tradeoff is that a social dilemma entails a conflict between individual and collective
interests that does not appear in trade-offs. It is plausible, therefore, that the
policies and procedures for dealing with dual-use dilemmas also will need to
distinguish between the two.
Partition indeterminacy
Nearly all formal decision-making frameworks, including SEU, assume
that all possible options and outcomes are known. In other words, the state
space is predetermined. The nature of innovative research implies that in at
least some dual-use dilemmas that assumption is untenable on three counts.
First, the potential outcomes of research often are not completely known. The
accidental creation of a mousepox super strain5 is a case in point, as is the
current state of ignorance regarding potential consequences of third or fourthgeneration nanomaterials. Second, the uses of research outputs also sometimes
are unanticipated. Witness the applications in cryptography of number theory,
a sub-discipline that once was held up as the epitome of pure mathematics
beyond reach of any applicability. Third, the variety of responses to the threat
of research misuse is not predetermined. The first two sources of state space
indeterminacy are matters to be taken into account by those who make judgments
and decisions, and this is the topic of the next subsection. The third, however,
can be a matter of choice, and this is discussed in the subsection thereafter.
of the J events must sum to 1, meaning that the entire probability mass is
concentrated on that set of events. Thus, a more knowledgeable rational agent
who has assigned a probability of 1/2 to the first dog winning and 1/4 to the
second dog is compelled to assign the remaining 1/4 to the third.
The number of possible elementary events or states in a space is determined by
the partition of that space. The greyhound race has been partitioned into three
outcomes: dog one wins, dog two wins or dog three wins. Were we to allow
ties, the partition would expand to J = 7. The ignorant agent now would assign
a probability of 1/7 to each dog winning, and the more knowledgeable agent
could distribute the remaining 1/4 probability across the remaining five events
instead of having to allocate it all to the third dog winning. Thus, probability
assignments are partition dependent.
When partitions are indeterminate, partition dependence poses a problem for
subjective probability assignments. This is not the same problem as unknown
probabilities over a unique and complete partition (for example, where we know
that there are only red and black marbles in a bag but do not know how many
of each).6 It is more profound. In the absence of a uniquely privileged partition,
there is no defensible prior probability distribution to be constructed.
Two separable problems for partitions may arise. One is an incomplete account
of possible events. A unique and complete partition might be attainable in
principle, but we lack the necessary information. The other problem is the
absence of a privileged partition even when one has a complete account of
those possibilities. Shafer7 presented an example of this problem as a motivation
for the belief functions framework. He asked whether the probability of life
existing in a newly discovered solar system should be partitioned into {life, no
life} or {life, planets without life, no planets}. This issue arises naturally when
a decision must be made that involves a threshold or an interval on a continuum.
We shall revisit this particular problem in the next subsection.
Returning to the first problem, the most common situation confronting judges
or decision-makers is partial knowledge of the possible outcomes. We may know
some of the potential uses and misuses of a new biotechnology but not all of
them. We might even be willing to assign a subjective probability that party
X will misuse this technology in ways we can anticipate. But what probability
should we assign to X misusing the technology in ways we havent anticipated?
Likewise, what probability should we assign to party Y finding a new way to
use the technology for good?
6 Smithson, M. 2009, How many alternatives? Partitions pose problems for predictions and diagnoses,
Social Epistemology, vol. 23, pp. 34760.
7 Shafer, G. 1976, A Mathematical Theory of Evidence, Princeton University Press, Princeton, NJ.
172
the next subsection,13 but these have limited scope. Other criteria could be linked
with strategies for manipulating and exploring judgment biases in informative
ways. As a simple example, expert judges estimating probabilities of adverse
consequences arising from the revival of an extinct pathogen could be randomly
assigned to one of two conditions: a twofold partition (consequence versus no
consequence) or a J-fold partition (a list of anticipated consequences plus a
catch-all category for unanticipated ones). Partition dependence would predict
that the average probability of an adverse consequence in the first condition
should be less than the average sum of the probabilities across the J consequence
categories in the second condition. The results would yield fairly defensible
lower and upper expert estimates of the probability of adverse consequences.
More sophisticated experimental designs would enable the construction and
estimation of relevant partition-dependence effects.
Finally, let us briefly consider non-standard probability frameworks that are not
partition dependent. These have appeared in the growing literature on generalised
probability theories, and also in behavioural economics.14 Walley15 argues on
normative grounds that imprecise probability frameworks can avoid partition
dependence entirely. He proposes that when judges are permitted to provide
a lower and an upper probability judgment (that is, imprecise probabilities)
every ignorance prior should consist of vacuous probabilities {0, 1}. In the
greyhound race example, the ignorant agent could assign a lower probability
of 0 and an upper probability of 1 to every event regardless of whether the
partition is threefold or sevenfold. The lower and upper probabilities of the first
dog winning would be 0 and 1 regardless of the partition, thereby avoiding
partition dependence. Walley developed an updating method (the imprecise
dirichlet model) that is partition independent and has generated interest within
the community of imprecise-probability theorists.
That said, recent studies16 experimentally demonstrated that naive judges are
just as strongly influenced by partitions when making imprecise probability
judgments as they are when making precise probability judgments. Moreover,
they demonstrated that many judges anchor on 1/J as the midpoint of their
lower and upper probability judgments. No applicable de-biasing strategies have
13 Smithson, M. 2006, Scale construction from a decisional viewpoint, Minds and Machines, vol. 16, pp.
33964; and Smithson, 2009, op. cit., pp. 34760.
14 For example, Grant, S. and Quiggin, J. 2004, Conjectures, refutations and discoveries: incorporating
new knowledge in models of belief and decision under uncertainty, Paper presented at the 11th International
Conference on the Foundations and Applications of Utility, Risk and Decision Theory (FUR XIParis), under
the joint auspices of the Ecole Nationale Suprieure dArts et Mtiers (ENSAM) and the Ecole Spciale des
Travaux Publics (ESTP), Paris, 2 July.
15 Walley, P. 1991, Statistical Reasoning with Imprecise Probabilities, Chapman Hall, London; and Walley,
P. 1996, Inferences from multinomial data: learning about a bag of marbles, Journal of the Royal Statistical
Society, Series B, vol. 58, pp. 334.
16 Smithson, M. and Segale, C. 2009, Partition priming in judgments of imprecise probabilities, Journal of
Statistical Theory and Practice, vol. 3, pp. 16982.
174
yet been reported. Nevertheless, the possibility remains that allowing judges to
express one kind of uncertainty (imprecision in their probability assignments)
may militate against the impact of another kind (partition indeterminacy).
17 Connolly, T. 1987, Decision theory, reasonable doubt, and the utility of erroneous acquittals, Law and
Human Behavior, vol. 11, pp. 10112.
18 Ibid., p. 111.
19 Smithson, 2006, op. cit., pp. 33964.
20 For evidence that this also is what humans do, see Smithson, M., Gracik, L. and Deady, S. 2007, Guilty,
not guilty, or ? Multiple verdict options in jury verdict choices, Journal of Behavioral Decision Making,
vol. 20, pp. 48198.
175
then the decision-maker should prefer act Rj over Rj1. The odds threshold
Wj1j therefore is determined by the ratio of utility differences.
Table 11.3 Twofold Partition of Acts
R1
R0
Laissez F.
Prohibit
No misuse
H1 = 1
H0
Misuse
G1 = 0
G0
The simplest set-up of this kind is shown in Table 11.3. There are two possible
acts: prohibition or laissez faire. Without loss of generality we may assign
H1 = 1 (the best possible outcome) and G1 = 0 (the worst). Therefore, the odds
threshold is
H0 > 1 q0/w01.
Suppose we also wish to restrict w01 > y0 > 1. This should seem reasonable,
because we are merely restricting the odds-of-no-misuse threshold to be above 1.
Then
For example, if q0 = 0.1 and y0 = 10 then H0 > 0.99; and in fact if q0 = 1 and
y0 = 100 then we also have H0 > 0.99. Thus, no misuse under prohibition has
nearly as high a utility as no misuse under laissez faire, implying that prohibition
hardly decreases utility at all. Moreover, in the special case where prohibition
obviates misuse so that G0 = H0, a high odds threshold yields a correspondingly
high value for G0 and H0. For instance, y0 = 10 implies G0 and H0 both must
exceed 10/11.
The problem is the inability to simultaneously have a high value of y0, a low q0
and a relatively low H0. The chief result is that a high (and therefore cautious)
odds-of-no-misuse threshold for invoking the prohibition of research requires a
belief that prohibition results in only a very small decrease in utility relative to
the improvement in the (dis)utility of misuse. As in the legal standard-of-proof
case, this difficulty arises because we have only two possible acts. A way around
this is to introduce a third act (middle option). Let us call it Regulate. Table
11.4 shows the utility set-up for this threefold partition.
Table 11.4 Threefold Partition of Acts
R2
R1
R0
Laissez F.
Regulate
Prohibit
No misuse
H2 = 1
H1
H0
Misuse
G2 = 0
G1
G0
The w01 threshold now determines when the Regulate option is chosen over
Prohibit, and a new threshold, w12, determines when Laissez-Faire is chosen
over Regulate. Now, H1 > (y1 q1)/y1 implies
177
Setting w12 = 5 and G0 = 0.5, for instance, and using the settings q1 = 0.1 and
y1 = 10, gives
H0 > 1 (0.5 0.1)/5 0.1/10 = 0.91
If we are willing to lower the threshold to w12 = 2 and increase G0 to 0.68 then
H0 > 1 (0.68 0.1)/2 0.1/10 = 0.7.
The threefold partition therefore can express a belief that outright prohibition
could substantially negatively affect research (in this last example, a decline in
utility from 1 to 0.7). Nevertheless, there are limits if we take certain additional
constraints into account. It seems reasonable to stipulate that misuse cannot
yield a greater utility than no misuse, so we impose the constraint G0 < H0. As
mentioned earlier, the case where G0 = H0 corresponds to the situation where
prohibition of research eliminates the possibility of misuse of its outputs, so that
there is no difference between the no misuse and misuse states. The restriction
G0 < H0 and the constraint w01 = 1 imply that H0 > 1/2. Higher odds thresholds
increase the lower bound on H0. It is easy to prove that the general relationship
is w01 = x implies H0 > x/(x + 1) = p01, the corresponding probability threshold.
In the two examples above, w01 = 5 implies H0 > 5/6 and w01 = 2 imply H0 > 2/3.
Thus, extreme cases where prohibition of further research would hardly alter
the (dis)utility of misuse of an existing technology impose severe restrictions on
the utility if there is no misuse. Table 11.5 shows a set-up like this, with similar
low values of G1 and G2. We would be inclined to set the odds thresholds w01
and w12 to be very highsay, w01 = 100 and w12 = 1000. The result would be
that H1 and H0 both would be very close to 1: H1 = 0.99999 and H0 = 0.99989.
Therefore, a substantial difference between H1 and H0 (say, due to the inhibition
of scientific progress) can only arise if there is a substantial difference between
G1 and G0 and relatively low threshold odds of no misuse, w01.
Table 11.5 Extreme Disutility of Misuse
R2
R1
R0
Laissez F.
Regulate
Prohibit
No misuse
H2 = 1
H1
H0
Misuse
G2 = 0
G1= 0.01
G0= 0.02
Are there sets of utilities and threshold odds that could satisfy the intuition
that some security measures should be in place when there is only a very small
chance of misuse, but that severe restrictions on research will have a substantial
impact on scientific progress? What would these look like? Table 11.6 illustrates
a set-up similar to an earlier example that is compatible with these intuitions.
178
The bottom row shows the odds thresholds. Resetting w12 to values greater than
10 has relatively little impact on w01 (or alternatively on utilities G0 and H0 if
we wish w01 to remain at 2) because H1 is already close to 1 and G1 is close to 0.
And of course it is possible to solve for H1 and G1 such that w12 takes a specific
value greater than 10 while w01 is unaffected and remains at 2. Thus, in the
threefold partition of acts we are free to set w12 to very conservative (high)
values while still retaining flexibility regarding w01 or the utilities it comprises.
Table 11.6: Extreme Disutility of Prohibition
R2
R1
R0
Laissez F.
Regulate
Prohibit
No misuse
H2 = 1
H1= 0.99
H0= 0.6933
Misuse
G2 = 0
G1= 0.1
G0= 0.6933
Odds threshold
w12 = 10
w01 = 2
and compares gains and losses in utility as the decision-maker moves from one
act to another. A positive sum indicates greater risk sensitivity in choosing
between acts for high js and a negative sum indicates greater risk sensitivity
in choosing between acts with low js. In Table 11.6 these log ratios are 3.39
and 1.78, so there is greater risk sensitivity in choosing between Regulate and
Prohibit than between Laissez Faire and Regulate. This is simply due to the
greater changes in H and G utilities as we move from Regulate to Prohibit.
Finally, given that the utility scales have no absolute lower or upper bounds, a
reasonable question to ask is whether some bounds are more useful or sensible
than others. The [0, 1] interval probably is not well suited to human judgments
because it lacks two features that have psychological significance: a reference
point representing the status quo and a distinction between being better off
or worse off than the status quo. A well-established empirical and theoretical
literature22 informs us that people judge the utility of future outcomes relative
to a reference point (usually the status quo) instead of in absolute terms, and
that they are more sensitive to losses than to gains.
Table 11.7 presents one way of rescaling Table 11.6 according to these
considerations. Suppose we assign 0 to represent the status quo and represent
the maximal loss by 100. Suppose also that we believe misuse of a research
output under laissez faire would yield a loss that is 10 times the magnitude of
the gains that could be realised if no misuse occurred. Then G2 = 100$ and
H2 = 10. The odds thresholds in Table 11.6 partially determine the remaining
utility assignments. We require more than one constraint, so let us repeat the
loss due to misuse being 10 times the gain with no misuse under Regulate. The
end result reveals that we believe we will be worse off than the status quo under
the Prohibit option whether there is misuse or not, but that will be our best
option if the odds of misuse are shorter than 2 to 1.
Table 11.7 Rescaled Utilities from Table 11.6
R2
R1
R0
Laissez F.
Regulate
Prohibit
No misuse
H2 = 10
H1= 9.9
H0= 26.4
Misuse
G2 = 100
G1= 99
G0= 26.4
22 Beginning with Kahneman, D. and Tversky, A. 1979, Prospect theory: an analysis of decision under
risk, Econometrica, vol. 47, pp. 26391.
180
Conclusions?
This chapter largely neglects ethical considerations, which may seem odd given
the predominantly ethical nature of dual-use dilemmas. Ethical considerations
have been set aside to enable a focus on some prerequisites for a fine-grained
analysis of dual-use dilemmasnamely, a systematic investigation of specific
28 For example, Russo and Kolzow, op. cit., pp. 1732.
29 For example, Winman, A., Hansson, P. and Juslin, P. 2004, Subjective probability intervals: how to cure
overconfidence by interval evaluation, Journal of Experimental Psychology: Learning, Memory, and Cognition,
vol. 30, pp. 116775.
30 See, for example, Brown, V. A. 2010, Collective inquiry and its wicked problems, in V. A. Brown,
J. Russell and J. Harris (eds), Tackling Wicked Problems through the Transdisciplinary Imagination, Earthscan,
London, pp. 6184.
31 See Lempert, R., Popper, S. and Bankes, S. 2002, Confronting surprise, Social Science Computer Review,
vol. 20, pp. 42040.
183
unknowns that such an analysis would have to contend with. My hope is that
ethicists will find useful guidance in this investigation, avoiding some of the
pitfalls and traps awaiting the unwary.
Some pertinent unknowns have not been dealt with here, so this chapter cannot
be taken as anything like an exhaustive survey. Nevertheless, we have examined
types of unknowns that are beyond the purview of standard decision theories,
such as state space indeterminacy and imprecision. We have seen that there
are genuinely different kinds of unknowns, not just different sources of the
same kind, and that these play distinct roles. One of the key emergent points
is that many of the unknowns in dual-use dilemmas (and in so-called wicked
problems) are interconnected. They can be traded against one another, and how
one unknown is dealt with has ramifications for other unknowns. Allowing
imprecision in probability assignments, for instance, offers a way of handling
state space indeterminacy. Conversely, choosing the right number of options
can rectify incompatibilities between preferences and decisional probability
thresholds.
We may never be able to attain precise quantification of costs, benefits and
probabilities of outcomes arising from dual-use dilemmas, so a fine-grained
analysis in that sense also is unachievable. After all, accidental findings and
consequences are legion in cutting-edge research and development, and so
irreducible unknowns such as the catch-all underestimation problem are likely
to dog policy formation and decision-making alike. Moreover, as I have argued
elsewhere,32 if we value creativity, discovery and/or entrepreneurship then we
shall have to tolerate at least some irreducible unknowns.
Nevertheless, as Head33 has pointed out, great uncertainty alone is not sufficient
to render a problem wicked in the sense used in most of the literature on
that topic. Wickedness also requires complexity and divergent or contradictory
viewpoints about the nature of the problem and preferences regarding alternative
outcomes. I have tried to show here that even rather simple formal analyses in
the form of thought experiments can frame and structure dual-use dilemmas in
useful ways that avoid some aspects of wickedness, so that at least some of our
psychological foibles can be taken into account and even overcome.
32 Smithson, M. 2008, The many faces and masks of uncertainty, in G. Bammer and M. Smithson (eds),
Uncertainty and Risk: Multidisciplinary Perspectives, Earthscan, London, pp. 1326.
33 Head, B. W. 2008, Wicked problems in public policy, Public Policy, vol. 3, pp. 10118.
184
bats can be used to hit people over the head, as well as for the enjoyment of
playing baseball; however, it is implicit in the use of the term dual use in play
in the academic literature that the potential harm in question is of a very great
magnitudefor example, the potential of nuclear fusion to lead to the creation
of the hydrogen bomb, and the potential of genetic engineering to lead to a
super virus. Moreover, it is also implicit that the dual-use potential in question
is not simply a repeat application of existing science or technologyas is the
case with the construction of the first baseball batbut genuinely new science
or technology that can be used to provide a qualitatively or quantitatively new
means of harming, for example: the invention of the first explosives, or creating
a highly virulent form of an existing much less virulent pathogen. Hitting a
person over the head with a solid object has been done since time immemorial,
so hitting someone over the head with a baseball bat is hardly a novel means
of harming. By contrast, blowing someone up with gunpowder was initially a
novel means of harming. Moreover, unlike the use of a baseball bat, the use of
an explosive device, such as a 10-tonne bomb, could harm on a very large scale.
Note that accidents involving science and technology, even accidents on a very
large scale such as the Chernobyl disaster, are not dual use in my sense since there
is no secondary evil user, although they may involve unethical behaviour such
as negligence with respect to safety precautions. Nor are weapons designed as
weaponsfor example, gunsinstances of dual-use science and/or technology.
For even if their harmful use is intended to be ultimately for the good, such
weapons are in the first instance designed to harm; their use to harm is not a
secondary, but rather a primary, use.
One paradigmatic case of dual-use research was the biological research on the
mousepox virus. The dilemma was as follows.
Option 1: The scientists ought to conduct research on the mousepox virus
and do so intending to develop a genetically engineered sterility treatment
that combats periodic plagues of mice in Australia.
Option 2: The scientists ought not to conduct the research since it might lead
(and in point of fact did lead) to the creation of a highly virulent strain of
mousepox and the consequent possibility of the creationby, say, a terrorist
group contemplating a biological terrorist attackof a highly virulent strain
of smallpox able to overcome available vaccines.
It is a dilemma since there are two options with good reasons in favour of both
and it is an ethical dilemma since these options are morally significant. In
essence the dilemma involves a choice between intentionally doing good and
foreseeably providing others with the means to do evil.
A second and more recent paradigm of dual-use research is the biological
research done on a deadly flu virus (A [H5N1]), which causes bird flu. Scientists
186
12. Moral Responsibility, Collective-Action Problems and the Dual-Use Dilemma in Science and Technology
in the United States and the Netherlands created a highly transmissible strain
of this virus, albeit, as it emerged, a strain that is not as deadly as ordinary
H5N1. Crucially, the work was done on ferrets, which are considered a very
good model for predicting the likely effects on humans.
As with the mousepox case, there are two options, ethically speaking.
Option 1: The scientists ought to conduct research on the bird flu virus and
do so intending to develop vaccines against similar, but deadly, naturally
occurring and artificially created strains of H5N1.
Option 2: The scientists ought not to conduct the research since it will lead
to the creation of a virus that is transmissible to humans, and is unlikely to
lead to the development of vaccines against similar, but deadly, naturally
occurring and artificially created strains of H5N1.
Notice here that the scientific claim in option two that the research is unlikely
to lead to the development of vaccines contradicts the corresponding claim in
option one. In this respect the dilemma is unlike that posed by the mousepox
research. Moreover, if this scientific claim made in option two is correct then the
justification offered in option one for conducting the research collapses.
In such dual-use cases, the researchersif they go ahead with the research
will have foreseeably provided the means for the harmful actions of others and,
thereby, arguably infringed a moral principle (albeit their infringement might
in some cases be morally justified). The principle in question is the principle of
what we might refer to as the no means to harm principle.3 Roughly speaking,
this is the principle that rules out providing malevolent persons with the means
to do great harma principle that itself ultimately derives from the more basic
principle: do no harm.
The no means to harm principle (NMHP) is the principle that one should not
foreseeably (whether intentionally or unintentionally) provide others with the
means to intentionally do great harm and it assumes: 1) the means in question
is a means to do great harm; and 2) the others in question will do great harm,
given the chance.
As with most, if not all, moral principles, the NMHP is not an absolute principle
and, therefore, it can be overridden under certain circumstances. For example,
it is presumably morally permissible to provide guns to the police in order that
they can defend themselves and others. Moreover, as is the case with most, if
3 This principle, or similar ones, is familiar in a variety of ethical contexts. See, for example, Scanlon, T.
1977, A theory of freedom of expression, in R. M. Dworkin (ed.), The Philosophy of Law, Oxford University
Press, Oxford.
187
not all, moral principles, the application of the NMHP is very often a matter of
judgment. In the case of NMHP the need for judgments depends in large part on
the uncertainty of future harms.
The dual-use dilemma is a dilemma for researchers, governments, the community
at large, and for the private and public institutions, including universities and
commercial firms that fund or otherwise enable research to be undertaken.
Moreover, in an increasingly interdependent set of nation-statesthe socalled global communitythe dual-use dilemma has become a dilemma for
international bodies such as the United Nations. The dilemma is perhaps most
acute in those areas of science and technology that operate on an engineering
or construction modelfor example, synthetic biology, nanotechnology (as
opposed to a description model, which restricts itself to the description of
pre-existing entities and their causal and other relationshipsfor example,
astronomy).
12. Moral Responsibility, Collective-Action Problems and the Dual-Use Dilemma in Science and Technology
acquire and use anthrax and botulinum toxin, and actually carried out a number
of terrorist attacks using the chemical sarin gas), by al-Qaeda (they attempted
to acquire and use anthrax) and by US Government employees in the so-called
Amerithrax attacks (involving the actual use of anthrax).
Given that a small number of animal, human and plant pathogens are readily
obtainable from nature, and that bioterrorists with some microbiological training
could use these to inflict casualties or economic damage, and especially given
the new possibilities provided by synthetic biology (involving the creation of
pathogens de novo), evidently there is a non-negligible bioterrorist threat, and
it is increasing.4 This threat is perhaps greatest in unregulated environments,
including in weak or failing states in which well-resourced international
terrorist groups are allowed to flourish.
Moreover, it would be naive to assume that the scientific community can be
entirely trusted to regulate itself in relation to dual-use problems. After all,
thousands of scientists have worked in the abovementioned and other WMD
for example, biological weaponsprograms, and in doing so have had as their
institutional collective end the production of biological weapons. Accordingly,
these scientists are directly collectively morally responsible for the existence
of those weapons and, in the case of scientists working for authoritarian
governments, for enabling authoritarian regimes to possess them. Moreover, on
some occasions, as already noted, WMD have actually been used; accordingly,
the scientists involved in the development of these WMD are morally implicated,
even if only indirectly, in the harms caused by such use.
4 Miller, S. 2009, Terrorism and Counter-Terrorism: Ethics and Liberal Democracy, Blackwell, London, ch. 7.
189
efficient way of organizing the team, and that any attempts to coordinate
their efforts by directives of a superior authority would inevitably
destroy the effectiveness of their cooperation.5
Polanyis view is that each scientist acts freely but does so
1. on the basis of the work of past scientists
2. with constant reference and adjustment to the work of other contemporary
scientists
3. in the overall service of a collective end of comprehensive knowledge (in the
sense of understanding) of the scientific phenomena in question.
So his conception is one of individual scientific freedom in the overall context
of intellectual interdependence in a joint epistemic project.
The epistemic character of scientific research is obvious: science aims at
knowledge or, more broadly, at understanding. What is perhaps not quite so
obvious, however, is that scientific work is, therefore, epistemic action. We
often contrast the acquisition of beliefs, knowledge and other cognitive states
with actionthe notion of action being reserved for behaviour or bodily action.
Action is intentional and voluntary; on the other hand, knowledge acquisition, it
is often held, is unintentional and involuntary. Consider perceptual knowledge.
If an object, say, a piece of litmus paper placed in a liquid solution in a test tube,
turns blue before our eyes then we come to believe that it has turned blue; we
have no choice in the matter, or so it seems.
This focus, however, on perceptual knowledge of ordinary middle-sized objects
provides a distorted picture of the scientific epistemic project for reasons that
will become clear in what follows. A further point to be made here picks up on
the joint or cooperative nature of scientific work described above. If scientific
research is epistemic action and, at the same time, it is joint action then scientific
research is joint epistemic action.
Naturally, this being an epistemic academic project, the participants are governed
by epistemic principles (for example, replication of experiments), peer-review
processes to ensure quality control and uncensored publication.
The key notion in play here is what I am referring to as joint epistemic action.
What is joint action?
Joint action consists of multiple individual actions performed by multiple agents
and directed towards a collective endfor example, a team of workers building
the Empire State Building; a team of terrorists destroying the Twin Towers,
5 Polanyi, M. 1951, The Logic of Liberty: Reflections and Rejoinders, Routledge & Kegan Paul, London, p. 34.
190
12. Moral Responsibility, Collective-Action Problems and the Dual-Use Dilemma in Science and Technology
killing thousands; and a team of scientists discovering the cure for cancer.6 A
collective end is an individual end that each of the participating agents has, but
it is an end that no one agent acting alone realises on his or her own. So each
agent acts interdependently with the other agents in the service of the same,
shared end: the collective end. Again, consider the collective end of a security
organisation such as the Federal Bureau of Investigation (FBI) whose members
may be jointly working to prevent harmnotably, great harms planned by
criminal organisations such as terrorist groups; or consider a team of scientists
working feverishly to develop an antidote for some infectious disease that is
reaching epidemic proportions.
Joint actions exist on a spectrum. At one end of the spectrum there are joint
actions undertaken by a small number of agents performing a one-off simple
action at a moment in timefor example, two lab assistants lifting some
equipment onto a bench. At the other end of the spectrum there are large
numbers of institutionally structured agents undertaking complex and often
repetitive tasks over very long stretches of historyfor example, those who
built the Great Wall of China, or biological scientists developing vaccines.
Joint activity within institutions typically also involves a degree of competition
between the very same institutional actors who are cooperating in the joint
activity. For example, life scientists within a university who are engaged in joint
research and teaching activity in the service of the universitys institutional
goals are also competing for academic status within the relevant international
scientific community, and competing in part on the basis of their individual
contribution to those same institutional goals.
Moreover, in many institutional settings organisations compete with one
anotherfor example, business organisations in market settings. Here there is
joint activity at a number of levels. For one thing, each competing organisation
(for example, a single corporation) comprises a team of individual agents who
are cooperating with one another and jointly working to secure the collective
ends of the organisation (for example, a biotech company trying to maximise
market share). For another thing, each team (for example, each corporation) is
engaged in joint compliance with the regulatory framework that governs their
competitive market behaviourthat is, each complies with (say) the regulations
of free and fair competition interdependently with the others doing so and in
the service of ensuring the ongoing existence of the market in question. This
is consistent with the existence of a regulator who applies sanctions to those
organisations which breach the regulations, including safety regulations that
might be regarded as a costly and unnecessary impost on business (for example,
the select agent rule, some mandatory safety procedures in laboratories, and
6 Miller, S. 2001, Social Action: A Teleological Account, Cambridge University Press, Cambridge, ch. 1.
191
7 Miller, S. 2010, The Moral Foundations of Social Institutions: A Philosophical Study, Cambridge University
Press, Cambridge, pp. 502, and ch. 2.
8 Miller, 2001, op. cit., pp. 569.
192
12. Moral Responsibility, Collective-Action Problems and the Dual-Use Dilemma in Science and Technology
The methods of acquiring propositional knowledge are manifold but for scientific
knowledge they include observation, calculation and testimony. Moreover, the
acquisition of these methods is very often the acquisition of knowledge-how
for example, how to calculate, how to use a microscope, how to read an x-ray
chart.
In the case of the engineering sciences there is an even more obvious and
intimate relationship between propositional and practical knowledge, since both
are in the service of constructing or making things. Thus in order to build an
aeroplane engineers have to have prior practical (how-to) knowledge and that
practical knowledge in part comprises propositional knowledgefor example,
with respect to load-bearing capacity. Moreover, this engineering model has
increasing applicability in new and emerging sciences such as synthetic biology
and nanotechnology. In the case of synthetic biology, for example, scientists can
develop new vaccines, enhance the virulence and transmissibility of existing
pathogens and even create new pathogens (albeit, presumably using elements of
existing pathogens as building blocks).
What counts as sufficient evidence for the possession of knowledge varies from
one kind of investigation and one kind of investigative context to another.
Thus a scientist seeking a cure for cancer would need his or her experimental
results to be replicated by other scientists and the putative cure (say, a new
drug) would need to be subjected to clinical trials before being made widely
available. A detective investigating a series of murdersfor example, by the
Yorkshire Ripperwill be focused not only on physical evidence but also on
motive (a mental state) and opportunity. Moreover, the evidential threshold for
being found guilty is beyond reasonable doubt.
Whereas the acquisition of practical knowledge is readily seen as emanating
from action and, indeed, as being a species of action (knowledge-in-action), the
acquisition of propositional knowledge is a different matter; however, coming to
truly believe that p on the basis of evidencethat is, propositional knowledge
acquisitionis action in at least three respects.
First, the agent, A, makes a decision to investigate some matter with a view
to finding out the truth; the action resulting from this decision is epistemic
action. For example, a detective intentionally gathers evidence having as an
end to know who the serial killer of prostitutes in Yorkshire isthat is, who the
Yorkshire Ripper is. Thus the detective gathers physical evidence in relation to
the precise cause and time of death of the Rippers victims; the detective also
interviews people who live in the vicinity of the attacks, and so on. Here A has
decided that A will come to have a true belief with respect to some matter, as
opposed to not having any belief with respect to that matterfor example, a
true belief with respect to who the Yorkshire Ripper is. As decision is between
194
12. Moral Responsibility, Collective-Action Problems and the Dual-Use Dilemma in Science and Technology
coming to have true belief and being in a state of ignorance, and in conducting
the investigation A has decided in favour of the former. Similarly, a scientist
seeking to discover the genetic structure of some organism makes a decision to
come to have a true belief with respect to this matter rather than remaining in
ignorance.
Second, the agent, A, intentionally makes inferences from As pre-existing
network of beliefs; these inferences to new beliefs are epistemic actions. For
example, a forensic scientist might infer the time of death of a murder victim on
the basis of her prior belief that rigor mortis sets in within 10 hours after death.
Third, in many cases A makes a judgment that p in the sense that when faced
with a decision between believing that p and believing that not p, A decides
in favour of p; again, A is performing an epistemic action. For example, our
detective, A, intentionally makes an evidence-based judgment (mental act) that
Sutcliffe is the Yorkshire Ripper (as opposed to that Sutcliffe is not the Yorkshire
Ripper) and does so having as an end the truth of the matter. Here, A is deciding
between believing that p and believing that not p; but A is still aiming at truth
(not falsity). A is not deciding to believe what he thinks is false. Similarly, our
forensic scientist makes an evidence-based judgment in relation to the cause of
death of the victim having as an end the truth of the matter. Here the scientist
is deciding between believing that the cause of death was x and believing that
the cause of death was not x (but was, say, y).
As is the case with non-epistemic action, much epistemic actionwhether it
is propositional or practical epistemic action or, more likely, an integrated mix
of bothis joint action, that is, joint epistemic action. Joint epistemic action is
knowledge acquisition involving multiple epistemic agents seeking to realise a
collective epistemic end. For example, a team of scientists seeking knowledge of
the cure for cancer is engaged in joint epistemic action.10
In cases of joint epistemic action there is mutual true belief among the epistemic
agents that each has the same collective epistemic endfor example, to discover
the cure for cancer. Moreover, there is typically a division of epistemic labour.
Thus in scientific cases some scientists are engaged in devising experiments,
others in replicating experiments, and so on. So, as is the case with joint action
more generally, joint epistemic action involves interdependence of individual
action, albeit interdependence of individual epistemic action.
As we saw above, knowledge of the cure for cancer, for example, is joint epistemic
action that involves a collective epistemic end and also involves a division of
epistemic labour and interdependence of epistemic action. The further point
to be made here is that there is interdependence in relation to such collective
10 See Miller, 2010, op. cit., ch. 11.
195
epistemic ends. This is because, given the need for replication of experiments
by others, each can only know that p is the cure for cancerto continue with
our examplegiven that others also know this, that is, there is interdependence
in relation to the collective end of knowledge.
A collective epistemic end can be both a collective intrinsic goodand thus
an end in itselfand the means to further ends. Knowledge of the cure for
cancer is a case in point. Such knowledge consists of propositional and practical
knowledge: knowledge of the cure for cancer and knowledge of how to produce
it. This knowledge, however, has as a further (collective) end: the actual
production of the cure (say, a drug). And this end has in turn a still further
endnamely, to save lives.
If knowledge of the cure for cancer is a collective end in itself then it is not
simply a means to individual endsnamely, each having as an end that he/she
knows the cure for cancer. Rather it is mutually believed that knowledge of the
cure for cancer is a collective intrinsic good. In my view, however, moral beliefs
can have motivational force.11 In that case the mutual belief that knowledge of
the cure for cancer is a collective intrinsic good can have motivational force.
It follows from this thatas we saw with joint action more generallyjoint
epistemic action can be collectively self-motivating and does not necessarily
have to rely on prior affective states such as desires.
12. Moral Responsibility, Collective-Action Problems and the Dual-Use Dilemma in Science and Technology
Each of these level-two actions is, however, already in itself a joint action with
component individual actions; and these component individual actions are
severally necessary and jointly sufficient for the performance of some collective
end. Thus the individual members of the mortar squad jointly operate the
mortar to realise the collective end of destroying enemy gun emplacements.
Each pilot, jointly with the other pilots, strafes enemy soldiers to realise the
collective end of providing air cover for their advancing foot soldiers. Further,
the set of foot soldiers jointly advances to take and hold the ground vacated by
the members of the retreating enemy force.
At level one there are individual actions directed to three distinct collective
ends: the collective ends of (respectively) destroying gun emplacements,
providing air cover, and taking and holding ground. So at level one there are
three joint actionsnamely, the members of the mortar squad destroying gun
emplacements, the members of the flight of planes providing air cover, and the
members of the infantry taking and holding ground. Taken together, however,
these three joint actions constitute a single level-two joint action. The collective
end of this level-two joint action is to defeat the enemy; and from the perspective
of this level-two joint action, and its collective end, these constitutive actions
are (level-two) individual actions.
Importantly for our purposes here there are layered structures of joint epistemic
action. Consider a crime squad, comprising detectives, forensic scientists and so
on, attempting to solve a crime.
At level one, a victim, A, communicates the occurrence of the crime (say, an
assault) and description of the offender to a police officer, B. But A asserting
that p to B is a joint epistemic action; it is a cooperative action governed by
conventionsthe convention that the speaker, A, tells the truth and the hearer
trusts the speaker to tell the truth.13
Also at level one, a couple of detectives interviews the suspect to determine
motive and opportunity; the detectives are cooperating with one another in the
performance of a joint epistemic action the collective end of which is to discover
motive and opportunity.
Finally, at level one, a team of forensic scientists analyses the available physical
evidencefor example, the DNA of the blood samples of the offender found on
the victim are matched to the suspects DNA; the forensic scientists are engaged
in joint epistemic action to determine whether there is or is not a DNA match.
These three level-one joint epistemic actions are constitutive of a level-two joint
epistemic actionnamely, the level-two joint epistemic action directed towards
13 Miller, 2010, op. cit., ch. 11.
197
the collective end of determining who committed the crime. Accordingly, when
each of the level-one joint epistemic actions is successfully performed then the
level-two joint epistemic action is successfully performedthat is, the crime
squad solves a crime.
Now consider an example of a large scientific project conducted by a number of
cooperating organisations and hundreds of scientists over many yearsnamely,
the human genome project. The project involved multiple connected goals
collective endsand multiple layered structures of joint action, including joint
projects in publishing, undertaken to realise those goals.
In fact most organisations are hierarchical institutions comprising task-defined
roles standing in authority relations to one another, and governed by a complex
network of conventions, social norms, regulations and laws. Consider a science
department in a university or the forensic laboratory in a police organisation:
both comprise heads of department, scientists, laboratory assistants and so on,
and the work of both is governed by scientific norms of observation, replication
of experiments, and so on. So most layered structures of joint action, including
joint epistemic action, are undertaken in institutional settings, and scientific
joint epistemic action is not an exception.
Institutions have de facto purposes/strategic directionsthat is, collective ends,
such as to maximise shareholder profit (corporations), find a cure for cancer
(university research team), or build an atomic bomb (military organisation).
Institutions also have specific structures (hierarchical, collegial and so on)
and they have specific cultures (for example, a competitive, status-driven
ethos).14 In this connection consider scientific activityfor example, biological
research, undertaken in three different institutional settings: that of the
university, the commercial firm and the military biodefence organisation. Some
of the principal purposes/strategic directions (collective ends) of commercial
firmsfor example, to maximise shareholder profitsare quite different from,
and possibly inconsistent with, those of universities (for example, scientific
knowledge for its own sake), and quite different again from those of military
research establishments (for example, to save the lives of our military personnel).
Again, the hierarchical structures within a military research establishment are
quite different from the more collegial structures prevailing in universities; and
the structure of commercial firms is quite different again. The general point to be
made here is that scientific activity is not only a form of complex joint activity (a
layered structure of joint epistemic action); it is activity that is inevitably shaped
by the non-scientific institutional setting in which it is conductedthat is, by
the specific collective ends, structure and cultures of particular institutions.
12. Moral Responsibility, Collective-Action Problems and the Dual-Use Dilemma in Science and Technology
Here we also need to stress the distinction between the de facto institutional
collective end, structure and/or culture and what it ought to be; cultures,
for example, can vary greatly from one organisational setting to another,
notwithstanding that the type of institution in question is the same or very
similar.
In the light of the above, we can distinguish the normative account of science
as a joint intellectual activityfor example, aimed at knowledge for its own
sakefrom science as a means to broader social ends (for example, vaccines to
save lives). Moreover, we can distinguish both from the normative account of
specific institutions in which science exists principally as a meansfor example,
commercial firms (vaccines to make profit) and a military biodefence organisation
(vaccines to save the lives of our military personnel). Importantly, in the context
of discussion of dual-use concerns, we can distinguish within the normative
account of science (both at the level of joint intellectual activity and at the
level of specific institutions) between its positive ends (for example, knowledge
for its own sake and knowledge as a means to combat disease) and negative
ends (for example, harm prevention). Earlier we discussed the no means to
harm principle (NMHP)namely, the principle that one should not foreseeably
(whether intentionally or unintentionally) provide others (for example,
bioterrorists) with the means to do great harm. Clearly the means in question
is essentially scientific or technological knowledge and this knowledge is the
product of joint epistemic action (indeed, joint epistemic action undertaken in
the context of multiple layered structures of joint epistemic action). Accordingly,
harm prevention in relation to dual-use concerns is, or at least morally ought to
be, a joint enterprise with respect to joint epistemic action; it is something that
biological scientists as members of their scientific community (or communities)
and specific institutions (for example, biology departments in universities,
biotech companies) morally ought to jointly address including (presumably) by
way of education and regulation of their potentially harmful joint epistemic
action. But let us be clear on the moral responsibilities in play here.
199
12. Moral Responsibility, Collective-Action Problems and the Dual-Use Dilemma in Science and Technology
(individual or joint) actions then those jointly responsible for the joint action in
question, and its outcome, may have diminished moral responsibility. Scientists
who engage in dual-use research that is subsequently used in the construction
of WMD may well have diminished responsibility for the harm caused by those
WMD. Diminished responsibility is not, however, necessarily equivalent to no
responsibility. Further points to be made here are as follows.
First, each agent may have full or partial moral responsibility for x jointly
with others for the joint action x and/or its outcome. If, for example, five men
each stab a sixth man once, killing him, each is held fully morally (and legally)
responsible for the death even though no single act of stabbing was either
necessary or sufficient for the death. In some cases each agent might have full
moral responsibility (jointly with others) for some outcome, Onotwithstanding
the fact that each only made a very small causal contribution to the outcome
in large part because each is held to have prior full institutional (including legal)
responsibility (jointly with others) for O.
On the other hand, each agent might have partial and minimum moral
responsibility jointly with others if each only makes a very small and incremental
contribution as a member of a very large set of agents performing their actions
over a long periodfor example, the scientists who worked on the human
genome project.
Second, we need to distinguish cases in which agents have collective moral
responsibility for some joint action or its outcome from cases in which agents
only have collective moral responsibility for failing to take adequate preventative
measures against O taking place. Many untoward dual-use cases are of the latter
kind.
On the other hand, agents may not have any collective moral responsibility
with respect to some foreseeable morally significant outcome, O, if O has a low
probability, takes place in the distant future and involves a large number of
intervening agents.
The collective moral responsibilities of scientists are multiple. Scientists have
a collective institutional (professional) and moral responsibility as scientists to
acquire knowledge for its own sake.
Scientists functioning in universities also have a collective institutional and
moral responsibility to acquire knowledge for the good of humanityfor
example, vaccines for poverty-related diseases.
201
Regulatory frameworks
In the light of the dangers stemming from the dual uses of science and technology
there is a need to consider a range of regulatory measures to reduce the risks.
Such measures include the imposition of limits on dual-use experiments and on
the dissemination of potentially dangerous information resulting from dual-use
discoveries. These measures themselves exist on a spectrum ranging from the
least intrusive/restrictive to the most intrusive/restrictive.15
Some specific regulatory measures that might be considered include the
following ones.
1. Mandatory physical safety and security regulation: Should there be
regulations providing for physical safety and security in relation to the
storage of, the transport of and physical access to samples of pathogens,
equipment, laboratories and so on? And should compliance with such
regulations be mandatory? In both cases the answer is presumably in the
affirmative.
2. Licensing of dual-use technologies/techniques: Should there be mandatory
licensing of dual-use technologies/techniques/pathogen samples? Only certain
laboratories in the public sector and the private sector might be licensed to
15 For a detailed treatment of these and other options, see Miller and Selgelid, 2008, op. cit.
202
12. Moral Responsibility, Collective-Action Problems and the Dual-Use Dilemma in Science and Technology
Collective-action problems
Thus far we have characterised the scientific enterprise as essentially a joint
epistemic one: the emphasis has been on intellectual cooperation to achieve
common scientific (epistemic) goals in an institutional context of minimal
restrictions on the freedom of individual scientists. This picture, however, while
acceptable as far as it goes, is an oversimplification. Specifically, it obscures the
competitive dimension of scientific activity and, in particular, it masks various
203
204
12. Moral Responsibility, Collective-Action Problems and the Dual-Use Dilemma in Science and Technology
206
poison King Pyrrhus.10 And Samuel von Pufendorff (163294) wrote that the
more civilized nations condemn certain ways of inflicting harm on an enemy:
for instance the use of poison.11
Back to just-war thinking in general. New times and new situations ask for
revisiting what moral and legal norms should be weighed and how in seeking
to answer the question if (and under what conditions) the use of violence is
justified. Sometimes developments lead to more than marginal adjustments
of assessments. So after World War II, the issue was posed of whether the
invention of nuclear weapons led to novel considerations in relation not only
to the possible use of these weapons, but also to deterrence. Is it, for instance,
morally justified to threaten their use if this would go against all prevailing
criteria regarding what is appropriate in terms of violence? What if deterrence
is the only way to prevent the actual use of nuclear weapons? Such complex
questions raised by this case could lead to the conclusion that the just-war
tradition has lost its relevance and significance.12 But such a view is too rash
given it is based on only onehowever important and shockingdevelopment
in the waging of war. In the heydays of the Cold War, (too) many so-called
conventional wars were fought in which just-war criteria could and should
have provided good guidance. More than this: just-war thinking could provide
criteria for moral argument on the paradoxes of nuclear deterrence, as is shown
in the development of a theory of justified deterrence.13
Just-war criteria
In the just-war tradition, a distinction is made between the so-called jus ad
bellum and the jus in bello. The jus ad bellum indicates if and when it can be
justified to start a war. The criteria one to five below refer to this jus ad bellum.
The sixth one belongs to both domains and the remaining two criteria deal with
the question of the jus in bello: what behaviour is permitted or forbidden during
a war. More recently a third domain has been added: the jus post bellum. There
is much debate about whether this new distinction is useful and necessary for
judging modern wars. Elsewhere I have questioned the utility for a jus post
bellum.14 Setting aside this wider question though, the jus post bellum will not
be considered in this chapter, because it is not very relevant for biosecurity
issues.
Jus ad bellum
1. The war must be waged by a legitimate government
This criterion expresses the assumption that only a legitimate government has
and should have the monopoly on violence. In a sovereign state the government
and the government alone is allowed to use violence. This may seem obvious,
but in practice things often are not so obvious, because the legitimacy of a
government is not always undisputed. Much violence (for example, in civil
wars or anticolonial wars) stems from conflicts over who owns the legitimate
authority. Often it works out that legitimacy can be decided only post hoc: the
winner takes all. In addition, the notion of legitimate authority had varying
meanings over time. Thomas Aquinas had a much different concept than Hugo
Grotius, and also in our days definitions are changing. But the fact remains that
the notion of legitimate authority has a certain core holding. By convention
actually since the Peace of Westphalia in 1648sovereign states hold the
monopoly on violence.15 States and government embody the public domain and
because of that they differ from all other (private) groups and entities. Given
all disputes and discussions about sovereignty, this first just-war criterion
certainly is not ideal, but something better is not available. If the public domain
collapses, the war of all against all remains a possibility, even today. And in
such a situation sovereign authority will be sought. Recent examples can be
found in so-called failed states, where in practice the central government has
virtually disappeared. Many people in that situation eventually accept the
authority of terrorist movements, becausein the view of powerless groups
their violent and arbitrary exercise of power is always better than the absence
of any authority. To give an example: it is no wonder that in Afghanistan many
people accept (which is not the same as support) the authority of the Taliban
instead of what they see as a condition of anarchy.
2. The war must be waged for a just cause
Fighting war for a just cause is the core of the jus ad bellum. Influenced by
the Charter of the United Nations (1945), the interpretation of what counts
under this category has become more and more restricted in recent times. Only
a reaction against aggression or an intervention that is sanctioned by the UN
14 van der Bruggen, 2009, op. cit.
15 Schrijver, N. 1998, Begrensde soevereiniteit350 jaar na de Vrede van Munster [Limited
sovereignty350 years after the Peace of Mnster], Transaktie, vol. 27, no. 2, pp. 14174.
210
Security Council is taken as a legally acceptable base for waging war. This means
thatto give some examplesretribution or recoveries of previously suffered
injustices are not acceptable justifications.
In a sense the interpretation has also been extended. Since the end of the Cold
War the idea has become widespread that the international community has
the rightif not the dutyto use military force to intervene in cases of gross
violations of human rights. The intervention in Kosovo in 1999 serves as one
example, even though a UN Security Council authorisation was missing. A
further extension came after 11 September 2001, when the war against terror
was added to the list of just wars by many people. The invasion of Afghanistan
to expel the Taliban regime was not explicitly mandated by the UN Security
Council, but the American action was widely seen as a legitimate form of selfdefence supported by resolutions that were adopted after 9/11.16 In relation to
biosecurity the question arises if and under which conditions the development
or possession of biological weapons can be a reason to start a war. This is not
a purely academic question as is shown below in the heated debates about the
AmericanBritish intervention in Iraq.
3. The war must be waged with a right intention
This criterion is in line with the previous one, but it goes further. It says that it
is not justified to have secondary intentions when fighting a war for a just cause.
The only intention should be that just cause (as resisting the aggressor). If the
defender succeeds in that objective, he must stop fighting. He is not allowed
to aim at military or economic destruction of the opponent. According to the
classical tradition, feelings of revenge also should play no role. Most recent wars
have shown that standard proves unrealistic. In fact, in most wars, intentions
other than the official ones can be discerned. Economic (oil), political or
personal secondary intentions are never far away. If and how these secondary
intentions played a part in the Iraq warformally intended to remove biological
and chemical weaponswill be illustrated below.
4. All other means of conflict resolution must be exhausted; war is
the last resort
This criterion is designed to prevent governments from taking up arms too
quickly. The question that arises is: when is too quickly and who is to decide?
Parties that are contemplating force would likely argue that it is the only
solution. The relevance of this criterion is that it is forcing the aggressor to
make the case for why aggression is necessary. Is a military intervention the
16 For example, United Nations Security Council (UNSC) Resolution 1373 (2001): <http://daccess-dds-ny.
un.org/doc/UNDOC/GEN/N01/557/43/PDF/N0155743.pdf?OpenElement> (viewed 11 January 2011).
211
last remaining option? Are all other means really exhausted? These kinds of
questions have to be asked and answered. The problem is that the answers to
these and similar questions can never be given with mathematical certainty.
Fortunately, history also shows many examples where war could be prevented.
Humankind has repeatedly escaped the battlefield.17 Of course we owe this to
several causes, but certainly the message of the just-war traditionand more
specifically of this criterionplayed a role in the idea that starting a war is a
decision that has to be avoided if possible. This can be illustrated by the way
political leaders try to persuade others that their war is justified and that they
do not have any other possibility. One need not believe what is said to still note
the importance of making a justification. Peace is the rule; war (unfortunately
all too often) the exception.
5. There must be a reasonable chance that the intended purposes
are reached through the war
The purpose of this criterion is to prevent a government from starting a war
if it is clear beforehand that the intended targets cannot be reached. At least
one objection to this criterion rises immediately: it seems to support the side
of the strongest party. Yet military superiority is not always a guarantee for
victory. The most famous example in recent history is the Vietnam War. Despite
overwhelming military superiority, the Americans did not manage to reach their
goals. Especially in a humanitarian intervention, the militarily stronger party
does not always have the greatest interest in going on with fighting. The tragedy
of Srebrenica seems an example. If the UN Protection Force (UNPROFOR, the
UN troops in Bosnia) had used all available resources, the fall of Srebrenica could
have been prevented. Among other factors, the fact that the self-interest of the
Netherlands and the Dutch military was not at stake certainly played a role. The
professional soldiers of Dutchbat had signed up for the army and realised that
an ultimate consequence of this was to die, [b]ut dying for Srebrenica. It was
not worth it.18
6. The objectives and means of war must be in a reasonable
relationship to each other. This is the principle of proportionality
The principle of proportionality can relate to both the jus ad bellum and the
jus in bello. In the first case, the principle refers to waging the war: is there a
proportionate relationship between its (expected) cost and the aim to be reached?
During the war the principle can be applied to judge if a concrete action or the
17 Gerrits, A. and de Wilde, J. 2000, Aan het slagveld ontsnapt. Over oorlogen die niet plaatsvonden, Waburg
Pers, Zutphen. This book is about wars that did not take place.
18 Westerman, F. and Rijs, B. 1997, Srebrenica. Het zwartste scenario [Srebrenica. The Blackest Scenario],
Atlas, Amsterdam/Antwerpen, p. 11. Quote from a Dutchbat soldier.
212
19 This was the reason that some parents of Dutch soldiers who had been killed in Afghanistan were among
the people who resisted the return of Dutch troops. The same argument can be read in the memoir of George
W. Bush, where he describes his meetings with parents of killed soldiers: George W. Bush 2010, Decision
Points, Crown, New York, ch. 12, pp. 35594.
20 von Clausewitz, C. 1982, Over de oorlog, Het Wereldvenster, Bussum, p. 15.
213
In the Lebanon war, the Israeli army caused most of the civilian deaths,
but some (or many) of the villages it attacked were being used by
Hezbollah as bases for rocket attacks on Israeli cities, so the greater
part of the responsibility for civilian deaths in those villages lay with
Hezbollahas did the greater part of the responsibility for the war
itself, which began with a rocket barrage and a Hezbollah raid across
the international frontier. Israel is responsible for deaths caused by
unjustifiable bombings like the Qana raid or by the cluster bombs used
late in the war. But (again) it shouldnt be the proportionality argument
that guides our judgment of those deaths; they were wrong whether or
not they were disproportionate. In Vietnam, Kosovo, and Lebanon, it is
the balance of responsibility that is morally determinative.22
24 Iraq resolution 1441 adviceoriginal memo, BBC News, 7 March 2003, <http://news.bbc.co.uk/1/
shared/bsp/hi/pdfs/28_04_05_attorney_general.pdf> (viewed 4 October 2010).
25 Dutch Committee of Inquiry on the War in Iraq 2010, Report of Dutch Committee of Inquiry on the
War in Iraq, Chapter 8: The basis in international law for the military intervention in Iraq, Netherlands
International Law Review, vol. LVII, no. 1, pp. 1268.
26 Ibid., pp. 8790.
27 Although not all (applications of) biological weapons lead to mass destruction, I will follow the usual
terminology for chemical, biological, radiological and nuclear (CBRN) weapons as weapons of mass destruction.
216
not least the desire of President George W. Bush to finish the job of his father.
Moreover, it is well known that Tony Blair had an almost religious zeal to restore
human rights in Iraq.
Were all other means of conflict resolution exhausted and was war indeed the
last resort?
This third criterion of the just-war tradition played an important role in the
discussions about the Iraq war. Opponents did not stop to mention the efforts
of the International Atomic Energy Agency (IAEA) and the UN Monitoring,
Verification and Inspection Commission (UNMOVIC) in the search for possible
WMD. Both organisations had confidence in the success of their efforts. Because
of that they were opposed to the strategy of Bush and Blair, which was directed
at starting a conflictalmost independently of the results of the IAEA and
UNMOVIC. The United States and the United Kingdom had, however, developed
their plan of attack and they were not willing to make that plan dependent on
the possible results of the inspections. The planning of the invading countries
was related more to practical circumstances in and around Iraq.28
Looking at the Iraq war, it seems that biological weapons hardly played any role
in the final decision. But in the declaratory policy to justify the war as well as
in the long lead-up to the war, the issues certainly were important. It helped to
make war acceptable: if a country possesses and even uses chemical or biological
weapons, it must be a very despicable regime and thus war is an acceptable way
to get rid of that regime. In other words: the (supposed) possession of WMD as
such is seen a casus belli.
The next criterion (number five) that should be taken into account is that there
must be a reasonable chance that the intended purposes are reached by waging
war. If the purpose of a war is eliminating possible stocks of WMD or dual-use
capacities, the feasibility of this purpose is not only dependent on the outcome
of the war. Of course it is helpful and perhaps even necessary to have military
superiority in those areas where the WMD or other materials are stockpiled,
but in addition an intensive scientific survey has to be set up. Anyone who
takes a look at the complexity of the activities of the UN Special Commission
(UNSCOM), UNMOVIC, the IAEA andafter the warthe Iraq Survey Group
gets an impression of the work that has to be done to find, identify and eventually
eliminate WMD materials. The inspectors Hans Blix (UNMOVIC) and Mohamed
El Baradei (IAEA) had repeatedly declared that they would be able to finish their
complex job without military intervention. This makes the argument that war is
necessary to reach the goals of eliminating biological (and nuclear or chemical)
weapons at least implausible. Assuming the hidden goals or intentions of the
28 Woodward, R. 2004, Plan of Attack, Simon & Schuster, New York.
217
Iraq war, it can be argued that the goal of regime change indeed was reached.
Within two months the regime of Saddam Hussein collapsed. But, as we know
now, this was not the end of the fighting.
What can be said of proportionality (criterion six) as an argument for whether
or not to start a war? Proportionality as a criterion of the jus ad bellum
becomes relevant ifaccording to the other ad bellum criteriastarting a
war can be justified. Proportionality in that case is an added criterion to judge
if the relation between purposes and means is balanced. There is no need to
consider the proportionality criterion if it has already been determined that the
other ad bellum arguments define a war as unjust. Given the fact that despite
all counterarguments the Iraq war took place, the jus in bello context of the
proportionality criterion is applicable. But how to apply it to the specific aspect
of biosecurityor broader biological weaponsis not very obvious. The Iraq
waronce startedled to many actions that were and could not be foreseen.
This war is no exception to the rule that a war creates its own judgments on
proportionality: the longer a war lasts, the less some actions are appreciated as
disproportionate. The formal link with the search for biological weapons drifted
out of sight and out of mind.
Finally, both parties have violated the noncombatant principle many, many
times during the Iraq war. Many civilians were killed by Iraqi militants as well
as by coalition troops.29 But it is not possible to link these violations and the
biosecurity issue. The only thing that can be said is that this warat least
partiallywas based on handling the problem of weapons of mass destruction;
however, this was a very exceptional situation. This evokes the question if
and how just-war criteria are relevant for everyday biosecurity policy. This
question will be addressed by looking for elements of this policy to which justwar criteria could or should be applied.
Biodefence as bio-offence
According to Article I, BWC state parties are allowed to undertake activities for
prophylactic, protective or other peaceful purposes. This implies that biodefence
is allowed as far as this is limited to protective or peaceful purposes. But who is
to decide what purposes are protective and peaceful? In practice biodefence can
coincide with bio-offence. And even this could be defended with an argument
that is derived from the debate on nuclear deterrence. The argument could
be that having the ability to produce biological weapons will deter another
state or non-state actor from using their biological weapons because of fear of
retaliation. This of course was the logic of the nuclear-deterrence policy during
the Cold War. In those days many debates were devoted to the question of
whether nuclear deterrence was justified from a moral point of view: was it
220
suffices to draw attention to these kinds of risk. Although such accidents do not
take place during a war, the collateral damage that almost by definition will be
caused by biological weapons can be seen as a violation of the noncombatants
principle of the just-war tradition.
Conclusion
This chapter started with the observation that biosecurity and the just-war
tradition occupy separate worlds. This observation has been confirmed. There
are not many overlaps between both. But although the overlaps are few, some
clear lines can be drawn between the two.
Biological weapons are in the category of weapons of mass destruction, but
biological weapons have much less military value today in deterrence and in
practice than nuclear weapons. The most important example of a war that was
waged for reasons that were linked to biological weapons was the Iraq war of
2003. But this link existed more on paper and in the declaratory policy than in
reality. As far as the argument was used, it cannot be accepted as a just cause
for the war.
This does not mean that developing and storing biological weapons are justified
from a just-war perspective. Most of these weapons (certainly the ones with
contagious agents) are by definition indiscriminate, and using them would be
a violation of the noncombatants principle. The same argument also applies to
a possible bioterrorist use of biological agents. And for the category of nonindiscriminate biological weapons, the argument against their use can be found
in the inhumane character of these means. From a more military point of view,
the argument of Valerius Maximus (armis bella non venenis geri: wars are fought
with weapons, not with poison) still can be seen as valid.
The consideration that possession of biological weapons could be legitimated for
reasons of deterrence is refuted by the current political and military situation.
Because of the BWC, which became possible because of the limited military
value of biological weapons, there is no credible reasoning that these weapons
have a deterring function.
222
comparison of any form of the PP with CBA, taking the view that, while CBA has
been developed to be applicable in circumstances where we need to make policy
decisions under risk, the PP has been developed to address circumstances of
uncertainty. The distinction between risk and uncertainty goes back to Knight.16
In his terminology, situations of risk are circumstances where the probabilities
of possible outcomes can be specified, on the basis of reliable evidence, and
situations of uncertainty are circumstances where the probability of possible
outcomes cannot be specified, on the basis of reliable evidence. Flipping a
normal coin creates an instance of risk without uncertainty as we know all
the possible outcomes of a coin flip (heads and tails) and we are warranted in
specifying particular probabilities for these outcomes. Speculation about the
details of a possible afterlife is a case of uncertainty without risk. There are
many possible accounts of the ways in which an afterlife might be experienced,
but we appear to lack good grounds for assigning probabilities to any of them.
While coin flips are pure instances of risk and speculations about the afterlife
seem to be pure instances of uncertainty, these are exceptional cases. The
majority of real-world cases where decisions involving probabilities need to be
made involve a mixture of risk and uncertainty. Consider a couple of real-world
decisions: 1) an insurer who writes home insurance policies will try to price
those policies on the basis of assessments of the risks to which particular houses
are exposed. It is not possible, however, to anticipate all such risks and work
out exactly how likely these are, so there is inevitably an element of uncertainty
in such assessments. Exact probabilities can, of course, be assigned to possible
outcomes, but such assignments will involve a degree of stipulation. 2) A
patient who is contemplating an operation will want to know what the risks
involved are and the surgeon who is set to conduct the operation will generally
try to transmit this information to the best of her ability. But while it is possible
to anticipate many of the risks involved in a complex operation, estimates of
how likely these are to occur in particular contexts will inevitably involve a
degree of speculation. In many real-world circumstances, we attempt to turn
uncertainties into risks, however, this typically involves some speculation and
so an element of uncertainty remains.17 Most real-world circumstances involve
a mix of risk and uncertainty and the PP and CBA can both be applied to these.
So it is appropriate to make direct comparisons of CBA and the PP in dealing
with most real-world cases.
16 Knight, F. 1921, Risk, Uncertainty, and Profit, Hart, Schaffner & Marx, Boston.
17 Knights view of risk is based on the (I think plausible) presupposition that we try to assign objective
probabilities to the world, and that when we are unable to do so, we are left with uncertainties: LeRoy, S.
F. and Singell, L. D. 1987, Knight on risk and uncertainty, The Journal of Political Economy, vol. 95, pp.
394406. Some commentators, such as Friedman, who take the view that we only ever assign subjective
probabilities to the world, hold that we never need to have recourse to the Knightian concept of uncertainty
at all. Friedman, M. 1962, Price Theory: A Provisional Text, Aldine, Chicago.
226
a choice between exactly two alternatives; however, as is the case with the above
example, there may be contexts in which we may be able to make comparative
choices that are more complicated than simple dilemmas, and such possibilities
may be obscured from us by framing our choice as a dilemma.31
Framing a problem as a dual-use dilemma encourages us to consider a choice
between two alternatives in isolation from other possible choices. And at least
some versions of the PP (sPP) can only be applied if we make a decision about
risks in isolation from consideration of alternatives. Other versions of the PP,
including the Wingspread Statement and the Rio declaration, are also structured
around consideration of the risks that are associated with a particular activity
and do not encourage us to consider risks involved with alternatives to that
activity. The concern here is that if the PP is applied to the dual-use dilemma
then the tendency of both of these conceptual structures to focus our attention
on a particular policy option, to the exclusion of consideration of other options,
may reinforce one another. So we have a reason to be especially wary about
applying PP when that is combined with the framing of options as dilemmas.
31 This problem may become resolved as language evolves. The word dilemma is sometimes used these
days in a way that is meant to be inclusive of tri-lemmas, quadri-lemmas, and so on. If this usage becomes
sufficiently common then framing a decision as a dilemma will no longer encourage a forced choice between
exactly two alternatives.
32 Kuhlau et al., op. cit., p. 6.
33 Ibid., pp. 18.
231
But perhaps Kuhlau et al. mean to offer the scientific community a reminder
that there are threats to human health and security that can arise from lifescience research and they are insisting that the scientific community considers
these when deciding whether or not to conduct particular research that may
lead to harms. On this second reading, their version of the PP is supposed to
function like the Rio declaration and is designed to ensure that, when costs and
benefits are weighed up, the potential costs of certain risks are not excluded
from consideration. If this is the sort of reading that is intended then it would
be useful to know what motivates Kuhlau et al. to suppose that there is a danger
of these concerns not being considered, when consideration of them is expected
under the application of ordinary CBA. The Rio declaration was developed
in response to a history of significant risks being ignored on the (fallacious)
grounds that if these had not been established with full scientific certainty
then they should not be considered at all. But is there a reason to think that
threats to human health and security are liable to be ignored by the scientific
community?34 If there is no particular reason to believe that this may happen
then it seems that Kuhlau et al.35 have created a gratuitous version of the PP that
performs no needed function.
On a third possible reading, Kuhlau et al. intend a strong version of the PP.
Their wording is ambiguous, but they may intend that the scientific community
only allows research in the life sciences to go ahead once all concerns regarding
possible threats to human health and security have been met, no matter how
potentially beneficial such research may be. If they do intend this strong
reading then it would be useful to know how the following two questions can
be answered. First, it would be useful to know why they think we should accept
this strong precautionary approach. If the benefits of some piece of research are
judged to outweigh the risks, and if all significant possible costs and benefits
have been considered, why shouldnt we accept the risks of conducting such
research, in order to try to reap potential benefits? Second, we need to be told
how the problem of paradox is to be avoided.36 In many situations there will be
risks involved in not conducting research in the life sciences. If we do not take
the risks involved in developing new vaccines for currently lethal diseases, for
example, we implicitly accept the risks of people continuing to die as a result of
34 Tom Douglas suggests that there is a tradition in the scientific community of denying that scientists
should be held responsible for the ways in which scientific knowledge is employed and that Kuhlau et al.
may be intending to issue a special reminder to scientists, as a way of trying to overcome the influence of
this tradition. Also Selgelid suggests that biological scientists and bioethicists have a history of ignoring the
dual-use potential of genetics. See Selgelid, M. 2010, Ethics engagement of the dual-use dilemma: progress
and potential, in B. Rappert (ed.), Education and Ethics in the Life Sciences, ANU E Press, Canberra, pp. 2334.
35 Kuhlau et al., op. cit.
36 Kuhlau et al. consider some objections to the PP, including the objections that it stifles scientific
development, it lacks practical applicability and that it is poorly defined and vague. They do not consider the
applicability of these charges to specific versions of the PP and they do not consider the charge that (strong
versions of) the PP leads to contradictory policy recommendations.
232
these diseases. It looks like Kuhlau et al.s version of the PP (on a strong reading)
is susceptible to the problem of there being risks on all sides and so it looks
like it leads to paradoxical recommendations, if applied consistently.37
The PP can be useful in particular contexts when it has specific uses. There
may be particularly good uses for a PP developed to suit dual-use contexts.
Unfortunately, Kuhlau et al.38 are not clear about the uses to which they wish
to put their version of the PP and it looks like their version is not designed to
suit any specific use. If others wish to develop new versions of the PP for dualuse contexts then I would urge that they use precise language and specify what
their version of the PP is intended to achieve.
37 Thanks to Linsey McGoey, Brian Rappert and Tom Douglas for some very helpful comments on an earlier
version of this chapter.
38 Kuhlau et al., op. cit.
233
Introduction
In the past decade, scientists, policymakers, ethicists and citizens have become
increasingly aware that scientific research that promotes public health and
safety has the potential to be used for terrorist, criminal or other malevolent
purposes.1 The phrase dual-use research refers to research that may have
beneficial as well as detrimental consequences. The National Science Advisory
Board for Biosecurity (NSABB), a US Government committee that provides advice
to researchers and federal agencies, has defined dual-use research of concern as
research that, based on current understanding, can be reasonably anticipated to
provide knowledge, products, or technologies that could be directly misapplied
by others to pose a threat to public health and safety, agricultural crops and
other plants, animals, the environment or materiel.2 Examples of published
research that has raised dual-use issues include research on a mousepox virus
that could be used to enhance the virulence of the human smallpox virus,3 a
study showing how to manufacture a polio virus from available sequence data
1 Atlas, R. 2002, National security and the biological research community, Science, vol. 298, pp. 753
4; Atlas, R. and Dando, M. 2006, The dual-use dilemma for the life sciences: perspectives, conundrums,
and global solutions, Biosecurity and Bioterrorism, vol. 4, pp. 27686; National Science Advisory Board for
Biosecurity (NSABB) 2007, Proposed Framework for the Oversight of Dual Use Life Sciences Research: Strategies
for Minimizing the Potential Misuse of Research Information, National Science Advisory Board for Biosecurity,
Bethesda, Md, <http://oba.od.nih.gov/biosecurity/pdf/Framework%20for%20transmittal%200807_Sept07.
pdf> (viewed 30 December 2011); Selgelid, M. 2007, A tale of two studies; ethics, bioterrorism, and the
censorship of science, Hastings Center Report, vol. 37, no. 3, pp. 3543; Resnik, D. B. and Shamoo, A. S.
2005, Bioterrorism and the responsible conduct of biomedical research, Drug Development Research, vol. 63,
pp. 12133; Selgelid, M. 2009, Governance of dual use research: an ethical dilemma, Bulletin of the World
Health Organization, vol. 87, pp. 7203; National Research Council 2004, Biotechnology Research in the Age of
Terrorism, The National Academies Press, Washington, DC; National Research Council 2006, Globalization,
Biosecurity and the Future of the Life Sciences, The National Academies Press, Washington, DC; Resnik, D. B.
2010, Can scientists regulate the publication of dual use research? Studies in Ethics, Law, and Technology, vol.
4, no. 1, article 6, <http://www.bepress.com/selt/vol4/iss1/art6> (viewed 30 December 2011).
2 NSABB, op. cit.
3 Jackson, R., Ramsay, A., Christensen, C., Beaton, S. and Hall, D. 2001, Expression of mouse interleukin-4
by a recombinant ectromelia virus suppresses cytolytic lymphocyte responses and overcomes genetic
resistance to mousepox, Journal of Virology, vol. 75, pp. 120510.
237
and mail-order supplies,4 a study on the genetics of human smallpox virus that
could be used to develop a strain of Vaccinia (the virus used in smallpox vaccine)
that would overcome the immune systems natural defences,5 a paper describing
how a terrorist could poison the US milk supply with botulinum toxin,6 and a
paper demonstrating how to reconstruct the 1918 Spanish influenza virus from
published sequence data.7
Preventing the use of scientific research for malevolent purposes raises dilemmas
for ethics and public policy. Although most of these issues have existed in some
form or another since the dawn of science, the globalisation of scientific research,
advances in biotechnology and the spectre of bioterrorism create problems for
bioscientists and other researchers that are complex, difficult and urgent.8 The
fundamental dilemma related to dual-use research is how to protect society
from harm without unduly hampering the advancement of science.9 While
some restrictions on research are widely recognised as necessary to prevent the
misuse of science, administrative, legal and bureaucratic oversight of research
can hinder collaboration, the sharing of data and materials, and publication.10
Policymakers have explored two forms of oversight of dual-use research:
governmental control and self-regulation.11 Government oversight mechanisms
that have been used or proposed include12
regulations, such as the Patriot Act, that control access to and transfer and
storage of dangerous biological, chemical or radiological materials13
laws related to the transfer of technology across national borders14
4 Cello, J., Paul, A. and Wimmer, E. 2002, Chemical synthesis of poliovirus cDNA: generation of infectious
virus in the absence of natural template, Science, vol. 297, pp. 101618.
5 Rosengard, A, Liu, Y., Nie, Z. and Jimenez, R. 2002, Variola virus immune evasion design: expression of
a highly efficient inhibitor of human complement, Proceedings of the National Academy of Sciences, vol. 99,
pp. 880813.
6 Wein, L. and Liu, Y. 2005, Analyzing a bioterror attack on the food supply: the case of botulinum toxin in
milk, Proceedings of the National Academy of Sciences, vol. 102, pp. 99849.
7 Tumpey, T. M., Basler, C. F., Aguilar, P. V., Zeng, H., Solrzano, A., Swayne, D. E., Cox, N. J., Katz, J. M.,
Taubenberger, J. K., Palese, P. and Garca-Sastre, A. 2005, Characterization of the reconstructed 1918 Spanish
influenza pandemic virus, Science, vol. 310, pp. 7780.
8 Atlas and Dando, op. cit.
9 Miller, S. and Selgelid, M. 2007, Ethical and philosophical consideration of the dual-use dilemma in the
biological sciences, Science and Engineering Ethics, vol. 13, pp. 52380.
10 Ibid.; Atlas and Dando, op. cit.
11 Miller and Selgelid, op. cit.; Atlas and Dando, op. cit.
12 Miller and Selgelid, op. cit.
13 Malakoff, D. 2002, Biological agents: new U.S. rules set the stage for tighter security, oversight, Science,
vol. 298, p. 2304; Bhattacharjee, Y. 2010, Biosecurity: new biosecurity rules to target the riskiest pathogens,
Science, vol. 329, pp. 2645.
14 National Research Council, 2006, op. cit.
238
28Ibid.
29 Resnik, 2009, op. cit.
30 Journal Editors and Authors Group 2003, Uncensored exchange of scientific results, Proceedings of the
National Academy of Sciences, vol. 100, p. 1464.
31 Resnik, 2009, op. cit.
32 Miller and Selgelid, op. cit.
33 NSABB, op. cit.
34 Bhattacharjee, op. cit.
35 Resnik and Shamoo, op. cit.
36 Resnik, 2009, op. cit.
240
the National Research Council and the American Association for the Advancement of Science, The National
Academies Press, Washington, DC, <http://books.google.com/books/about/A_Survey_of_Attitudes_and_
Actions_on_Dua.html?id=2-MLru0bHkQC> (viewed 30 December 2011).
39Cornell University 2007, Institutional Biosafety Committee, <http://www.ibc.cornell.edu/
responsibilities> (viewed 30 December 2011).
40 NSABB, op. cit.
41 Shamoo, A. S. and Resnik, D. B. 2009, Responsible Conduct of Research, 2nd edn, Oxford University Press,
New York.
42 Resnik, D. B. 2010, Dual-use review and the IRB, Journal of Clinical Research Best Practices, vol. 6, no.
1, <http://www.firstclinical.com> (viewed 30 December 2011).
242
Although these three committees have the ability to identify scientific research
with dual-use implications, inevitably some research may fall through the cracks,
such as biomedical research that does not involve dangerous pathogens or animal
or human subjects; chemical research; and engineering research. For example,
the study of how to infect the US milk supply with botulinum toxin would
not have been reviewed by any IBC, IACUC or IRB.43 Because these committees
may lack the expertise to assess the dual-use implications of research that falls
within their purview, the NSABB has recommended that institutions consider
appointing committees that focus on dual-use research.44 It is not known
how many institutions have created such committees, although the National
Institutes of Health (NIH) intramural program has a dual-use committee.45
Review of research by institutional committees is likely to be one of the most
effective ways of preventing harms related to dual-use research. First, most
institutions already have IBCs, IACUCs and IRBs or equivalent committees.
Second, since these committees are accustomed to dealing with a variety of
ethical, legal and social issues related to research, it will not come as a surprise
to most committee members that they should be mindful of dual-use concerns.
Third, these committees are likely to know how to access the appropriate
institutional officials (for example, vice-president for research, compliance
officer) for dealing with dual-use issues. Although institutional committees
are well positioned to deal with dual-uses issues, it is not known how often
these committees encounter research that raises dual-use concerns, whether
they recognise the dual-use implications, or how they typically respond to
these situations. More research is needed to get a better understanding of the
effectiveness of institutional committees of dealing with dual-use issues.
requirements for students supported with grants funds.48 Education in RCR varies
considerably. Some institutions require students to complete online modules
in RCR, while others require attendance at classes, workshops or seminars.49
RCR education typically addresses fabrication, falsification, plagiarism, data
management, conflict of interest, authorship, publication, mentoring and
peer review, but this list could be expanded to include dual-use issues. It is
not known how many institutions have conducted education and training in
dual-use research, and more research is needed on this topic. In 2009, the NIH
required that all intramural researchers and trainees receive education on their
social responsibilities related to dual-use research. The educational materials
developed by the NIH included an overview of dual-use research as well as
several case studies.50
While education in dual-use issues is likely to play an important role in
preventing the misuse of scientific research or materials, its effectiveness has been
questioned. According to some studies, RCR education does not decrease the
prevalence of negative behaviour, such as data falsification or plagiarism.51 Other
studies suggest that RCR education can promote awareness of ethical issues and
knowledge of ethical concepts, but does not influence attitudes or behaviour.52
Although there have been no published studies on the effectiveness of dual-use
education and training, it is plausible to hypothesise that these efforts may have
mixed results, given the scientific communitys experiences with RCR education
and training. A survey of life scientists indicates they support education and
training initiatives that cover dual-use research;53 however, more research is
needed on the effectiveness of dual-use education, and the effectiveness of
different pedagogical techniquesfor example, seminars, online training, and
so on.
Professional codes
The NSABB54 and other commentators55 have recommended that professional
associations should develop codes of ethics (or conduct) to promote selfregulation of dual-use research. In response to the dual-use concerns that arose
in 2001, the American Society for Microbiology (ASM) revised its code of ethics,
which now includes the following paragraph:
ASM members are obligated to discourage any use of microbiology
contrary to the welfare of humankind, including the use of microbes
as biological weapons. Bioterrorism violates the fundamental principles
upon which the Society was founded and is abhorrent to the ASM and
its members. ASM members will call to the attention of the public or
the appropriate authorities misuses of microbiology or of information
derived from microbiology.56
Other professions with codes that address dual-use issues include the
International Union of Biochemistry and Molecular Biology (IUBMB)57 and
the American Medical Association (AMA).58 It is not known how many other
professions have such codes but development thus far has been sparse.59 The
NSABB has published some considerations for developing a code of conduct
related to dual-use research in the life sciences, which address such topics
as proposing, managing and conducting research; collaborations; public
communications; and mentoring.60
The effectiveness of professional codes at promoting ethical conduct has been
the subject of much debate.61 Some argue that professional codes are merely
symbolic statements of values that have little effect on behaviour. Others
argue that codes provide useful guidance for members of the profession and
help promote public trust by defining standards of conduct;62 however, most
professional codes lack enforcement mechanisms. Those that are the most
effective are linked to professional licensure or certification, which can result
54 NSABB, op. cit.
55 Rappert, op. cit.
56 American Society for Microbiology (ASM) 2005, Code of Ethics, <http://www.asm.org/ccLibraryFiles/
FILENAME/000000001596/ASMCodeofEthics05.pdf> (viewed 30 December 2011).
57 International Union of Biochemistry and Molecular Biology (IUBMB) 2005, Code of Ethics, <http://www.
babonline.org/bab/babcethics.pdf> (viewed 30 December 2011).
58 American Medical Association (AMA) 2004, Code of Medical Ethics, Opinion 2.078Guideline to Prevent
Malevolent Use of Biomedical Research, <http://www.ama-assn.org/ama/pub/physician-resources/medicalethics/code-medical-ethics/opinion2078.shtml> (viewed 30 December 2011).
59 Rappert, B. 2007, Codes of conduct and biological weapons: an in-process assessment, Biosecurity and
Bioterrorism, vol. 5, pp. 110.
60 NSABB, op. cit.
61 Rappert, 2007, op. cit.
62 Bayles, M. 1988, Professional Ethics, 2nd edn, Wadsworth, Belmont, Calif.
245
Institutional policies
The NSABB has also recommended that institutions develop policies addressing
dual-use issues, complimenting existing RCR policies.65 Many scholars and
commentators have argued that institutional policies, combined with education
and training, can help promote ethical conduct in research;66 however, it is
difficult to assess the effectiveness of institutional policies due to social, economic
and cultural differences among institutions. Studies have shown that there is
considerable variation in the content of some institutional ethics policies, such
as conflict-of-interest and misconduct rules.67 In the United States, the content of
institutional policies is largely driven by government mandates, such as NIH or
NSF rules that require institutions to establish conflict-of-interest, misconduct,
animal research and human subjects research policies as a condition of obtaining
funding.68 Federal agencies could promote institutional policy development by
requiring that funding recipients have dual-use policies; however, so far no
federal agency has done so.
Institutional dual-use policies could play a key role in preventing the
misuse of scientific work for malevolent purposes. Policies could address
controlling, securing, transferring and accessing hazardous materials, as well
as publishing research that could be readily used to cause significant harm to
63 Rotunda, R. 2007, Legal Ethics, 3rd edn, West Publishing, St Paul, Minn.
64 Shamoo and Resnik, op. cit.; Rappert, 2007, op. cit.
65 NSABB, op. cit.
66 Institute of Medicine 2002, Integrity in Scientific Research: Creating An Environment that Promotes
Responsible Conduct, The National Academies Press, Washington, DC; Geller, G., Boyce, A., Ford, D. E. and
Sugarman, J. 2010, Beyond compliance: the role of institutional culture in promoting research integrity,
Academic Medicine, vol. 85, pp. 1296302.
67 Cho, M. K., Shohara, R., Schissel, A. and Rennie, D. 2000, Policies on faculty conflicts of interest at US
universities, Journal of the American Medical Association, vol. 284, pp. 220328; McCrary, S. V., Anderson,
C. B., Jakovljevic, J., Khan, T., McCullough, L. B., Wray, N. P. and Brody, B. A. 2000, A national survey of
policies on disclosure of conflicts of interest in biomedical research, New England Journal of Medicine, vol.
343, pp. 16216; Lind, R. A. 2005, Evaluating research misconduct policies at major research universities: a
pilot study, Accountability in Research, vol. 12, pp. 24162.
68 Shamoo and Resnik, op. cit.
246
Journal policies
Scientific journal dual-use review policies are another form of self-governance.
Journals have been at the centre of the dual-use controversy, as several papers
published by prominent journals have been controversial. For example, several
members of the US Congress criticised journal editors for publishing the paper
on how to manufacture a polio virus mentioned earlier. The US Department of
Health and Human Services (DHHS) asked the editors of the Proceedings of the
National Academy of Sciences (PNAS) not to publish the paper on contaminating
the US milk supply, also mentioned earlier.70 The editors of PNAS met with
DHHS representatives prior to publication to discuss the benefits and risks of
public dissemination of the findings. The Department of Homeland Security
(DHS) asked the NSABB to review the paper on reconstructing the 1918
pandemic influenza virus prior to publication in Science. Although the NSABB
voted unanimously in favour of publication, the editor of Science publicly
stated he would have ignored the NSABBs recommendations if had been against
publication.71
Recently, the NSABB recommended that journals omit important details from
research that used genetic-engineering techniques to create a mutated form of
an avian flu virus, A(H5N1), which could be transmitted between mammalian
species, including humans. Key details would be available only to responsible
scientists. Currently, the virus can only be transmitted from birds to other
mammalian species, not between members of mammalian species. Six hundred
people have contracted the virus since 1997, and more than half of them have
died. The journals reviewing the research, Science and Nature, had not made a
decision concerning publication as of the writing of this chapter.72
69Ibid.
70 Resnik and Shamoo, op. cit.
71 Kennedy, D. 2005, Better never than late, Science, vol. 310, p. 195.
72 Grady, D. and Broad, W. 2011, Seeing terror risk, U.S. asks journals to cut flu study facts, The New York
Times, 20 December 2011, p. A1.
247
Journals began developing dual-use review policies in the early part of the
twenty-first century, following the publication of several controversial papers.73
Some of the leading journals with dual-use review policies include Science,
journals published by the Nature Publishing Group (NPG)that is, Nature,
Nature Biotechnology, and so onand journals published by ASM.74 Review
policies include an additional level of review for papers that raise dual-use
concerns. Outside reviewers with expertise in national security, terrorism or
other relevant subjects may be asked to assist with the review. The outcome
of the review could include a recommendation to publish, to not publish or
to publish with appropriate revisions, such as restrictions on access to key
information.75 The NPGs dual-use policy provides that:
Nature journal editors may seek advice about submitted papers not
only from technical reviewers but also on any aspect of a paper that
raises concerns As in all publishing decisions, the ultimate decision
whether to publish is the responsibility of the editor of the Nature
journal concerned. The threat posed by bioweapons raises the unusual
need to assess the balance of risk and benefit in publication. Editors
are not necessarily well qualified to make such judgements unassisted,
and so we reserve the right to take expert advice in cases where we
believe that concerns may arise. We recognize the widespread view that
openness in science helps to alert society to potential threats and to
defend against them, and we anticipate that only very rarely (if at all)
will the risks be perceived as outweighing the benefits of publishing a
paper that has otherwise been deemed appropriate for a Nature journal.
Nevertheless, we think it appropriate to consider such risks and to have
a formal policy for dealing with them if need arises. The editorial staff of
Nature journals maintains a network of advisers on biosecurity issues.
All concerns on that score, including the commissioning of external
advice, will be shared within an editorial monitoring group consisting
of the Editor-in-Chief of Nature publications, the Executive Editor of the
Nature research journals, the Chief Biological Sciences Editor of Nature,
and the chief editor of the journal concerned. Once a decision has been
reached, authors will be informed if biosecurity advice has informed
that decision.76
There have been three published studies to date on the dual-use review policies
of scientific journals. In 2009, van Aken and Hunger published a survey of
28 major life-science journals that regularly publish research that may raise
biosecurity issues. They found that 25 per cent of these journals had dual-use
review policies.77 In October 2010, a working group from the NSABB reported
their findings at an NSABB board meeting. Members of the working group
reported on their discussions with the editors of 18 high-impact life-science
journals that had published dual-use review policies online. Members of the
working group found that different models of dual-use review were in place,
and that editors felt they needed additional guidance.78
Though these two studies provide some useful information about the dual-use
policies of scientific journals, they do not provide systematic information about
journal policy development because the sampling was focused (not random) and
the sample sizes were small. To overcome these limitations, my colleagues and
I conducted a larger survey of biomedical journals. We drew a random sample
of 400 journals from the ISI Web of Knowledge Journal Citation Reports 2009
Edition. We eliminated journals that had little chance of reviewing dual-use
research. We sent out an email survey to the editors and reminders after seven
and 14 days if we did not receive a response. Of these, 155 journals responded to
our survey (response rate 39 per cent). Only 7.7 per cent said they had a formal
(written) dual-use policy; 72.8 per cent said they had had no experience with
reviewing dual-use research in the past five years, 5.8 per cent indicated they
had some experience, and 21.8 per cent did not give a definite answer to this
question. We attempted to determine whether several variables were associated
with having a dual-use policy. Belonging to the NPG was the most significant
predictor of having a policy (positive association). Having experience with
reviewing manuscripts with dual-use implications was also positively associated
with having a policy.79
Our research indicates that less than 8 per cent of biomedical journals have a
dual-use policy. This is much lower than the percentage reported by van Aken
and Hunger. This is an important finding, since it shows that the scientific
community has not made much progress in an important area of self-regulation.
Another important finding is that most journals have not had any experience
with reviewing research with dual-use implications. Indeed, some editors said
they had not developed a policy because they saw no need for one, and others
had not even heard of the term dual use. Some editors said they were planning
to develop a policy after they learned about our study.80
77 van Aken and Hunger, op. cit.
78 Report of the NSABB Working Group on Journal Review Policies, op. cit.
79 Resnik, D. B., Barner, D. D. and Dinse, G. E. 2011, Dual use policies of biomedical journals, Biosecurity
and Bioterrorism, vol. 9, pp. 4954.
80Ibid.
249
81 Berg, P., Baltimore, D., Brenner, S., Roblin III, R. O. and Singer, M. F. 1975, Summary statement of the
Asilomar Conference on Recombinant DNA Molecules, Proceedings of the National Academy of Sciences, vol.
72, pp. 19814.
82 Kimmelman, J. 2009, Gene Transfer and the Ethics of First-in-Human Research: Lost in Translation,
Cambridge University Press, Cambridge.
250
83Ibid.
84 Shamoo and Resnik, op. cit.
85Ibid.
86 Nuremberg Code 1947, Directives for Human Experimentation, <http://ohsr.od.nih.gov/guidelines/
nuremberg.html> (viewed 30 December 2011).
87 Shamoo and Resnik, op. cit.
251
Acknowledgments
This research was sponsored by the National Institute of Environmental Health
Sciences (NIEHS), National Institutes of Health (NIH). It does not represent the
views of the NIEHS, NIH or US Government.
253
2 For example, Zimmerman, P. D. and Loeb, C. 2004, Dirty bombs: the threat revisited, Defense Horizons,
vol. 38, pp. 111.
3 Stoll, W. 2007, 210polonium, Atw. Internationale Zeitschrift fr Kernenergie, vol. 52, no. 1, pp. 3941.
256
4 Notable exceptions include Selgelid, M. J. 2007, A tale of two studies: ethics, bioterrorism, and the
censorship of science, The Hastings Centre Report, vol. 37, no. 3, pp. 3548; Finney, J. L. 2007, Dual use:
can we learn from the physicists experience? in B. Rappert (ed.), A Web of Prevention: Biological Weapons,
Life Sciences and the Governance of Research, Earthscan, London, pp. 6776; Forge, J. 2008, The Responsible
Scientist, University of Pittsburg Press, Pittsburg; Miller and Selgelid, op. cit.
5 National Academies of Science 2004, Biotechnology Research in An Age of Terrorism, The National
Academies Press, Washington, DC, p. 23.
6 Ibid., pp. 813.
7 National Academies of Science 2006, Globalization, Biosecurity, and the Future of the Life Sciences, The
National Academies Press, Washington, DC, p. 46.
257
Biological pathogens
Non-living, synthetic
Living, replicative
Source: National Academies of Science 2004, Biotechnology Research in An Age of Terrorism, The National
Academies Press, Washington, DC.
further investigation of the nuclear sciences and their history to understand the
pitfalls of regulating science and technology, and apply lessons learnt to dual
use in the life sciences.
Finally, it is not my intention to claim that biology and the nuclear sciences
are the same regarding the dual-use dilemma. I will show where the issues
presented by dual-use materials, knowledge, techniques or technologies in
each field overlap, and where they come apart. The differences between nuclear
science and biology do not preclude an analysis of their similarities; likewise,
those differences may be important in and of themselves.
Second, the examples I have chosen serve to highlight some less-explored
features of the dual-use dilemma in the life sciencesfeatures I believe deserve
significant exploration. Though we may continue to worry about the potential
for state-sponsored biological weapons programs, a primary concern in the
dual-use dilemma is that of small groups or individuals committing acts of
bioterrorism. My examples are chosen to identify issues in regulating these
small-scale operations. This does not mean that there is nothing to be learned
from the nuclear sciences about state-level regulations, or that my analysis
doesnt cover these issues at all. Rather, my focus is meant to target the trajectory
I see the (rapidly) evolving life sciences following, in an effort to identify issues
that will arise or are already coming to light regarding dual-use research.
Materials control
Materials control is an important part of preventing the misuse of science
and technology. If access to the constituent materials of biological or nuclear
weapons is restricted, their use can be restricted presumably to good uses by
good and responsible actors. As the subject of the Fink Report that led to their
conclusion that the nuclear sciences cannot inform our thought on the dual-use
dilemma, materials control is a good first point of inquiry for this chapter. The
differences between fissile materials and biological pathogens listed in Table
16.1 are a paradigmatic way of contrasting nuclear science and biology. The
differences between the materials used in both fields present a picture in which
nuclear science is far removed from biology.
There are definitely some pathogens that possess the same type of threat as
nuclear weapons. A smallpox attack could potentially cause up to 167 000
deathsgreater than the number who died when the United States dropped a
nuclear weapon on Hiroshima.8 An anthrax attack on an unprepared population
could cause more than half a million deaths, but only if the emergency response
8 See Samuel, G., Selgelid, M. J. and Kerridge, I. 2010, Back to the future: controlling synthetic life sciences
trade in DNA sequences, Bulletin of the Atomic Scientists, vol. 66, p. 10.
259
to the act was mismanaged.9 The botulinum toxina biological toxin and the
most potent poison in the worldreleased into the milk supply of the United
States could kill up to 200 000 people, the majority children or the elderly.10
Most recently, there are concerns that genetically modified avian influenza
could kill millions.11
Those listed above, however, are some of only a few pathogens capable of the
devastation similar to that visited on Hiroshima. Moreover, comparisons to
Hiroshima are dated and misleading: the W76 warhead, a mainstay of the US
nuclear arsenal, has a yield of 100 kilotons,12 roughly five times that of the
Hiroshima bomb. The W76 bomb is, further, small in yield compared with others
in the US and Russian arsenals. Nuclear weapons range from tactical warheads
of a fraction of a kiloton, to megaton (where one megaton is equivalent to 1000
kilotons) scales.13
Biological attacks, further, suffer from a greater range of contingencies than
nuclear attacks, and are prone to more variable yields than nuclear weapons. Bad
weatheror, in the case of spores that are sensitive to sunlight, good weather
can seriously mitigate the effect of a bioweapons attack, to say nothing of other
factors. Damage caused by biological weapons can be further mitigated by
strong public health policies, smart post-attack vaccinations, and so on.14 This is
contrasted with the indiscriminate and destructive nature of nuclear weapons,
which are more or less impervious to weather conditions or most attempts to
defend against them.
This is not to say that biological pathogens should be treated less seriously than
fissile materials. A key difference listed in the Fink Report is that the destructive
potential of nuclear weapons is balanced to an extent by the difficulty of
procuring fissile materials. Bioweapons, on the other hand, are likely to be more
attainable and more augmented for causing death in years to come. For the
most part, however, the distinction between the (broad) class of things called
9 Wein, L. M., Craft, D. L. and Kaplan, E. H. 2003, Emergency response to an anthrax attack, Proceedings
of the National Academies of Science, vol. 100, no. 7, pp. 434651. It is worth, in the interest of disclosure,
noting that 660 000 people are predicted to die in the modelled attack in the case where only individuals
symptomatic with the disease are given antibiotic treatment. Wein et al. are quite clear in their study that
there are a number of factors that, taken together, could lower this death count to approximately 10 000.
10 Wein, L. M. and Liu, Y. 2005, Analyzing a bioterror attack on the food supply: the case of botulinum
toxin in milk, Proceedings of the National Academies of Science, vol. 102, no. 28, pp. 99849.
11 Enserink, M. 2011, Grudgingly, virologists agree to redact details in sensitive flu papers, Science Insider,
20 December 2011, <http://news.sciencemag.org/scienceinsider/2011/12/grudgingly-virologists-agree-to.
html> (viewed 1 February 2012).
12 Where 1 kiloton = 1000 tonnes of TNT equivalent energy of explosion.
13 Norris, R. S. and Kristensen, H. N. 2007, Estimates of the U.S. Nuclear Weapons Stockpile [2007 and 2012],
<http://www.fas.org/programs/ssp/nukes/publications1/USStockpile2007-2012.pdf> (viewed 24 February
2010).
14 See, for example, Wein et al., 2003, op. cit.; Chen, L.-C. et. al. 2004, Aligning simulation models of
smallpox outbreaks, Intelligence and Security Informatics, vol. 3073, pp. 116; Wein et al., 2005, op. cit.
260
biological pathogens and the (narrow) class of things called fissile materials
is inaccurate; we are working with two classes of materials for which the
comparisons we can make are not strong ones.
Radionuclides
A better comparison, I contend, is between biological pathogens and
radionuclides described above. Though radionuclides more generally are not
dual-use materials of the same degree as fissile materialswhich form a small
but significant set of radionuclidesradionuclides generally pose a series of
dual-use issues, some of which I have described above.
Radionuclides are significant for two reasons. First, much as we think there
are dual-use issues with life-sciences materials, technology and research that
are worthy of attention even though they are not dangerous on the scale of,
say, weaponised smallpox, there are dual-use issues in the nuclear sciences that
do not involve nuclear weapons. Second, the way fissile materials were and
are regulated is, as the Fink Report argued, somewhat unique. It is useful to
examine the regulation of other areas of the nuclear sciences, which may have
experienced issues with regulation more in line with what we would expect
from the life sciences.
Though nowhere near as powerful as a thermonuclear detonation, or the expert
use of weaponised smallpox, dirty bombs would cause damage comparable
with less spectacularand paradigmaticexamples of bioterrorism. First, the
Amerithrax attacks that occurred following the 11 September 2001 attacks on
the World Trade Centre killed five and infected a further 17 people. Second, the
attempted attack by the Aum Shinrikyo cult through the dispersion of anthrax
failed,15 but could have easily inflicted hundreds of casualties had it worked;
the Sarin gas attack in 1995 on the Tokyo subway by the same cult killed 12
and injured thousands.16 Finally, the lacing of salad bars with salmonella by
the Bagwasrinesh cult in Oregon did not kill anyone, but hospitalised 45 and
infected 750.17 These cases are of serious concern, but the number of casualties
(or hospitalisations, in those cases that lacked fatalities) is much smaller than
from a thermonuclear explosion.
Sufficient exposure to radioisotopes can also lead to radiation poisoning. The
most high-profile recent case of this is the death of KGB defector Alexander
Litvinenko from a lethal dose of polonium-210. Polonium-210 is found in
15 See, for example, Miller, J., Broad, W. and Engelberg, S. 2001, Germs: Biological Weapons and Americas
Secret War, Simon & Schuster, New York.
16 Tu, A. T. 1999, Overview of the sarin terrorist attacks in Japan, in A. T. Tu and W. Gaffield (eds), Natural
and Selected Synthetic Toxins: Biological Implications, ACS Symposium Series, 20 December, pp. 30417.
17 Miller et al., op. cit.
261
technically difficult propositionor they could wait until flu season to gain
living native samples and modify them. Anthrax, to use another example, is
still a common disease amongst cattle in developing countries.
The advent of do-it-yourself (DIY) or garage biology, where individuals
practise biology at home or in private hobby groups, further undermines the
efficacy of materials control. As the cost of sequencing and synthesis continues
to decline, and the methods of DIY biologists in procuring samples either
individually or through communal open labs (biological commons) or creating
their own technologies become increasingly sophisticated,25 materials control
will become progressively more costly, or even completely ineffective. Current
regulatory approaches should take this into account when attempting to design
overlapping regulatory measuresthe so-called web of prevention26to
prevent the misuse of biotechnology.
Moreover, if present expectations for biology are fulfilled, biological materials
stand to become a ubiquitous part of modern lifestyles not in the way of nuclear
materials, but in the way of microchips.27 Many of these biological materials will
not pose a significant dual-use threat, but there is a good chance that many of
them will. Understanding how biological materials will appear in the everyday
world and how motivated individuals might use them for malevolent or just
manifestly reckless purposes is an important step in understanding dual use
beyond the paradigm of state-level weapons conventions, deterrence and a fieldwide security enterprisea paradigm that is already recognised as untenable or
insufficient for the dual-use threat in the life sciences.
25 See Carlson, R. 2010, Biology is Technology: The Promise, Peril, and New Business of Engineering Life,
Harvard University Press, Cambridge, Mass.
26 Feakes, D., Rappert, B. and McLeish, C. 2007, Introduction: a web of prevention? in B. Rappert and C.
McLeish (eds), A Web of Prevention: Biological Weapons, Life Sciences and the Future Governance of Research,
Earthscan, London, pp. 114.
27 See Carlson, op. cit.
264
28 See Corera, G. and Myers, J. J. 2006, Shopping for Bombs: Nuclear Proliferation, Global Insecurity, and the
Rise and Fall of A. Q. Khans Nuclear Network, Oxford University Press, London.
29 Lava Amp, <http://www.lava-amp.com/> (viewed 27 February 2011).
265
a narrow sense of utility at that. Most of us, reflecting on the nuclear sciences,
would admit that the nuclear sciences, even if they have brought gains to our
lives, have brought gains unequally and at times unfairly.
Restriction of information
At times, it may be that the knowledge produced by a particular piece of
research is dual use. The polio virus case, for example, functioned as a proof
of principle that viruses could be synthesised de novo. This is independent of
which virusor strain of a particular virusis to be synthesised. It means that
vaccines could be constructed rapidly, but also that virulent pathogens could
be made. Even then, this could be good, enabling more individuals to study
a virus with an eye to creating medicines against it; however, the knowledge
could be also used to harm others. Jeronimo Cello and Eckard Wimmer, two
of the researchers who worked on the polio virus synthesis, noted that one of
the important outcomes of the research was confirming that viruses, in fact,
behave in a chemical fashion and can be synthesised like chemicals.30 It is
this knowledge, we might think, which is dangerous as much as any physical
materials used in the research.
One option is to censor information. We could restrict what projects are
undertaken, or we could restricteither temporarily or permanently, depending
on the seriousness of the dual-use concernthe publication of research. For
example, a 2005 study in the Proceedings of the National Academies of Science
modelling a terrorist attack in which botulinum toxin was released in the US
milk supply was embargoed for approximately a month as the journal, the
National Academy of Sciences and government officials debated the security
concerns in releasing such a report.31 In 2011, the National Science Advisory
Board for Biosecurity (NSABB) recommended the censorship of two studies
in which highly pathogenic H5N1 avian influenza virus was modified to be
transmissible between ferrets (close immunological analogues to humans),
though this recommendation was later reversed.32
Regulating the production of such knowledge by restricting its creation or its
dissemination is typically met with suspicion or outrage. Denying scientists
30 Selgelid, M. J. and Weir, L. 2010, Reflections on the synthetic production of poliovirus, Bulletin of the
Atomic Scientists, vol. 66, no. 3, pp. 19.
31 Alberts, B. 2005, Modeling attacks on the food supply, Proceedings of the National Academy of Sciences,
vol. 102, no. 29, pp. 97378. See also Wein, L. M. 2008, Homeland security: from mathematical models
to policy implementation: the 2008 Philip McCord Morse lecture, Operations Research, vol. 57, no. 4, pp.
80111.
32 Evans, N. G. 2013, Great expectationsethics, avian flu and the value of progress, Journal of Medical
Ethics, vol. 39, pp. 20913.
266
secrecy in the Atomic Energy Act of 1954, restricting: 1) the design, manufacture
or utilisation of atomic weapons; 2) the production of special nuclear material
(uranium-233 or 235 or plutonium in natural or enriched states); and 3) the use
of special nuclear material in the production of energy.39
The article of concern, written by Howard Morland, was based on information
in the public domain.40 Morland had not, in writing his article, used any
information that was secret. Much was still considered born classified, but had
existed in the public domain long enough that it was more or less commonly
accessible knowledge. The information of concern was the description of
radiation implosion, the technical process that allowed the superbetter
known as the hydrogen bombto operate.41
Morlands stated intentions in releasing the article were to promote discussion
about the arms race and the extensive secrecy of the nuclear complex.42 In this
sense, the knowledge was used in a manner that many would consider to be
good. Nuclear secrecy, while justified in certain areas, has been criticised for
subverting discourse on arms reductions and control of nuclear weapons.43
The US Governments reasons for attempting to censor Morlands article were
that public knowledge was not complete. Potential builders needed more than
just what was publicly availableknowledge that Morlands article, they
claimed, provided. The court, however, rejected this claim, stating that any
deduction that Morland had made regarding the technical mechanisms of the
hydrogen bomb were clearly independently deducible: four countries already
had independently created hydrogen bombs.
Education
Education is advocated as a worthwhile strategy to deal with the dual-use
dilemma.44 Typically, the proposed venues for ethics education are institutions
of higher learning, where professional or apprentice practitioners can be trained
in the ethical implications of their discipline. The aim of this education, then, is
to engage bench scientists or students of science in the ethical issues associated
with their research.
There are, however, other types of education that are crucial to mitigating the
harms of the dual-use dilemma, as demonstrated by the nuclear sciences. These
measures are associated with the education of the public in both ethics and
scientific literacy. Recall that Morlands contention in the Progressive case was
that public engagement and reasonable knowledge of the technical as well as the
political aspects of the regulatory debates and structures in the nuclear sciences
were necessary to functioning scientific policy.
Moreover, ordinary public citizens may at times be the only witnesses to
problematic acts that are not the purview of conventional regulations. Without
appropriate education, ordinary public citizens are not well placed to know what
44 See, for example, Rappert, op. cit.; National Science Advisory Board for Biosecurity (NSABB), Strategic
Plan for Outreach and Education on Dual Use Research Issues, National Science Advisory Board for Biosecurity,
Bethesda, Md.
269
to look for or, sometimes, even what they are looking at. One of the observations
that came from the David Hahn case is that, had they been suitably attentive,
Davids parents, teachers and classmates should have known something was
amiss. Particularly at the point at which David began showing up with burns
caused by exposure radiation, the public who interacted with David, we might
think, should have become concerned enough to notify someone.45
Literature on dual use in the life sciences has discussed the meritsalbeit to a
lesser degree than the education of lab scientistsof public outreach.46 Such
outreach typically aims to create a culture of acceptance of novel technologies.
The Biotechnology Industry Organization (BIO), for example, notes that 31 per
cent of Americans surveyed by the Robert Wood Johnson Foundation are at
least somewhat concerned about becoming ill from a bioterror attack.47 The aim
of public outreach is often to calm fears about biotechnology, often by showing
how the emerging science will contribute to security.48
Another reason to pursue extensive public education is that as the life sciences
become ever more ubiquitous in the technological and social landscapes, the
publics comprehension may become an important part of security. In the life
sciences, this is particularly important as garage or DIY biology comes into
its own. DIY biologists are biologists who pursue biological research from their
homes, usually as a hobby or as part of an entrepreneurial activity. While DIY
biologists can create communities that have strong commitments to safety and
security, such as the organisation DIYBio,49 this is not a necessarymuch less
mandatorypart of being a DIY biologist. It remains to be seen how such a
community can deal with members who would actively seek to cause harm, or
perform reckless or dangerous experiments.
It may be that in some cases individuals engaged in DIY biology who pursue
reckless or malevolent acts are only subject to observation by their peers,
friends and family. In these cases, oversight is only plausible if those around an
individual engaged in reckless or malevolent activity are suitably knowledgeable
to recognise that something is amiss. If the public is not educated, activity done
outside official laboratories may be overlooked with disastrous consequences.
Moreover, lack of public education may lead to powerful stakeholders dominating
regulatory discourse in ways that are not conducive to the promotion of the good
45 See Silverberg, op. cit., ch. 8 and Epilogue.
46 For example, Presidential Commission for the Study of Bioethical Issues 2010, New Directions: The Ethics
of Synthetic Biology and Emerging Technologies, Washington, DC.
47 Biotechnology Industry Organization (BIO) 2010, Healing, Fueling, Feeding: How Biotechnology is
Enriching Your Life, Biotechnology Industry Organization, Washington, DC, pp. 2324, 29 n. 63.
48 Carlson, op. cit.; Carlson, R. 2003, The pace and proliferation of biological technologies, Biosecurity and
Bioterrorism: Biodefense Strategy, Practice, and Science, vol. 1, no. 3, pp. 20314.
49 DIYBio, <http://diybio.org> (viewed 27 February 2011).
270
52 Carlson, R. 2007, Laying the foundations for a bio-economy, Systems and Synthetic Biology, vol. 1, no.
3, pp. 10917, <http://www.ncbi.nlm.nih.gov/pubmed/19003445>.
53 As opposed to unjustified amounts of control, which we might think is easier to affect, but costs a
population much more.
272
273
Introduction
As dual-use ethics continues to grow as a topic of discussion, a number of
features are increasingly becoming identifiable in the discourse. While many
of these have been well discussed in a number of other volumes,1 this chapter
focuses on a little-examined characteristic: how issues relating to contextuality
in life-science research are currently addressed in dual-use ethics.
The issue of contextuality in dual-use ethics is an interesting topic for
consideration because it may be simultaneously argued that there is too much
focus as well as too little. Those suggesting that dual-use ethics has been
predominantly context driven will point to the central role that the web
of prevention rhetoric and policy development have had in the evolving
discussions on responsibility. And they would not be wrong; indeed, much
of the discourse in dual-use ethics has evolved out of control and regulatory
discussions and continues to be strongly influenced by them.
The alternativethat current approaches to dual-use ethics are largely decontextualisedis more difficult to defend (and ultimately much less popular).
As this chapter will elaborate, however, examining the contextual oversights in
dual-use ethics raises some extremely important considerations. In particular,
these considerations shed light on how and why scientists in developing
countries remain marginalised from dual-use ethics discourse and are unlikely
to gain more prominence if current approaches are continued.
In order to elaborate on this position, the chapter will start by briefly examining
dual-use ethics in light of the latter position. It will then go on to highlight
how assumptions made through this position may impact on scientists in nonWestern countries. These issues are supported by fieldwork observations from
1 Such as National Research Council 2011, Challenges and Opportunities for Education about Dual-Use Issues
in the Life Sciences, The National Academies Press, Washington, DC.
275
web of prevention model engenders for scientists. When considering the notion
of partial responsibility, it is important to note that dual-use ethics has widely
endorsed the idea that although (t)he misapplication of peacefully intended
research may cause moral distress among scientists it is difficult to argue
that researchers should (solely) be held morally accountable for harm caused
by unforeseen acts of misuse.11 Indeed, scientists are usually suggested only to
have a limited amount of responsibility regarding the reuse of their data, and
bioterrorist activities are thought of as beyond the responsibility of most life
scientists either to prevent or to respond to.12
This notion of partial responsibility is intimately connected to the idea of
distributed responsibility. As suggested by Ehni,13 only a mixed authority
which is constituted by the scientific community together with governmental
bodies, but with the participation of scientists meeting their responsibilities so
far as possible, can solve the problem. Indeed, the web of prevention model
engages a wide range of stakeholders who bear some responsibility towards
addressing and controlling the dual-use potential of the life sciences, including
security, health and judicial communities. The web of prevention model has been
very influential in structuring discussions regarding dual-use controls and most
commonly includes a number of different areas of interventions including public
health initiatives, security surveillance, biosafety and biosecurity controls, and
the education of scientists and the development of codes of conduct.14
The web of prevention model has thus been influential in promoting the idea
that scientists bear only partial responsibility for dual-use issues, and that they
cannot be expected to address the dual-use potential of their research alone.
While there has been general agreement on this, there remains considerable
discussion on how this idea of a partial responsibility may be understood.
In recent years, there have been a number of attempts to determine lists of
(conditional) duties for scientists that will clearly elucidate the expectations
that they have towards dual-use concerns. These, as promoted by Kuhlau et al.,15
may be summarised as follows
the duty to prevent bioterrorism
the duty to engage in response activities
the duty to consider the negative implications of their work
11 Kuhlau, F., Eriksson, S., Evers, K. and Hoglund, A. T. 2008, Taking due care: moral obligations in dual
use research, Bioethics, vol. 22, no. 9, pp. 47787 at p. 483.
12 Ibid., p. 477.
13 Ehni, H.-J. 2008, Dual use and the ethical responsibility of scientists, Archivum Immunologiae Et
Therapiae Experimentalis, vol. 56, pp. 14752 at p. 151.
14 As discussed in Rappert and McLeish, op. cit.
15 Kuhlau et al., op. cit., pp. 4836.
278
16 Ehni proposed similar duties (Ehni, op. cit., p. 150): not to carry out a certain type of research;
systematically to anticipate dual-use applications in order to warn of dangers generated by them; to inform
public authorities about such dangers; not to disseminate results publicly, but keep dangerous scientific
knowledge secret.
17Ibid.
18 National Research Council, 2011, op. cit.
19 There is a considerable amount of discussion about how dual-use ethics should be taught and who should
be teaching the scientists. For an extensive discussion, see Rappert, B. (ed.) 2010, Education and Ethics in the
Life Sciences, ANU E Press, Canberra.
279
The duties for scientists, as proposed by Kuhlau and her colleagues above, were
subjected to a number of different conditions20
it must be within their professional responsibility
it must be within their professional capacity and ability
it must be reasonably foreseeable
it must be proportionally greater than the benefits
it must be not more easily achieved by other means.
A brief survey of these conditions, it must be noted, does not make provision
for any deficiencies in research environments, lack of support or problems with
carrying out the duties. Rather, these conditions seem to delineate student
scientists from principle investigators, and technicians from researchers. Thus,
these conditions provide little in the way of support for scientists working under
research conditions that may be markedly different from the Western norm.21
It therefore becomes important to question whether dual-use responsibility
duties such as those proposed above make implicit assumptions about research
environments and the implementation of webs of prevention. If the duties
discussed above are therefore re-examined in light of this, a number of key
considerations are noted. First, the duty to prevent bioterrorism, while a
laudable goal, may be seen to be largely dependent on the existence of a web of
prevention in action. Without the combined efforts of the security, health and
judicial stakeholders, it is difficult to conceptualise how this duty may be carried
out. Without the integrated involvement of governmental and international
bodies,22 it is difficult to conceptualise how such a duty would be acted upon.
Second, the duty to engage in response activities raises important questions
about the responsibility of scientists in the absence of coordinated activities.
Is it their responsibility to lobby for the establishment of response activities,
or have they fulfilled their obligations solely due to the absence? Furthermore,
if scientists are obligated to engage in response activities, are they similarly
obliged to be involved in those not created in their own milieu? Are scientists,
for example, in developing countries morally obliged to actively participate in
any Western response activity, or is that in fact a form of ethical imperialism?
Third, although the duty to consider the negative implications of their work
may at the outset be seen as self-explanatory, it is vital to consider that risk
and benefit are interpreted quite differently around the world. Therefore, it is
20 Kuhlau et al., op. cit., pp. 4812.
21 As will be discussed in the following sections, these differences could be in regulatory controls, funding,
extra-laboratory service provision, governmental involvement and support, and access to the international
life-science community.
22 As proposed by Miller and Selgelid, op. cit.
280
23 Bezuidenhout, L. (forthcoming), Moving life science ethics debates beyond national borders: some
empirical observations.
281
are no structures in place for them to do so; and whether, in the face of extreme
public health crises in many countries, the risk of losing funding outweighs any
threat of terrorism and thus perceptions of risk.
These concerns all require serious consideration. They suggest not only that
achieving a common culture of awareness and a shared sense of responsibility24
amongst the global scientific community may be more complicated than initially
envisioned, but also that current methods of raising dual-use awareness may
alienaterather than incorporatescientists from non-Western research
environments. If scientists are presented with such duties during dual-use ethics
education without accompanying discussion on the strengths and limitations of
implementing them contextually, it is just possible that such initiatives may do
more harm than good.
Such hesitations relate to another chapter in this volume, by Judi Sture, which
examines the concept of ethical erosion within communities of learners. It is
possible that the presentation of idealised duties or unattainable standards
of behaviour may significantly detract from attempts to engage scientists in
discourse about dual-use responsibilities. Indeed, studies with scientists in a
number of African laboratories strongly suggest that the wholesale importing
of Western ethical approaches to teach dual use to these scientists was limited
in success.25
These issues are further complicated by the lack of capacity in most developing
countries to invest in home-grown ethics initiativesat least for the moment.
Thus, as it stands, within developing countries ethics education often remains
largely in the hands of foreign funding agencies or interest groups. It is thus
plausible to reiterate the question: is a lack of understanding of the structure
of research within developing countries hampering efforts to build capacity
within ethics?
however, what does need to be considered is whether the current Westerncentric approach to dual-use responsibility as presented in current ethics
education may potentially alienate scientists by presenting them with duties
they cannot fulfil.
28 As informed by the fieldwork; Kiringia, J. M., Wambebe, C. and Baba-Moussa, A. 2005, Status of
national research bioethics committees in the WHO African region, BMC Medical Ethics, vol. 6, no. 10; Fine,
J. C. 2007, Investing in STI in sub-Saharan Africa: lessons from collaborative initiatives in research and higher
education, Global Forum: Building Science, Technology and Innovation Capacity for Sustainable Growth and
Poverty Reduction, Washington, DC; and Council on Health Research for Development (COHRED) 2010, Fact
Sheet: NEPAD-COHRED Strengthening Pharmaceutical Innovation in Africa, COHRED, Johannesburg.
284
that there were often no answers to be had. It was my impression that presenting
these duties without a proper, contextually considered understanding of how it
may be implemented often turned the participants off the dual-use discussion in
its entirety, as it was once again perceived as not a problem for Africa.
If, as is the case in dual-use ethics, the duties continue to be presented as moral
obligations, it is also easy to see how these scientists are placed in ethically
untenable positions that compromise their ethical development. Thus, structural
issues within the research environment, if unaddressed, have the potential to
undermine ethical training. Continually facing deficits in the ethical conflicts
between expected duties and the characteristics of the research environment
may cause frustration and resignation amongst the scientists and lower the
potential for them to get involved in ethical discussions. It must be asked how
to reflect these environmental issues within ethics training.
Introduction
Over the past two decades there has been a revolution in the life sciences
with extremely rapid advances in genomics, synthetic biology, biotechnology,
neuroscience and the understanding of human behaviour. The speed of
progress is staggering. For example, in 1999 a special meeting of the National
Academies of Sciences and the Society of Neuroscience noted that [t]he past
decade had delivered more advances than all previous years of neuroscience
research combined.1 Many of these developments have great potential to
benefit humankindin, for example, the production of more effective, safer
medicines.2 Concern has been raised, however, by a growing number of those
in the scientific and medical communities regarding the dual-use nature of
certain advances with the consequent danger of the new technologies being
misused for the development of a new range of chemical or biological weapons.
Meselson has stated that [d]uring the century ahead, as our ability to modify
fundamental life processes continues its rapid advance, we will be able not
only to devise additional ways to destroy life, but also be able to manipulate it
including the processes of cognition, development and inheritance.3
And he added: A world in which these capabilities are widely employed for
hostile purposes would be a world in which the very nature of conflict had
radically changed. Therein could lie unprecedented opportunities for violence,
coercion, repression or subjugation.4
1 Society of Neuroscience 1999, Neuroscience 2000: A New Era of Discovery, Symposium organised by the
Society of Neuroscience, Washington, DC, 1213 April.
2 Andreasen, N. 2004, Brave New Brain: Conquering Mental Illness in the Era of the Genome, Oxford
University Press, USA.
3 Meselson, M. 2000, Averting the hostile exploitation of biotechnology, The CBW Conventions Bulletin,
no. 48, pp. 1619.
4 Ibid., pp. 1619.
293
294
Substance P
Fentanyl
Carfentanil
Cyanide
Phosgene
Infect
Source: Adapted from: Pearson, G. 2002, Relevant scientific and technological developments for the first CWC Review Conference: the BTWC Review Conference experience, CWC
Review Conference Paper, no. 1, University of Bradford, UK, p. 5.
Poison
Plague
Tularemia
Midazolam
Anthrax
Dexmedetomidine
Rickettsia
Etorphine
Viruses
Bacteria
Traditional
biological
weapons
Remifentanil
Modified/tailored
bacteria and viruses
Genetically modified
biological weapons
Mustard
Staphylococcal enterotoxin B
(SEB)
Toxins
Nerve agents
Neurokinin A
Bio-regulators
peptides
Industrial pharmaceutical
chemicals
Classical chemical
weapons
18. Exploring the Role of Life Scientists in Combating the Misuse of Incapacitating Chemical and Toxin Agents
As the ongoing revolution in the life sciences has proceeded, the boundary
between chemistry and biology, and consequently the distinction between
certain chemical and biological weapons, has become increasingly blurred.
Rather than thinking of chemical and biological weapons threats as distinct,
certain analysts including Aas,5 Dando,6 Davison7 and Pearson8 believe it is more
useful to conceptualise them as lying along a continuous biochemical threat
spectrum. This chapter will focus upon research and development of those midspectrum agents (pharmaceutical chemicals, bio-regulators and toxins featured
in Table 18.1) that some consider as having potential utility as incapacitating
weapons (incapacitants). The chapter will employ a holistic arms control (HAC)
approach to examine the potential dangers and proposed utility of such agents
and explore the obligations and opportunities for the life-science community to
ensure that such agents are not utilised for hostile purposes.
Incapacitants: A primer
Although certain states and multilateral organisations such as the North Atlantic
Treaty Organisation (NATO)9 have sought to characterise incapacitants, there is
currently no internationally accepted definition for these chemical agents. Indeed
certain leading international experts believe that such a technical definition is
not possible.10 Whilst recognising the contested nature of this discourse, as a
provisional working description, they can be considered as substances whose
chemical action on specific biochemical processes and physiological systems,
especially those affecting the higher regulatory activity of the central nervous
5 Aas, P. 2003, The threat of mid-spectrum chemical warfare agents, Prehospital and Disaster Medicine, vol.
18, no. 4, pp. 30612.
6 Dando, M. 2007, Scientific outlook for the development of incapacitants, in A. Pearson, M. Chevrier and
M. Wheelis (eds), Incapacitating Biochemical Weapons, Lexington Books, Lanham, Md, p. 125.
7 Davison, N. 2009, Non-Lethal Weapons, Palgrave Macmillan, Basingstoke, UK, pp. 1067.
8 Pearson, op. cit.
9 NATO defines an incapacitant as a chemical agent which produces temporary disabling conditions which
(unlike those caused by riot control agents) can be physical or mental and persist for hours or days after
exposure to the agent has ceased. Medical treatment, while not usually required, facilitates a more rapid
recovery. North Atlantic Treaty Organisation (NATO) 2000, Glossary of Terms and Definitions (AAP-6 (V),
Modified version 02), 7 August 2000.
10 A report of an expert meeting organised by Spiez Laboratory concluded that because there is no clearcut line between (non-lethal) ICA [incapacitating chemical agents] and more lethal chemical war-fare agents,
a scientifically meaningful definition cannot easily be made. One can describe several toxicological effects
that could be used to incapacitate, but in principle there is no way to draw a line between ICAs and lethal
agents. See Mogl, S. (ed.) 2011, Technical Workshop on Incapacitating Chemical Agents, Spiez, Switzerland,
89 September; Spiez Laboratory 2012, op cit., p. 10; The Royal Society 2012, Brain Waves Module 3:
Neuroscience, Conflict and Security, Science Policy Centre, The Royal Society, London, pp. 445; Organisation
for the Prohibition of Chemical Weapons (OPCW) 2013, Report of the Scientific Advisory Board on Developments
in Science and Technology for the Third Special Session of the Conference of the States Parties to the Chemical
Weapons Convention, Third Review Conference RC-3/DG.1, 819 April 2013, 29 October 2013, p. 4.
295
11 Adapted from Pearson et al., op. cit., p. xii. Incapacitants have also been called advanced riot-control
agents, biochemical agents, biotechnical agents, calmatives, incapacitating biochemical weapons and
immobilising agents.
12 See, for example: Lakoski, J., Bosseau Murray, W. and Kenny, J. 2000, The Advantages and Limitations of
Calmatives for Use As A Non-Lethal Technique, College of Medicine Applied Research Laboratory, Pennsylvania
State University.
13 See Aas, op. cit., p. 309.
14 Lakoski et al., op. cit., p. 48.
15 Ibid., p. 48.
296
18. Exploring the Role of Life Scientists in Combating the Misuse of Incapacitating Chemical and Toxin Agents
Figure 18.2 Indicative Drug Classes and Agents with Potential Utility as
Incapacitants
Drug class
Selected compounds
Site of action
Benzodiazepines
Diazepam
GABA receptors
Midazolam
Etizolam
Flumazenil (antagonist)
Alpha2 adrenergic receptor
agonists
Clonidine
Dexmedetomidine
Alpha2 adrenergic
receptors
Fluparoxan (antagonist)
Dopamine D3 receptor agonists
Pramipexole
D3 receptors
Cl-1007
PD 128907
Selective serotonin reuptake
inhibitors
Fluoxetine
5-HT transporter
Sertraline
Paroxetine
WO-09500194
Buspirone
5-HT1A receptor
Lesopitron
Alnespirone
MCK-242
Oleamide
WAY-100, 635
Morphine
Mu opioid reception
Carfentanil
Naloxone (antagonist)
Neurolept anaesthetics
Corticotrophin-releasing factor
receptor antagonists
Propofol
GABA receptors
Phencyclidines
Opioid receptors
CP 154,526 (antagonist)
CRF receptor
Cholecystokinin B receptor
antagonists
CCK-4
CCKB receptor
Cl-988 (antagonist)
Cl-1015 (antagonist)
Source: Adapted from Lakoski, J., Bosseau Murray, W. and Kenny, J. 2000, The Advantages and Limitations of
Calmatives for Use As A Non-Lethal Technique, College of Medicine Applied Research Laboratory, Pennsylvania
State University, pp. 1516.
297
There is a long history, dating from the late 1940s, of certain state programs
attempting to develop incapacitant weapons employing a range of pharmaceutical
chemicals or toxins.16 Analysis of open-source information from the mid
1990s onwards indicates that a number of states including China,17 the Czech
Republic,18 Russia19 and the United States20 appear to have conducted research
relating to incapacitants and/or possible means of delivery at some stage during
this period. It is, however, difficult to establish the current situation, and certain
states that have previously shown an interest in developing such agentssuch
as the United Stateshave recently declared that no such activities currently
take place.21
According to the International Committee of the Red Cross (ICRC): There is
clearly an ongoing attraction to incapacitating chemical agents but it is not
easy to determine the extent to which this has moved along the spectrum from
academia and industrial circles into the law enforcement, security and military
apparatuses of states.22
16 See, for example: Crowley, M. 2009, Dangerous Ambiguities: Regulation of Riot Control Agents and
Incapacitants under the Chemical Weapons Convention, University of Bradford, UK; Dando, M. and Furmanski,
M. 2006, Midspectrum incapacitant programs, in M. Wheelis, L. Rzsa and M. Dando (eds), Deadly Cultures:
Biological Weapons Since 1945, Harvard University Press, Cambridge, Mass.; Davison, op. cit.; Furmanski,
M. 2007, Historical military interest in low-lethality biochemical agents, in Pearson et al., op. cit.; Pearson,
A. 2006, Incapacitating biochemical weapons: science, technology, and policy for the 21st century,
Nonproliferation Review, vol. 13, no. 2.
17 Crowley, op. cit., p. 82; Guo Ji-Wei and Xue-sen Yang 2005, Ultramicro, nonlethal and reversible: looking
ahead to military biotechnology, Military Review, JulyAugust, as cited in Pearson, A. 2007, Late and postCold War research and development of incapacitating biochemical weapons, in Pearson et al., op. cit.
18 Hess, L., Schreiberov, J., Mlek, J. and Fusek, J. 2007, Drug-induced loss of aggressiveness in the
macaque rhesus, Proceedings of 4th European Symposium on Non-Lethal Weapons, 21st23rd May 2007,
Ettlingen, Germany, European Working Group on Non-Lethal Weapons, Pfinztal: Fraunhofer ICT, V15; Hess,
L., Schreiberova, J. and Fusek, J. 2005, Pharmacological non-lethal weapons, Proceedings of the 3rd European
Symposium on Non-Lethal Weapons, 10th12th May 2005, Ettlingen, Germany, European Working Group on
Non-Lethal Weapons, Pfinztal: Fraunhofer ICT, V23; Davison, N. and Lewer, N. 2006, Bradford Non-Lethal
Weapons Research Project (BNLWRP)Research Report No. 8, University of Bradford, UK, p. 50.
19 Klochikin, V., Pirumov, V., Putilov, A. and Selivanov, V. 2003, The complex forecast of perspectives of
NLW for European application, Proceedings of the 2nd European Symposium on Non-Lethal Weapons, 13th14th
May 2003, Ettlingen, Germany, European Working Group on Non-Lethal Weapons, Pfinztal: Fraunhofer ICT;
Klochinkhin, V., Lushnikov, A., Zagaynov, V., Putilov, A., Selivanov, V. and Zatekvakhin, M. 2005, Principles
of modelling of the scenario of calmative application in a building with deterred hostages, Proceedings of
the 3rd European Symposium on Non-Lethal Weapons, 10th12th May 2005, Ettlingen, Germany, European
Working Group on Non-Lethal Weapons, Pfinztal: Fraunhofer ICT.
20 Crowley, op. cit., pp. 768; Davison, op. cit., pp. 10542; Furmanski, op. cit.; Pearson, 2007, op. cit.;
Furmanski and Dando, op. cit.
21 Organisation for the Prevention of Chemical Weapons (OPCW), Executive Council 2013, Statement by
Ambassador Robert P. Mikulak, United States Delegation to the OPCW at the Seventy Second Session of
the Executive Council, Seventy-Second Session EC-72/NAT.8, 67 May 2013, 6 May 2013. In his statement,
ambassador Mikulak declared [i]n this context, I also wish very clearly and directly to reconfirm that the
United States is not developing, producing, stockpiling, or using incapacitating chemical agents.
22 International Committee of the Red Cross (ICRC) 2010, Expert Meeting: Incapacitating Chemical Agents,
Implications for International Law, Montreux, Switzerland, 2426 March, p. 3.
298
18. Exploring the Role of Life Scientists in Combating the Misuse of Incapacitating Chemical and Toxin Agents
According to Davison and Lewer, research to develop sedative and anaesthetic agent combinations
for use as weapons had been funded by the Czech Army under Project No: MO 03021100007. See
Davison and Lewer, op. cit., p. 50.
a
Ibid., p. 7.
Hess, L., Votava, M., Schreiberov, J., Mlek, J. and Horek, M. 2010, Experience with a
naphthylmedetomidine-ketamine-hyaluronidase combination in inducing immobilization in
anthropoid apes, Journal of Medical Primatology, vol. 39, no. 3 (June), pp. 1519.
f
299
technologies by US special operations forces in the global war on terrorism, Maxwell Paper, no. 37, Air
University Press, Maxwell Air Force Base, Ala. It should be noted that other authors have questioned the
utility of incapacitants in certain proposed scenarios such as premeditated hostage situations, due to the
availability of countermeasures. See Wheelis, M. 2007, Non-consensual manipulation of human physiology
using biochemicals, in Pearson et al., op. cit., p. 6.
24 Klotz, L., Furmanski, M. and Wheelis, M. 2003, Beware the sirens song: why non-lethal incapacitating
chemical agents are lethal, Federation of American Scientists Paper, <http://www.fas.org/bwc/papers/sirens_
song.pdf> (viewed 8 February 2012).
25 Ibid., p. 7.
26 Pearson, 2007, op. cit., p. 70.
27 British Medical Association (BMA) 2007, The Use of Drugs As Weapons: The Concerns and Responsibilities
of Healthcare Professionals, BMA, London, p. 1.
28 Klotz et al., op. cit., p. 1.
300
18. Exploring the Role of Life Scientists in Combating the Misuse of Incapacitating Chemical and Toxin Agents
For descriptions of the incident, see, for example: Koplow, op. cit.; Pearson et al., op. cit.; British
Broadcasting Corporation (BBC) 2004, Horizon: The Moscow Theatre Siege, BBC 2, 15 January, <http://
www.bbc.co.uk/science/horizon/2004/moscowtheatretrans.shtml> (viewed 30 July 2009); Amnesty
International 2003, Rough Justice: The Law and Human Rights in the Russian Federation, October,
Amnesty International, London, AI Index EUR 46/054/2003.
a
Specifically, the prohibition against attacking those recognised as hors de combat. See: Henckaerts, J.
2005, Study on customary international humanitarian law: a contribution to the understanding and
respect for the rule of law in armed conflict, International Review of the Red Cross, vol. 87, no. 857, p.
203.
b
Wheelis, M. 2010, Human impact of incapacitating chemical agents, inICRC, op. cit.; Levin, D. and
Selivanov, V. 2009, Medical and biological issues of NLW development and application, Proceedings
of the 5th European Symposium on Non-Lethal Weapons, 11th13th May 2009, Ettlingen, Germany,
European Working Group on Non-Lethal Weapons, Pfinztal: Fraunhofer ICT, V23, p. 7.
c
See, for example: Human Rights Watch 2002, Independent commission of inquiry must investigate
raid on Moscow theater: inadequate protection for consequences of gas violates obligation to protect
life, Press release, 30 October 2002, Human Rights Watch.
d
Russian experts discuss use of fentanyl in hostage crisis, ITAR-TASS, [from Moscow in English], 2112
hrs GMT, 30 October 2002, FBIS-SOV-2002-1030, as cited by Perry Robinson, October 2007, op. cit.
e
Alison, S. 2002, Russian confirms siege gas based on opiate fentanyl, [from Moscow for Reuters],
1257 hrs ET, 30 October 2002, as cited in Perry Robinson, 2007, op. cit.
f
Riches, J., Read, R., Black, R., Cooper, N. and Timperley, C. 2012, Analysis of clothing and urine
from Moscow theatre siege casualties reveals carfentanil and remifentanil use, Journal of Analytical
Toxicology, vol. 36, pp. 64756.
h
301
18. Exploring the Role of Life Scientists in Combating the Misuse of Incapacitating Chemical and Toxin Agents
to lethal force, but as a means to make lethal force more deadly. During the
October 2002 Moscow theatre siege, those Chechen hostage takers who were
rendered unconscious by the incapacitant were then reportedly shot where
they lay by Russian forces rather than being arrested.34
Facilitation of torture and other human rights violations: As well as potentially
being utilised for torture and ill treatment of individuals, incapacitants could
also facilitate repression of groups by, for example, allowing the capture, en
masse, of large numbers of people participating in peaceful demonstrations.
Militarisation of biology: Analysts35 have warned that the continuing
utilisation of the life sciences in the development of incapacitants could
potentially open the way to more malign objectives, such as the widespread
repression of entire populations.
Camouflage for lethal chemical weapons programs: States could exploit the
limited transparency mechanisms required under the Chemical Weapons
Convention (CWC), for incapacitants and other toxic chemicals designated
for use in law enforcement, to hide illicit activities.36
Confusion between lethal and nonlethal chemical weapons: A state using
a nonlethal incapacitant during an armed conflict may be perceived by
another party as having used a lethal chemical weapon and thus initiate an
escalating cycle of retaliation leading to actual use of lethal chemical agents.37
18. Exploring the Role of Life Scientists in Combating the Misuse of Incapacitating Chemical and Toxin Agents
47 This chapter does not, therefore, explore a range of parallel processes that can potentially play important
roles in preventing or ameliorating the effects of incapacitant weapons attack, including broadband defence
measures, strengthening public health surveillance and response, and so on. For further discussion of these
issues, see, for example, Pearson, 1998, op. cit.; Pearson, 2001, op. cit.; McLeish and Rappert, op. cit.
306
18. Exploring the Role of Life Scientists in Combating the Misuse of Incapacitating Chemical and Toxin Agents
recognises that states are the prime actors in existing regulatory regimes,
and allows for and encourages participation by the full range of relevant
stakeholders.
Applying this approach to the case of incapacitants, a HAC regime can be
envisioned comprising
state-led activities
adherence to comprehensive legal prohibitions against chemical and biological
weapons (CBW) enshrined in the Geneva Protocol, the Biological and Toxin
Weapons Convention (BTWC) and the Chemical Weapons Convention (CWC)
adherence to international humanitarian law (notably, the four Geneva
conventions and two additional protocols) and international human rights
law (including the Convention against Torture, the International Covenant
on Civil and Political Rights and the Universal Declaration on Human Rights)
adherence to other relevant international law and agreements including the
Rome Statute of the International Criminal Court, the Single Convention on
Narcotic Drugs and the UN Convention on Psychotropic Substances
effective monitoring, verification, investigation and enforcement of the above
obligations
application of stringent export controls and interdiction measures.
engagement by civil society
conducting societal monitoring and verification
developing a culture of responsibility amongst the scientific and medical
communities built upon strong normative and ethical standards
developing and advocating mechanisms to strengthen the regime.
Whilst certain authors have previously examined the application of a range of
state-led mechanisms for the regulation of incapacitants,48 the importance of the
life-science communitys engagement in this issue has been underexplored. In
the following sections, the author will, therefore, briefly examine the potential
application of the two most pertinent arms-control regimesthe Chemical
Weapons Convention and the Biological and Toxin Weapons Convention
before concentrating upon the potential roles that civil society, and particularly
48 See, for example: Crowley, op. cit.; Casey-Maslen, S. 2010, Non-Kinetic-Energy Weapons Termed NonLethal: A Preliminary Assessment under International Humanitarian Law and International Human Rights
Law, Geneva Academy of International Humanitarian Law and Human Rights, Geneva; ICRC, op. cit.;
International Committee of the Red Cross (ICRC) 2012, Toxic Chemicals As Weapons for Law Enforcement:
A Threat to Life and International Law? Synthesis paper, September, International Committee of the Red
Cross, Geneva; International Committee of the Red Cross (ICRC) 2013, Incapacitating Chemical Agents: Law
Enforcement, Human Rights Law and Policy Perspectives, Expert Meeting, Montreux, Switzerland, 2426 April
2012, ICRC, Geneva, 13 January 2013.
307
the scientific and medical communities, can play in preventing the misuse of
biomedical research for hostile purposes, most notably, the development of
incapacitating weapons.
18. Exploring the Role of Life Scientists in Combating the Misuse of Incapacitating Chemical and Toxin Agents
51 Analysis was undertaken of all relevant documents pertaining to this issue publicly available up to 10
September 2013.
52 Although there is no equivalent of an OPCW for the BTWC, the Sixth BTWC Review Conference
decided to create and fund a (three-person) Implementation Support Unit (ISU) within the UN Office for
Disarmament Affairs (UNODA) of the UN Office at Geneva. The ISU was launched in August 2007 and its
mandate was renewed and extended by the Seventh BTWC Review Conference to run until 2016. The ISU
provides administrative support to, and prepares documentation for, meetings agreed by the BTWC Review
Conference. The ISU also facilitates communication among states parties, international organisations, and
scientific and academic institutions, as well as NGOs. It also acts as a focal point for submission of information
by and to states parties, and will support, as appropriate, the implementation by the states parties of the
decisions and recommendations of the Sixth and Seventh BTWC Review Conferences. The ISU, however, has
no authority to undertake verification or compliance activities.
53 Convention on the Prohibition of the Development, Production, Stockpiling and Use of Chemical Weapons and
on their Destruction (Chemical Weapons Convention), 1993, art. I.1.
54 Ibid., art. I.3.
55 Ibid., art. I.4.
309
310
18. Exploring the Role of Life Scientists in Combating the Misuse of Incapacitating Chemical and Toxin Agents
has been a collective failure by the CWC states parties and policymaking organs
to effectively address the regulation of incapacitants under the convention. It
is therefore left to individual states parties to interpret the scope and nature of
their obligations with regard to the regulation of such agents.
Box 18.3 The ICRC and Incapacitants: A call to action
A range of civil society organisations and researchers has urged the international governmental
communityparticularly the OPCW and the CWC states partiesto take action to address
the dangers of the potential proliferation and inappropriate use of incapacitants.a The ICRC,
however, has noted that [a]lthough there have been exhortations by some States and by
some in academic circles to address these issues, there has been little or no movement to
date in the relevant multilateral fora.b Consequently in March 2010, the ICRC convened the
first of two expert meetings to explore the implications of these issues for international law.
The meeting brought together a group of 33 government and independent experts who were
joined by ICRC staff members.c For details of the second ICRC meeting, see: ICRC, 2013,
op. cit. In the subsequent report of the meeting, the ICRC urged [s]tates to give greater
attention to the implications for international law of incapacitant chemical agents. The
organisation also noted that [t]here is currently an opportunity to address preventatively
the challenges and risks identifiedd (emphasis added). In addition, the report concluded that
[t]here is a clear need to tackle the issues raised by incapacitating chemical agents in
appropriate fora engaging a broad range of experts including policy makers, law-enforcement
professionals, security personnel, military personnel, health professionals, scientists and
lawyers with IHL [international humanitarian law], human rights and disarmament expertisee
(emphases added).
See, for example: Crowley, op. cit., esp. pp. 57101, 11719; Pearson et al., 2007, op. cit.; Perry
Robinson, 2007 and October 2007, op. cit.
a
Ibid., p. 3.
Ibid., p. 75.
18. Exploring the Role of Life Scientists in Combating the Misuse of Incapacitating Chemical and Toxin Agents
in terms of the countries from which they can operate and consequently the
quantity and quality of information they are able to receive, particularly from
inaccessible regions and closed or semi-closed authoritarian countries.
Hay, A., Giacaman, R., Sansur, R. and Rose, S. 2006, Skin injuries caused by new riot control agent
used against civilians on the West Bank, Medicine,Conflict and Survival, vol. 22, no. 4 (October
December).
a
Ibid.
18. Exploring the Role of Life Scientists in Combating the Misuse of Incapacitating Chemical and Toxin Agents
The remaining sections of this chapter will explore the current range of
initiatives being undertaken by those in the scientific and medical communities
to nurture a culture of responsibility, beginning with the growing recognition
of the dual-use dilemma and the consequent requirement for effective oversight
of research. This will then be followed by a discussion of the potential utility of
oaths, codes and pledges and the parallel processes of education and awarenessraising in building the appropriate norms of behaviour for the scientific and
biomedical communities. The practical application of such principles by
individual scientists through such practices as whistleblowing will be explored
as well as the duty of individual scientists to inform the policies and practices
of governments in this area.
Research oversight and the dual-use dilemma
To date, much of the discourse amongst the life-science community concerning
how best to combat the proliferation and misuse of chemical and biological
weapons has concentrated on tackling the dual-use dilemma and has been
framed in terms of regulating the actions of individual life scientists conducting
academic research projects and publishing academic articles. Highly
influential in this discourse have been the 2003 Fink Report (Biotechnology
Research in An Age of Terrorism)70 and the 2006 Lemon Report (Globalisation,
Biosecurity and the Future of the Life Sciences), both produced under the auspices
of the National Research Council of the US National Academies. Both reports
highlighted the importance of taking a comprehensive approach to analysing
dual-use research of potential concern. The Lemon Report recommended
the adoption of a broadened awareness of threats beyond the classical select
agents and other pathogenic organisms and toxins, so as to include, for
example, approaches for disrupting host homeostatic and defense systems and
for creating synthetic organisms.71 The broad threat spectrums enunciated
by both reports, particularly that of the Lemon Committee, appear to capture
incapacitants within their scope.
As a result of the concerns and recommendations outlined in the Fink and Lemon
reports and the work of others,72 a range of oversight structures and processes
70 National Research Council 2004, Biotechnology Research in an Age of Terrorism, [Fink Report], The
National Academies Press, Washington, DC, p. 114.
71 National Research Council 2006, Globalisation, Biosecurity and the Future of the Life Sciences, [Lemon
Report], The National Academies Press, Washington, DC, p. 216.
72 A range of analysts has subsequently highlighted the need to address further potential actors of concern
beyond individual life-science researchers. Garfinkel et al., for example, defined three major points for
potential policy interventionnamely: commercial firms that sell synthetic DNA (oligonucleotides, genes or
genomes) to users; owners of laboratory bench-top DNA synthesizers, with which users can produce their
own DNA; the users (consumers) of synthetic DNA themselves and the institutions that support and oversee
their work. See: Garfinkel, M., Endy, D., Epstein, G. and Friedman, R. 2007, Synthetic Genomics: Options for
Governance, J. Craig Venter Institute, Rockville, Md, and San Diego, Calif.
315
18. Exploring the Role of Life Scientists in Combating the Misuse of Incapacitating Chemical and Toxin Agents
such codes have been undertaken by a wide range of scientific associations and
organisations including the American Society of Microbiology,85 the National
Academy of Sciences, The Royal Society,86 the International Centre for Genetic
Engineering and Biotechnology, the International Union of Biochemistry and
Molecular Biology and the International Council for the Life Sciences. These
activities have been complemented and stimulated by the ICRC as well as the
work of individual scientists87 and academics.88
Such codes may well help to sensitise life scientists to the dangers of dual-use
research, and reinforce the importance of, and promulgate, ethical red lines
where the legal prohibitions or normative taboos are already clearly defined and
widely accepted. The effectiveness of such an approach, to date, has, however,
been questioned by a range of scholars including Corneliussen, Dando, Perry
Robinson and Rappert.89 One important limitation of the majority of such
initiatives is that the resulting codes are aspirational and non-binding in nature
with no clearly identified penalties elaborated for those individuals who breach
the prohibitions, or mechanisms established to monitor and enforce such
prohibitions.
Recognising the limitations of self-governance initiatives to effect change
in this area, some have called for codes of conduct to become binding, with
those breaching such codes facing sanction from their peers (or the state). For
example, Rotblat, in a letter to a Pugwash Workshop on Science, Ethics and
Society in 2004, stated his belief that:
[S]ome believe that the search for knowledge overrides all other
considerations and that scientists should be entitled to ignore the
ethical elements of their work The harm to society that has resulted
from such attitudes has brought science into disrepute, and action is
needed to restore the proper image of science. The introduction of a
hippocratic oath is our example of such action, but it should perhaps
be given more than a symbolic value. Perhaps the time has come for a
85 American Society of Microbiology (ASM) 2005, Code of Ethics (Revised and approved by ASM Council in
2005), American Society of Microbiology, Washington, DC.
86 The Royal Society 2005, The Roles of Codes of Conduct in Preventing the Misuse of Scientific Research, The
Royal Society, London.
87 See Atlas, R. and Somerville, M. 2007, Life sciences or death science: tipping the balance towards life
with ethics, codes and laws, in McLeish and Rappert, op. cit.
88 See, for example, Revil, J. and Dando, M. 2006, A Hippocratic oath for life scientists, EMBO Reports,
vol. 7.
89 Rappert, B. 2010, Biological Weapons and Codes of Conduct, <http://projects.exeter.ac.uk/codesofconduct/
Chronology/index.htm> (viewed 22 December 2011).
318
18. Exploring the Role of Life Scientists in Combating the Misuse of Incapacitating Chemical and Toxin Agents
binding code of conduct, where only those who abide by the code should be
entitled to be practicing scientists, something which applies now to medical
practice [emphasis added].90
Examination of the literature, however, reveals that no such binding codes of
conduct prohibiting research, development or utilisation of incapacitating agents
have been established by any national or international scientific organisation to
date.
Furthermore, Rappert has noted that if codes are to go beyond reiterating
platitudes about the abhorrence of using modern biology toward malign ends,
then they are likely to confront major issues of controversy. For instance,
codes could comment on the acceptability of disputed attempts to develop
non-lethal incapacitating agents.91 Areas of dispute or controversy, such as
incapacitants, are the ones, however, where such codes remain silent or at best
provide ambiguous guidance. Similarly, as with dual-use research oversight,
whilst numerous codes condemn and seek to prevent the involvement of
scientists in the development of biological and chemical weapons by non-state
actors, it is questionable whether enough energy has been devoted to targeting
the involvement of life scientists in state-run weapons programs.
Corneliussen has asked
why voluntary self-governance regimesand codes of conduct in
particularare being given so much attention in policy discussions
about preventing the misuse of biological research when they appear to
have significant shortcomings in practice. Indeed, why have individual
scientists become the target of the policy discussions when it is generally
accepted within the disarmament community that the greatest risk of
misuse is at the level of national biological weapons programmes
Preventing these state-level programmes in the future should therefore
be a primary concern, rather than implementing codes of conduct for
life scientists.92
Corneliussen further contends that:
[T]he current sole focus on codes, and the extensive investment of
resources that accompanies it, might well serve to detract from other more
90 Rotblat, J. 2004, Letter to workshop, in Report of 2nd Pugwash Workshop: Science, Ethics and Society,
Ajaccio, Corsica, 1012 September 2004, <http://www.pugwash.org/reports/ees/corsica2004/corsica2004.
htm#attach2> (viewed 22 December 2011).
91 Rappert, B. 2005, Biological weapons and the life sciences: the potential for professional codes,
Disarmament Forum. Volume 1: Science Technology and the CBW Regimes, p. 56.
92 Corneliussen, F. 2006, Adequate regulation, a stop-gap measure, or part of a package? Debates on codes
of conduct for scientists could be diverting attention away from more serious questions, EMBO Reports, vol.
7 (Special Issue), p. 53.
319
crucial regulatory measures that target not only individual scientists but
also state programmes. Without this plurality of regulatory measures in
place, codes of conduct are doomed to fail.93
More recently there have been some indications of an increasing awareness
amongst certain life-science communities of the dangers of cooption into state
programs and initiatives undertaken to address these dangers (see Box 18.5).
Box 18.5 A Pledge for Neuroscientists
In January 2010, Dr Curtis Bell, senior scientist emeritus at the Oregon Health and Science
University, circulated a pledge intended to foster opposition amongst neuroscientists to
the application of neuroscience to torture and other forms of coercive interrogation or
manipulation that violate human rights and personhood.a According to Bell, such applications
could include drugs that cause excessive pain, anxiety, or trust, and manipulations such as
brain stimulation or inactivation.b Furthermore, signatories would oppose the application of
neuroscience to aggressive war illegal under international law. The pledge states that a
government which engages in aggressive wars should not be provided with tools to engage
more effectively in such wars. Neuroscience can and does provide such tools. Examples
include drugs which damage the effectiveness of soldiers on the other side.c
Under the pledge, neuroscientists commit to making themselves aware of the potential misuse
of neuroscience for violations of basic human rights or international law such as torture
and aggressive war and commit to refusing to knowingly participate in the application of
Neuroscience to violations of basic human rights or international law.d Bell acknowledges
that [s]igning this pledge will not stop aggressive wars or human rights violations, or
even the use of neuroscience for these purposes; however, he believes that by signing,
neuroscientists will help make such applications less acceptable.e
Ibid.
Ibid.
Bell, C. 2010c, Neurons for peace: take the pledge, brain scientists, New Scientist, no. 2746
(8 February).
e
Despite the energy and resources expended upon the development and
promotion of codes, however, Rapperts stark conclusion is that efforts to
devise meaningful codes have largely floundered. Rappert believes that [i]n
no small part, this has been due to the lack of prior awareness and attention by
researchers as well as science organisations to the destructive applications of the
life sciences. Before codes can help teach, education is needed.94
93 Ibid., p. 54.
94 Rappert, 2010, op. cit., p. 14.
320
18. Exploring the Role of Life Scientists in Combating the Misuse of Incapacitating Chemical and Toxin Agents
95 Dando, M. and Rappert, B. 2005, Codes of Conduct for the Life Sciences: Some Insights from UK Academia,
Bradford Briefing Paper No. 16 (2nd series), University of Bradford, UK, p. 23.
96 Rappert, B., Chevrier, M. and Dando, M. 2006, In-Depth Implementation of the BTWC: Education and
Outreach, Report presented at Sixth BTWC Review Conference, University of Bradford, UK, para. 89.
97 Whitby, S. and Dando, M. 2010, Biosecurity awareness-raising and education for life scientists: what
should be done now? in B. Rappert (ed.), Education and Ethics in the Life Sciences: Strengthening the Prohibition
of Biological Weapons, ANU E Press, Canberra. For a detailed discussion, see: Rappert, B. 2007, Biotechnology,
Security and the Search for Limits: An Inquiry into Research and Methods, Palgrave, Basingstoke, UK; Rappert,
B. 2009, Experimental Secrets: International Security, Codes, and the Future of Research, University Press of
America, Lanham, Md.
98 Dando, M. 2009, Biologists napping while work militarized, Nature, vol. 460, no. 7258, p. 951. Dandos
concerns are echoed in an accompanying Nature editorial entitled: A question of control: scientists must
address the ethics of using neuroactive compounds to quash domestic crises.
321
18. Exploring the Role of Life Scientists in Combating the Misuse of Incapacitating Chemical and Toxin Agents
104 See, for example: Maclean, A. 2006, Historical Survey of the Porton Down Volunteer Programme, Ministry
of Defence, London, <http://www.mod.uk/NR/rdonlyres/7211B28A-F5CB-4803-AAAC-2D34F3DBD961/0/
part_iv.pdf> (viewed 17 August 2011), pp. 10942; Ketchum, J. 2006, Chemical Warfare: Secrets Almost
Forgotten, James S. Ketchum (self-published); Gould, C. and Folb, P. 2002, Project Coast: Apartheids Chemical
and Biological Warfare Programme, United Nations Institute for Disarmament Research, Geneva.
105 Hess et al., 2005, op. cit.
106 Gross, M. 2010, Medicalized weapons and modern war, Hastings Center Report, vol. 40, no. 1, pp.
345.
107 World Medical Association (WMA) 1956, Regulations in Times of Armed Conflict, Adopted by the 10th
World Medical Assembly, Havana, Cuba, October 1956, last amended by the 35th World Medical Assembly,
Tokyo, Japan, 2004, and editorially revised at the 173rd Council Session, Divonne-les-Bains, France, May
2006, para. 2.
323
in torture, ill treatment and other forms of human rights abuse.108 The WMA
has also sought to develop ethical guidelines prohibiting medical involvement
in the development of chemical or biological weapons.109
In addition, guidance has also been developed by certain national medical
associations and other medical bodies110 on the ethical considerations
surrounding health professionals involvement in specific weapons research
or weapons development more generally. A number of such bodies have also
established a range of mechanisms and structures to implement ethical standards
including ethics boards that have the authority to suspend or disbar physicians
from practising medicine in cases of extreme misconduct.
There are thus ethical frameworks and mechanisms in place that describe and
regulate the duty of health professionals to abide by and promote aspects of
human rights and international humanitarian law, and specifically prohibit
engagement in acts such as torture or the development of biological and
toxin weapons. Whilst these and other ethical standardsparticularly those
concerned with medical research involving human subjects111can in theory be
applied to the development and utilisation of incapacitants, this does not appear
to have occurred in a consistent manner to date. There are no widely accepted
guidelines specifically determining the permissibility or non-permissibility of
physician involvement in the development, testing or utilisation of so-called
nonlethal weapons in general and incapacitants in particular. Indeed the issue
appears, at present, to be both underexplored and contentious, with a spectrum
of opinions held by health professionals and medical ethicists.
Amongst national medical associations, it is the British Medical Association
(BMA) that has taken the lead in the development of ethical guidance for
the health community on the issue of nonlethal chemical weapons, and in
108 World Medical Association (WMA) 1975, Declaration of Tokyo, Adopted by the 29th World Medical
Assembly, Tokyo, Japan, October 1975, and editorially revised at the 170th Council Session, Divonne-lesBains, France, May 2005, and the 173rd Council Session, Divonne-les-Bains, France, May 2006.
109 World Medical Association (WMA) 1990, Declaration on Chemical and Biological Weapons, Adopted by
the 42nd World Medical Assembly, Rancho Mirage, Calif., October 1990, and rescinded at the WMA General
Assembly, Santiago, 2005; World Medical Association (WMA) n.d., Washington Declaration on Biological
Weapons, paras 18 and 19.
110 For example, the US Army textbook of military ethics urges medical professionals to stay in the
business of healing and not hurting, which includes not participating in or contributing to weapons research
and development. Frisina, M. E. 2003, Medical ethics in military biomedical research, in T. E. Beam and L. R.
Sparacino (eds), Military Medical Ethics, vol. 2, Office of the Surgeon General/Borden Institute, Washington,
DC, pp. 53361, as cited in Gross, op. cit., p. 37.
111 See, for example: World Medical Association (WMA) 1964, Declaration of Helsinki, Adopted by the
18th WMA General Assembly, Helsinki, , June 1964, and last amended at the 59th WMA General Assembly,
Seoul, October 2008, <http://www.wma.net/en/30publications/10policies/b3/index.html> (viewed 22 July
2011); Nuremberg Code, 1949, as detailed in Trials of War Criminals before the Nuremberg Military Tribunals
under Control Council Law, vol. 2, no. 10, pp. 1812, US Government Printing Office, Washington, DC, 1949,
available from: Office of Human Subjects Research, US National Institutes of Health, <http://ohsr.od.nih.gov/
guidelines/nuremberg.html> (viewed 22 July 2011).
324
18. Exploring the Role of Life Scientists in Combating the Misuse of Incapacitating Chemical and Toxin Agents
18. Exploring the Role of Life Scientists in Combating the Misuse of Incapacitating Chemical and Toxin Agents
on their statute books, the effectiveness of such legislation and its enforcement
are variable.128 Furthermore, Martin, who has long experience of working with,
and seeking to protect, whistleblowers in many different spheres, believes that
the track record of whistle-blower protection measureswhistle-blower laws,
hot-lines, ombudsmen and the likeis abysmal. In many cases, these formal
processes give only an illusion of protection. Codes of ethics seem similarly
impotent in the face of the problems.129
It is important that independent scientists, health professionals and professional
bodiesin cooperation with human rights, civil liberties and whistleblowing
organisationspromote the establishment of truly effective mechanisms
under international and domestic law that provide legal protection against
discrimination and criminal prosecution for whistleblowers. Furthermore, given
the failings of the current systems of whistleblower protection, the life-science
and biomedical communities have a duty to support those individuals who
refuse to participate in what they consider immoral research and development
projects, and those who blow the whistle on such activities. A number of
scientific associations and professional bodies have mechanisms for promoting
ethical standards amongst their members, which can also be utilised to support
colleagues facing reprisals for acting ethically.130
Conclusion
In its public statement Preventing hostile use of the life sciences, from ethics
and law to best practice, the ICRC has noted that [a]dvances in the life
sciences hold great promise for humanity, but that there is also great risk if
these same advances are put to hostile use.131 Whether the life sciences will
be employed for the further development and proliferation of incapacitant
weapons with the consequent dangers of the misuse of such weapons for
internal repression or offensive military operations will depend to a significant
degree on the actions of individual scientists and biomedical researchers and on
the role of the life-science community as a whole. At a minimum this must entail
stringent personal and community reflection and regulation of the trajectories
128 See, for example: Calland, R. and Dehn, G. (eds) 2004, Whistleblowing around the World, Open
Democracy Advice Centre, Cape Town, & Public Concern at Work, London; Deiseroth, op. cit.; Devine, T.
1997, The Whistleblowers Survival Guide, Fund for Constitutional Government, Washington, DC; Devine,
T. 2004, Whistleblowing in the United States: the gap between vision and lessons learned, in Calland and
Dehn, op. cit.; Martin, B. 1999a, Suppression of dissent in science, Research in Social Problems and Public
Policy, vol. 7, pp. 10535; Martin, B. 1999b, The Whistleblowers Handbook, Jon Carpenter, Charlbury, UK.
129 Martin, B. 2007, Whistle-blowing: risks and skills, in McLeish and Rappert, op. cit., p. 6.
130 For example, the AAAS Science and Human Rights Program (SHRP), which works with scientists to
advance science and serve society through human rights, has successfully campaigned on behalf of a number
of scientific whistleblowers. For further information, see: <http://shr.aaas.org/>.
131 ICRC, 2004, op. cit.
328
18. Exploring the Role of Life Scientists in Combating the Misuse of Incapacitating Chemical and Toxin Agents
329
Introduction
There has been much discussion over the past years about global health security
and how to strengthen it. One area that has raised much activity revolves around
the risks posed by accidents and the potential deliberate misuse of life-sciences
research.2 Different actors have proposed a variety of measures to manage such
potential risks.3 Yet little information is available about the needs and capacities
of countries, laboratories and research institutions in this area.
The World Health Organisation (WHO) has developed a self-assessment
questionnaire for laboratory managers and researchers to assess their needs and
1 The author is a staff member of the World Health Organisation. The author alone is responsible for the
views expressed in this chapter and he does not necessarily represent the decisions or policies of the World
Health Organisation.
2 Research accidents are understood as research activities that may unexpectedly pose some risks via
accidental discoveries. World Health Organisation (WHO) 2010, Responsible Life Sciences Research for Global
Health Security. A Guidance Document, WHO/HSE/GAR/BDP/2010.2, World Health Organisation, Geneva.
3 See, for instance: National Research Council 2004, Biotechnology Research in An Age of Terrorism, Committee
on Research Standards and Practices to Prevent the Destructive Application of Biotechnology, The National
Academies Press, Washington, DC; National Science Advisory Board for Biosecurity (NSABB) 2007,Proposed
Framework for the Oversight of Dual Use Life Sciences Research: Strategies for Minimizing the Potential
Misuse of Research Information, National Science Advisory Board for Biosecurity, Bethesda, Md; Miller,
S. and Selgelid, M. 2008, Ethical and Philosophical Consideration of the Dual-Use Dilemma in the Biological
Sciences, Springer, Dordrecht; Report prepared by the Centre for Applied Philosophy and Public Ethics at
The Australian National University for the Australian Department of Prime Minister and Cabinet, National
Security Science and Technology Unit, November 2006; Steinbruner, J. et al. 2007, Controlling Dangerous
Pathogens. A Prototype Protective Oversight System, March, Center for International and Security Studies at
Maryland (CISSM), University of Maryland, College Park; InterAcademy Panel on International Issues (IAP)
2005, IAP Statement on Biosecurity, November; Statement on the scientific publication and security, Science,
vol. 299 (2003), p. 1149; United Nations Meeting of the States Parties to the Convention on the Prohibition
of the Development, Production and Stockpiling of Bacteriological (Biological) and Toxin Weapons and on
their Destruction; Managing Risks of Misuse Associated with Grant Funding Activities. A Joint Biotechnology
and Biological Sciences Research Council (BBSRC), Medical Research Council (MRC) and Wellcome Trust Policy
Statement, September 2005; Wellcome Trust 2005, Guidelines on Good Research Practice, November; Dando, M.
and Rappert, B. 2005, Codes of conduct for the life sciences: some insights from UK academia, Briefing Paper
No. 16 (Second series), May, University of Bradford, UK.
331
19. WHO Project on Responsible Life-Sciences Research for Global Health Security
Pillar 2: Ethics. This involves the promotion of responsible and good research
practices, the provision of tools and practices to scientists and institutions
that allow them to discuss, analyse and resolve in an open atmosphere the
potential dilemmas they may face in their research, including those related
to the possibility of accidents or misuse of the life sciences.
Pillar 3: Biosafety and laboratory biosecurity. This concerns the
implementation and strengthening of measures and procedures to: minimise
the risk of worker exposure to pathogens and infections; protect the
environment and the community; and protect, control and account for
valuable biological materials (VBM) within laboratories, in order to prevent
their unauthorised access, loss, theft, misuse, diversion or intentional release.
Such measures reinforce good research practices and are aimed at ensuring
a safe and secure laboratory environment, thereby reducing any potential
risks of accidents or deliberate misuse.
This approach recognises that there is no single solution or system that will
suit all countries, institutions or laboratories. Each country or institution that
assesses the extent to which it has systems and practices in place to deal with
the risks posed by accidents or the potential deliberate misuse of life-sciences
research will decide which measures are most appropriate and relevant according
to their own national circumstances and contexts.
333
Likewise, this approach also acknowledges the fact that there is a multiplicity
of stakeholders developing different measures. The public health sector is one
stakeholder among others (for example, national academies of science, ethicists
and security communities), and coordination at national and international levels
is therefore crucial for addressing those types of risks.
One important advantage of this integrated framework is that it recommends
investing capacities in public health areas that either are already in place in
many countries or are being developed for strengthening research capacities
and for addressing laboratory bio-risks. This approach therefore promotes a
sustainable and effective way to address such bio-risks. Indeed it emphasises
that one of the most effective ways to prepare for deliberately caused disease is
to strengthen public health measures for naturally occurring and accidentally
occurring diseases.6 In a similar manner, the promotion of a culture of scientific
integrity and excellence is considered as one of the best protections against the
possibility of accidents and deliberate misuse of life-sciences research and offers
the best prospect for scientific progress and development. A culture of scientific
integrity and excellence can be strengthened through the development and
reinforcement of capacities in the three aforementioned public health pillars.
In adopting the bio-risk management framework for responsible life-sciences
research, countries and institutions are encouraged to consider
reinforcing public health capacities in terms of research for health, biosafety
and laboratory biosecurity management and ethics
investing in training personnel (laboratory staff and researchers) and students
in ethics, the responsible conduct of research, and biosafety and laboratory
biosecurity
ensuring compliance with biosafety and laboratory biosecurity
identifying multi-stakeholder issues, with different layers of responsibilities
and encouraging coordination among stakeholders
using existing mechanisms, procedures and systems and reinforcing local
institutional bodies (if they exist).
While recognising the above, countries and institutions may consider drawing
on a bio-risk management framework for responsible life-sciences research,
which has been developed to help countries and institutions identify where
public health resources can be used for addressing the above risks. To facilitate
such analysis, a self-assessment questionnaire on responsible life-science
research has been developed to support health researchers, laboratory managers
and research institutions to identify strengths and to address weaknesses in
6 World Health Assembly Resolution WHA55.16 of 18 May 2002, Global Public Health Response to Natural
Occurrence, Accidental Release or Deliberate Use of Biological and Chemical Agents or Radionuclear Material
that Affect Health, World Health Organisation, Geneva.
334
19. WHO Project on Responsible Life-Sciences Research for Global Health Security
each of the three pillars. Among these three pillars, ethics is an important public
health area where countries and institutions can invest capacities to address
such bio-risks.
An ethics framework
Context and related initiatives
The potential duality of research and science, and related moral concerns, is
not a new issue. Following the invention of atomic weapons, many scientists,
including Einstein, Oppenheimer and Russell, voiced concern about the
potential dangers to humanity arising from scientific discoveries.7
Ethical issues arising in research and the corresponding obligations of scientists
have for a long time been one of the key areas of work on ethics of the UN
Educational, Scientific and Cultural Organisation (UNESCO). Already in 1974,
UNESCOs eighteenth general conference adopted the Recommendation on the
Status of Scientific Researchers,8 which enumerated ethical responsibilities and
rights of scientists.
In 1999, UNESCO organised the World Conference on Science, in Budapest,
which gave special attention to ethical principles and responsibilities in the
practice of science.
Since 2005, UNESCO has led a new project, with regional consultations, to
survey the field of science ethics and to evaluate the need for an update of the
1974 document.9
Until recently, however, the bioethics community concerned with research has
focused on the protection of human subjects in clinical trials, and other kinds of
ethical issues raised by genetics and related disciplines. It is only in the past few
years that ethicists have become more engaged with the scientific community
and security experts regarding the ethical questions raised by the potential
misuse of science.
7 See, for example: Russell-Einstein Manifesto and the Pugwash Conferences on Science and World Affairs,
<http://www.pugwash.org/about/manifesto.htm> (viewed 7 February 2012).
8 United Nations Educational, Scientific and Cultural Organisation (UNESCO) 1974, Recommendation
Adopted on the Report of the Commission for Science at the Thirty-Eighth Plenary Meeting on 20 November 1974,
Paris.
9 Scholze, S. 2006, Setting standards for scientists: for almost ten years, COMEST has advised UNESCO on
the formulation of ethical guidelines, EMBO Reports, vol. 7 (SI) (July), pp. S657.
335
Ethics is now recognised in the field as providing a useful framework for the
identification of dilemmas, and the discussion and evaluation of responsibilities
of different stakeholders. Ethics can help promote understanding of the nature
of decisions that researchers and other actors have to make.10 The discussion
of ethical issues can contribute to consensus-building among the stakeholders,
who sometimes have competing interests.
19. WHO Project on Responsible Life-Sciences Research for Global Health Security
13 For more examples, see WHO, 2010, op. cit., pp. 589.
14 Rappert, B. 2007, Codes of conduct and biological weapons: an in-process assessment, Biosecurity and
Bioterrorism: Biodefense Strategy, Practice and Science, vol. 5, pp. 14554.
337
Research institutions
As most research takes place in institutions, these also have important
responsibilities in the prevention of accidents and the deliberate misuse of
research. Research institutions have an obligation to ensure that research within
their premises is conducted according to national laws and relevant codes of
conduct. They should sensitise their researchers to dual-use issuesfor example,
through education and specific training, promoting discussion and reflection
on research practices, and creating a climate that encourages scientists to come
forward in case of difficulties. Procedures and standards for whistleblowing
should be in place for cases of scientific misconduct.
Another potentially useful approach to address the dual-use dilemma, which
has been proposed at the institutional level, is the ethical review of sensitive
research by an ethics committee. In practice, however, it is not clear which
committees could take on this role. Traditional research ethics committees review
research with human subjects, and not bench science, and their members are
not usually trained in biosecurity issues. With specific training, the members of
research ethics committees could probably be enabled to play this role. Another
15 Somerville, M. and Atlas, R. 2005, Ethics: a weapon to counter bioterrorism, Science, vol. 307, pp.
18812.
16 American Association for the Advancement of Science (AAAS) 2008, Professional and Graduate-Level
Programs on Dual Use Research and Biosecurity for Scientists Working in the Biological Sciences: Workshop
Report, American Association for the Advancement of Science, Washington, DC.
338
19. WHO Project on Responsible Life-Sciences Research for Global Health Security
Methodology
The self-assessment questionnaire on responsible life-sciences research
lists several statements associated with each pillar, using the Likert scale. A
340
19. WHO Project on Responsible Life-Sciences Research for Global Health Security
The pilot study had some limitations. First, the sample size of the pilot study
is too small to draw conclusions that would be representative of the whole
facility. The collected data are therefore only illustrative of potential needs
and capacities. A second limitation of the self-assessment questionnaire is that
these questions may be considered subjective. The validity and reliability
of the subjective measures have been enhanced through multiple questions
measuring the same subjective state. The self-assessment questionnaire has
indeed been designed in a way that statements regarding the different pillars
can be analysed in combination, thereby reinforcing the data analysis. It should
be noted, however, that one important strength of the questionnaire is its ability
to identify differences in opinions about existing systems and measures so that
the discrepancies can be dealt withnot necessarily through a change in the
system itself but just with, sometimes, further discussion with the staff about
existing measures.
Results
The data collected by the pilot study regarding the ethics pillar showed the22
following.
Seventeen of the 18 respondents agreed or strongly agreed that education
and/or training is offered on research ethics. One respondent was undecided.
This is reinforced with the fact that 15 respondents agreed or strongly agreed
with the statement that ongoing research training takes place.
Likewise, a majority of respondents (15) agreed or strongly agreed that
research proposals raising ethical issues (whether or not human or animal
subjects are involved) are subject to review.
A majority of respondents, however, disagreed with the statement that
education and/or training is offered on dual use. Nine respondents disagree
or strongly disagreed, and seven respondents were undecided about the
statement. Only two agreed with it.
Eight respondents also disagreed with the statement that researchers are
aware of and informed about national and international laws and regulations
related to their research, and three respondents were undecided. This can
be read in conjunction with the fact that 14 respondents were undecided
or disagreed with the statement that information about the national and
international laws and regulations related to all fields of science is easily
accessible.
In addition, 13 respondents were undecided or disagreed with the statement
that whistleblowing mechanisms exist and make provision for the protection
22Ibid.
342
19. WHO Project on Responsible Life-Sciences Research for Global Health Security
Discussion
The pilot study at the South African facility illustrates the type of useful and
informative feedback the self-assessment questionnaire can provide to laboratory
managers and their staff.23 In this case, the pilot study may seem to indicate that
ethics training is an asset of the facility and that research proposals are being
regularly reviewed. But it also seems to point out some areas for action.
For instance, disagreements occurred in four areas
about the education and/or training on dual-use issues
about their awareness of and information regarding national and international
laws and regulations related to their research
about the existence of mechanisms for staff to report unlawful or irregular
conduct (that is, whistleblowing mechanisms) and where to turn for
competent advice if they have ethical, safety or security questions relating
to their research
about the publication and implementation of ethical guidelines and practices
and mechanisms for investigating non-adherence to ethical standards.
When a majority of respondents disagreed with a statement or was undecided,
this may point to a certain weakness that needs to be addressed. In this case, a
majority of respondents noted the lack of training and/or education on dual-use
issues.
Disagreement among respondents may indicate that there may be measures in
place but these are not sufficiently well communicated to all employees. This
may mean that ethical standards for research may be in place in the facility,
23 Although the pilot study implemented the whole self-assessment questionnaire, only the ethics data are
being reported in this chapter.
343
19. WHO Project on Responsible Life-Sciences Research for Global Health Security
24 Kenya E. et al., An Assessment of the Capacity of Research and Diagnostic Laboratories in Nairobi and
Surrounds, June 2012.
345
Conclusion
20. Ethics as
Brian Rappert
The contributions to On the Dual Uses of Science and Ethics have extended an
invitation to dwell on a matter of much importance: the unity of knowledge.
Each chapter has considered the potential for the skills, know-how, information
and techniques associated with modern biology to serve contrasting ends. Each
has spoken to steps that might be undertaken to prevent the deliberate spread
of disease. A recurring message has been that, to date, the discussion about the
dual-use potential of the life sciences has been characterised by silences and
absences. To be sure, while some researchers and policymakers have devoted
attention to this topic for years, overall the regard is patchy and problematic.
But as the title of this volume advocates, as we attend to the multiple purposes
served by the life sciences, parallel regard should be directed towards those
served by ethics. Another theme of On the Dual Uses of Science and Ethics has
been that, to date, the ethics discussion about the multiple potentials of the
life sciences has been characterised by silences and absences. This book has
sought to redress the state of analysis by bringing together individuals with
varied backgrounds. Some have been trained as ethicists but had given little
prior attention to the destructive applications of science, while others had long
mulled over this topic but not directly informed by the discipline of ethics.
As part of building the debate, the contributors have asked what role ethics
(particularly bioethics) might serve in averting the deliberate spread of disease.
Part reflection on the preceding chapters, part analysis of the wider literature,
and part programmatic agenda setting, this concluding chapter addresses how
the history, methods and practices of bioethics could enrich current dualuse deliberations. This is conceived as a two-way exchange. The queries,
uncertainties and anomalies raised by examining the unity of knowledge will
also be approached for what they tell us about the preoccupations of bioethics.
Through this the limits of bioethics, how it must adapt and what is at stake in
suggesting change is required will be considered.
In asking about uses, regard will also turn to possible abuses of bioethics.
Efforts to bring to bear the intellectual resources from this tradition come with
their own dangers. The assumptions informing normative approaches, how they
are positioned to justify courses of (in)action and the blind-spots of analysis
are just some of the reasons for holding together worries about the direction of
ethics and the life sciences. For instance, one of the common ways bioethics has
positioned itself as a discipline is as an intellectual response to new innovations
349
in science and technology.1 Ethicists then set out to offer needed analytical
responses to the latest challenges cropping up. Yet, this techno-origin myth2
risks becoming blind to the organisation, funding priorities, agendas and
purposes of ethics.
This chapter sets about to accomplish the above aims through a four-part
analysis. The next section begins by scrutinising assumptions informing
the contributions to On the Dual Uses of Science and Ethics as well as other
commentaries. Undertaking this critical turning back is vital if we wish to set
a productive program for the future. Section three asks how normative and
empirical approaches can be reconciled in efforts to prevent the destructive
application of the life sciences. The fourth section considers the potential
dangers associated with promoting greater bioethical attention to dual use.
Informed by the analysis that precedes it, the final section sets out a path for
future intellectual and practical engagement. That path entails attending to the
whys and hows associated with what is not: a) what is not recognised as posing
a concern in the first place or, if recognised at some level, b) what is not treated
as a problem, and c) if regarded as a problem then what is not acted upon.
Questioning beginnings
This section begins by undertaking a central task for robust ethical analysis:
scrutinising presumptions.
1. Categorical condemnation
The contributors to this volume have begun with a normative position that
has itself attracted little attention: the categorical unacceptability of biological
weapons. This overall evaluation is codified in the 1972 Biological and Toxin
Weapons Convention (BTWC) that forbids states from acquiring or developing
bio-agents and toxins for hostile purposes.
Within international diplomacy, policy deliberation and academic study, the
attribution of an abhorrent or an inhumane standing to these weapons is
commonplace (as in van der Bruggens chapter in this volume). While some
have impugned the political expediency of the categorical prohibition of
bioweapons vis--vis the laxer controls for nuclear weapons (which still enable
1 See Borry, P., Schotsmans, P. and Dierickx, K. 2005, The birth of the empirical turn in bioethics, Bioethics,
vol. 19, no. 1; and Kushe, H. and Singer, P. (eds), A Companion to Bioethics, Blackwell, London.
2 de Vries, R. 2007, Who will guard the guardians of neuroscience? Firing the neuroethical imagination,
EMBO Reports, vol. 8, S659.
350
20. Ethics as
3 See Falk, R. 2001, The challenges of biological weaponry, in S. Wright (ed.), Biological Warfare and
Disarmament, Rowman & Littlefield, London.
4 For an examination of the ethics of biological weapons, see, for example: Sims, N. 1991, Morality and
biological warfare, Arms Control, vol. 8, pp. 523; and Krickus, R. 1965, On the morality of chemical/
biological war, Journal of Conflict Resolution, vol. 9, no. 2, pp. 20010. Even in attempts to differentiate the
moral standing of types of biological weapons, the starting assumption is often that they all share dubious
qualities, as in Appel, J. M. 2009, Is all fair in biological warfare? The controversy over genetically engineered
biological weapons, Journal of Medical Ethics, vol. 35, pp. 42932.
5 James, A. 2007, Science & technology policy and international security, in B. Rappert (ed.), Technology
and Security, Palgrave, London.
6 Colwell, R. 2010, Understanding Biosecurity: Protecting against the Misuse of Science in Todays World,
National Research Council, Washington, DC.
7 Zanders, J. P. 2003, International norms against chemical and biological warfare, Journal of Conflict &
Security Law, vol. 8, no. 2, pp. 391410.
351
reasons.8 Prior to the ratification of the BTWC, in the 1960s scientific societies
(such as the American Society for Microbiology)9 found themselves embroiled
in bitter dispute about the rights and wrongs of offensive programs.
Recognition of this historical contestation is not taken as an opportunity to
reject the denunciation of bioweapons and the corresponding need to avert the
life sciences from aiding their development. In this sense, this chapter begins
where others in the volume have begun (and ended).
Noting this contestation though will be taken as an opportunity to reflect
on the basis for todays commonsense moral appraisals and thus how ethical
analysis can be made to matter. In relation to the topic of this volume, that
means questioning how norms, stigmas and taboos can be bolstered (see below).
Noting this contestation also will be taken as an opportunity to reflect on how
the boundary is set between what is and is not treated as a biological weapon.
As Whitby details in his chapter about plant inoculants, at times the distinction
can be fine to non-existent. The contingency associated with categorisations
is evident elsewhere. During ancient Greek and Roman times, debate about
the morality of deliberate poisoning and the contamination of water supplies
took place alongside debate about the morality of driving enemy troops into
mosquito-infested areas.10 Each of these represented a sort of weaponising of
nature, even if the last would not fit today under the term biowarfare. And
as yet another aspect of boundary drawing, while the use of pathogens as
weapons to inflict disease might be widely deplored, just when it is deemed
appropriate for biology to serve war-fighting is of much less accord. Whitmans
chapter indicated the range of bio-enabling capabilities that nanotechnology
offers militaries. The moral status of such applications is likely to be a matter of
disagreement in a way it would not be for other areas of nanotechnology given
the link to biology.
What such examples suggest is that the descriptive matter of how things become
understood needs to accompany the normative matter of how things ought to be
judged.
Take the example of bio-defence. As a number of contributors in this volume have
indicated, the acceptability of activities undertaken in the name of protection
is not a matter of unanimity. In no small part this has been due to disagreement
about what distinguishes the knowledge and techniques necessary for defence
from those of offence. As I have argued elsewhere, at stake in past disputes about
8 Balmer, B. 2002, Killing without the distressing preliminaries, Minerva, vol. 40, pp. 5775.
9 Cassell, G., Miller, L. and Rest, R. 1994, Biological warfare: role of scientific societies, in R. Zilinskas (ed.),
The Microbiologist and Biological Defense Research, New York Academy of Science, New York.
10 Mayor, A. 2009, Greek Fire, Poison Arrows and Scorpion Bombs, Overlook Duckworth, London.
352
20. Ethics as
what is ethically and legally permissible have been thorny questions about how
the purposes of activities can be discerned. Commentators have been divided on
whether purpose can be discerned from the characteristics of the activities or
whether their meaning has to be found in some wider context. In practice, for
many who subscribe to a contextual approach a sense of the context is often
mutually defined in relation to a sense of activities that are under scrutiny. For
instance, expansion of American military funding during the 1980s was one
occasion that featured heated dispute about the appropriateness of bio-defence.
As part of this debate, attempts to evaluate aerosolisation-testing centres were
subjected to contrasting assessments due to alternative characterisations about
the overall purpose of bio-defence programs. But on top of this:
[S]uch characterizations, in turn, [informed] determinations of intent.
Many critics of the US biodefense program treated the secrecy
surrounding it and the wider political posturing of US administrations
as indicators of the dubious intent of the undertaking. Seemingly
acceptable individual projects were re-considered for how they
might inform offensive weapons development. In turn, the number
and character of such ambiguous activities provided justification for
concerns about the ultimate aims served by the program as a whole.11
It is through suchwhat might well be deemed circularreasoning that
attempts to assess what is permitted often unfold. Herein, the trust held in a
presidential administration informed the evaluation of particular projects and
vice versa. In recognising such co-definition dynamics, it is possible to go
beyond simply noting the recurring difficulties with making decisions about
whether biodefense research can be ethically justified as truly defensive in
nature.12 Instead it is possible to examine how boundaries and evaluations are
established.
The previous paragraph indicated the need for empirically informed ethical
analysis in relation to matters of dual usea theme that will be taken up again
in the next section.
2. Neglected dis-ease
As mentioned at the beginning of this chapter, a motivation for this volume
and a conclusion supported by its contributorswas a sense that dual-use issues
had not hitherto been subjected to significant or sufficient ethical analysis.13
11 Rappert, B. 2005, Prohibitions, weapons and controversy: managing the problem of ordering, Social
Studies of Science, vol. 35, no. 2, p. 231.
12 King, N. 2005, The ethics of biodefense, Bioethics, vol. 19, no. 4, pp. 43246.
13 As also voiced in Chadwick, R. 2011, Bio- and security ethics: only connect, Bioethics, vol. 25, no. 1,
p. ii.
353
There are, however, past and present lively areas of deliberation related to the
overall themes of On the Dual Uses of Science and Ethics. The rights of research
subjects have long been an enduring topic of mainstream medical ethics. Human
experimentation in weapons programs has more than its share of flagrant
violations of rights.14 Just how much research and healthcare priorities ought
to pay to concerns about the deliberate spread of disease have animated much
debate recently, too, with a feared biosecuritisation of priorities.15 Many have
contended that the increasing funding directed at security post 9/11 has led to
skewed research and healthcare agendas. Herein pathogens that might kill in
warfare are given disproportionate resources compared with those that do kill
on a daily basis. What suffer are human welfare, global justice and international
security.16 How many lives, it might be asked, have been lost because of the
choice to spend scare resources on non-existent threats? In contrast, others
have advocated for the increasing inclusion of security within public health and
thereby reconceiving what is meant by both.17 Herein, not only do healthcare
systems need to do more to prepare for the intentional spread of disease, but also
security agencies need to recognise the gravity of natural disease for national
protection.18
These points about biosecuritisation suggest how the destructive use of the life
sciences overlaps with healthcare preparations for the outbreak of disease. To
the extent this is correct, turning the previous disconnect between ethics and
dual use into connect should be entirely feasible.
Affinity was a recurring theme of the chapter by Bartolucci and Dando, who
examined: 1) what neuroethics has to say about dual use; and 2) how neuroethics
has positioned itself as similar/different to other areas of ethics. The first of
these is relatively straightforward: neuroethics is notable for its lack of direct
engagement with the dual-use aspects of the life sciences. At the same time,
much of it could be pertinent. The second dimension is more complex. Debates
about distinctiveness have long been at the centre of positioning about whether
there is a need for neuroethics as a field of study in its own right. With the
increase of specialised publications and conferences, over recent years many
14 Harris, S. 1999, Factories of death, in T. Beauchamp and L. Walters (eds), Contemporary Issues in
Bioethics, 5th edn, Wadsworth, London, pp. 4708.
15 Fisher, J. and Monahan, T. 2001, The biosecuritization of healthcare delivery: examples of post-9/11
technological imperatives, Social Science & Medicine, vol. 72, pp. 54552; and Brown, T. 2001, Vulnerability
is universal, Social Science & Medicine, vol. 72, pp. 31926.
16 Enemark, C. 2009, Is pandemic flu a security threat?, Survival, vol. 51, no. 1, pp. 191214; and Choffnes,
E. 2002, Bioweapons: new labs, more terror? Bulletin of the Atomic Scientists, vol. 58, no. 5, pp. 2832.
17 See Fidler, D. and Gostin, L. 2008, Biosecurity in A Global Age, Stanford University Press, Stanford, Calif.
18 See, for example, de Waal, A. 2010, Reframing governance, security and conflict in the light of HIV/
AIDS: a synthesis of findings from the AIDS, security and conflict initiative, Social Science & Medicine, vol.
70, no. 1, pp. 11420.
354
20. Ethics as
20. Ethics as
22 See Rappert, B. 2007, Biotechnology, Security and the Search for Limits: An Inquiry into Research and
Methods, Palgrave, London, ch. 1.
23 Chambliss, D. F. 1996, Beyond Caring: Hospitals, Nurses, and the Social Organization of Ethics, University
of Chicago Press, London.
24 Context should not, however, simply be thought about as a static backdrop directing action. Financial
and organisational resource constraints motivate the search for different forms of research collaboration that,
in turn, structure possibilities for perception.
25 Douglas, T. and Savulescu, J. 2010, Synthetic biology and the ethics of knowledge, Journal of Medical
Ethics, vol. 36, pp. 68793.
26 King, op. cit., pp. 43246; Frisina, M. 2006, The application of medical ethics in biomedical research,
Cambridge Quarterly of Healthcare Ethics, vol. 15, pp. 43941; and Tyshenko, M. 2007, Management of
natural and bioterrorism induced pandemics, Bioethics, vol. 21, no. 7, pp. 3649.
357
32 prominent (Western) journal editors committed themselves to enact peerreview procedures to assess the risks and benefits of individual manuscripts.
These were meant to determine whether articles needed to be modified
or withdrawn because the potential harm of publication outweighs the
potential societal benefits.27 One concern frequently expressed has been
that such measures would jeopardise the free flow of information that is vital
to civilian science. Such fears though are arguably misplaced for two reasons
1. experience from at least 2004 has indicated that the riskbenefit logic
of the formal review processes means that few manuscripts, grant
applications or experiment proposals are identified as posing dual-use
concerns (let alone are censored; there are no cases of civilian work being
categorically withheld)28
2. evidence from social studies of science has illustrated how the exchange of
resources and information in research is frequently subject to negotiation
in practice.29
Such observations raise concerns about the ultimate purposes and prospects of
formal oversight procedures. They might either miss important developments or
impose needless layers of bureaucracy.
Operationalisation: Policy initiatives can also be misjudged if they misconstrue
the unit at which ethical decisions are taken and moral standards enforced.
As Miller argues in Chapter 12, reducing ethical concerns to individual
decision-making would be fundamentally flawed. Science must be treated as
a group and community enterprise.
The need for ethical analysis informed by empirical research is readily
acknowledged today in bioethicsindeed, many contend it has always been
so.30 The rub is not with whether empirical and ethics can be placed side by
side, but rather how. The central difficulty is that of bringing together what
is and what ought to be. The facts of some issuefor instance, how those
in a lab regard their professional responsibilitiesdo not resolve what should
be the case.31 Clearly, it is not possible to justify standards for morality on
empirical data alone, even if in practice what is often shapes notions of what
ought to be.
27 Journal Editors and Authors Group 2003, Proceedings of the National Academy of Sciences, vol. 100, no.
4, p. 1464.
28 See Nightingale, S. 2011, Scientific publication and global security, JAMA, vol. 306, no. 5, pp. 5456;
and Rappert, B. 2008, The benefits, risks, and threats of biotechnology, Science & Public Policy, vol. 35, no.
1, pp. 3744.
29 Rappert, 2007, op. cit., ch. 2.
30 Herrera, C. 2008, Is it time for bioethics to go empirical? Bioethics, vol. 22, no. 3, pp. 13746; and Hurst,
S. 2010, What empirical turn in bioethics? Bioethics, vol. 24, no. 8, pp. 43944.
31 See Bosk, C. 2000, The sociological imagination and bioethics, in C. Bird, P. Conrad and A. Fremont
(eds), Handbook of Medical Sociology, Prentice Hall, Upper Saddle River, NJ, p. 403.
358
20. Ethics as
Recognising this though is just the start of unpacking how empirical study can
contribute to the normative. As de Vries and Gordijn argue, the conclusion of
an empirical-ethical inquiry is not necessarily a moral judgment or principle.
Nor are the conclusions of these studies necessarily based solely on empirical
results.32 In other words, the tasks undertaken as part of empirical ethics are
many and varied. Seeking some way to move from description to prescription is
just one possibility. Others include detailing conduct, understanding conditions
and processes of meaning making, acknowledging alternative ethical reasoning,
testing the feasibility of moral precepts, and refining ethical theory.
Advising
Bioethicists are often called upon to provide advice in controversies. The 2010
New Directions: The Ethics of Synthetic Biology and Emerging Technologies34 report
of a US presidential commission is one such instance. It recommended various
32 de Vries, R. and Gordijn, B. 2009, Empirical ethics and its alleged meta-ethical fallacies, Bioethics, vol. 23,
no. 4, pp. 193201.
33 Billig, M. 1996, Arguing and Thinking, Cambridge University Press, Cambridge; and Billig, M., Condo, S.,
Edwards, D., Gane, M., Middleton, D. and Radley, A. 1989, Ideological Dilemmas, Sage, London.
34 Presidential Commission for the Study of Bioethical Issues 2010, New Directions: The Ethics of Synthetic
Biology and Emerging Technologies, US Department of Health and Human Services, Washington, DC.
359
reforms to oversight and reporting procedures for research that were said to be
congruent with widely shared ethical principles. Through such advice giving,
ethics has a chair at the table of high-level public policy.
The need for considered ethical advice has been made before. Green called for
a portion of the multi-billion-dollar per year funding for bio-defence in the
United States to be set aside to examine its ethical, legal and social implications
(ELSI).35 This recommendation was based on the precedent of the Human
Genome Project. And yet, despite seeking to build on this example, Green also
acknowledged various problems with the Human Genome Project ELSI program:
its policy consequences, coordination and focus.
At a more theoretical level, attempts to advise can be misguided if they are
based on a faulty understanding of how analysisbe it ethical or empirical
is made to matter. In an idealised model of policy, analysis acts as an input
into rational decision-making processes through establishing facts and selecting
between options. Medical ethics and bioethics are arguably particularly aligned
with this inputting model given the centrality placed on decision-making. In
contrast, students of policymaking have forwarded a complex path between
knowledge and action, one characterised by iterative movements without
clearly demarcated beginnings or endings. Indeed, some have gone so far as to
question whether it would even be desirable for policymaking to be solely or
largely based on analysis.36 Instead of portraying it as some sort of authoritative
input, the role of analysis has been defined in more limited termssuch as
helping to identify concerns not widely recognised.
Therefore, a danger with greater bioethical scrutiny about dual use would be
that it invests a greater significance in the former than it can bear. In practice,
the arguments of ethics might well give a false air of justification to decisions
taken for altogether different reasons.37 Another potential problem is that of
misconstruing the nature of the choices faced. Hoffmaster and Hooker have
argued ethical dilemmas are often radically open-ended and indeterminate.
Herein,
not only is the solution unknown, but the problem itself is initially not
well defined, and the values that ought to drive its investigation and the
35 Green, S. 2005, E3LSI research: an essential element of biodefense, Biosecurity and Bioterrorism, vol. 3,
no. 2, pp. 12837.
36 Lindblom, C. and Cohen, D. 1979, Usable Knowledge, Yale University Press, New Haven, Conn.; Palumbo,
D. and Hallet, M. 1993, Conflict versus consensus models in policy evaluation and implementation, Evaluation
and Programme Planning, vol. 16, pp. 1123.
37 For a consideration of this point, see Winner, L. 1992, Citizen virtues in a technological order, Inquiry,
vol. 35, nos 34, pp. 34161.
360
20. Ethics as
Guiding
If bioethics might not be thought of as wielding the decisive hand in policymaking,
a more modest goal would be to suggest that it provides guides for assessing the
morality of acts. Along this line, some have sought to derive general guidance
for scientists and research organisations from ethical principles.39 Kuhlau and
colleagues, for instance, outlined various criteria for identifying harm within
the moral responsibility of scientists.40 In line with the ethical principle of nonmaleficence, for instance, they argued that [r]esearchers should be responsible
not only for not engaging in harmfully intended activities but also for research
with harmful implications that they can reasonably foresee.41
The ability of ethics to derive moral obligations has been a topic of debate since
its inception as a field of study. Much of that discussion has turned on the
prospects for high-level principles to inform situated action. A frequent refrain
against principlism is that it is flawed because what it means to adhere to a
principle (such as autonomy or justice) is always indeterminate at some level.
As instances of ethical concern are never identical, the future application of
normative standards cannot be set out once and for all. Individuals must manage
the relevance of principles, what it means to follow or deviate from them, and
what consequences are likely to follow from violations. For instance, while
the obligation to prevent harm often figures as an ethical bar to bioweapons
development, this prescription does not have the same import for conventional
(read: commonly accepted) forms of weaponry.42 Enabling harm is a goal in
many areas of research. As such, dual-use discussions are often coloured by a
normative starting orientation about which principles matter and why, rather
than deriving a sense of the right course of action from the principles themselves.
38 Hoffmaster, B. and Hooker, C. 2009, How experience confronts ethics, Bioethics, vol. 23, no. 4, pp. 21425.
39 See Ehni, H. J. 2008, Dual use and the ethical responsibility of scientists, Archivum Immunologiae Et
Therapiae Experimentalis, vol. 56, pp. 14752; and Green, S., Taub, S., Morin, K. and Higginson, D. 2006,
Guidelines to prevent malevolent use of biomedical research, Cambridge Quarterly of Healthcare Ethics, vol.
15, pp. 4329.
40 Kuhlau, F., Erikson, S., Evers, K. and Hglund, A. 2008, Taking due care: moral obligations in dual use
research, Bioethics, vol. 22, pp. 47787.
41 Ibid., p. 481.
42 Rappert, B. 2012, How to Look Good in War, Pluto Press, London, ch. 6.
361
Deliberating
A different way of thinking about the utility of ethics is to focus on process. If
general ethical analysis has a limited ability to definitely determine the proper
course of action then it can help structure democratic deliberation.45 Herein, as
individuals schooled in conceptual analysis and argumentative logic, ethicists
can ensure the quality of discussions (even if they do not occupy elevated moral
positions). This procedural expertise might be exercised through clarifying
reasoning, ensuring consideration of neglected topics, making hidden values
explicit, providing comparative examples, and so on.
Each of these notionally procedural contributions though could be questioned
regarding how they import in (unrecognised) value commitments. The
prescriptive and procedural dimensions of bioethics are exemplified in current
disputes about the merits of the precautionary principle.46 Any discussion of
this notion needs to begin with the recognition that it comes in a multitude
of versions. While in general terms precaution is aligned with not requiring
conclusive demonstration of harm to prompt concern or even action, many
formulations of it exist.47 It is possible, for instance, to distinguish between
43 Beauchamp, T. 1995, Principalism and its alleged competitors, Kennedy Institute of Ethics Journal, vol.
5, no. 3, pp. 18198.
44 Marshall, P. 2001, A contextual approach to clinical ethics consultation, in B. Hoffmaster (ed.), Bioethics
in Social Context, Temple University Press, Philadelphia.
45 Boniolo, G. and di Fiore, P. P. 2010, Deliberative ethics in a biomedical institution: an example of
integration between science and ethics, Journal of Medical Ethics, vol. 36, pp. 40914.
46 In relation to how the prescriptive and the procedural meld, a concern for some is that attention to
procedure is a means of decision-making, at least by default. A criticism made of precautionary approaches
but one that could be made of any attempt to promote and conduct deliberationis that it ends up significantly
delaying decisions, and that such a delay thereby favours some over others. It is notable in this regard that
precaution has become a byword for inaction and hesitancy in many areas of science policy. See Ledford, H.
2011, Hidden toll of embryo ethics war, Nature, vol. 471, p. 279.
47 Stirling, A. 2008, Science, precaution, and the politics of technological risk converging: implications
in evolutionary and social scientific perspectives, Annals of the New York Academy of Sciences, vol. 1128,
pp. 95110.
362
20. Ethics as
48 Sandin, P., Peterson, M., Hansson, S., Rudn, C. and Juthe, A. 2002, Five charges against the precautionary
principle, Journal of Risk Research, vol. 5, no. 4, pp. 28799.
49 See as well Harris, J. 2001, Introduction: the scope and importance of bioethics, in J. Harris (ed.),
Bioethics, Oxford University Press, Oxford; and Douglas and Savulescu, op. cit.
363
Because it is imagined that few experiments will need to be given security review, the
emphasis has been with devising a non-demanding tick-box first stage that should exclude
the majority of research from further formal consideration.d In this regard, NSABB has
proposed that the initial review of whether or not research is of concern be undertaken
by the principal investigator (that is, the senior project leader). Herein, this person would
ask of their work, based on current understanding, can [it] be reasonably anticipated to
provide knowledge, products, or technologies that could be directly misapplied by others to
pose a threat to public health and safety, agricultural crops and other plants, animals, the
environment, or materiel.e
To state that assessors must be able to reasonably anticipate a direct threat based on current
understanding sets a high threshold for proof. At this initial stage of the review process,
the determination of the status of research is not intended to impose significant demands
on principal investigators. Should research be found to match the criterion then it would be
subjected to institutional risk review.f
Such an approach can be contrasted with an alternative oversight model proposed by the
Center for International and Security Studies at Maryland (CISSM). This is envisioned as an
international legally binding system requiring the licensing of personnel and research facilities.
The Maryland system also involves independent peer review. An oversight body needs to
approve work going ahead, rather than the investigators making the initial determination. This
was justified on the basis that [i]n addition to having a self-interest in seeing their research
proceed, such individuals are also unlikely to have the security and other expertise necessary
to recognize the possible dual use risks of their work.g
The criteria proposed as part of the riskbenefit analysis in the Maryland system also go
further than the NSABB proposal. As part of assessing research, for instance, individuals
are required to consider whether the same experimental outcome could be pursued through
alternative means, whether the research is being done in response to a validated (credible)
threat, and whether it will yield results definitive enough to inform policy decisions. Such
questions place additional demands on those taking part in the assessment process to those
as part of NSABB recommendations. They also require forms of knowledge that the average
principal investigator is unlikely to posses. As another contrast to the NSABB proposals, the
Maryland one provides a metric for evaluating research based on the responses given to the
criteria mentioned.
At the heart of such alternative policy options is the matter of expertise and how this should
be exercised. While NSABB devolves much of the decision-making down to senior individual
scientists who are aided by others, the CISSM proposal places much more emphasis on a
diverse range of expertise structured through mandatory requirements.
a
National Science Advisory Board for Biosecurity (NSABB) 2007, Proposed Framework for the
Oversight of Dual Use Life Sciences Research: Strategies for Minimizing the Potential Misuse of Research
Information, National Science Advisory Board for Biosecurity, Bethesda, Md, p. 4.
b
Ibid., p. 16.
As in comments made during National Science Advisory Board for Biosecurity, 20 March 2006.
Ibid., Appendix 4.
Harris, E. 2007, Dual use biotechnology research: the case for protective oversight, in B. Rappert
and C. McLeish (eds), A Web of Prevention: Biological Weapons, Life Sciences and the Governance of
Research, Earthscan, London, p. 120.
g
364
20. Ethics as
Legitimating
If legitimacy is a function of being in accordance with rules and procedures
justified by shared societal beliefs50 then one of the roles often sought of ethics is
to legitimate policies and practices. Whether through contributing expert moral
reasoning or facilitating deliberation, bioethicists (and others) are often called
up to ensure support for core social institutions. An often-voiced reservation
with such a purpose is that the basis for accord might be less than warranted or
genuine. In other words, rather than legitimacy being positively secured, it is
skilfully manufactured through a language supplied by ethics.
Chambliss, for instance, offers a highly critical evaluation of how medical ethics
and bioethics training figure within hospital practice. One of the starting points
for this is the view that organisational hierarchies are often the source of the
dilemmas experienced by professionals.51 Nurseswith relatively little formal
power in relation to doctors or administratorsoften struggle with the tension
between doing what they think is right and doing what conforms to official
policy. As a result, it is not enough for them to receive high-minded instruction
about moral principles in order for them to do what is right. For Chambliss,
ethical training in the form of abstract, principle-based talk cannot only be
irrelevant to lived experience, but it can also serve to give a false impression
that dilemmas and challenges are being acknowledged and addressed while, in
practice, ethics instruction reinforces relations of hierarchy. Winner too cautions
against the way moral categories and ethical arguments are often forwarded
without attention to the roles and institutions needed to enable notions of the
good to translate into deeds.52 In the absence of such opportunities, ethical
analysis all too easily validates the status quo.
More subtly than this, ethics can also act to reinforce the perception of shared
beliefs and values. Take the case of the previously mentioned dual-use review
procedures initiated by civilian journals, funders and research organisations
since 2004. While they differ in specifics, each proposes a riskbenefit
calculation. Expected societal gains from research are to be measured against
possible security threats, the balance between the two indicating what should
be done. As such, the review procedures are forwarded as embodying core
characteristics of rational decision-making. Ethicists have adopted and endorsed
this rationalistic framework of balancing risks and benefits.53
What has not been elaborated is how risk could be calculated (see by way
of contrast the aforementioned CISSM system for an alternative framework).
Certainly no form of detailed calculus has been set out. Even putting to the side
a demand for exactitude, it is not at all clear, for instance, how reviewers could
specify the risks from the destructive use of fundamental scientific knowledge.
The users, situations, time frame, specific contribution of that knowledge, and
so on, are not well defined. The absence of significant past experience of the
deliberate spread of disease that might at least provide data points for assessing
risks and benefits also frustrates undertaking reviews. In the exceptionally
few instances of experiments where some science bodies have judged risks
in excess of benefitsas in the initial (but subsequently revised) NSABB
recommendations for publication redactions with research on the transmission
of a modified H5N1 virus in a ferret modeldetailed costbenefit calculations
have not been forwarded. Instead, general appeals to notions such as precaution
have been given.54
Perhaps most fundamentally, what positive values or negative implications might
be relevant for weighing are not clear. From one defensive standpoint, it might be
vital to identify and publish research that raises concerns (so as to confirm such
fears and to devise countermeasures). This would seem to be the rationale that
informed the Defense Advanced Research Projects Agency decision to fund the
synthesis of poliovirus.55 In Nancy Connells chapter, similarly, demonstrating
hyper virulence in a mouse model was seen as providing the grounds for future
follow-on funding. Thus, what should count as a problem is not simply poorly
defined, but also open for radically opposed interpretations in the first place.
Legitimation dangers do not just include giving undue credence to certain
outcomes. Another danger is perpetuating faith in the presumptions, beliefs
and competencies that underpin the proposal for what should be done. Rockel,
for instance, has offered a trenchant critique of the doctrine of double effect as
applied to modern warfare.56 As contended, its underpinning distinctions related
to foreseeable effects and intentions have provided the material for papering
over systematic deficiencies in military action. In relation to the themes of this
volume, it should also be noted that within the discussion of risks and benefits,
the non-destructive applications are often assumed to fall wholly on the plus
side. This thinking is in line with many public portrayals of science; however,
the extent to which biomedical research is linked to improvements in human
54 Cohen, J. and Malakoff, D. 2012, NSABB members react to request for second look at H5N1 flu studies,
Science, 2 March; and Imperiale, M. 2012, Presentation to Dual-Use Research and Biosecurity: Implications
for Science, Governance and the Law, The Hague, 12 March.
55 Selgelid, M. and Weir, L. 2010, The mousepox experience, EMBO Reports, vol. 11, pp. 1824, <http://
www.nature.com/embor/journal/v11/n1/full/embor2009270.html> (viewed 23 April 2003).
56 Rockel, S. 2009, Collateral damage: a comparative history, in S. Rockell and R. Halpern (eds), Inventing
Collateral Damage: Civilian Casualties, War, and Empire, Between the Lines Press, Toronto, pp. 196.
366
20. Ethics as
Stigmatising
As contended previously, the categorical condemnation of bioweapons is
historically contingent and collectively produced. It is because ofnot in spite
ofthis thoroughgoing social basis that resistance would be offered to attempts
to sanction the employment of life-sciences knowledge and techniques for
destructive ends.59 Looking towards the future, given the numerous possibilities
for the malign applications outlined within the pages of this volume as well as
the difficulties of trying to enforce the prohibition through national security
and policing measures, stigmatisation is likely to be essential.
Taking this to be the caseand agreeing with the need to work against the
deliberate spread of diseaseimplies a certain agenda for bioethics: it should
work to find ways of strengthening and renewing stigmatisation. This needs to
take place in a manner sensitive to different possible belligerents: states, substate groups and lone individuals. How to prevent current efforts to develop
57 Sarewitz, D. 1996, Frontiers of Illusion, Temple University Press, Philadelphia, p. 150.
58 Houtepen, R. 1998, The social construction of euthanasia and medical ethics in the Netherlands, in R.
de Vries and J. Subedi (eds), Bioethics and Society, Prentice Hall, Upper Saddle River, NJ, pp. 11744.
59 In other words, the categorical nature of the prohibition in the BTWC and the 1925 Geneva Convention
is laudable not because it reflects an objective and essential truth, but rather because of the choices that
buttress it.
367
20. Ethics as
Educating
Who needs to be educated, about what, how and by whom are longstanding
matters of commentary in ethics. The charge that those involved in medicine
and the life sciences are somehow lacking with regard to an appreciation of the
implications of their work is hardly unique to the topic of this volume.62 Yet
moving from such an appraisal to proposals for what needs to be done often
proves contentious. Approaches to ethics tuition differnotably, between
prescriptive, procedural and virtue-based varieties. Each of these is aligned
with distinct ways of thinking about individuals as moral agents, what count as
appropriate learning techniques and how value disagreement ought be handled.
Elsewhere I have considered the dilemmas, tensions and pitfalls of education
about the destructive use of life science.63 In this subsection I want to extend that
work by placing the issue of education within a wider political framework. Cribb
offers a valuable inroad into this by distinguishing types of health education.64
For him, a medical model treats education as a way of achieving health outcomes
through providing information that affects patient behaviour. A danger with
this is that healthcare workers assume a highly paternalistic role. Against the
medical model another would be to conceive of education as enabling people
to make informed choices based on their own values and preferences. A danger
with this model is that it treats individuals values and preferences as deriving
only from them as autonomous agents. An empowerment model, by contrast,
starts with asking about the factors that constrain individuals from realising
their preferences and then envisions education as part of overcoming those
barriers. In other words, the emphasis is with change rather than edification
for its own sake. A social-action model goes one step further by asking what
sort of structural changes in society (for instance, with regard to poverty and
social welfare) are necessary to achieve sought-for health gains. As with the
empowerment model, this one necessarily involves posing wider questions
about what needs reform. Individuals and groups require skills for participation
to affect change. In endorsing the social-action model, as with Chambliss,
Cribb counsels against divorcing education and training from institutional and
organisational conditions.
Such a typology offers a way of classifying the work that has taken place to date
with regard to dual-use education. The overwhelming orientation has been in
62 As, for instance, in the case of Sales, C. and Schlaff, A. 2010, Reforming medical education: a review and
synthesis of five critiques of medical practice, Social Science & Medicine, vol. 70, pp. 16658.
63 See Rappert, B. 2007, Education for the life sciences, in Rappert and McLeish, op. cit.; and Rappert, B.
2010, Introduction: education as , in B. Rappert (ed.), Education and Ethics in the Life Sciences: Strengthening
the Prohibition of Biological Weapons, ANU E Press, Canberra.
64 Cribb, A. 2005, Health and the Good Society: Setting Healthcare Ethics in Social Context, Oxford University
Press, Oxford.
369
line with the medical model. Herein, researchers or the public are expected to
take on board messages about the potential of the life sciences. The intention is to
help achieve certain thinking or behavioursuch as the competency to identify
research of concern. Situated within the international community of states, the
danger of paternalism typically associated with this model is compounded by
that of neo-colonialism. With much of the recent attention to dual use emanating
from North America and Europe, an obvious concern is that the agenda as well
as the ethical approaches employed to understand it (for example, individualist
and principle based) are indebted to strains of Western thinking.65 With the
effort dedicated to education as instruction, much less attention has been given
to empowerment or social-action models. Exceptions include the Kampala
Compact: The Global Bargain for Biosecurity and Bioscience and related DNA for
Peace report.66 Both proposed holding together biosecurity measures with social/
international development agendas. Brian Martin, too, spoke of the importance
of individuals and group empowerment in whistleblowing and dual use.67 In
the absence of such practical skills training, a danger of education is that it does
not enable positive reform. Such an outcome can lead to feelings of irrelevance,
indifference or frustration on the part of educators and the educated.
65 For a more general discussion of this danger, see Widdows, H. 2007, Is global ethics moral neocolonialism? An investigation of the issue in the context of bioethics, Bioethics, vol. 21, no. 6, pp. 30515.
66 Available at: <http://www.utoronto.ca/jcb/home/documents/DNA_Peace.pdf> (viewed 4 April 2010).
67 Martin, B. 2007, Whistleblowers: risks and skills, in Rappert and McLeish, op. cit.
68 In line with Clouser, K. D. 1978, Medical ethics: some uses, abuses, and limitations, New England Journal
of Medicine, vol. 293, pp. 3847; and Hoffmaster, op. cit.
370
20. Ethics as
recognised in the first place or, if recognised at some level, what is not treated
as a serious problem; if regarded as a problem then what is not acted upon. In
short, the subject for scrutiny is the one of what isnt happening.
A rereading of the Australian IL-4 mousepox experiment illustrates the varied
relevancies of the non-: the potential for IL-4 to enhance virulence was
recognised prior to the mousepox publication by some experts, yet seemingly it
was not subject to much professional (let alone public) debate. In this respect,
what distinguishes the 2001 mousepox controversy from the majority of other
work with a dual-use potential is that the researchers actively voiced their
concerns beyond a closely knit expert coterie. Although counterfactual, it seems
doubtful that this work would have garnered anything like the same attention in
the absence of Ramshaws communication with New Scientist. The fraught path
to media agenda item is suggested by what happened subsequently. Despite
the profile of the researchers from the mousepox experiment, 9/11, the anthrax
attacks and much else besides, the publication of results of a follow-on study
indicating IL-4 modified mousepox resisted treatment with an antiviral agent
for smallpox garnered little notice. As another instance of the non-, only years
after the initial controversy did Ramshaw regard his research as entailing a sort
of weaponisation of sterilisation for rodents.
As well, on a different level, the way that mousepox has been one of only a
handful of so-called experiments of concern that are repeatedly put forward
speaks to the narrow, individual case-based approach that has come to dominate
framing how the life sciences might aid destructive purposes.69 Herein what
are held as mattering are the choices taken at critical ethical moments (for
example, should these results be published? Should experiments be approved?
And so on).
This framing has characterised recent debate regarding the justifications for
the proposed redaction of research undertaken by Dutch and American-based
researchers who mutated the H5N1 virus in a ferret model in such as way as
to enable it to transmit between mammals. While this case was a matter of
much controversy in early 2012, as with other experiments of concern, what
is perhaps most notable is its exceptionality. With this regard to a very limited
number of cutting-edge experiments, much less attention has been cast on what
the mundane commercialisation of science means for new biowarfare capacities.
69 As in World Health Organisation (WHO) 2011, Responsible Life Sciences Research for Global Health
Security, WHO/HSE/GAR/BDP/2010.2, World Health Organisation, Geneva, Part 2.
371
20. Ethics as
argument about moral equivalence has itself been disputed through the
contention that the widespread belief that a moral difference should be made
between killing and letting die itself provides an adequate basis for judging
which one is preferable.
In contrast, other ethicists have sought to move away from direct reference
to action/non-action as the basis for evaluating morality. One way that has
been done is by distinguishing between positive and negative duties. Herein
transgressing negative duties (such as refraining from killing) are treated as more
serious than positive ones (such as not intervening to prevent death). Yet this
is problematic because some actions (such as preventing someone from being
saved) cut across the starting distinction between killing and letting die. Still
other ethicists have advocated replacing the focus on action in debates about
euthanasia with that on agency and responsibility.73
The manner in which action/non-action is varyingly configured as relevant
speaks to the importance of how issues are identified and how they are
formulated. Action and inaction have been the locus for moral argument while
also being deemed somewhat beside the point. Thus when attending to dual use
vis-a-vis what is missing, in addition to considering the many ways (in)action
is said to matter, how the debate is framed must be considered: what is taken as
counting as (in)action, what evidence supports such claims, what implications
are said to follow.
This complicated picture of the coverage and place of the non- in ethics is
mirrored in the empirical social sciences. Again while fields such as sociology
and political science are generally preoccupied with what is taking place, what
is not happening has also figured as a subject of study. A recurring undercurrent
of the commentary by sociologists on bioethics is that it does not attend to the
structural and institutional conditions that delimit the possibilities available
to individuals. As such, sociological analysis often purports to attend to what
bioethics systematically ignores.
Another facet of the study of the non- in social research is the examination
of how social concerns about science are nullified. Much of this work starts
with Gieryns observations about how the boundaries between objectivity/
subjectivity, natural/social realms and expert/lay knowledge are routinely
managed within the practice and portrayals of scientists.74 Such boundary work
is part and parcel of how control is maintained over the goals and standards of
science. Along these lines, Cunningham-Burley and Kerr examined how adept
boundary work enabled geneticists to secure the cognitive authority necessary
to secure funding, while placing themselves as authority figures for speaking
73 Coggon, J. 2008, On acts, omissions and responsibility, Journal of Medical Ethics, vol. 34, no. 8.
74 Gieryn, T. 1999, Cultural Boundaries of Science, University of Chicago Press, Chicago.
373
Non-groundings
As suggested in the previous subsection then, within both ethics and social
science, uneven regard is given to what is not taking place. As a result, attempts
in relation to the dual-use life sciences to combine normative justification with
empirical analysis face two types of problems: how the non- is treated within
fields of study and how these fields can be brought together. In relation to what is
absent, what really needs attention is the status quo and therefore what is likely
to foil efforts to move on from it. As part of this, the normative and the empirical
must seek to identify each others underlying assumptions. The non- as a topic
of study in this respect proves advantageous. This is so because attending to
what is absent requires not just clarifying thinking, but instead also inquiring
about the conditions under which quandaries arise and are structured.
If the non- has advantages as a topic for study in fostering this dialogue
between the empirical and the normative, it also has drawbacks. A prominent
one is its open-endedness. In de-anchoring analysis from something definite
to something that could be happening, the range of relevant considerations
multiplies manifold. Appeals to research-community interactions, time and
organisational constraints, awareness, professional socialisation,78 widespread
cultural myths and narratives,79 and so on, are among the reasons that could
be cited to explain the lack of professional attention to dual-use issues. And
since those considerations relate to what is not happening, proving their
counterfactual relevance is not straightforward. This in turn makes choosing
75 Cunningham-Burely, S. and Kerr, A. 1999, Defining the social, Sociology of Health & Illness, vol. 21,
no. 5, pp. 64768.
76 From Hoffmaster, B. 1990, Morality and the social sciences, in G. Weisz (ed.), Social Science Perspectives
on Medical Ethics, Kluwer Academic, Boston.
77 Frith, L., Jacoby, A. and Gabbay, M. 2011, Ethical boundary-work in the infertility clinic, Sociology of
Health & Illness, pp. 116.
78 As suggested in Sture, J. 2009, Educating scientists about biosecurity: lessons from medicine and
business, in Rappert, 2009, op. cit.
79 Gordon, D. and Paci, E. 1997, Disclosure practices and cultural narratives, Social Science and Medicine,
vol. 44, no. 10, pp. 143352.
374
20. Ethics as
Methodology
From the previous argument of this chapter it is possible to draw some important
conclusions for the study of what is not going on in relation to concerns about
the destructive use of the life sciences. First, while some of the existing literature
in bioethics and social sciences speaks to how issues are not recognised or
what actions are not taken, more systematic thought is needed. Second, with
the somewhat inevitable reliance on the counterfactual and the speculative,
arguments about the non- require careful justification. Although it might be
possible to make a case for why something is not happening that is persuasive
to many, it is likely to be disputed too. Third, a dialogue must be established
between the normative and the empirical. Locating a discussion of the non- of
dual use within the emerging literature about empirical ethics could provide
additional (normative) analytical resources for probing compared with those
typically called upon by social scientists.
In thinking in more specific methodological terms about how to study nonissues and non-actions, in this subsection I want to advance two considerations:
1) comparative examination, and 2) interventionist inquiry.
Comparative examination
One of the strategies used in researching the exercise of power has been to match
up situations that shared pertinent similarities in order to account for their
variations. For instance, the responses of communities affected by air pollution
have been juxtaposed to inquire about the reasons for those differences. In a
related vein, using time as the variable, periods of major social disturbance have
been examined for the opportunities created (and closed) for unconventional
ideas and practices.
80 For a consideration of objectivist and constructivist orientations to dual use as a social problem, see
Rappert, 2007, op. cit., ch. 1.
375
20. Ethics as
Interventionist inquiry
That non-issues are non-issues and non-actions are non-actions speak to the way
in which a spirit of intervention needs to infuse their study.
To expand, the process of questioning practitioners about matters of dual use is
likely to be an act of questioning assumptions, priorities and world views. For
instance, since 2004 Malcolm Dando and I have conducted seminars for university
faculties and other public research centres in order to inform participants about
current life-science security developments as well as to generate debate about
how research findings should be communicated, whether experiments should
be subject to institutional oversight and what research should be funded.
More than 130 seminars have been undertaken in 15 countries (ranging from
the United Kingdom to Uganda and Japan to Argentina), with more than 3000
participants. While it has been possible to generate lively (but bounded)85
discussion about dual-use issues at these events, as interactions they required
careful management because they asked participants to think about their work
anew. As a result, what was said, how, to whom and when were all subject to
lengthy methodological consideration. The decision to run seminar discussions
akin to focus groups in which attendees were encouraged to deliberate with
each other was itself the result of the limitations experienced in a one-to-one
interview format.86
The manner in which probing about non-issues de facto amounts to a form of
intervention suggests the need to consciously attend to how this intervention
is conducted. Overall, what is required is a systematic process of planning
and execution that allows for learning and experimentation. As part of this,
inquiry should be thought of as a practical, intellectual, action-orientated
and consequentialist form of action.87 Kurt Lewins often quoted suggestion
that if you want to truly understand something, try to change it indicates
the potential for deliberate intervention to yield insights not readily obtainable
through unobtrusive means.
In fields of social science such as action research, such practical inquiry is
linked to the aim of transforming social relations.88 As with the empowerment
and social-action models noted in the previous section, the factors that
frustrate change require attention. This practical step entails incorporating
85 See, for instance, Rappert, 2007, op. cit., ch. 5.
86 Ibid., ch. 2.
87 Dewey, J. 1929, The Quest for Certainty, George Allen & Unwin, London.
88 See, for example, Ospina, S., Dodge, J., Godsoe, B., Minieri, J., Reza, S. and Schall, E. 2004, From consent
to mutual inquiry, Action Research, vol. 2, no. 1, pp. 4769; and Winter, R. 1996, Some principles and
procedures for the conduct of action research, in O. Zuber-Skerritt (ed.), New Directions in Action Research,
Taylor & Francis, London; and Winter, R. 1998, Managers, spectators and citizens, Educational Action
Research, vol. 6, no. 3, pp. 36176.
377
positive normative goals into the design of inquiry. Central to a robust process of
transformative intervention is to ensure that a conversation takes place between
the methods of inquiry and its normative aims. The latter should promote
scrutiny regarding the fallibility and commitments of the methods employed.
Also, the methods should enable the refinement and revision of what is held as
necessary and desirable. Doing so not only requires a certain kind of intellectual
understanding, but also practical skills.
What seems essential in studying what is absent is to find means of questioning
taken-for-granted assumptions about what counts as an ethical or a social
problem in the first place. Rather than going out and probing straightforwardly
overt, recognised issues widely labelled as ethical or contentious, research
techniques and strategies must help to cultivate thinking afresh in order to
avoid confirmation bias, to encourage alternative hypotheses and to embrace
negative evidence.
Inquiry about non-issues and non-actions then cannot be conceived simply as
an attempt to reveal holes in understanding. Instead, it must be a project of
questioning the historical, political and situational bases for how understandings
are formed and thereby what is counted as missing in the first place. As such,
ensuring inquiry interrogates its own starting points is vital. One interesting
direction to take the non- is to consider the hows and whys regarding ethicists
lack of engagement with dual-use life science. Some preliminary reflections
have already been given on this matter;89 however, undertaking systematic
empirical research might inform an understanding of not only the priorities and
presumptions of ethicists, but also therefore the likely limits of existing academic
disciplinary resources. Moreover, such a line of empirical investigation provides
an opportunity for bringing to the fore questions about how the normative and
the empirical can be combined.
At stake is the question of how facts, figures, concepts and arguments should be
made sense of in order to assess whether ethicists themselves have been remiss
for their past level of regard. The contention that some issues are being neglected
compared with others is commonplace in bioethics, with its attentiveness to
distributive justice. At times this extends to commentary on bioethics own
agenda.90 For both, the justifications for claims are open to disagreement in
relation to their underlying ethical assumptions. Appeals to consequences
versus duties versus rights, for instance, can result in far different assessments
about what is lacking from the agenda of bioethics. Each appeal is also reliant on
89 Selgelid, M. 2010, Ethics engagement of the dual-use dilemma: progress and potential, in Rappert, 2010,
op. cit.
90 For examples of this, see Selgelid, M. 2008, Ethics, tuberculosis and globalization, Public Health Ethics,
vol. 1, no. 1, pp. 1020; and Selgelid, 2005, op. cit.
378
20. Ethics as
91 Further, then, whether some definitive sense of the topic needs to be established (and what that should
be) is a choice that needs to be addressed.
379
Appendix A
Who is working in neuroethics? Where are
they?
In a preliminary overview of the field, it was noticed that neuroethicists are not
saying enough about the problem of dual-use,1 a conclusion promptly validated
by Peter B. Reiner, Professor in the National Core for Neuroethics, who wrote:
The truth is that other than Jonathan Moreno, few neuroethicists have
applied serious scholarship to the issue of dual use. Of course, it is
a simple matter to just say no: neuroscience should only be used for
improving the quality of human life. But frankly, that is too simplistic.2
This appendix provides the reader with an overview of where neuroethics
is carried out, who is working in the field and what neuroethicists say about
dual use. Only some of the main institutions and networks as well as leading
neuroethicists are mentioned as representatives of the wider field.
Where
Center for Neuroscience & Society, University of Pennsylvania
The stated mission of the centre is to increase understanding of the impact of
neuroscience on society through research and teaching, and to encourage the
responsible use of neuroscience for the benefit of humanity. <http://neuroethics.
upenn.edu/>
Center for Cognitive Neuroscience, University of Pennsylvania
Penns Center for Cognitive Neuroscience is a multidisciplinary community
dedicated to understanding the neural bases of human thought. Their current
research addresses the central problems of cognitive neuroscience, including
perception, attention, learning, memory, language, decision-making, emotion
and development. Methods include functional neuroimaging, behavioural
testing of neurological and psychiatric patients, transcranial and direct-current
1 Dando, M. 2010, Neuroethicists are not saying enough about the problem of dual-use, Bulletin of the
Atomic Scientists.
2 Reiner, B. 2010, Neuroethics at the Core, 19 July, <http://neuroethicscanada.wordpress.com/2010/07/19/
neuroethics-of-dual-use/>.
381
Appendix A
Appendix A
issues that neuroscience raises for our legal system through its three Research
NetworksDiminished Brains, Addiction and Decision Makingand its
Education and Outreach Program. <http://neuroethics.stanford.edu>
MacArthur Law & Neuroscience Project
The MacArthur Law and Neuroscience Project investigates the impact of modern
neuroscience on criminal law and in particular the diverse and complex issues
that neuroscience raises for the criminal justice system in the United States.
<www.lawneuro.org>
The Neuroethics Research Unit
The Neuroethics Research Unit is committed to training a new generation of
students in neuroethics through the conduct of collaborative interdisciplinary
research. Research interests include the ethical application of neuroscience
in research and patient care, empirical bioethics research, and pragmatism in
bioethics. It is based at the Montreal Institute of Clinical Research. <http://
www.ircm.qc.ca/microsites/neuroethics/en/index.html>
Novel Tech Ethics
The team at Novel Tech Ethics is committed to public discussion of the ethics
issues that affect all human beings. Key focal points of the Novel Tech Ethics
Research Team include
how are psychopharmacologies creating a new normal for human
behaviourand what is normal for humans when regenerative medicine
mixes and matches cells across species
how will the results emerging from neuroimaging studies and behavioural
genetics affect our understandingand social and legal enactmentof free
will and responsibility
what are the particular kinds of harm and benefit offered by neurological
treatment, neurological enhancement and neurological control, and how do
these challenge traditional notions and practices of risk assessment?
<www.noveltechethics.ca/site_events.php>
University of Wisconsin: Neuroscience and Public Policy Program
The program emerged from the recognition that the rapid advancement in
neuroscience demanded research neuroscientists to be trained to think critically
about both neuroscience and the making of public policy, and to have appropriate
skills, experience and networks to facilitate an effective integration of the two.
The program is hosted by the University of WisconsinMadison. A central
385
element of the program is the weekly Neuroscience and Public Policy Seminar,
which challenges students to synthesise information across neuroscience and
policy research. <http://npp.neuroscience.wisc.edu/index.html>
The Neuroethics New Emerging Team (NET)
The Neuroethics New Emerging Team (NET) is based at Dalhousie University,
Canada. Launched in 2003, the NET aims to undertake an interdisciplinary
study of, and disseminate their findings on, the ethical issues posed by advances
in neuroscience technology. It is funded by the Canadian Institutes of Health
Research. <http://www.neuroethics.ca/Neuroethics.ca>
Who
Colin Blakemore
Colin Blakemore is a neurobiologist at Oxford University. He is specialised
in vision and development of the brain. He also holds professorships at the
University of Warwick and the Duke UniversityNational University of
Singapore Graduate Medical School, where he is chairman of Singapores
Neuroscience Research Partnership. His research has been concerned with
many aspects of vision, the early development of the brain and plasticity of the
cerebral cortex.
Turhan Canli
Turhan Canli is Associate Professor at Stony Brook University. The work in Dr
Canlis laboratory focuses on the hormonal and neurogenetic bases of individual
differences in emotion and cognition. The research addresses these questions:
what are the biological mechanisms that can explain human personality? What
is the mechanism by which life experience, in interaction with genetic variation,
influences brain function to generate behavioural patterns that we associate with
certain personality traits? Do men and women differ in how their brains respond
to these genetic and experiential influences? Can this information be used to
identify healthy individuals at risk for psychopathology? To address these
questions, Dr Canlis team uses a number of different technologies: functional
magnetic resonance imaging (fMRI); transcranial magnetic stimulation (TMS);
molecular genetics; and hormone assays.
386
Appendix A
Arthur Caplan
Arthur Caplan is Emanuel and Robert Hart Professor of Bioethics and Philosophy
at the University of Pennsylvania. His research interests focus on transplantation
research ethics, genetics, reproductive technologies, health policy and general
bioethics.
Jocelyn Downie
Jocelyn Downie is Professor at Faculties of Law and Medicine, Dalhousie
University. She works at the intersection of law, ethics and health care. Her
research interests include womens health, assisted death, research involving
humans, and organ transplantation. Her work is interdisciplinary, collaborative
and geared both to contributing to the academic literature and to affecting
change in health law and policy at federal and provincial levels.
Martha J. Farah
Martha J. Farah is a cognitive neuroscientist at the University of Pennsylvania,
who works on problems at the interface of neuroscience and society, including
the effects of childhood poverty on brain development
the expanding use of neuropsychiatric medications by healthy people for
brain enhancement
novel uses of brain imaging in, for example, legal, diagnostic and educational
contexts
the many ways in which neuroscience is changing the way we think of
ourselves as physical, mental, moral and spiritual beings.
Kenneth R. Foster
Kenneth R. Foster is a professor of bioengineering. His research interests relate
to biomedical applications of non-ionising radiation from audio through to
microwave frequency ranges, and health and safety aspects of electromagnetic
fields as they interact with the body. For example, he examines the prospects
of workers in electrical occupations and the possibility (or lack of) cancer risk.
Another and somewhat broader topic of interest is technological risk and the
impact of technology (principally, electro-technologies) on humans. His goal in
this area is to examine technology, putting into perspective its relative risks and
benefits to society. What he hopes to impart is a better perception of the social
use of science.
387
Michael Gazzaniga
Michael Gazzaniga is a professor of psychology and the Director of the Sage
Center for the Study of the Mind at the University of California, Santa Barbara.
He oversees an extensive and broad research program investigating how the
brain enables the mind. Over the course of several decades, a major focus of
his research has been an extensive study of patients who have undergone splitbrain surgery that has revealed lateralisation of functions across the cerebral
hemispheres.
Henry Greely
Henry Greely is Deane F. and Kate Edelman Johnson Professor of Law. A
leading expert on the legal, ethical and social issues surrounding health law
and the biosciences, Hank specialises in the implications of new biomedical
technologies, especially those related to neuroscience, genetics and stemcell research. He frequently serves as an advisor on Californian, national and
international policy issues. He is chair of Californias Human Stem Cell Research
Advisory Committee and served from 2007 to 2010 as co-director of the Law
and Neuroscience Project, funded by the MacArthur Foundation. Active in
university leadership, Professor Greely chairs the steering committee for the
Stanford Center for Biomedical Ethics and directs both the law schools Center
for Law and the Biosciences and the Stanford Interdisciplinary Group on
Neuroscience and Society. Professor Greely serves on the Scientific Leadership
Council for the universitys interdisciplinary Bio-X Program.
Ronald M. Green
Ronald M. Green is Eunice and Julian Cohen Professor for the Study of Ethics and
Human Values. Ronald M. Green has been a member of Dartmouth Universitys
Religion Department since 1969; he also directs Dartmouths Ethics Institute,
a consortium of faculty concerned with teaching and research in applied and
professional ethics. Professor Greens research interests are in genetic ethics,
biomedical ethics and issues of justice in healthcare allocation. He is the author
of six books and more than 130 articles in theoretical and applied ethics.
Judy Illes
Judy Illes is Associate Professor of Pediatrics (Medical Genetics) and Director of
the Program in Neuroethics at the Stanford Center for Biomedical Ethics. She also
co-founded the Stanford Brain Research Center (now the Neuroscience Institute
at Stanford), and served as its first executive director between 1998 and 2001.
Today, Dr Illes directs a strong research team devoted to neuroethics, and issues
specifically at the intersection of medical imaging and biomedical ethics. These
include ethical, social and legal challenges presented by advanced functional
388
Appendix A
academic life, Dr Reiner refocused his scholarly work in the area of neuroethics,
with interests in neuro-essentialism, the neuroethics of cognitive enhancement
and the commercialisation of neuroscience.
Paul R. Wolpe
Paul R. Wolpe is Associate Professor of Psychiatry in the Department of
Psychiatry at the University of Pennsylvania, where he also holds appointments
in the Department of Medical Ethics and the Department of Sociology. He is
President of the American Society for Bioethics and Humanities and is CoEditor of the American Journal of Bioethics. Dr Wolpe serves as the first Chief of
Bioethics for the National Aeronautics and Space Administration (NASA).
390