Advertisement

If you have an ACS member number, please enter it here so we can link this account to your membership. (optional)

ACS values your privacy. By submitting your information, you are gaining access to C&EN and subscribing to our weekly newsletter. We use the information you provide to make your reading experience better, and we will never sell your data to third party members.

ENJOY UNLIMITED ACCES TO C&EN

Genomics

Building bioethics into the future of life sciences innovation

Gene editing accelerates a détente between the laboratory and social sciences over questions that direct future research

by Rick Mullin
August 27, 2018 | A version of this story appeared in Volume 96, Issue 34

 

A man and a woman stand on stairs.
Credit: Rick Friedman
Ethicist Jeantine Lunshof has an office in the laboratory of chemist and genetics professor George Church at Harvard. They agree that scientists are asking the right questions about the ethical implications of their research.

In brief

The advent of low-cost, easily accessed gene-editing technology has accelerated a decades-long effort to establish a means of assessing the safety and ethical deployment of scientific innovation. The prospect of clinic-ready tools for altering the human germ line has heightened anxieties in the social sciences and in the laboratory, where scientists are increasingly raising nontechnical questions that can be answered only in collaboration with ethicists, lawyers, regulators, and the public. A détente is emerging between two disciplinary approaches to answering questions: the philosophical and the empirical. It is likely to change how science is done in the future.

In 2015, Jennifer Doudna, codeveloper of the CRISPR/Cas9 gene-editing technology, convened a meeting in Napa, Calif., aimed at setting the parameters for ethical use of the groundbreaking genomic technique. Her template for the proceedings was a meeting of scientists that took place in 1975 down the coast at the Asilomar Conference Center near Monterey. There, the pioneers of biotech agreed to stop laboratory work with recombinant DNA until they could reach a consensus on how and where experiments could be pursued safely.

It is not surprising that Doudna, a professor of chemisty, biochemistry, and microbiology at the University of California, Berkeley, chose to model her event on the Asilomar conference. The science community still considers that meeting to be a model for vetting the safety, ethical implications, and societal impact of innovation.

Nor was it surprising that she was criticized for doing so. Societal concern over the impact of genomic research has risen steadily since the early days of biotechnology, and critics felt a meeting such as Doudna’s in 2015 needed to include voices from outside the laboratory—those of ethicists, regulators, and the public.

Also, the stakes are higher. Altering the human germ line—introducing changes to genes that influence heritable traits—was not considered a threat in 1975. Now, “we are for the first time contemplating changing the genomes of individuals, and in that sense changing the genomes of the whole world of human beings,” says biologist David Baltimore, former president of California Institute of Technology and one of the coordinators of the Asilomar meeting.

Baltimore also attended Doudna’s Napa event, which he says laid the groundwork for other meetings that have incorporated more diverse voices, including one scheduled for November in Hong Kong, as well as a panel organized by the U.S. National Academies of Sciences, Engineering & Medicine. It helps that the field of bioethics is more established and the science community is more open to discussing its work with nonexperts, Baltimore says. Overall, momentum is building for a multidisciplinary vetting of scientific innovation that some say will permanently change how research is done.

“Asilomar was 40 years ago now,” says Hugh Whittall, director of a U.K. organization of ethicists called the Nuffield Council on Bioethics. “I think over those 40 years, but in particular over the last, say, five years, many people working in science increasingly recognized that these kinds of areas of work need to be opened up to a much wider public discourse.”

Nevertheless, mixing social and laboratory sciences can be challenging. It involves melding philosophical and empirical approaches to answering questions. Conflicts of interest arise when scientific innovators try to lead discussion of how to safely and ethically apply their inventions. And although scientists today are generally more open to discussion, some still resist communicating with the public.



Drawing lines

Questions about the safe and ethical use of innovation predate the development of gunpowder. Coordinated endeavors to answer those questions and control access to high-risk technologies arose more recently. Many would say such efforts started with the international Treaty on the Non-Proliferation of Nuclear Weapons, signed in 1968.

Photo of a woman sitting in front of a bookshelf in an office.
Credit: Courtesy of Jennifer Doudna
Jennifer Doudna, a codeveloper of the CRISPR/Cas9 gene-editing technology, compares introducing a new technology to opening Pandora's box: "You can’t close the box again. You wouldn’t want to anyway because of the potential of great things to come from it."

With heightened focus on biotechnology in the 1980s, the precautionary principle took hold, calling on researchers to anticipate harm before it occurs. Then came the decoding of the human genome and the plummeting cost of genomic technologies, accompanied by rising innovator anxiety.

“When we first started to discuss application in the human germ line, I was very worried about the pace at which the CRISPR genome editing was being deployed, and I wondered whether there was appropriate caution being exercised,” Doudna says. In particular, rumors had surfaced of Chinese researchers using the technology on human embryos. Doudna’s concerns prompted her to organize the Napa meeting.

The findings from that meeting, published in Science, proposed that “the potential safety and efficacy issues” posed by CRISPR “be thoroughly investigated and understood before any attempts at human engineering are sanctioned, if ever, for clinical testing” (Science 2015, DOI: 10.1126/science.aab1028).

Doudna’s meeting was followed by a National Academies summit in Washington, D.C., that marked the launch of a panel charged with compiling genome-editing guidelines. The panel published recommendations last year as “Human Genome Editing: Science, Ethics, and Governance.” It envisions heritable human genome editing to treat or prevent a disease in cases in which clinicians lack a reasonable alternative treatment and researchers have a strong understanding of the gene directly involved in the disease. The panel called for “maximum transparency,” regulatory oversight, and multigenerational follow-up.

Similarly, the Nuffield Council on Bioethics last month published “Genome Editing and Human Reproduction: Social and Ethical Issues.” That guidance states that heritable human genome editing “could be morally acceptable” under criteria similar to the National Academies committee’s.

Doudna, who last year published a book coauthored by Samuel H. Sternberg titled “A Crack in Creation: Gene Editing and the Unthinkable Power to Control Evolution,” is pleased with the level of interdisciplinary discussion of CRISPR that followed the Napa meeting.

“Over the last three years there has really been more active engagement between the scientific community and bioethicists who do this professionally,” Doudna says. In the process, many scientists have come forward to question the wisdom of moving full speed before reaching a consensus on safety, environmental consequence, and ethics when research can have major societal impact.

Nuffield Council Director Whittall agrees with Doudna that scientists and ethicists are increasingly engaged in working together to chart a safe and ethical course for technologies such as gene editing and to create more regulatory oversight.

But he says that discussions are still largely the domain of academics and professionals—lab scientists, ethicists, philosophers, and lawyers—and lack participation of the general public. “That is why we in our report call for the U.K. government to set up a body that specifically has the job of forming and promoting a broad and inclusive public debate around gene editing and related questions,” Whittall says.

And other work is still needed before making decisions about editing the human genome. “We are not close to the amount of research needed to gauge precision and safety,” said bioethicist R. Alta Charo, a professor of law and bioethics at the University of Wisconsin, Madison, in a 2017 presentation. Charo was one of two nonscientists at the Napa gathering, and she cochairs the National Academies genome editing committee.

Questions from the bench

At no point in recent deliberations has there been a call for a full stop to research similar to the moratorium agreed on at Asilomar. Scientists have preferred to continue experimenting, dealing with ethical questions as they arise.

“The danger of using philosophy first and mapping everything out is that on any journey, unexpected things happen,” says Insoo Hyun, a professor of bioethics and philosophy at Case Western Reserve University.

However, holding off on asking questions is also inadvisable, Hyun says. Waiting to discuss ethical values until the technology is advanced enough for clinical trials leading to babies with altered genomes is too late, he says. “We should be having the discussion about whether it is even desirable to try to do germ-line engineering, even in the dish, right now.”

Some scientists are highly committed to seeking a consensus on nontechnical ethical questions about research and technology application, Hyun says. “There are people who will sit on a committee for a year and a half writing guidelines. They are not necessarily a representative group of scientists, but they are very sincerely concerned about making sure the public does get to have some input and to have some of their concerns aired out and heard and taken seriously by scientists.”

“My experience is that ethical questions or deeper questions are identified and raised by the scientists themselves in the first place,” says Jeantine Lunshof, a philosopher and ethicist working in the biological engineering laboratory of chemist and genetics professor George Church at Harvard University. Lunshof, who began working with Church in 2006, says her daily access to Church’s genomic engineering lab gives her a front-row view of research and of how researchers are crossing over into the nontechnical realm to assess societal impact and direct the use of their science.

Lunshof, who rejects the term “embedded ethicist,” with its watchdog implications, bemoans a general lack of understanding of the function of an ethicist.

“It is not the case, as some people think, that I sit here as an ethicist and identify incredible or terrible or horrible ethical issues,” Lunshof says. “The scientists themselves have a very good feeling for moral issues and values questions, where they arise. Being an ethicist in the room, I can react to that immediately and say, ‘Hey! We should discuss that.’ I don’t throw cold water on anyone. I am not the ethics police. A sign on my door says that.”

Advertisement

That’s not to say that ethical concerns should never slow science. “I’m as close as you can get to being a scientist and an ethicist in the same person,” says Church, who claims to have published 21 articles on ethics, “and I do not think we should be green-lighting anything.”

He agrees with Lunshof that the important questions regarding ethics in the laboratory are generated by scientists. “Technologists have, for better or worse, the ability to see further out because we happen to see the technology and where it’s going,” he says.

At the same time, Church advocates for greater regulation of new technologies. As an extreme example, Church has called for government licensing and even surveillance of synthetic biologists. “I feel that everybody that passes through my lab even temporarily should be under surveillance for life,” Church says. “Nobody forced them to be synthetic biologists. They did that voluntarily, and they gave up some of their civil liberties when they did that.”

If Church draws a bright line anywhere on the ethical application of science, he says, it would be at any point where proceeding—or not proceeding—would pose a threat to the human species.

But not all scientists are interested in debating where such lines should be drawn. For some it’s an issue of believing that nonscientist critics lack the expertise to understand the consequences of the research.

Others say, “That’s not my department. I’m not responsible for the ethics; I’m not paid to do ethics; it’s a distraction,” Church says. “Then there are other people, like me, who say, ‘Well, it doesn’t take that much time; I’m going to publish a few papers on ethics and give a few talks.’ ”

Lunshof attests to Church’s approaching problems both as a scientist and philosopher, essentially bridging the objective and subjective. “George’s thinking goes beyond empirical truth,” she says.

Crossing over

But crossing over from scientific research to the social sciences may not be as easy as Church makes it sound. Bioethics, a fledgling discipline at the time of the Asilomar conference, has become a complex academic discipline of protocols and practices mixing a variety of social sciences, philosophy, theology, and public policy unfamiliar to most chemists and biologists.

“When scientists say we need more science literacy, I turn around and say we need more ethics literacy,” says Françoise Baylis, a professor of bioethics and philosophy at Dalhousie University.

Like Lunshof, Baylis sees her role as a facilitator. She wants to “create spaces for more and more participation,” she says, while sharing her own views that are informed by a philosophical commitment to social justice. Her reception by the science community has been mixed.

“I am perceived by some as an interesting interlocutor,” she says. “They enjoy talking with me, whether they agree with me or not. Often they don’t agree with me but find it interesting that somebody from my perspective engages with them. Others are more dismissive: I am a Luddite, I’m not a scientist, I don’t understand the science, and I have no right to speak to anything.” Still others, she says, are extremely responsive to her concerns and agree that there are questions of social justice that should be addressed in guiding research.

Photo of a woman standing outside a building.
Credit: Melissa DeWitte
Sociologist Jenny Reardon, a founding director of the Science & Justice Research Center at the University of California, Santa Cruz, advocates multidisciplinary academic training to meld the social and laboratory sciences.

Social justice is a central issue in bioethics, but it’s poorly understood outside ethicist circles, says Jenny Reardon, a professor of sociology at the University of California, Santa Cruz, where she was a founding director of the Science & Justice Research Center. Most people, scientists included, view it as a matter of determining which groups in society have access to beneficial technologies. But there is also the question of which groups in society decide which technologies will be developed, she says.

Imagination and ideation matter in developing technologies. “If you just have a certain kind of person designing the technology—say, all tech people from Silicon Valley—they are going to design technologies that aren’t going to be useful and may even be harmful for some groups of people,” says Reardon, who is working to create a more robust system for interdisciplinary bioethics, including academic training that melds the social and laboratory sciences. “We haven’t figured out how to create collectives with the expertise we need to determine what the world will look like with new technology and what rights we will have in it,” Reardon says.

As constructs for interdisciplinary vetting of science emerge, Reardon says it is important that bioethicists maintain an independent perspective on laboratory research. Ben Hurlbut, a professor of biology and society at Arizona State University, agrees. “There is a certain kind of bioethicist that has been incorporated into the structures of bioscience research itself,” he says. He acknowledges that such ethicists play an important role, but he questions whether their proximity to research teams allows them to adequately address “the far-reaching questions that need to be asked to inform the trajectories of research.”

Lunshof, an example of such an ethicist, contends that there are advantages and disadvantages to working either within a laboratory team or independently. “I would not claim that one is better than the other,” she says. “You get very different views and apply different theories. They are complementary.”

Lunshof, who this year began working on-site at the Massachusetts Institute of Technology Media Lab as well as with Church’s group, adds that ethicists situated in laboratories often collaborate with independent ethicists, bringing both perspectives together. She says, for example, that she and Hyun at CWRU are in the preliminary stages of a project focused on work in the Church lab.

Hurlbut contends that ultimately a broader range of views is required. “The real challenge is to achieve the kinds of inclusive, heterogeneous, potentially divisive, but rich forms of democratic deliberation” to guide the trajectory of science and technology in a way that respects human integrity, Hurlbut says.

Conflict and resolution

Ethicists agree that gene editing has accelerated efforts to establish a balance of voices in bioethics. Fear of designer babies and killer viruses date back to the earliest genomic technologies, “but this is different,” says Dominique Brossard, chair of the department of life sciences communication at the University of Wisconsin, Madison. “We are not just talking about people who feel a technology challenges their value system—people who, by the way, are entitled to talk,” she says. Rather, “people who are involved with the technology and understand the dimensions of the ethics are pointing out the fact that this is going too fast. Scientists are calling the technology into question.”

When those scientists are also innovators, there are obvious possible conflicts of interest, Brossard and others say. “We need to be vigilant in seeing that whoever talks about a scientific advance really states clearly where they are coming from,” Brossard says. “People need to understand where they are coming from and understand their own biases.”

Kevin Esvelt—director of the sculpting evolution group at the MIT Media Lab and an innovator in the use of CRISPR technology in gene drive, a technique that accelerates natural evolution by driving a particular suite of genes throughout a population—says scientists are well equipped to identify possible areas of conflict. He agrees with Brossard that they must state them from the outset.

“Our role is to present what we believe is technically possible, acknowledging its incompleteness,” Esvelt says. “We have guesses on the pros and cons, but they are guesses. We acknowledge that. We also acknowledge that we’re biased.

“Then, society and everyone in it, please help us determine the best course of action,” Esvelt says.

Elizabeth Heitman, a professor in the Program in Ethics in Science & Medicine at the University of Texas Southwestern Medical Center and a cochair of the National Academies Committee on Gene Drive Research in Non-Human Organisms, concurs with Esvelt. She agrees that scientists are in a good position to identify conflicts of interest in addition to ethical questions about their work. But ethical questions will not be resolved without scientists engaging with a well-informed general public.

“There may be a time that gene drive is suggested for a problem that some people take seriously and others don’t,” Heitman says. “Who is going to be the person or group to make the decision to go forward? It’s likely to be a combination of people suffering with the problem and people with the technology to do the work.”

Scientists who refuse to engage with ethicists and the public will find themselves at a disadvantage. “Just because you are a scientist and have invented something doesn’t mean you have authority over it,” says Fred Gould, an entomologist and codirector of the Genetic Engineering & Society Center at North Carolina State University. He points to the National Academies report’s advocacy of participatory decision-making. Resistance from the science community based on ethicists and the public not fully understanding the science wears thin, he says. “You are a pretty poor scientist if you can’t explain what these things are about to an ethicist,” he says.

Accepting responsibility

Researchers on the front lines acknowledge that putting new technologies into the world is a learning experience, one that opens pathways to beneficial use while raising red flags regarding misuse. They are also anxious about the public’s perception of the pros and cons and about their personal responsibility for how technologies they introduce are deployed by others. Meeting that responsibility, Church says, is a matter of taking the time to investigate ethical issues—a process that by no means impedes science.

Industry has a role to play as well. Merck KGaA convened an ethics panel in 2011 to guide the company on its work in genomics and its marketing of genomic equipment. When Merck acquired Sigma-Aldrich in 2015, becoming the lead supplier of CRISPR and other gene-editing technologies, its panel—an international group comprising scientists and a range of nonscience professionals, including a theologian—turned to issues such as the verification of customers’ authenticity.

“Are they legitimate researchers?” asks Steven Hildemann, a physician and chief medical officer in Merck’s biopharma division and cochair of the ethics panel. “Can we put in checks and balances to determine who we will work with and who we won’t? We are talking about gene editing now, not new software or the next iPhone.”

Individual scientists developing new technologies that could be exploited also need to ask questions, Doudna says. “I have learned a lot more of the fundamental aspects of human biology than I knew before,” she says of her experience with CRISPR. “I really am a biochemist and a chemist by training, not a human biologist by any stretch, and I came to appreciate that there is a lot of very fundamental knowledge about early human development and embryogenesis that is unknown.”

Doudna is cautious about any clinical applications of human genome editing. She says she is especially concerned about a backlash resulting from any use of CRISPR that is perceived as harming society or the environment.

Introducing new technology is like opening Pandora’s box, Doudna says. “You can’t close the box again. You wouldn’t want to anyway because of the potential of great things to come from it,” Doudna says. “It just requires that the folks involved in the original science actively engage in a discussion of how to regulate it, how to self-regulate it, and how to appropriately steward it in the future.”

Esvelt is adamant about his personal responsibility for the uses of technologies and techniques he develops. “I outright assume that I am personally morally responsible for the consequences of everything that I research,” he says. “That is, I am on the hook for anything anyone ever does with CRISPR-based gene drive because I am the one who invented that and introduced that to the world.”

Esvelt says he advocates a change in “the modus operandi” of science, whereby innovators closely scrutinize the work others do with their inventions. “This should make anyone in my field deeply uncomfortable,” he says, “in that I am by definition a busybody, poking my nose into what they are doing. I am deeply concerned with fundamentally changing how we do science because six years ago, no one even imagined we would be able to readily edit entire wild species.”

CORRECTION: This story was updated on Aug. 27 to remove the word “widely” from the amount of criticism Doudna received for modeling her 2015 Napa meeting on the 1975 Asilomar conference. Several people did criticize Doudna at the time, but “widely” implied a larger immediate response than occurred.

Article:

This article has been sent to the following recipient:

0 /1 FREE ARTICLES LEFT THIS MONTH Remaining
Chemistry matters. Join us to get the news you need.