How do you handle the data of a scientist who violates all the norms of his field? Who breaches the trust of a community that spans the entire globe? Who shows a casual disregard for the fate of the whole human species?
On the one hand, you might want to learn from such a person’s work; to have a full and open dissection of everything that went wrong. Because, spoiler, there was a lot that went wrong in the case in question. But rewarding such “abhorrent” behavior, as one scientist put it, with a publication—the currency of the scientific world—would send a message that ethical rules only exist to be broken.
This is the precarious situation in which we find ourselves today, as scientists hash out the next chapter of the human gene-editing scandal that erupted two weeks ago, when the Chinese scientist He Jiankui revealed that for the last two years he has been working in secret to produce the world’s first Crispr-edited babies. Scientists denounced the work with near-unanimous condemnation, citing its technical failures as well as its deep breaches of ethical (and possibly legal) lines. What’s much less certain is what should happen to the work, now that it’s been done.
Hours after He presented data on the twin girls at an international genome editing summit in Hong Kong, copies of his slides were already circulating in email inboxes and on Twitter. Scientists scrutinized the work, 280 characters at a time, and pointed out all the questions that remained unanswered. It was the kind of conversation that normally would take place under the auspices of a journal. But He, who made his announcement over YouTube, has so far produced no manuscript for public consumption. A paper describing this work is reportedly under peer review, and a second one about additional Crispr experiments in human embryos was rejected by an international journal over ethical and scientific concerns, STAT reported Monday morning.
Scientists are beginning to grapple with the very real possibility that He’s work may never be awarded publication status, along with its attendant sheen of legitimacy. And that may be the academic justice he deserves. But it also highlights an intractable tension embedded in scientific publishing: policing bad actors comes at the cost of scientific censorship.
“It’s a very dicey issue,” says Michael Eisen, a molecular biologist at University of California, Berkeley, and a staunch advocate of open-access publishing. “There need to be consequences for people who do things that are deemed to be unethical. You don’t want to have a system that gives people reasons to just randomly experiment on people.”
The scientific publishing system, imperfect as it may be, has remained relevant in an era where anyone can buy a URL, self-publish a paper, and push it out to social media platforms reaching millions of people all in the span of an afternoon. The reason is that data wants to be seen in context, in conversation with other data. Through the connective tissue of citations, scientific journals establish a common set of vetted facts to debate, challenge, and be inspired by. They ensure some modicum of permanence to those facts; so that people today, tomorrow, and 100 years into the future can all point to the same digital object identifier assigned at publication and know that they’re all talking about the same thing.
What then are the scientific costs to building a foundation for the field of human germline editing with one very consequential brick conspicuously missing? Disappearing the data down a memory hole presents logistical challenges as well as philosophical ones. Does the original sin of He Who Must Not Be Named preclude society from studying these twin babies as they grow up and maybe have children of their own? Addressing these questions will require decoupling the knowledge-building purpose of scientific publishing from the career-building one.
Now, lest you think these are just #ivorytowerproblems, let’s be real for a second. There are going to be more Crispr babies. Maybe not next year or the year after that. But they’re coming, and not just in China. Last week, Harvard researchers announced that they plan to edit the DNA of human sperm to see if it’s possible to create IVF babies with lower risks of developing Alzheimer’s later in life. All around the world, researchers are doing studies in mice and monkeys, filing patents, and starting companies, all with an eye toward a future where germline editing becomes a legal, socially acceptable technology. How the scientific community responds in the present moment will have huge consequences for how, and how fast, that happens.
“You would hate for some future experiment to fail or have some problem that could be avoided had people studied what happened here,” says Eisen. “In some sense there might even be an ethical duty for people to consider what was done.” Despite the uproar among scientists, they have not backed a moratorium, and embryo editing is ongoing.
During the Hong Kong summit, an audience member asked He if he would be willing to post his work to a public forum, such as the biology preprint server bioRxiv, so the scientific community could have access to the data. He said that the journal considering his manuscript had advised against posting anything to bioRxiv until the paper had passed peer review. He did not specify which journal. Nor did He return WIRED’s requests for comment. But scientists who have seen the manuscript doubt it will pass peer review any time soon, if ever.
“It was a very shoddy paper, very incomplete. What I saw wouldn’t pass any journal,” says Eric Topol, a cardiologist and director of the Scripps Research Translational Institute who reviewed He’s manuscript for the Associated Press. Other scientists have also denounced the experiment as a technical failure, based on the slides He presented in Hong Kong.
The edit He was trying to mimic was a 32-base pair deletion to the CCR5 gene that occurs naturally in some people with Northern European ancestry. Having two copies of that specific mutation leads to zero production of the CCR5 receptor, which HIV uses to access human immune cells. Instead, He introduced two new, unstudied mutations in one twin, Nana. In the other, Lulu, Crispr only managed to edit one copy of the CCR5 gene, again with a novel alteration. That means her healthy copy will still make CCR5 and she will likely still be susceptible to HIV. No one knows if the random mutations will provide a protective effect. They might even be harmful. Not only that, but early data suggests that both girls have a patchwork of edited and non-edited cells; a phenomenon known as mosaicism.
The work’s moral failings are equally numerous. Besides choosing to cripple a normal gene to reduce the risk of a preventable, controllable disease neither child had, He personally took study participants through the informed consent process, in which he had no training, and during which he falsely described his work as an “AIDS-vaccine development project.” The consent documents made no mention of the risks involved in disabling the CCR5 gene—including the potential for increased susceptibility to other viruses like West Nile and influenza. And the hospital where He claimed to have ethical approval denied knowledge of any such project and said in a statement that the signatures on the approval form are suspected to be forgeries.
The dilemma now, Topol says, is whether any publication or preprint server should be party to something so deeply sunk in a moral morass. “This hasn’t come up before because nothing has breached the ethics of human research like this,” says Topol. “ It’s highly problematic to publish it anywhere.”
That includes bioRxiv, which was launched in 2013 by scientists at Cold Spring Harbor Laboratory to make scientific information available faster. Submissions to bioRxiv go through a quick (24-48 hour) screening process that filters out obviously non-scientific material, plagiarism, and any thinly veiled submissions by activists or AI. Scientists wanting to upload human studies have to list registered clinical trial IDs, meaning the studies have passed some form of ethical review.
He’s Crispr baby work was technically listed with China’s clinical trial registry, but it does not appear he sought prior approval from federal regulators. According to the AP, the study was listed on November 8, 2018, long after it began. Richard Sever, a molecular biologist and bioRxiv co-founder declined to comment on He’s work specifically, but he did say that the preprint server would exercise its right to turn away any papers with known ethical or legal violations. “Our intention is not to provide a platform that seems to endorse or encourage unethical work,” says Sever. “That would be a very dangerous precedent for bioRxiv.”
All this hand-wringing over the moral complicity of publishing platforms raises a tree-falling-in-the-forest line of existential questioning: If no one will publish what He did, does that mean it’s not science?
Depends on what you mean by that.
Science with a small “s” is a human enterprise as old as humanity itself. Nibbling on that tasty-looking mushroom and waiting a few hours to see if you get sick? That’s hypothesis testing. Try it a few more times with successively bigger bites, maybe add a bit of open-fire cooking; you’ve got a scientific method going. He’s human experiment is clearly science in this sense.
Whether it will become Science with a big “S” remains to be seen. This more rigorous meaning of Science—which seeks to accrue knowledge by progressively, and systematically, reducing uncertainty—has only been around a few hundred years. Its arrival was marked by the development of the scientific paper, published in the pages of peer-reviewed journals. Before the 1600s, scientists communicated over private correspondence or in lectures. The scientific paper then became, and still is, the enabling unit of Science as a progressive, global enterprise.
So what then, is to be done with the work of researchers like He, who step outside the bounds of acceptable Science? It’s a question that has mostly only come up in a backward-looking way, to studies that might have met the ethical standards of the day but have since been roundly denounced. The Tuskegee study—which denied African-American men syphilis treatment—comes to mind, as does Operation Sea-Spray, the US Navy’s fatal release of pathogenic bacteria over San Francisco.
Then you have the case of Edward Jenner, who in the 1790s began experimenting on people with cowpox, injecting them with material taken from diseased dairy cows to see if it would protect them against smallpox. The Royal Society rejected his paper on the topic. Feeling it was an important public health contribution, Jenner published his case studies privately. The account led to the formation of mass vaccination campaigns and the eventual eradication of smallpox from the face of the Earth.
He’s few public statements have hinted at his ambitions to be a modern-day Jenner, ambitions that may have blinded him to his transgressions. Now the scientific establishment will have to decide if it too will wear blinders. Never before has the academic publishing world had to contend in real time with research that nearly everyone agrees was profoundly wrong. And if anything, the last two weeks have made it all too clear just how unprepared anyone is to do that.
More Great WIRED Stories
- What causes hangovers, and how can I avoid them?
- A civilian’s guide to Fortnite, just in time for season 7
- The promise—and heartbreak—of cancer genomics
- Waymo’s so-called robo-taxi launch reveals a brutal truth
- PHOTOS: Dress rehearsal for a mission to Mars
- 👀 Looking for the latest gadgets? Check out our picks, gift guides, and best deals all year round
- 📩 Want more? Sign up for our daily newsletter and never miss our latest and greatest stories