About six months ago, the College of American Pathologists (CAP) surveyed the labs that it accredits to perform diagnostic tests. The organization asked how many labs were already offering genetic tests using “next-generation” DNA-sequencing technology and how many planned to start offering them soon. Such tests can guide the diagnosis and treatment of patients with cancer, some forms of heart disease, and rare genetic diseases.
Out of the 176 labs that responded to the survey, 19% were already offering these tests and 55% said they planned to use the tests within three years. Just a few years ago, scientists thought such next-gen sequencing was about a decade away from regular clinical use.
But things are moving faster than expected. Although there are advantages for patients, the speed also means testing facilities already face challenges that can lead to missed diagnoses. That’s because of an unsettling truth: The gene variants identified in sequence data sometimes differ depending on which bioinformatics tools are used to analyze that data.
Worse, little standardization exists to guide clinical labs on how to best use the tools. As the cost continues to drop, next-gen sequencing will soon be adopted by smaller labs, according to Ira M. Lubin, a geneticist at the Centers for Disease Control & Prevention (CDC). “You’re always going to get a result. Whether it’s a good result is not going to be obvious unless you have real experts” to analyze it, he says.
Regulations surrounding sequencing tests are still evolving in light of continuing concerns about test quality and reproducibility. And uncertainties remain about the ethics of informing physicians and patients about test findings on disease tendencies unrelated to the original medical conditions that prompted the tests.
Next-gen sequencing is a catch-all term for the various successors to Sanger sequencing, the method used a decade ago for the Human Genome Project. Sanger sequencing is a laborious method—the first human genome analysis took more than a decade to complete and cost nearly $3 billion.
Sanger sequencing is still used for single-gene tests, where it continues to be a cost-effective approach. “For single genes, it’s actually cheaper to do it by Sanger,” says Heidi L. Rehm, director of the Laboratory for Molecular Medicine at Partners HealthCare Center for Personalized Genetic Medicine, in Cambridge, Mass.
But for larger-scale sequencing tasks, next-gen technologies are much faster and cheaper. Though they differ in the particulars, the techniques all involve sequencing many short, overlapping fragments simultaneously. Those short sequences are lined up against a reference genome to determine where they fit in the overall genome. Because the sequences are so short, each location must be analyzed in multiple fragments to be sure which base is there.
Although multiple platforms qualify as next-gen sequencing systems, instruments from two companies dominate the clinical landscape: HiSeq and MiSeq instruments from Illumina and Ion Torrent systems by Life Technologies. “The machines are such that the lab operator doesn’t have to do much,” says Nazneen Aziz, director of molecular medicine at CAP and lead staff member of its Next-Generation Sequencing Work Group. Next-gen sequencing makes it “so easy to generate these sequences at low cost in an automated fashion,” Aziz says. “That’s why it’s being adopted.”
Next-gen technology is being used in clinical testing labs mostly to sequence panels of multiple genes but also to analyze exomes—protein-coding regions of genomes—and even whole genomes.
Most of those tests are panels of genes associated with cancer risk. But labs are offering panels for a variety of other diseases as well. Some panels involve only a few genes, whereas others involve hundreds of potential culprits.
Genomics & Pathology Services at the School of Medicine at Washington University in St. Louis focuses exclusively on diagnostic gene panels, its workhorse offering being a panel of 40 cancer-related genes. The gene panel orientation stems from the way the university service is funded—by government and private health insurance rather than by grants or philanthropy. Insurance programs “only pay for information that directly impacts the care of a patient now,” says John D. Pfeifer, vice chair for clinical affairs in the department of pathology and immunology at Washington University in St. Louis.
Other labs are starting to offer exome- and whole-genome sequencing. Rather than focusing on only a handful of specific genes, these types of tests look at all genes, either just the protein-coding regions or the whole thing, including the regions between genes. For many of these genes, the connection to disease is not well understood.
Earlier this year, the Medical College of Wisconsin’s Human & Molecular Genetics Center became the first lab to offer full-service clinical whole-genome sequencing to patients worldwide. The sequencing and analysis costs $17,000, but other clinical costs can vary from patient to patient. Whether such tests are covered by insurance varies from case to case, a medical college spokeswoman says. Other labs are following suit. The Laboratory for Molecular Medicine at Partners HealthCare Center for Personalized Genetic Medicine is in the process of launching its exome- and genome-sequencing service.
The most common situation calling for clinical exome or whole-genome sequencing involves what David P. Bick, the medical director for genetics at Children’s Hospital of Wisconsin and a professor at the Medical College of Wisconsin, calls the “diagnostic odyssey.” In such cases, “a patient has been seen by a geneticist here and there. They’ve had several rounds of testing over the years, where people do their best guess about which gene to chase” for the condition the patient has. But they’ve exhausted the usual set of diagnostic tools and have not been able to come up with an answer, he says.
A prime example of such a situation is the 2009 case of Nicholas Volker, a child with a gastrointestinal disorder for which physicians had not been able to pinpoint a cause. After multiple surgeries, they finally turned to whole-exome sequencing. The analysis revealed a variant in the XIAP gene that turned out to cause an immune condition. The child was then cured with a bone marrow transplant.
But such dramatic results are far from guaranteed. Genomic diagnosticians strike gold in only about a quarter of cases, Bick says. “If you can sequence all the genes, why can’t you find all the answers?” he asks. Although a lab might analyze 20,000 genes, he says, disease associations might be known for only 3,000 or 4,000 of them, and “there are many thousands of genes where we don’t know what a change in the gene will do to a person.”
Labs also face challenges in data analysis and interpretation. When used with the same genome, some algorithms will detect particular variants and others won’t. The labs involved are “all using best practices, but there is no one standard protocol,” Aziz says.
The underlying problem is that gene sequence data are just hard to interpret. The ability to assemble the genome depends on having sufficient depth of coverage, which is the number of times a particular location in the DNA sequence is read. Exome and genome sequencing require greater coverage depth than do gene panels, and that extra depth is expensive and time-consuming.
The overall DNA sequence is determined by assembling raw sequence data from relatively short segments into coherent sequences, and from that overall sequence, variants are “called,” or identified, relative to a reference genome. For some parts of the genome, those steps remain challenging. And gaps still exist in the National Center for Biotechnology Information (NCBI) reference genome that most labs use. In regions with a single base repeated many times, counting the number of bases correctly can be hard. And regions rich in the bases guanine and cytosine remain difficult to sequence.
Then, after the sequence is assembled, the variants have to be identified. Variants fall into four major categories: single-nucleotide variants, where one nucleotide is substituted for another; insertions and deletions, where one or more nucleotides are added to or subtracted from the sequence; copy number variants, in which a relatively large region of the genome is present more or fewer times than normal; and translocations, in which sections move from one chromosome to another. Examples of each of these variant types are associated with diseases, so being able to find them is important.
“Right now, the ability to accurately call variants varies with the [bioinformatics] tool,” Rehm says. “There’s an incredible lack of standardization.”
Rehm is serving on a CDC committee to define a variant identification system for clinical use. She also has a grant to create a centralized environment for interpreting genomic variants. For that grant, she’s working closely with bioinformaticians at NCBI, which is developing ClinVar, a database of medical phenotypes and genetic variations.
“The idea is for all clinical laboratories to share their variant interpretations in a public environment,” Rehm says. “Effectively, you crowd-source your interpretation. We will be able to uncover discrepancies in labs where they’re interpreting things differently. As we develop and improve resources for sharing and interpreting data, that will gradually lessen the challenges we’re dealing with on the interpretative side.”
In addition to methodological issues like inconsistent interpretation or variants, the spread of clinical next-gen sequencing also raises regulatory issues. The labs that run the tests are subject to regulation by the Center for Medicare & Medicaid Services under a set of standards called the Clinical Laboratory Improvement Amendments (CLIA). Labs performing medical tests have to be CLIA-certified. In addition, they can be accredited by CAP, which is generally considered a higher quality standard.
But the tests themselves are currently not regulated. They instead fall under the category of lab-developed tests—procedures developed in-house rather than performed using kits. Lab-developed gene-sequencing tests have so far not been regulated by the Food & Drug Administration. FDA has a draft guidance for lab-developed tests under administrative review, but it is unclear whether and when that document could affect genetic tests. It has not even been released for public comment yet.
Nevertheless, under CLIA and CAP rules such tests must be validated to show that they meet performance characteristics. For example, Rehm’s lab did an extensive platform validation when it launched its first test. Rehm and coworkers ran 400 confirmed mutations to show that they could detect substitutions, insertions, and deletions. They also validated a method for detecting copy-number variants. Then when they ran their first two tests they analyzed 80 samples to validate the platform. Each new test needs to be validated, but the extensive up-front platform validation the lab had carried out made subsequent validations easier.
The increased popularity of clinical next-gen sequencing is leading to efforts to standardize how such tests are run, especially as they move out of specialized genetic testing labs into more general clinical labs. “Everybody is hungry for standards, as they allow wider adoption of the technology,” CAP’s Aziz says.
CDC’s Division of Laboratory Science & Standards, which promotes the quality of laboratory practice, has convened two working groups since 2011 to address different aspects of clinical next-gen sequencing, according to CDC’s Lubin. The first working group, which met in 2011, developed a quality framework that could be implemented in all labs doing next-gen sequencing. The framework recommended steps for labs to take when establishing a new test and for defining CLIA performance characteristics, such as accuracy.
“There are quality metrics that you can apply to every step of the process,” Lubin says. The work group focused on the overall process rather than on detection of individual variants because “it’s not practical to quality control every possible outcome,” Lubin says. “The work group made recommendations for what can be practically done to give a level of confidence that one is able to detect a variant if it’s present.” The first working group published its recommendations in November 2012 (Nat. Biotechnol., DOI: 10.1038/nbt.2403).
The second working group met in October 2012 to develop guidance for bioinformatics tools for next-gen sequencing. The recommendations from the second working group have not yet been published.
CAP is also continuing to develop standards for next-gen sequencing. The organization published the first next-gen sequencing checklist in July 2012; later this month it will publish laboratory standards for noninvasive maternal-screening tests, which also use next-gen sequencing.
In addition, CAP is putting together a methods-based proficiency test for next-gen sequencing. When taking the test, labs will be asked to identify specific variants in a genomic sample. “We are trying to see if labs running next-gen sequencing for clinical testing can get the variant calls right,” Aziz says.
For the proficiency test, labs will receive DNA from a healthy individual. Each lab will process the sample as it would any patient sample. If it uses next-gen sequencing for a gene panel, it would run that gene panel. If it does exome or genome tests, it would follow those procedures instead.
The first test will be educational and not too rigorous, Aziz says. “We want to encourage people, and not make them shy away from taking the test,” she says. “We’ll give them a summary evaluation of how they did compared to peers and not grade them. Over the years, we’ll make it more stringent in areas that we feel need to be assessed more rigorously.”
The test is not yet ready. “We’re launching a pilot late this year,” Aziz says. “Next year, hopefully by the first quarter, we’ll be launching the full product.”
Increased use of DNA testing, especially of exomes and genomes, brings ethical considerations with it as well. One concern is what to do when a test ordered for one diagnostic purpose finds genetic variants relevant to other diseases, so-called secondary or incidental findings.
The American College of Medical Genetics & Genomics (ACMG) has addressed the question head-on. In March, ACMG released a guideline on the reporting of incidental findings, including a list of 56 genes that ACMG says labs should report to the ordering physician regardless of the patient’s wishes. It is then up to the physician to decide whether or not to inform the patient. Some gene variants become less important if the patient is already facing something more severe.
Reactions to the recommendations have been mixed, according to Bruce R. Korf, president of the ACMG Foundation for Genetic & Genomic Medicine. “There was pretty wide agreement that somebody had to put a stake down and create a starting point,” he says. But there have been strong responses to the perceived lack of patient autonomy. “Some people feel that patients should have the right to choose which results they do and do not want to see,” he says. For example, Madhuri Hegde, executive director of the genetics lab at Emory University, thinks patients should be able to opt out of receiving incidental findings.
“As this debate is brought out into the public and aired, the public is going to decide the answer to this question,” Medical College of Wisconsin’s Bick says.
Elaine Lyon, medical director of genetics at ARUP Laboratories, a national reference lab owned by the University of Utah, is glad that ACMG made its recommendations. “It’s nice to have an organization provide a list of what it considers clinically actionable,” she says. “We tried to define it ourselves, but we didn’t feel comfortable.”
Rehm was on the committee that wrote the recommendations, but as the head of a lab charged with implementing those recommendations, she is planning to put them into effect slowly. For a while, her lab will allow patients to opt out of receiving incidental findings, but she plans to collect information on why they choose to do so. Only then, she says, will she be fully secure with the notion that opt-outs won’t be allowed.
Most important, for next-gen sequencing—especially of exomes and genomes—to succeed over the long term, the public has to decide how it wants the technology used. “Whole-genome and whole-exome [sequencing] will fail as tests if the public doesn’t believe that they’re being used in the way they want,” Bick says. ◾