Issue Date: April 24, 2017
Big data, big deadlines spur change in toxicity testing
Attendees at the 2003 Chemspec exhibition in Manchester, England, may not remember the product introductions. They may not recall that the sector was in a slump. But many still speak of the attack on Huntingdon Life Sciences, in which company representatives were hit with paint bombs thrown by animal welfare activists.
Abusive treatment of animals at Huntingdon’s Princeton, N.J., laboratory had been exposed by an undercover agent of People for the Ethical Treatment of Animals (PETA) in 1998, making the firm a target for protesters calling for an end to the use of animals in chemical toxicology testing.
The confrontation in Manchester and others like it resulted in a perception that industry and animal advocates are in irreconcilable conflict pitting the requirement for in vivo chemical testing against an absolutist demand for its elimination. The idea of a major company such as Dow Chemical sitting down and coming to any productive terms with a group such as PETA seemed remote.
But in fact, Dow is one of many chemical companies working closely with PETA as industry, advocacy groups, and regulators embark on a new era of chemical testing, one in which data-enhanced alternative methods are emerging to meet rapidly growing demand for toxicological testing.
“We work a lot with Dow,” says Amy Clippinger, associate director of the PETA International Science Consortium. PETA still has a unit that works on exposing abuses, but her unit has grown in importance since it launched five years ago. Indeed, she says, PETA has been engaged with industry on regulatory testing since 1997.
“We can point to valid and robust methods that can be used instead of animals.”
—Amy Clippinger, associate director, PETA International Science Consortium
The development of alternative testing methods has accelerated in recent years with advances in science, Clippinger explains. In vitro methods have steadily gained acceptance in corporate labs as well as with regulators, and emerging techniques for managing large amounts of data provide an opportunity for moving away from animal testing.
Kristie Sullivan, vice president of research policy at the Physicians Committee for Responsible Medicine, which advocates for alternatives to animal testing in chemical and drug research, adds that with reauthorization of the Toxic Substances Control Act (TSCA), the U.S. Environmental Protection Agency has set risk-based safety standards with deadlines for the evaluation of chemicals. TSCA, Sullivan notes, now specifies a requirement to reduce or replace animal testing.
Meanwhile, the European Union’s Registration, Evaluation, Authorisation & Restriction of Chemicals (REACH) initiative has delivered a huge motivation to develop new ways of testing chemicals. It has set a 2018 deadline for registering chemicals made or imported into the EU in volumes between 1 and 100 metric tons—a directive expected to heavily impact small and medium-sized chemical manufacturers.
As Clippinger and Sullivan see it, the table is set for significant advances in alternative testing methodology. “It’s just a matter of really putting the pieces together, getting the relevant stakeholders organized, and making the policy changes,” Clippinger says.
She and others agree that an important piece, data analysis, is now moving firmly into position thanks to the rise of a technique called read-across. Scientists use the method to assess the toxicity profile of new molecules by comparing their structures with those of molecules for which toxicity has been determined. Read-across is viewed as a potential step forward from established quantitative structure-activity relationship (QSAR) comparisons in that it can be developed to compare data such as in vitro bioactivity profiles in addition to molecular structures.
The technique received a boost last month when the testing and validating firm UL introduced a read-across software product targeted at companies filing under REACH. Called REACHAcross, the software has garnered interest from industry, regulators, and government agencies such as the U.S. National Institutes of Health.
“We are starting to get to a position where we can actually understand the mechanistic basis behind certain types of toxicities,” says Nicole Kleinstreuer, deputy director of the National Toxicity Program Interagency Center for the Evaluation of Alternative Toxicological Methods at NIH.
Armed with data analysis tools, researchers are beginning to map molecular interactions key to cellular and tissue events in humans, she says. Progress has occurred quickly in the area of skin sensitization accelerated by a 2013 EU marketing ban on cosmetics materials tested on animals.
More work is needed in other areas, such as testing for developmental toxicity and carcinogenicity, according to Kleinstreuer. “It’s all about the complexity of the end points,” she says. “We are hoping that in the next five to 10 years, nonanimal testing strategies will replace existing methods for acute toxicological end points beyond skin and eye.”
In the U.S., she emphasizes, regulators have been spurring the bulk of the activity, much of it through the Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM). Composed of multiple government agencies, ICCVAM is working on a national road map for developing test methods and has a working group focused on read-across data management.
“Big data is absolutely necessary to the progress we are seeing,” Kleinstreuer says.
UL’s REACHAcross software has roots in an effort to work with a big, new database—the data trove amassed at the European Chemicals Agency, which administers REACH. The software effort was led by Thomas Hartung, a professor at Johns Hopkins University’s Center for Alternatives to Animal Testing who previously worked for the European Commission and helped develop the REACH legislation and organize test guidance.
According to Hartung, companies associated with the Johns Hopkins center, including Dow, BASF, ExxonMobil, and many drug and cosmetics firms, have been using read-across for years. He estimates that 75% of REACH filings include read-across-generated data derived by expert statisticians.
“The trouble is, there are very few experts who know how to do this,” he says. “And they all work at the big companies.” His group set out to build information technology support for nonexperts.
Hartung credits Thomas Luechtefeld, a Ph.D. student at Johns Hopkins, with spearheading the software development. “We built a web crawler for getting data out of REACH,” Luechtefeld says. “The really interesting thing about REACH is that it’s the largest repository for in vivo toxicological data ever.”
For example, he says, REACH has skin sensitization data on 5,000 chemicals. Comparable public data before REACH covered about 250 chemicals.
Luechtefeld and a partner launched a spin-off, ToxTrack, to develop the software, signing a product development contract with UL two years ago. Craig Rowlands, senior toxicologist at UL, says his company saw opportunity in REACH’s looming 2018 deadline for registering toxicity data on 20,000 to 40,000 chemicals. Moreover, UL sees wider application ahead for the software. It is working on a phase two of REACHAcross with broader capabilities.
UL has consulted with Dow on its next stage of development, according to Sue Marty, a toxicology science leader who heads Dow’s predictive toxicology program. Dow has been test-running read-across and QSAR techniques on its own data, Marty says, because it found the available software produced inconsistent quality.
The firm took an interest in REACHAcross when it met with UL in December. “It sounded like they were planning to help companies that need a lot of help with 2018,” she says. “Their phase two will be for more sophisticated users like us.”
Marty points to Dow’s 2025 sustainability goals, which anticipate that the company will reduce animal testing by 30%. She says the firm has long been a proponent of intelligent experimental design and regularly meets with PETA to discuss testing alternatives.
“They understand our need for safety information and the regulatory requirements under which we work,” Marty says. “I think that we have found some common ground in the desire to find better ways to generate safety information and more sustainable materials.”
Clippinger says PETA is hopeful that the push to eliminate animal testing will gain ground. “Back in 1997, it was more of an ethics-based conversation taking place,” she says. “Today, we are hiring more and more scientists and having more and more success because we can point to valid and robust methods that can be used instead of animals.”
In Luechtefeld’s view, new testing methods make the shift possible. “The only way to convince people to change is by creating something better,” he says. “That’s the approach I like better than being a zealot and telling people they are awful for doing animal testing. Let’s push the technology to where we don’t need animal testing.”
- Chemical & Engineering News
- ISSN 0009-2347
- Copyright © American Chemical Society