As emphasis grows on managing data in pharmaceutical labs and plants—collecting and storing it, sharing it, mining it for pertinence—a basic question often goes unanswered: Is the data any good?
The quality and completeness of data associated with making drugs are, however, huge concerns at agencies such as the U.S. Food & Drug Administration. And drug companies are being forced to pay attention to data in response to a sharp increase in the number of FDA warning letters citing inadequate data integrity.
Quality managers who work on improving the thoroughness and accuracy of data agree that the recent spike in warnings does not reflect a sudden problem with data integrity. It is, they say, the result of industry and regulators coming late to the realization that computer-managed data is in bad shape.
Regulators, in particular, have struggled with the conversion from paper-based to electronic filing, given the difficulty of discovering gaps in experimental data stored electronically. Lab workers, meanwhile, face the question of which data need to be stored in a laboratory information management system (LIMS) and how best to configure all the digital laboratory equipment that feeds data into these systems.
All involved agree that the biggest challenge stems from the human element: the need to adequately train scientists on electronic systems and prevent them from deleting preliminary results that might be of interest to investigators looking for evidence of testing to compliance or flat-out falsifying stored data.
“It’s a huge story right now,” says Barbara Unger, a former quality and regulatory affairs manager at Eli Lilly & Co. and Amgen who started a data integrity consulting firm in 2014.
Data system remediation projects are under way across the industry—at drug firms and at the contract services firms that manufacture active ingredients and finished drugs for them. Efforts are both time-consuming and costly, Unger says. “And regulatory agencies are aware that it doesn’t happen overnight.”
FDA, regulators in Europe, and even the World Health Organization stepped up in 2016 with separate guidance documents clarifying requirements under long-standing codes, such as FDA’s 21 CFR Part 11 and Part 211, that established the ground rules for electronic data decades ago.
These documents—specifying the need for complete, consistent, and accurate data that indicate safety, identity, strength, quality, and purity of a drug or drug candidate—have also spurred efforts to improve data quality systems and procedures. The scope of the quality of data that regulators require is reflected in an acronym coined by FDA: ALCOA, for attributable, legible, contemporaneously recorded, original or a true copy, and accurate.
Many U.S. drug and drug chemical makers have been lax in initiating data quality programs largely because of a perception that data integrity violations happen overseas, specifically in India. This view was spawned after Ranbaxy Laboratories and a plethora of smaller Indian firms were caught either falsifying or badly transposing lab documentation between 2004 and 2008.
India did lead in FDA data integrity warnings in 2017. But the U.S. was in second, with China close behind.
Indeed, problems in the U.S. made the news at about the same time as the Ranbaxy incident when whistle-blowers alerted FDA about fraudulent data at two drug companies, Leiner Health Products and Able Laboratories, both of which had previously been inspected and received clean bills of health.
Alerted to what to look for, inspectors came back and found sufficient data problems to issue both companies warning letters. Leiner and Able have since gone out of business. And FDA went into training.
“I came at this with the prejudice that it all came out of India,” Unger, the consultant, recalls. “But it started in the U.S. and goes back to the 1990s.”
The dearth of citations in the U.S. before 2015 and the sharp rise since then resulted from a push by FDA. Spurred by incidents such as those at Ranbaxy, Leiner, and Able, the agency spent several years training its auditors on how to identify problems with computers and work processes that could result in gaps in stored data.
“FDA has taken the lead on this,” Unger says. “They started looking far more closely at electronic data capture, approval, review, and archiving mechanisms and at what controls are in place so that analysts can’t simply delete data they don’t like.”
“Based on the increase in warning letters in regard to data integrity, FDA is certainly picking up their inspections, knowledge, and readiness as they go around the world,” adds Andrew McNicoll, vice president of quality systems and compliance at Patheon, a provider of manufacturing and other services to the drug industry. Regulators in Canada, the U.K., and elsewhere also have increased their focus on data integrity, he says.
Researchers have obfuscated data on paper since the dawn of research. But redacting data on paper generally leaves a mark. Electronic data that isn’t entered into permanent memory, on the other hand, disappears entirely.
Nor are most labs using technology to its fullest potential to manage data integrity, says Monica Cahilly, president of Green Mountain Quality Assurance, a consultancy that has been hired by FDA and other regulators to train inspectors. “This is actually the biggest cause of the uptick in data integrity risk.”
Cahilly adds that researchers coming into pharmaceutical labs from graduate school may not realize that eliminating data that doesn’t give them the “right answer” compromises an electronic filing. They may also be writing data in paper notebooks and not entering it into an electronic database. The proliferation of stand-alone digital tools, Cahilly says, makes it hard to supervise lab activity and identify researchers in need of training.
Regulators, meanwhile, are on the lookout for data from repeated experiments as an indication of testing to compliance. Lab workers must document why they have made changes that result from necessary repetition. “Truth is an interactive thing,” Cahilly says.
Providing compliance-ready data systems has become a competitive front for LIMS vendors and makers of lab instruments such as chromatographs. Most have adapted their systems to support compliance with FDA requirements, Unger says, but success is a matter of effectively implementing these systems in a network with other lab tools.
Heather Longden, marketing manager for informatics and regulatory compliance at Waters Corp., a leading maker of chromatography systems, says the human element, including not only tampering but also system deployment gaffs, poses the greatest threat to data integrity. But she acknowledges that the interconnectivity of chromatography, LIMS, benchtop digital devices, and even financial management software creates a complex data environment and a management challenge.
The integrity of electronic data is further complicated by the enduring preference for paper in laboratories. “We go through laboratories and find that nobody is looking at the electronic data. They are printing it out,” Longden says. “They stopped doing that in banking a long time ago.”
Longden says Waters’s chromatography system, called Empower, has accommodated FDA requirements for data integrity since the agency issued 21 CFR 11 some 20 years ago. The company, she says, spends a lot of time guiding users on implementing the technology. “We are very conscious that just having a tool in your hand is not enough,” she says. “You need to know how and when to use it.”
LIMS vendors make similar claims. “Our approach is to explain to customers and to auditors how we use our tools,” says Trish Meek, director of commercial operations for LIMS at Thermo Fisher Scientific.
Meek notes that problems with data integrity are typically not the result of a technology shortcoming. “If you look at where things fail, it is typically human based,” she says. “I think it’s really just about understanding the regulations and what needs to be done with system integration and validation with a mind toward ensuring data integrity. People need to be trained on what that is.”
Jeff Vannest, senior director of product management at LIMS vendor LabVantage Solutions, says the entry of data into temporary memory is a crux in managing electronic data.
Traditional software allows users to delete data entries before saving them, he says. Recent versions of LabVantage’s software automatically save data when a user exits a data entry field such as a box in an Excel spreadsheet. The latest version, introduced last year, also queries the user about the reason for changes in data entered.
McNicoll emphasizes the complexity of a problem involving documentation, work practice, and equipment implementation, all of which impact data integrity and all of which are inspected by regulators.
“We spend a lot of our time on behavior—educating the organization,” he says. Software developers need to focus on regulators’ expectations, he adds, and labs need to keep pace with software upgrades.
Patheon consults with its clients on data management, McNicoll says. “The maturity of programs across the industry varies. Some companies are just embarking; others are further along.”
Mark E. Newton, a quality assurance scientist who recently retired from Lilly, is critical of some lab equipment vendors. “Some products don’t meet the needs of a regulated laboratory,” he says. The biggest problems are at the bench where digital devices on scales, spectroscopy tools, and other equipment do not automatically record data in permanent memory.
“Most of the manipulation of data happens before you get into the big systems such as LIMS,” Newton says.
As a result, labs need to design networks that capture required data to stored memory. “We advise configuring systems in which the user cannot see the result of calculations until they save the data they enter,” Newton says. This prevents them from omitting differing results from repeated experiments or from flat-out falsifying data.
Drug company quality managers have taken on data integrity collectively through an industry group, the International Society for Pharmaceutical Engineering (ISPE), which recently published its own guidance document for managing data filed electronically with FDA and other agencies.
Lorrie Schuessler, computer systems quality assurance manager at GlaxoSmithKline in Collegeville, Pa., is cochair of ISPE’s data integrity special interest group. She notes that the current focus on data integrity also results from the geographic dispersal of drug development and manufacturing.
“Studies are no longer done in one place where all the data is generated and analyzed at one point,” she says. “It’s done all across the globe.” This globalization is enabled by computers, she says, but technology can’t be relied on to maintain data integrity.
“Technology can only handle so much. There have to be procedures,” Schuessler says. “The hope is that systems will get better and technology will support the controls we need, but there will always be human intervention.”
Data integrity experts agree that the pharmaceutical industry will be devoting increased resources to systems supporting electronic data and that regulators will keep up the pressure with inspections. “It is an expensive remediation, and it will continue for the foreseeable future,” Unger says. “There are some companies who have for the most part done nothing.”
Cahilly adds that the cost of maintaining a data system is a crucial consideration. Compliant companies can only get costs down so far. “A site that is falsifying data—or where you go into the lab and instruments are still in their wrappers yet somehow they’ve got all this ‘lab data’—will be able to undercut them on cost all the time.”