ERROR 1
ERROR 1
ERROR 2
ERROR 2
ERROR 2
ERROR 2
ERROR 2
Password and Confirm password must match.
If you have an ACS member number, please enter it here so we can link this account to your membership. (optional)
ERROR 2
ACS values your privacy. By submitting your information, you are gaining access to C&EN and subscribing to our weekly newsletter. We use the information you provide to make your reading experience better, and we will never sell your data to third party members.
Pharmaceutical R&D, generally viewed as a grand stage for advanced technologies, is living up to its reputation these days. It’s got new machines for genomic analysis, technology for precisely editing genes, and all the software necessary to manage the data these things produce.
Cloud computing and modeling engines enhanced by machine-learning algorithms have already caught on in pharmaceutical labs such that researchers and their information technology system suppliers say the sector is closing in on a vision of the lab of the future—one based on digital connectivity and focused on bringing breakthrough drugs into the world.
Manufacturing drugs and getting them to patients, however, involves a completely different landscape of technology, somewhat less flashy but arguably more important given the dire need to avoid something that happens every day in research: failure.
“When people go to a pharmacy, they expect the drug to be at the pharmacy,” says John Baldoni, senior vice president of platform technology and science at GlaxoSmithKline. “If people go and find that the drug is not there because of a problem in manufacturing—that’s the worst thing that can happen.”
Baldoni is spearheading what he views as a revolutionary change at GSK’s manufacturing sites: switching from batch manufacturing to continuous flow processing wherever possible. The goal is to gain advantage over the competition by cutting cost and time in manufacturing, increasing quality, and reducing plant size.
His peers at other firms are also initiating fundamental change. In many cases, they are targeting simplification and unification of informatics systems. In all cases, they are designing engines to gather, store, and analyze huge amounts of data and mobilize those data in process design and control efforts.
Production managers and their system suppliers agree that a vision of the plant of the future is taking shape downstream from the new world of digitally integrated research. But the advance of new technology has been slower, given that manufacturing plants are more highly regulated than labs and that systems validated to the U.S. Food & Drug Administration’s current Good Manufacturing Practice (cGMP) quality standard are not well positioned for sudden change.
The dire consequence of failure that Baldoni points to has also bred a culture of risk aversion whereby technologies such as cloud computing, which took some getting used to on the part of researchers, are still considered off-limits for manufacturing.
But that culture is changing as drug companies begin to explore closer links between lab and manufacturing informatics. Pharmaceutical plant managers are also learning that manufacturing data, once deployed primarily to ensure that factories operate within validated performance parameters, can be used to support continuous improvement of processes. The result is a shift in the focus of plant informatics from compliance to competitiveness.
Breaking down barriers
On many levels, one can discern a palpable frustration, a tension between seeing the advantages offered by new technology and the natural reticence to making changes to tightly controlled, FDA-validated plants. Cloud computing—the use of external, third-party-run servers for data storage and calculations—may be the best example.
“The pharma industry doesn’t like the cloud so much,” says Torsten Winkler, head of the life sciences business for Europe at Honeywell Process Solutions, a plant automation systems vendor. “They like cloud technology, but not the cloud.”
Software deployment lags in drug manufacturing.
Based on 120 respondents to a survey of manufacturers of active and inactive pharmaceutical ingredients.
51%
use no automation software to manage production processes
16%
use automation software to fully control processes
15%
analyze production processes in real time
9%
share data fully across operations
91%
have siloed data in departments or at plants
25%
use software to collaborate with contract manufacturers and other contractors
Sources: Biovia
The internet itself, however, is growing in importance in the plant, Winkler says, especially in regard to managing data from the digital devices that are proliferating there. They include discrete processors on individual valves and pumps, which taken as a whole have become known as the “industrial internet of things.”
Much of the concern about cloud computing is unfounded, Winkler maintains. FDA, he says, does not require physical servers at plant sites and allows the use of data management in the cloud.
Still, data security remains a concern. Winkler suggests making a distinction between data used for process automation and the more sensitive data used for process analysis. Automation data can be segregated from analytics data and deployed on a cloud-based system. Companies can also consider an internal cloud behind the corporate firewall and supported by a company’s own servers rather than those of a host such as Google or Amazon.
Whether or not data management systems run in a cloud, the primary goal in drug manufacturing, Winkler says, is digital connectivity across the supply chain, establishing integrated informatics within the plant, between plants, between manufacturing and research, and between the plant and financial business systems such as enterprise resource planning software.
Honeywell will introduce an upgraded version of its Experion Process Knowledge System distributed control system later this year, according to Winkler, adding a feature called Unit Operation Controller that provides data integration support. The upgraded system also allows users to add or remove capacity from a batch process without reprogramming a central controller.
Paul Denny-Gouldson, a vice president at informatics system vendor IDBS, also sees a growing interest in cloud computing in the plant and other cracks in the cultural barriers to adopting new technology. “We are seeing much more openness to trying new things,” he says. “With the current number of early adopters, attitudes toward cloud-based content and capabilities, and its implications in manufacturing, are evolving very quickly.”
As such, the informatics gap between the plant and the R&D lab is closing. Big data tools, advanced automation, and artificial intelligence are providing IT support for the design of commercial-scale chemical manufacturing processes at the earliest stage of research, eliminating the need for technology transfer as drug candidates advance toward the market, Denny-Gouldson says. Data, which are now compartmentalized in R&D, pilot manufacturing, and commercial manufacturing, will navigate one IT system.
Setting this up, however, is a huge undertaking. “Imagine the billions of data from millions of variables,” Denny-Gouldson says. Manufacturers need to work out data relationships and try to model the data so that when they move from process development to manufacturing, the model is running—automatically monitoring the system at every point and making changes.
“The utopian place to be is where the model will run the process to the point where there is no manual intervention. But we are nowhere near that yet,” he acknowledges.
And system connectivity will ultimately require manufacturers to accept some level of cloud computing. “The ability to do computationally intensive and data-intensive calculations is what the cloud was born for,” Denny-Gouldson says. “These guys in manufacturing will need to jump on the bus and get there as quickly as possible.”
Competing on data
Software vendors agree that new uses for data are impacting manufacturing and informatics strategies. “One of the core questions as you gather all that data is: What is that data for?” says Martin Dittmer, a product manager at Rockwell Automation. In the past, the main concern was compliance—showing regulators that manufacturing has been executed according to predefined specifications. “Now, there is a major shift in data use to asking questions about how to become more competitive and increase productivity.”
Dittmer adds that FDA has undergone a similar shift in its perspective on data. “They realize that by focusing only on the data necessary to prove they followed a validated process, manufacturers are missing out on valuable data that can be used to improve production and bring products to market faster.”
Khris Kammer, a market development manager at Rockwell, adds that the agency has opened the door to technology that tests process variables during production, rather than after manufacturing, to ensure quality compliance. The shift is facilitating efforts to convert from batch to continuous processes.
Rockwell has been adding these capabilities with upgrades to its PharmaSuite manufacturing execution system (MES) and FactoryTalk data management software for distributed digital devices. One new product, FactoryTalk Analytics for Devices, supports the industrial internet of things concept. “It is quite literally a box that plugs into a plant network,” Kammer says. The box picks up data from control devices—digital instruments on valves and pumps, for example—for analysis. It also alerts users to problems in production.
Biovia, the former Accelrys, now owned by Dassault Systèmes, has a similar product, Biovia Discoverant, according to Stephen Hayward, a marketing manager. The software “pulls in different pieces of data from manufacturing processes and puts them into one place to do real-time monitoring of process variables, understand critical process parameters, and manage controls on an ongoing basis.”
The company recently published statistics on software adoption in the pharmaceutical manufacturing sector, polling managers about cGMP and non-cGMP materials. It found that half of them run factories without MES or other software products and that more than 90% of them are dealing with siloed data across operations.
But the landscape is changing quickly, according to Hayward. “Companies are looking at the plant from a holistic perspective,” he says. The goal at many drug companies is to establish a single informatics network.
System vendors agree that the technology needed to support these networks has been available for years but that industry has not sought to implement it until recently. Kammer at Rockwell says PharmaSuite was introduced back in 2011, but interest among drug companies is now picking up as companies begin to adopt internet-supported strategies for building plant informatics networks.
“There is a hockey stick curve happening,” he says. “These are not new technologies, even in consumer markets. But mobile devices, cloud computing, data integration technologies—these few things are coming together in a perfect storm, driving rapid adoption in pharmaceutical manufacturing.”
Pfizer is working to establish an integrated manufacturing regimen based on standard work practices supported by a single informatics infrastructure. The company elected to use Rockwell’s MES system as a foundation for that infrastructure and currently has the software controlling 25 plants around the world.
Stephen Giles, Pfizer’s senior director of business technology, says the lack of a unified approach to manufacturing work processes is one reason that IT has not advanced as quickly in the plant as it has in R&D at Pfizer.
“A centralized team drives work habits in research labs,” he says. “There has been more standardization from the center. In production, plants have operated in isolation with a lot of customized IT.” Now, Pfizer wants all plants working on the same version of MES software as part of a unified supply chain that integrates with labs.
“The big focus at the foundation is accessing data and pulling it to analytical tools that will most likely be in the cloud,” he says.
But there are clouds and there are clouds. Giles says Pfizer is unwilling to climb into an open cloud-computing world with manufacturing data. Instead, the company is building what he calls a virtual private cloud in order to ensure data security. The approach, despite Pfizer’s push to integrate systems, will result in data servers installed at each manufacturing facility providing local internet-based data storage and management.
Johnson & Johnson’s Janssen research arm has also been working to establish standard practices in manufacturing and the informatics to support them. Marc Hooybergs, senior director of execution systems for pharmaceuticals at the company, notes that the idea of a plant of the future is nebulous because the technology to support work in the plant evolves quickly. Once a system is installed it is outdated, he notes, so the focus needs to be on work processes, not IT.
“Ultimately we want to deploy optimized business processes across manufacturing and then tailor informatics to these processes rather than the other way around,” Hooybergs says.
Janssen, like others in the sector, is on a path that began with moving from paper-based data collection and analysis to a software-based system. But the technologies available are ramping up as open communication standards proliferate. “And cloud computing is growing exponentially,” adds Adam Fermier, Janssen’s scientific director for strategic business support. Janssen continually vets options to enhance informatics connectivity between plants and between manufacturing, research, and business systems.
Although companies have a lot of technologies available and a general means of plugging them into an informatics backbone, Fermier says structuring all the resulting data into a standardized format for modeling, visualization, and reporting is an ongoing challenge.
GSK’s Baldoni agrees. “Data systems in the plant aren’t that unsophisticated,” says Baldoni, who is spearheading a project to deploy supercomputers and machine learning in drug discovery at GSK. “They are capable of capturing data in flow and at the right frequency. The challenge is not finding commercial systems that are fully capable. The challenge is visualization.”
GSK, he says, is deploying such systems and adapting them to its informatics infrastructure—an integrated network that is evolving as the company moves toward continuous process manufacturing. Much of the work is about developing data modeling, automated signal detection, and visualization tools.
Keeping up with the volume of data entering the plant is another challenge. “Moving to the future, when we look at the amount of data that are being generated in continuous flow manufacturing, we will require much more data and data storage, much more computer power,” Baldoni says. “Cloud computing will be the norm.”
The new world
To the extent that the pharmaceutical industry is playing catchup on informatics, it is also confronting problems of cross-vendor compatibility that have created headaches in other sectors for decades. “Vendors have a low motivation to allow information exchange between systems,” Janssen’s Fermier says. Still, communication standards are emerging that promise greater interoperability in multivendor informatics networks.
Giles at Pfizer adds that picking one company to supply an MES does not necessarily preclude linking that software with other vendors’. In fact, he says, Rockwell was selected largely on the basis of the firm’s expertise at crafting a multivendor network and its ability to consult with the drugmaker on informatics system development.
Rockwell’s Dittmer goes so far as to characterize concerns about being locked into one vendor as 1990s thinking, given the rise of open communications standards over the past 20 years. The problem of how to visualize data, he concedes, is more of a 21st-century challenge.
“You can argue that modern dashboards and reporting capabilities should help with this challenge,” he says. “However, there are under-the-surface aspects that need to be looked into first.”
Data from disparate sources need to be entered in a common format. And data require context. “For example, if a series of measurements of a process device can’t be contextualized by the related equipment, related product order, and produced batch, those measurements are potentially worthless,” Dittmer says. Finally, data require integrity—they need to be legible, contemporaneous, and accurate.
Getting the data entry, modeling, and visualization in place will require customization and collaboration between system users and vendors. And a significant human element will come into play. Fermier at Janssen notes that young engineers are entering manufacturing with advanced data analytics skills. Baldoni agrees but says staff skills will need to keep up with new technology.
“It’s a different world,” Baldoni says. “The type of people that are going to be taking care of those computers and the type of people who are going to be visualizing the data are trainable, but I don’t think they’re there yet.”
Join the conversation
Contact the reporter
Submit a Letter to the Editor for publication
Engage with us on X