ERROR 1
ERROR 1
ERROR 2
ERROR 2
ERROR 2
ERROR 2
ERROR 2
Password and Confirm password must match.
If you have an ACS member number, please enter it here so we can link this account to your membership. (optional)
ERROR 2
ACS values your privacy. By submitting your information, you are gaining access to C&EN and subscribing to our weekly newsletter. We use the information you provide to make your reading experience better, and we will never sell your data to third party members.
Having made great progress in implementing information technology in the laboratory over the past decade, pharmaceutical companies are facing a new challenge in managing and sharing data as they create networks of external partners for discovery and development. The push to find cures through collaboration and to cut costs through outsourcing has created a need for a direct digital link between drug firms and the academic labs and contract research organizations with which they are now in business.
In many cases these links will be forged outside corporate IT firewalls, in what’s called the cloud.
As was the case with implementing electronic laboratory notebooks (ELNs) and other digital research tools in years past, the drug industry lags behind others in establishing collaborative IT. This is true whether the collaborations take place over traditional data connections or via cloud computing—the use of external computer resources hosted by companies such as Google, Amazon, and Microsoft that possess immense data-processing assets.
The industry’s hesitancy to move forward this time, however, differs from its slowness to adopt efficient IT in the lab. Whereas scientists initially resisted ELNs because of a cultural aversion to sharing research data, drug companies are now concerned over the security of their primary asset: data about drugs in development.
Recent news about Chinese hackers infiltrating the databases of defense contractors and other companies in the U.S., including Google, illustrates the risk. To many in the drug industry, the idea of aggressive and talented data hackers in China suggests an existential threat.
“All these news stories highlight the threat of hackers,” says Michael Elliott, chief executive officer of Atrium Research & Consulting, a market research and consulting firm specializing in IT for life sciences research. “And it’s a pretty serious threat. The level of sophistication is very high. You figure, if they can reach Google, they can reach anyone.”
Data security, however, is more than a matter of protecting intellectual property from online pirates. Companies also need to devise protocols for secure access to data by contract research organizations and other partners, and firms need to begin building an external computing environment where information can be processed by multiple users. By holding off on external IT links for fear of possible data theft, companies risk holding back the development of research data management.
“We can speak of a devolution in informatics,” Elliott says. “Many times companies with sophisticated IT, data capture, and security strategies go back to using Excel spreadsheets and paper notebooks when exchanging information with outside partners.”
External data processing is a recent phenomenon in the pharmaceutical industry that challenges drug companies’ sense of control over key assets, says Rudy Potenzone, vice president of product strategy at PerkinElmer Informatics. “There have always been research collaborations, but they had been very internally focused. By and large, you did not have your core research activities exposed,” he says. Today, however, companies looking for new cures have little choice but to share intellectual property.
PerkinElmer, a pioneer in ELN software, faces a challenge similar to its users, Potenzone says. “ELN has been one of our key areas of product development for 12 years. We have worked in partnership with a lot of our customers to build a sturdy internal environment,” he says. “Now how do we open this up and allow projects to be done outside the firewall? It’s a big question. How do we get data from external sources in?”
The answers are emerging, again in partnership with customers, Potenzone says. PerkinElmer is working on designing cloud computing applications for its ELNs and other Web-based means of sharing data securely.
Thermo Fisher Scientific, which markets a laboratory information management system (LIMS), has experienced an adoption lag with its life sciences customers in the cloud-computing realm. “We actually launched a cloud-based on-demand LIMS that supports multitenancy three years ago,” says Trish Meek, director of products strategy at Thermo Fisher. “No one has implemented multitenancy.”
While it waits for customers to become comfortable with the cloud, Thermo Fisher is customizing its Watson LIMS to support online collaborative research with in-house computers. For example, Meek notes that the company offers software and services that promote integrated IT for research partners. The product, called Connects, establishes a data channel between IT systems from multiple vendors.
Accelrys, another laboratory IT supplier, recently introduced its Externalized Collaboration Suite, which combines ELN and other laboratory IT with recently acquired technology, notably the Hit Explorer Operating System (HEOS), a collaborative IT platform it acquired from Scynexis last year. HEOS is a software-as-a-service (SaaS) product that works in the cloud.
Software designed to enhance security has been in the works in other industries for at least a decade, and drug companies are taking notice. In 2000, for example, Exostar began offering SaaS products for collaborative IT integration in the aerospace and defense industries. Today, a third of the company’s work is in life sciences, a segment that has grown rapidly over the past two years, says Vijay Takanti, Exostar’s vice president of security and collaboration. The firm counts several major drug companies, including Merck & Co., among its customers.
According to Takanti, establishing external IT links is largely a matter of setting standards and policies, rules that allow users access to information and ways to check the identity of other users on the system. Although this is a complicated management task, companies need to avoid putting onerous security safeguards in place. “Scientists want to get things done,” Takanti says.
Drug companies themselves also have spawned some early efforts at supporting collaboration with IT. Collaborative Drug Discovery (CDD), a supplier of chemistry and biology database software that supports research partnerships, was spun off from Eli Lilly & Co. nine years ago as part of the same program that produced InnoCentive, an online “open innovation” resource that counts as an early instance of crowdsourcing in pharmaceutical research.
The National Institutes of Health uses the CDD product CDD Vault to support collaboration among companies working on the NIH Blueprint for Neuroscience Research, a consortium of industry and academic research labs investigating how individual brain cells and neural circuits interact.
The drug industry is coming together around another initiative, this one launched in 2009 by Johnson & Johnson. TranSMART is an open-access research platform hosted on Amazon’s cloud-computing platform (C&EN, May 9, 2011, page 22). Shortly after it was launched, it was adopted by the Innovative Medicines Initiative, a European program that promotes collaborative research among major drug companies. Sanofi and Pfizer are among the drugmakers now using TranSMART.
Michael Braxenthaler, head of strategic alliances at Roche and CEO of TranSMART, which incorporated as a nonprofit company earlier this year, compares the platform to Linux, a widely licensed computer operating system that popularized open-access software. TranSMART hopes to build a community of life sciences users around its data and knowledge management platform for translational research, but it is only getting started. “We are where Linux was 10 or 12 years ago,” Braxenthaler acknowledges.
He says the industry has been working hard to eliminate technological snags that have made it difficult for partners to communicate via the Internet using ELNs. But the emergence of cloud computing has triggered a great deal of reluctance to externalize IT. “Things are moving fast,” he says. “However, information on our compounds is our lifeblood. Nobody wants this accessed in an uncontrolled fashion.”
Phyllis Post, executive director of IT account management at Merck, sees externalizing laboratory IT as a broad corporate undertaking rather than strictly a research venture. She acknowledges that external IT strategies have legal as well as scientific ramifications that complicate making big changes.
“I don’t believe that the cultural revolution in sharing information in the science sector has been overstated,” she says, “but rather that, given the conservative nature of the sector with regard to information sharing, it is taking a little longer than might have been anticipated to shift the behaviors and related capabilities.”
Still, software vendors and other industry watchers are waiting for the pharmaceutical industry to fully embrace the cloud. According to Etzard Stolte, chief technology officer for health and life sciences at Hewlett-Packard, the drug industry lags behind considerably because of a residual aversion to sharing data that outweighs its growing appetite for accessing data.
This disconnect, he says, was illustrated by Hewlett-Packard’s recent attempt to host an openly searchable database of molecules that failed in clinical trials. Drug companies expressed great interest in the resource and none whatsoever in contributing to it. “There is a very big difference between an interest in looking at data and the willingness to share it,” Stolte says.
Atrium Research’s Elliott agrees that unwillingness to share data is still a problem. Moreover, he sees a fundamental misunderstanding of IT security in the drug industry.
“I would argue that cloud applications have better protection than most companies have behind their firewalls,” he says. Security, in the long run, has more to do with how data are structured than where they are stored or how they are accessed. Data are just data without context, and molecular structures don’t mean much without associated information, he says.
“People have to look at the security issue differently,” Elliott says. “Saying you can’t use the cloud is the wrong way of looking at it. Every system is vulnerable.”
Join the conversation
Contact the reporter
Submit a Letter to the Editor for publication
Engage with us on X