Advertisement

If you have an ACS member number, please enter it here so we can link this account to your membership. (optional)

ACS values your privacy. By submitting your information, you are gaining access to C&EN and subscribing to our weekly newsletter. We use the information you provide to make your reading experience better, and we will never sell your data to third party members.

ENJOY UNLIMITED ACCES TO C&EN

Business

The Human Element In Lab Informatics

Technology-savvy researchers will steer the laboratory toward digital integration

by Rick Mullin
October 20, 2014 | A version of this story appeared in Volume 92, Issue 42

[+]Enlarge
Credit: Shutterstock
A bearded man in a labcoat sits at a computer terminal.
Credit: Shutterstock

Information technology is a contentious topic in almost any business setting. New developments in IT and automation are invariably linked to big changes in work processes, which tend to rub organizations and workers the wrong way.

Take the installation of enterprise resource planning (ERP) software—a mania that swept industries worldwide during the reengineering craze of the 1990s. Supporting a vision of the future promulgated by consultants, software vendors, and a few forward-looking company employees, ERP software meant a huge IT makeover.

Horror stories abounded, and most people still cringe at the word reengineering. But over the course of a decade, virtually all large corporations installed ERP software and now would have a hard time doing business without it.

A vision of the digital future in the laboratory has taken longer to emerge, partly because of the extraordinary mix of equipment that would have to be integrated in most labs and partly because of the science world’s cultural aversion to sharing data and changing procedures. The complexity of scientific information also makes rewiring the lab a daunting project.

But pressure to speed up product discovery and development, often stemming from an explosion of data, is bringing centralized IT into focus in laboratories at materials science and pharmaceutical companies. Software vendors are vying to supply the needed IT backbone for this lab of the future. And scientists—especially younger, more digitally savvy ones—are starting to get on board.

Vendors’ Visions

Leading informatics companies have been on the acquisition trail for years, hoping to extend their reach into the lab. In April, the most acquisitive of them, Accelrys, was itself acquired by France’s Dassault Systèmes, which develops software for three-dimensional modeling and product life-cycle management. Dassault folded Accelrys into a new group called Biovia with a goal no less ambitious than supplying the software needed for the lab of the future.

“I think of today, tomorrow, and beyond,” says Gene Tetreault, senior director of products and marketing at Biovia. “Today, labs have some really cool technology, and we are solving some problems. But it is not paradigm shifting. We are automating the way we are used to doing work.”

The world of tomorrow, he says, is a matter of getting “best-of-breed systems to work together in a cohesive way,” an integration of IT and automation into one digital platform. But it is beyond tomorrow when the much-vaunted “paradigm” will shift, Tetreault says.

Most corporate laboratories have one foot in today and one in tomorrow, according to Tetreault’s construct. They have yet to convert entirely from paper notebooks to electronic laboratory notebooks, or ELNs. Meanwhile, much work still needs to be done by lab software vendors and users before an integrated IT environment gels. For most users, integration is as far into the future as anyone is willing to gaze.

“That’s cool,” Tetreault says. “I’m happy that’s a futuristic view because right now it doesn’t exist, so it therefore has to exist in the future. But I don’t think of tomorrow as the future. I think of the future as years out where cars are flying and crazy things are happening.”

In laboratories, he says, work will be done in wholly new ways, including virtualized experimentation. Biovia is pulling together the IT to support the future lab from both its lab automation products and its modeling and simulation software. The trick, Tetreault admits, will be arriving at an effective means of connecting what are now two distinct varieties of laboratory IT.

But integrating pieces of the laboratory puzzle has been a focus at Accelrys for a long time, according to Matt Hahn, the firm’s chief technology officer. Hahn notes that the company built out its product line by adding ELNs, automated inventory management, and a collaborative research platform through a series of acquisitions. In recent years, efforts to integrate lab functions were leading to connections between the lab and the manufacturing plant and the business office.

“We were on this journey to deal with end-to-end problems,” he says. “And we began to see that we were butting up with companies like SAP and Dassault.” Talks about a merger with the French firm were the inevitable next step.

“There was a common vision,” Hahn explains. “Dassault was dealing with product life-cycle management in discrete manufacturing—planes, trains, and automobiles. We were coming at it from the scientific process angle.”

At the science services firm Thermo Fisher Scientific, the informatics integration platform is called, reasonably enough, Integration Manager. The company is six years into a program dubbed Connects in which it works with customers to design integrated IT for the laboratory, according to Trish Meek, director of product strategy with the firm’s life sciences division.

“We are trying to drive beyond functional requirements to build an entire laboratory into a single solution,” she says. In most labs, this will be a matter of integrating Thermo Fisher products, including ELNs and laboratory information management systems (LIMS), with automation and informatics components in customer labs.

But unlike at Dassault, Thermo Fisher has no effort to link automation and informatics on the one side to simulation and modeling on the other. “I do think they are fundamentally different endeavors,” Meek says. Given the choice between replacing scientific experiments with simulation and running traditional experiments in a more efficient way with a high level of data analytics, the company opts for the latter, she says.

Indeed, many laboratories are trying to catch up and aren’t necessarily ready for everything the software industry wants to sell them. There is a risk of simply automating what you already do, says Paul Denny-Gouldson, vice president of strategic development at the informatics firm IDBS. “It’s a very sensible place to start. But then you get to the next step.”

That step is generally taken in a conservative environment. “The big thing customers remind us of is that they don’t want to get to the future at the expense of usability and scalability,” Denny-Gouldson says. Researchers are also concerned about an interruption to their work processes, which are inextricably linked to computers. “The system we provide represents the environment that people come to in the morning. It’s the first thing they turn on and the last thing they turn off.”

IDBS focuses on the scientist’s electronic interface in the lab—the ELN or, as the company calls its product, the E-WorkBook. The firm also offers data integration software that it acquired with InforSense in 2009. IDBS’s focus on the IT in the user’s hand has given the firm a good read on what researchers want from the lab of the future, according to Denny-Gouldson. And that is mobility.

“Laboratory informatics is lagging in many respects compared to researchers’ outside lives,” he says. “Researchers ask, ‘When will we get our wearables in the lab?’ ”—technologies such as Google Glass, the intelligent eyeglass product.

“What scientists want to do is science,” Denny-Gouldson says. “They want to find a way to get rid of the rubbish and the noise of things.” The lab of the future, he predicts, will aggregate the routine manual calculations and record keeping into a real-time data historian, “and this will be a shocking change.”

GUSH
A screen grab of the BaseSpace software.
Credit: Illumina
Genomics data generated by Illumina’s sequencing tools are automatically shunted to the cloud for storage and processing.

Complex Information

The lab instrumentation and informatics provider Waters is also focused on the researcher’s workday. “The reality is that the individual customer needs to run experiments and make a decision,” says Rohit Khanna, the firm’s vice president of marketing and informatics. “They have to release a batch. They have to get ready to go to the Food & Drug Administration. The more you minimize all the activity along the way, the better off you are. We have no desire to add one more point solution to the mix.”

The rise of intelligent instrumentation has increased not only the volume but also the variety of electronic data in the lab, explains Leonard Weiser, director of strategic programs at Waters. And much of it needs to be entered into a central IT infrastructure.

NEXT GEN
[+]Enlarge
Credit: Biovia
A new breed of researchers born in the digital age will influence decisions on how computers and automation evolve in the laboratory.
Two women work on a laptop in a lab.
Credit: Biovia
A new breed of researchers born in the digital age will influence decisions on how computers and automation evolve in the laboratory.

“In most labs, the traditionally structured LIMS is pretty well in place, though it may not be doing everything a customer wants,” Weiser says. “It is certainly not linked to and providing data to the larger community outside the laboratory. But where most of our labs are struggling is in moving between the structured data that fit into the LIMS and a variety of unstructured information, which could be everything from JPEG pictures, social media, or e-mail from partners.”

Waters has developed a lab informatics platform called Unifi that Weiser describes as a melding of Empower, Waters’s chromatography data management software; MassLynx, the firm’s mass spectrometry data management tool; and NuGenesis, a Web-based data management system acquired in 2004. Unifi was introduced in the biopharmaceutical market three years ago and is being adapted for broader release next year.

Just as instrumentation has become more intelligent, researchers have become more IT savvy, Khanna notes. Weiser adds that IT departments have also evolved. “They are smaller,” he says. “They are no longer the dedicated groups supporting business applications. Now they provide a basic structure and conduit for computing.” Users manage their own applications, increasingly via mobile devices.

Khanna notes that the pendulum of IT architecture is swinging again. Over the past three decades, central mainframe computing gave way to personal computers, which were only to be replaced by “thin client” terminals that reinstated central data banks and informatics computing. Now, again, the IT power in the lab is at the bench.

Perhaps the most powerful new tool on the bench is genomics sequencing, a technique that produces reams of data. Although such systems have generally operated with minimum connection to centralized lab IT, Illumina, the leading provider of sequencing devices, has worked to link sequencing to a broader swath of research data.

“We have tried to build an ecosystem,” says Jordan Stockton, director of marketing for enterprise informatics at Illumina. “The heart and soul of genomics processing is sequencing, but it’s pretty clear to anyone doing this work that the bases that come off sequencers are useless without metadata as well as data on scientific results of people doing similar work.”

Last year Illumina acquired NextBio, a supplier of software designed to aggregate large quantities of phenotypic and genomic data. According to Stockton, many Illumina customers are already incorporating cloud computing as a means of storing and processing genomic data via the firm’s Base­Space Web networking application.

Caution From Users

To explore the IT and automation links among labs, manufacturing, and offices, go to http://cenm.ag/cloudlab.

Streaming data to the cloud is heady-sounding stuff, but industry analysts throw a bit of cold water on IT suppliers’ visions of the future. And users, whose attitudes toward IT range from curmudgeonly to clairvoyant, maintain a time-honored skepticism when presented with the notion of any kind of system overhaul by software vendors.

“There is an incentive to change, from a growth perspective, but there is also an aversion to risk,” says Michael Elliott, chief executive officer of the consulting firm Atrium Research, of efforts to integrate IT in research. This is especially true in regulated industries, in which alterations in manufacturing or product development usually require an elaborate validation process.

Elliott acknowledges that chemistry-oriented labs are in “a period of transition” but questions whether active consolidation among suppliers is moving research informatics in the right direction. “Many acquisitions have been happening at a very rapid pace without a clear line of sight on integration and the final state of what will happen,” he says.

Advertisement

Jim Brown, president of the consulting firm Tech-Clarity, points to cultural barriers within the research community as a drag on change. "Management will accept automating the lab as a good thing to do," he says, "but scientists are not always willing to be automated and move things electronically."  Brown adds that corporate discovery, development, and commercialization efforts are characterized by “fiefdoms” that are not naturally given to sharing space, including cyberspace.

Kerry Hughes, advanced computing leader in R&D at Dow Chemical, observes that finding a vendor to meet a lab’s needs has traditionally involved a fraught exchange.

“Every time you mention a feature to a vendor, he nods his head,” Hughes says. “It’s inbred. Vendors have to say ‘yes.’ Then they will somehow pull together a combination of their features that approximates what you’re talking about. But when you bring that software in, you have to integrate it, you have to translate it. You have to do all these things, and it’s just not soup yet.”

That soup is hard to make. Corporate research labs, Hughes notes, are not monolithic entities. “Big corporations tend to buy bits and pieces of things over time,” he says, “so you have various levels of data maturity in the laboratory. And you have to play your resources in accordance with your levels of maturity. We might have a place where they are still writing things in paper notebooks. The first step there is just to get electronic notebooks on the ground.”

Dow is also investing in bringing researchers themselves up to speed. Many Ph.D. chemists and chemical engineers arrive at the company with an inadequate background in statistical analysis and require on-the-job training for Dow to derive value from IT intended to manage an increasingly data-rich lab environment.

Lloyd Colegrove, director of data services at Dow, sees the lack of analytical skill among scientists as a limiting factor in laboratory IT development as well as a reason that existing lab informatics falls short of managing research data effectively.

“Laboratories may be collecting data, but organizations by and large have no idea what to do with it,” Colegrove says. “And quality management teams, currently the primary statisticians in manufacturing and R&D labs, are unlikely to adapt to the kind of multivariate, multidimensional modeling tools currently available, given a 30-year history of using univariate, one-dimensional approaches.

“Although the art today is really, really good,” he says, “it is going to take us a long time to change these old habits.”

Characterizing the future of lab IT at Dow, Colegrove envisions LIMS systems and plant data historians merging into “a giant format for all types of informatics and analytics” across the company. “But as for tying R&D into manufacturing for electronic scale-up using data and models cobbled through decades—or perhaps a century for a company like Dow—that is so far out in the future,” he says.

Large drug companies are also confronting changes in lab IT from the standpoint of their diverse base of systems and procedures. “Bristol-Myers Squibb labs vary greatly depending on the functional area,” says Jason Bronfeld, the firm’s executive director of preclinical and pharmaceutical development informatics. “We have different electronic lab notebook deployments with varying degrees of integration with other systems and equipment.”

This complex environment presents unique challenges for IT vendors. “From the perspective of information flow, today it is much easier to build your own computer from component parts or to automate and control a pilot plant than it is to assemble a laboratory,” Bronfeld points out. “I think this is a reflection of the immaturity of labs and their technology.”

He does see lab IT eventually developing to be comparable with supply-chain management software in manufacturing. “Initially, the manufacturing supply chain was an intramural exercise,” he says. “Today, companies think in terms of supply-chain networks. This same thing is happening in the lab, but because of the diversity of labs and the immaturity of lab-supply-chain infrastructure—standards, formats, interoperability, supply-chain tools—it is much more difficult.”

Standards for formatting and communicating data are vital yet generally lacking in any multivendor IT system, and they present a particular challenge in the lab, Bronfeld says. “To achieve end-to-end integration, both the format of information and the meaning of information must be harmonized across the process,” he says. “Neither of these is true today.”

A Lack Of Standards

In an effort to develop standards, 11 major pharmaceutical firms, including BMS, Amgen, Pfizer, and GlaxoSmithKline, got together in 2012 and formed a group called Allotrope Foundation.

“The lack of standards is clearly a significant part of why we are not at the lab of the future yet,” says Dana Vanderwall, one of the founders of Allotrope and associate director of cheminformatics at BMS. There is no such thing as simply plugging disparate IT components together, especially when they are manufactured by more than one vendor, he points out. “Nearly every integration is a bespoke integration. Every handoff is different.”

Allotrope has contracted with the German software firm Osthus to develop software to support IT interoperability for the pharmaceutical industry as well as related industries such as agricultural chemicals and food processing.

Odd among IT standards organizations for having an all-user membership, Allotrope recently launched a partnership program through which vendors are invited to collaborate. The program’s first workshops are scheduled to take place this week in Chicago.

The lack of standards has encouraged firms to advance a homegrown approach to IT development. Amgen, for example, developed an in-house informatics system called Research Gateway that consolidates data on molecules and targets relevant to therapeutic candidates, according to Peter Grandsard, executive director of research at the company.

“It pulls all the information you need on the screen,” he says. “It’s a handy tool for looking at the status of candidates and at the chemical matter and biological matter—but mostly the chemical matter—for many different targets as well.”

Software developers would do well to use what firms such as Amgen are doing in-house as a guide for how they develop commercial products. They may find that the conservative wins out over the radical, even in the lab of the future.

A conservative approach does not preclude fundamental change, according to Dow’s Colegrove, but the focus is going to be on work process rather than hardware and software. “We have to work within the constraints we have today to make incremental changes for tomorrow to show people what tomorrow can bring.”

Hughes, Colegrove’s colleague, adds that he would like to see lab IT systems develop in such a way as to support the kind of random and unexpected discoveries that fuel innovation.

“The phrase I commonly use to describe the future environment we’re striving for is ‘engineered serendipity,’ ” he says. “An oxymoron to be sure, but I often find that drawing the tension between the words in phrases like this helps us to think more creatively about a problem, perhaps resulting in a vision that gains clarity over time.” Lab automation, he says, needs to provide a foundation for the human element, for the stumbling into fortuitous discoveries that often move science forward.

Vendors and lab IT managers alike express some hope for the future, with humans—a new generation of researchers born in the digital era—leading the way. “As the workforce demographics change within the pharmaceutical industry and at equipment or software vendors,” BMS’s Bronfeld says, “increased expectations about life in the digital sea will fuel innovation. In many ways, it’s these people who are pulling the lab forward, not holding it back.”


To explore the IT and automation links among labs, manufacturing, and offices, go to http://cenm.ag/cloudlab.


Article:

This article has been sent to the following recipient:

0 /1 FREE ARTICLES LEFT THIS MONTH Remaining
Chemistry matters. Join us to get the news you need.