When Daniela Jansen moved recently, she made a surprising discovery. There, in a box she had in storage, were her lab notebooks from graduate school at the University of Freiburg, in Germany, in the 1980s. “I don’t know how they ended up there, but they did not stay at the university,” she says.
Jansen, now a product marketing manager with the laboratory software firm Accelrys in Cologne, Germany, says she immediately thought of how the loss of intellectual property from lab notes getting thrown in boxes on moving day is a concern for academic laboratories. But she also realized that labs in which researchers work on paper may have bigger problems.
“I discovered two things that made the notebooks totally useless,” she says. “One thing was my handwriting, and another was that I had glued in the thin-layer chromatography plates.” The plates had deteriorated to the point of becoming illegible.
Those notebooks might just as easily have walked out the door of an industrial lab in the 1980s, but such a thing would be rare in industry today. Spurred by an increase in research data over the past decade, industrial chemistry laboratories have made enormous leaps in installing information technology (IT) for research, automation, and statistical analysis. Industrial labs have concurrently undergone a culture change in which researchers, trained to hoard experimental results, have embraced software, such as electronic laboratory notebooks (ELNs), that requires them to share data.
No such leaps have taken place in academic labs, however. This is partly because of a lack of commercial systems scaled for use in universities and partly because of a slower evolution away from proprietary research. In addition, university labs are often not ready to use more advanced Internet options, such as cloud computing, that might help.
Lab managers in academia agree, however, that the pressure is mounting to improve IT in their laboratories. The information surge experienced by all researchers provides a practical incentive to automate data collection and analysis. It also requires universities to amend undergraduate curricula to prepare students for the world of IT-based statistical analysis ahead of them. The increase in industry/academic research partnerships has introduced academic researchers to corporate IT policies along with some of the tools used in industry. And researcher managers who recently left industry to lead academic labs are bringing an industrial IT mind-set, if not the actual tools, with them.
Academia would seem to be a huge potential market for software system suppliers. ELNs have penetrated less than 10% of the academic research market as opposed to 60% of the industrial biopharmaceutical market, according to Atrium Research, a research IT consulting firm.
“The academic lab is a very complex area,” observes Michael Elliott, chief executive officer of Atrium. “On the one hand you have places like the Karolinska Institute and the Whitehead Institute, and on the other you have the smaller labs that are funded grant by grant.” The teaching aspect of an academic lab also differentiates it from an industrial lab, which is focused entirely on research with a more stable staff.
“It’s difficult to make generalizations about academic labs,” Elliott says, “which is why nobody from the commercial vendor side has ever gotten it right.”
For budgetary reasons, academic labs often manage data with open-source or common tools such as Microsoft Excel and with in-house-built software. “But I find an interesting dichotomy,” Elliott says. “There are concerns about having the budget and the resources to put in a commercial product, yet labs will spend an awful lot of resources and human capital to make some of these open-source solutions work or to customize and develop their own code.”
Jansen at Accelrys acknowledges the academic market is a tough sell. ELNs do not have appropriate data analysis capabilities for academic labs. Larger laboratory information management systems are of little interest because of their size, cost, and maintenance requirements. And the functionality of all these systems outstrips the needs of most academic labs.
Like most IT vendors, Accelrys has no product line specifically for academic research. Jansen says the Accelrys Notebook Cloud, a product the firm acquired in 2011, was originally developed for small labs for data collection and storage on the cloud. She and others note, however, that there is still some skepticism about cloud computing at universities.
Many academic lab managers view the current state of IT affairs in their labs as unsustainable. They are also frustrated over a lack of easy options.
Keith J. Stevenson, professor of chemistry at the University of Texas, Austin, directs a group of 30 students as part of an undergraduate program called the Freshman Research Initiative. The students in his group conduct parallel experiments in the area of nanomaterial-based metal catalysts. Each student has a dedicated research space, Stevenson says, and contributes to a databank.
The data set the group collects is complex, and managing it requires software with a sophisticated graphic component, Stevenson says. It cannot be done on Excel spreadsheets alone, but Excel is in the mix in his lab’s home-grown data management system. “We basically do it all ad hoc,” he says. “We don’t have funds to invest in something like a laboratory management system that would have all these instruments networked to a general ELN.”
But Stevenson’s lab has begun using commercial data visualization software, supplied by the software maker OriginLab, that provides students a window into the project database. Origin provided UT Austin with a site license for software with three-dimensional graphics and drawing functionality that can be applied to data and metadata analyses on a shared network.
In addition to facilitating research, the Origin software exposes students to one kind of commercial IT tool they are likely to encounter in their careers as chemists, Stevenson says.
Rosina M. Georgiadis, an associate professor of chemistry at Boston University, is also working on integrating IT into the undergraduate laboratory. She is experiencing frustration similar to Stevenson’s regarding access to affordable commercial software. One concern is the diversity of data types and files produced by various instruments in the laboratory.
“Each instrument gives you results in a different format,” she says. “A lot of times, instrumentation won’t give you raw data. It gives you ratios, precalibrated results, and so on.” The researcher has to keep track of all of it. “Are we still printing out the plots and pasting them in our notebooks? Turns out researchers are still doing that,” Georgiadis says. “There isn’t really an ELN that can keep track of all the data coming from a dozen sources.”
Georgiadis has, however, begun using Origin software to visualize experimental results. She considers it a practical tool for laboratories in which researchers will have access to data processed by instruments rather than raw data itself. Very few labs that students move on to in their studies will have that particular software, Georgiadis says, but the skills they gain by using it are transferable.
“My goal is to train students who understand the underlying chemistry and underlying foundation of the measurements they are making and how to interpret the data,” she says. “The great thing about computers is that once you have an expectation of what you want any particular software to do, you can take that expectation to a different platform and figure out how to make that platform do what you need it to do.”
Many professors and chemistry department managers have prior industrial research experience, either in chemistry or in data analysis software. Jay Deiner, a chemistry professor at New York City College of Technology, previously worked as an ink chemist at Hewlett-Packard. He is familiar with how IT is used in industrial laboratories, he says. “That is definitely not similar to what is going on in my particular lab.”
Software development generally tends toward standardization and the creation of large suites of coordinated automation and data processing. “But as a physical chemist in an academic lab, my experience is the reverse,” Deiner says. “We have a range of very specialized and sophisticated software.” Origin’s visualization software was recently added to the mix at City Tech, Deiner says, noting that IT development is financed in part by grants from the National Science Foundation and the City University of New York.
Like Georgiadis, Deiner hopes to turn that software diversity to an advantage in educating chemists. He teaches a course called Instrumental Methods of Analysis. “I try to expose students to software I hope enables them to analyze the data in more sophisticated ways—and with software they are likely to encounter if they go forward in research.”
Some academic labs are equipped with robust software because they are working directly with industry. The Moulder Center for Drug Discovery Research at Temple University’s School of Pharmacy, for example, has a Tripos Discovery 360, an advanced data workflow system used at major drug companies.
Magid Abou-Gharbia, director of the Moulder Center, came to Temple in 2008 following a 26-year career at Wyeth, where he headed chemistry and screening sciences. His mission was to create a drug discovery center at the university with a contract research services component.
“When I came in, I started from scratch,” he says. “We began with bringing in cheminformatics systems three years ago, but then we hired pharmacologists and a protein biology person.” There was a need for a system that could handle data from the full spectrum of drug research.
“I remembered the Tripos system at Wyeth,” he says. “So I negotiated with Tripos. I use the system in my lectures, so they get some PR. Nobody in academia would think of getting something like this a few years ago. Now, more and more industrial scientists are moving into academia, and a lot of centers of excellence are being established.”
Industrial-scale academic research labs such as the Moulder Center and the Vanderbilt Center for Neuroscience Drug Discovery are only a small corner of the academic landscape, but sources agree that cross-pollination between industry and academia in the area of research IT, even in smaller labs, is a powerful trend affecting laboratory and research culture.
There is also a consensus that the business model is primed for change. “The emphasis is on trying to get a better engagement between small labs and comprehensive IT systems,” says Mike Payne, a physics professor at the University of Cambridge who has worked as a systems development consultant to Accelrys and also with a U.K. government effort to facilitate more efficient research IT.
“To me, the solution to all of this is a move away from the old-fashioned licensing model to a software-as-a-service model,” Payne says, referring to a trend in which software users access applications via the Internet as needed rather than licensing multifunctional software. Such an approach would require vendors to begin marketing research automation and statistical analysis applications online, à la carte. But it would also require change in the academic research mind-set regarding Internet-based IT, Payne says.
“The problem with big computing in academia is that you don’t mind wasting Ph.D. students’ time learning how to log on to 500 different code systems and moving files by hand between them,” he says. But options already exist for doing things more efficiently. “There is a cloud. Who cares about which particular disc my file is on? Let’s get in the cloud.”
Anxiety over storing data in the cloud is a giant leap from worrying about it being carried out of the laboratory by accident in paper notebooks. But a greater anxiety over access to scientific information is opening minds in academia to new ideas. “We are going to need some tools that are still being developed,” Georgiadis says. “And some of them will involve cloud computing.”