Advertisement

If you have an ACS member number, please enter it here so we can link this account to your membership. (optional)

ACS values your privacy. By submitting your information, you are gaining access to C&EN and subscribing to our weekly newsletter. We use the information you provide to make your reading experience better, and we will never sell your data to third party members.

ENJOY UNLIMITED ACCES TO C&EN

Policy

Research Misconduct

Rules for handling misconduct are now in place, but new issues are surfacing

by Bette Hileman, C&EN Washington
August 29, 2005 | A version of this story appeared in Volume 83, Issue 35

[+]Enlarge
Credit: PHOTO BY JOHN RIZZO/GETTY IMAGES
Credit: PHOTO BY JOHN RIZZO/GETTY IMAGES

The issue of research misconduct first gained broad attention in the 1980s, with Congress investigating high-profile cases. After this, the federal government struggled to come up with a definition of research misconduct that was acceptable to the 18 agencies that sponsor research.

In October 1999, the White House Office of Science & Technology Policy proposed a definition that would eventually replace the variety of definitions used by federal agencies. It defined research misconduct as falsification, fabrication, and plagiarism in proposing, performing, or reviewing research, or in reporting research results. In April 2002, a federal rule on handling allegations of misconduct at those institutions receiving National Science Foundation funding was finalized. And in June this year, a similar rule for those entities receiving Public Health Service (PHS) funding was adopted. Most large universities have now set up their own guidelines on research misconduct.

The number of serious cases of misconduct involving PHS funding has varied between 10 and 14 annually, says Chris B. Pascal, director of HHS's Office of Research Integrity (ORI). The Inspector General's Office at NSF, on the other hand, handles only up to five serious misconduct cases per year.

Despite the federal definition of misconduct and rules having been put in place, new issues have arisen that threaten to shake the scientific community.

MARTINSON
[+]Enlarge
Credit: COURTESY OF BRIAN
Credit: COURTESY OF BRIAN

In June, Brian C. Martinson, a research investigator at HealthPartners Research Foundation, and his colleagues published a paper in Nature saying that many researchers have engaged in conduct that is detrimental to research, although it does not fit the federal definition of scientific misconduct (2005, 435, 737).

And on Aug. 7, the Seattle Times published a long investigative report alleging that some physicians have been paid to engage in a serious form of misconduct. According to the article, physicians managing clinical drug trials have revealed the likely results of this research to Wall Street analysts and investors before the trial outcomes were publicized, or sometimes even known, by the drugs' sponsors. This release of information enabled investors to engage in insider trading.

Also, since 2002, a growing number of universities and institutes have been fined for making false claims to the government regarding research grants from the National Institutes of Health. Apparently, grant money for particular projects has been diverted into other research or used to cover routine expenses. Fines have ranged from $2.4 million to $6.5 million.

Despite heightened awareness of the issues of scientific misconduct, some researchers continue to engage in practices involving falsification, fabrication, and plagiarism. ORI concluded investigations of several high-profile cases last year.

FOR EXAMPLE, in September, Eric T. Poehlman will be sentenced for fabricating research data in numerous federal research grant applications while working as a professor at the University of Vermont College of Medicine. Over a decade, these grants generated about $2.9 million in funding for his research primarily in the fields of menopause, aging, and hormone supplements. He has been ordered to retract or correct 10 research articles. Poehlman is the first scientist who has been barred for life from receiving grants from the federal government, and he faces up to five years in prison.

Ali Sultan, formerly a malaria researcher at the Harvard School of Public Health, was found guilty of plagiarizing text and figures on a grant application to study malaria drugs. In a deceptive e-mail message he presented to an inquiry committee, he tried to blame one of his postdoctoral students for the plagiarism. Since then, he has accepted a position as an assistant professor of microbiology and immunology at the Weill Cornell Medical College in Qatar.

According to ORI's annual report for 2004, the number of allegations of scientific misconduct received by the office rose 50% between fiscal 2003 and 2004, and 2004's total, 274, was the highest since the federal government established a program to deal with research misconduct. Pascal, however, says the upswing in the number in 2004 does not indicate a rise in cases of serious misconduct.

"A lot of the allegations don't even fit within our jurisdiction. They are authorship disputes, Food & Drug Administration issues, a whole variety of things," he says. "Out of 270 or so allegations in 2004, we found only 30 or 40 that fit the definition of misconduct and deserved a formal inquiry or investigation," he explains. The higher number may simply reflect increased awareness of misconduct, he says.

David E. Wright, consultant at ORI and professor of history of American science and technology at Michigan State University, agrees: "The regulations on scientific misconduct have been around long enough, and there is enough training in the responsible conduct of research that people are a little more alert to problems and perhaps more willing to report them than they have been before."

Experts have a variety of theories to explain why misconduct occurs and what can be done to prevent it. Some misconduct results from deviant personalities-- sociopaths-- and may not be possible to eliminate entirely, except by reducing opportunities, they say. But this is rare.

A large number of misconduct incidents in academia stem from sloppy data management practices, Wright says. Such practices can "allow misconduct to occur, especially by junior people, and perpetuate itself over a period of time," he explains. Junior people who are not well-mentored can get caught up in this, he says.

"Variability and flexibility in data management do give the opportunity for individuals who want to take advantage of the system to cheat," Pascal says.

Industry, on the other hand, usually has strict data management practices that require lab notebooks to be dated and countersigned each day. But it is prone to another kind of misconduct. Scientists working for pharmaceutical companies often feel pressured, consciously or unconsciously, to design clinical trials so they will favor their company's product over the competitor's, Wright says. "And now that academia is accepting a lot of research money from industry, academic scientists may feel some of the same pressures."

GILAD
[+]Enlarge
Credit: COURTESY OF DEBBI
Credit: COURTESY OF DEBBI

ANOTHER CAUSE of misconduct is that researchers' expectations differ somewhat in academia and industry. Among experts contacted by C&EN, there is a general consensus that researchers in industry value secrecy. They publish as few details of their research as possible in an attempt to keep competitors from stealing the discoveries. And if they manage to steal a discovery from a competitor without violating a patent, that is considered commendable.

In the academic world, on the other hand, "secrecy is not supposed to factor in," says Debbi G. Gilad, research integrity and compliance officer at the University of California, Davis. "You are supposed to be sharing your data, sharing your results for the good of the academic community and perhaps for mankind," she explains. "In the private sector, it's about keeping your cards close to you because they could be a trade secret that someone could turn into something bigger for his or her company." When industry and academic researchers collaborate, there is the potential for some misunderstandings and differing expectations, she says.

One of the best ways to prevent research misconduct is to "get the institutions to adopt standard data policies" that apply throughout the organization, Pascal says. "The policies need to have sufficient detail to provide guidance to research staff," he says. If there are a wide variety of disciplines within the institution, then each discipline needs a data policy that is relevant to the specific type of research it is involved in, he explains.

In addition, the lab chief should provide mentoring or formal training on data policies, Pascal says. "Some lab chiefs do this already because they think it is important, but many do not.

"If the data are not correct, we are just wasting our time, even if the problem does not involve research misconduct," he observes. "If the data are sloppy if data points are eliminated to make a graph look prettier, things like that we are wasting our money, and the research won't be helpful to the community."

Training programs in the responsible conduct of research carried out by universities can help prevent misconduct, Wright says. "These programs are gathering some momentum," he says. ORI has developed a set of nine principles on the responsible conduct of research covering data management, conflicts of interest, human subjects, animal welfare, mentorship, peer review, research collaborations, and publication and authorship issues. It also has written a 93-page guidance document that explains how to implement the principles.

"Some universities have had to step up to the plate very aggressively with training programs because of unfortunate events that happened at their institutions," Gilad says. "Other institutions are not responding as proactively or as aggressively as they should." One of the best ways to make people aware of these issues is "through reaching out to the younger researchers and hoping that we can provide them the tools that they need to help them through difficult challenges," she says. Once they talk about these misconduct issues among themselves, "that, perhaps, will have an osmosis effect on more senior researchers," she explains.

Another major deterrent to misconduct, Pascal says, "would be to have authorship policies within the institutions" in other words, policies that spell out the requirements for being listed as an author or coauthor on a scientific article. Often the individual principal investigator decides what to do about authorship without "a lot of good communication with the other researchers in the group," he says.

In his commentary in Nature, Martinson contends that a high percentage of researchers engage in a wide range of questionable behavior that threatens the integrity of science. With support from ORI and NIH, he and two colleagues from the University of Minnesota conducted an anonymous survey of about 6,900 mid-career and early-career scientists who were funded by NIH grants. The scientists were asked whether they had engaged in 16 specific behaviors during the past three years. About 3,250 scientists responded to the survey.

Only 0.3% of the respondents said they had falsified or cooked research data and 1.4% admitted to plagiarism. But 33% said they had engaged in at least one of the questionable behaviors outside of falsification, fabrication, and plagiarism that can damage the integrity of science. These questionable behaviors include "failing to present data that contradict one's own previous research," "changing the design, methodology, or results of a study in response to pressure from a funding source," and "dropping observations or data points from analyses based on a gut feeling that they are inaccurate."

This survey shows, Martinson says, "there are a lot of behaviors outside the definition of research misconduct that are at least in a gray area, and some are clearly wrong." In the aggregate, he continues, these behaviors have a more corrosive influence on science than the rare instances of outright falsification, fabrication, and plagiarism.

In the U.S., he explains, the commonly accepted wisdom about research misconduct has been that it is "fraud and fraud is rare." It has been assumed that the blame for misconduct should fall on individual scientists and that "fixing" the behavior of individuals will solve all the problems, he says.

In contrast, Martinson says, the root cause of many questionable behaviors lies not in individuals, but in institutional structures within which science operates. "We've got a different set of drivers operating today than we had 30 or 40 years back," he says. "The kinds of financial incentives, the kinds of commercial pressures, the kinds of competitive pressures we have among scientists today are not supportive of the traditional norms of science."

Martinson observes that "the rank and file of scientists today are actually composed of very junior researchers graduate students, postdocs, research fellows, research associates, temporary faculty members" people with marginal careers and marginal attachments to their institutions. "Many of these scientists told us that the processes used to distribute resources in science are not fair," he says, and scientists with these opinions were significantly more likely to have engaged in questionable behavior.

There is a serious disconnect between the supply of scientists on the one hand and the demand for them on the other, Martinson explains. The number of tenured positions in academia is very low compared with the number of graduate students who aspire to such positions, he says. "We have ramped up the competition for the limited resources in science to a point that is very likely leading to dysfunction. Some of the dysfunction is showing up in the misbehaviors seen in the survey."

Advertisement

The investigation published by the Seattle Times uncovered a very different sort of problem, one involving physicians mostly at the top of the economic ladder. It identified 26 cases in which doctors at leading universities and research institutions some of whom were managing clinical drug trials revealed or hinted at the outcomes of the trials to Wall Street analysts and investors. In contracts with pharmaceutical companies, the doctors had signed confidentiality agreements to keep all information about the trials secret until drug sponsors announced the results.

The physicians allegedly were paid $300 to $500 per hour to consult with Wall Street investors. Some of their revelations resulted in large swings in the prices of certain pharmaceutical stocks.

The Securities & Exchange Commission (SEC) is now investigating these cases. If the allegations are true, the doctors could be convicted of breach of contract, and the doctors and the investors could be convicted of insider trading. The financial implications of the cases are large. But the cases most likely have another negative effect: Leaking information about drug trials can introduce bias into the trials.

Eric J. Topol, a prominent cardiologist at the Cleveland Clinic at Case Western Reserve University, and David Blumenthal, a physician at Massachusetts General Hospital, laid the foundation for the Seattle Times investigation in a June 1 commentary in the Journal of the American Medical Association (2005, 293, 2654). In the article, they claim that almost one out of 10 U.S. physicians are currently engaged in a formal consultancy with the investment industry and that the proportion of academic physicians "with relationships to financial entities" is likely "considerably higher." The largest category of investment firms involved with doctors are hedge funds, which bet on stocks to rise or fall and manage more than $1 trillion in assets. "These relationships create opportunities both for useful exchange of information and for potential conflicts of interest, particularly in the area of clinical research," Topol and Blumenthal write.

Universities and research institutes are also under scrutiny. Over the past three years, suits have been filed against four universities and the Mayo Clinic claiming that NIH research grant funds were not used properly. The suits were filed with the Justice Department under the False Claims Act.

In May 2005, the Mayo Clinic paid $6.5 million to settle a suit, and in April 2005, the University of Alabama, Birmingham, settled a suit for $3.4 million. In 2004, two false claims suits relating to NIH grants were settled one by Harvard University for $2.4 million and one by Johns Hopkins University for $2.6 million. In 2003, Northwestern University settled a false claims suit for $5.5 million.

IN EACH CASE, a whistleblower filed suit under the False Claims Act's qui tam provisions. These allow private parties to sue entities that have submitted false claims to the government and receive a portion of the settlement if the Justice Department reaches a monetary agreement with the defendant. And in each case, the defendant has not admitted any wrongdoing.

"By definition, grants are open-ended for funding research in a particular area, and granting agencies are fairly generous in according grantees the right to follow promising leads within the general area of their projects," says Howard H. Garrison, director of the Office of Public Affairs at the Federation of American Societies for Experimental Biology. "It is my understanding that most grants and granting agencies are fairly comfortable with scientists following whatever directions their research seems to be leading them."

It remains to be seen whether universities will continue to be sued over misuse or diversion of grant funds. Also, it is not clear that universities and other research institutions will try to deal with the grayer areas of research misconduct that fall outside the standard definition. It is clear, however, that the issue of physicians giving advice to investors will receive a great deal of scrutiny, both by SEC and by the academic community.

Article:

This article has been sent to the following recipient:

0 /1 FREE ARTICLES LEFT THIS MONTH Remaining
Chemistry matters. Join us to get the news you need.