Advertisement

If you have an ACS member number, please enter it here so we can link this account to your membership. (optional)

ACS values your privacy. By submitting your information, you are gaining access to C&EN and subscribing to our weekly newsletter. We use the information you provide to make your reading experience better, and we will never sell your data to third party members.

ENJOY UNLIMITED ACCES TO C&EN

Policy

NSF Revamps Data‑Sharing Policy

Mandatory grant application supplements detailing how data will be shared will increase transparency, agency hopes

by David Pittman
September 27, 2010 | A version of this story appeared in Volume 88, Issue 39

[+]Enlarge
Credit: Shutterstock
Credit: Shutterstock

The National Science Foundation will soon require that grant applications contain a detailed plan explaining how researchers expect to publicly share the data generated by the agency’s funding. Applications without the newly required supplements, which can be up to two pages long, will be ineligible for funding. NSF is still finalizing details—what data must be shared, how, and when. Meanwhile, some scientists are skeptical about the value of the new requirement.

“My understanding is that a detailed description of the new requirement will be released in October,” says Celeste Rohlfing, acting director of NSF’s Chemistry Division. “But proposals need not address it until mid-January.”

NSF insists that its current policy on data sharing isn’t changing, but that agency officials have been dissatisfied with the inconsistency in researchers’ compliance. “Investigators are expected to promptly prepare and submit for publication … all significant findings from work conducted under NSF grants,” the current policy states. It goes on to say that NSF expects researchers to share primary data with other researchers at only incremental cost and within a reasonable time.

About NSF’s current approach, “The enforcement could be stronger,” says Philip Bogden, NSF program director for the Office of Cyberinfrastructure and chair of the NSF data work group that developed the revised recommendations on data sharing. He says the recently formed National Science Board (NSB) Task Force on Data Policy will likely investigate adding more oversight to strengthen the policy, such as the data-sharing plans it will soon require of grant applications.

By requiring a data-sharing plan, NSF officials believe they’ll get a better handle on the effectiveness of the agency’s current approach. For example, NSF can use researchers’ compliance with NSF’s plans—how well they follow through on their statements—to determine whether to approve future applications. Noncompliance with past plans, therefore, could spell doom for future research proposals. NSF will also require investigators to discuss their data sharing in their projects’ final reports.

With an annual budget near $7 billion, NSF receives more than 45,000 competitive grant applications a year and awards more than 11,500 requests annually.

Mandatory data-sharing plans in grant applications are the result of pressure from Congress and the NSB Task Force on Data Policy. “Congress feels any research done with federal dollars has to be available to everybody,” says Maria K. Burka, NSF program director for process and reaction engineering who has worked with the NSF data work group.

By forcing researchers to publicly share their research data, NSF officials hope to provide greater transparency to taxpayer-supported research, enable scientists to better verify and reproduce work, and better support collaboration and record keeping in the science fields.

Applicants for NSF funding can expect to see the complete details of the new requirements at least 90 days before the change goes into effect. The agency plans to post on its website sometime in October a list of frequently asked questions (FAQ), as well as specific guidance from its various directorates, divisions, and programs. NSF officials expect to begin requiring proposal applications to include a data-sharing plan by mid-January.

NSF emphasizes that it is not implementing a single, rigid policy for the data it will require researchers from different disciplines to release. Recognizing that each field of science has its own nuances, the agency instead will allow the individual disciplinary divisions to craft their own plans on data sharing.

“What constitutes reasonable procedures will be determined by the community of interest through the process of peer review and program management,” Bogden says. “These standards are likely to evolve as new technologies and resources become available.”

That approach, however, leaves a lot of ambiguity, especially for researchers who may apply to different NSF divisions for funding. According to a draft FAQ, “data” could constitute data, publications, samples, physical collections, software, and models.

José-Marie Griffiths, chair of the NSB Task Force on Data Policy, says researchers will have to share more data than what is routinely published in most academic journals. “I do think a lot of individual researchers would think that when they publish their results they are in fact sharing their data,” Griffiths says. “But of course, they’re sharing what they’ve done with the data, the results, and the outcome.”

NSF wants to ensure the release of all of the primary or original data generated during the research—not just those that are traditionally published—so other scientists can verify and replicate the work. Many researchers will likely use their university’s archiving systems or websites to post data that NSF expects them to share.

As to when data must be made available, “The expectation is that all data will be made available after a reasonable length of time,” the draft FAQ states. “However, what constitutes a reasonable length of time will be determined by the community of interest through the process of peer review and program management.”

Bogden says NSF does not have a “one-size-fits-all data management plan. It really ends up being very respectful of each community’s needs.” He notes also that a data-sharing requirement in grant applications is not the same as an open-access mandate, which generally refers to the availability of journal articles to the public.

Rohlfing says chemists shouldn’t be apprehensive about the pending changes. She reiterates that NSF is striving for “clear, effective, and transparent implementation of long-standing NSF policy.”

“The Chemistry Division will make implementation as streamlined as possible. The required data-management plan is a maximum of two pages and can be much shorter. Robust data management practices are becoming essential as the nature of chemical research changes,” Rohlfing says. “Intellectual property will be respected. The intent of this policy is to further discovery.”

While NSF officials develop the new guidelines, some in the chemistry world wonder what good the new requirement will do.

“Not disseminating information that you get from your research is scientifically like being against mom and apple pie,” says Georgetown University chemistry professor Richard G. Weiss, who currently holds two NSF grants. “It sort of befuddles me that they would put in this requirement.”

NSF’s annual progress report, which principal investigators must complete, and current proposal review forms already ask how findings will be disseminated, according to Louis Kuo, a chemistry professor at Lewis & Clark College, in Portland, Ore. “For the sake of accountability,” the mandatory application supplement for data sharing “is a good idea,” Kuo says. “But procedurally, it’s redundant.”

“Increasing these formal requirements doesn’t do anything in my opinion,” says Anatoly B. Kolomeisky, a chemistry professor at Rice University.

And sharing data beyond what is published and presented at scientific meetings could present complications, says Kenneth M. Nicholas, an organometallic chemistry researcher at the University of Oklahoma, Norman. It could encroach on patenting opportunities, he notes, and could lead to elimination of traditional oversight practices, such as peer review before publication. NSF officials, he says, must walk a fine line between “balancing that right to know against intellectual property and scientific accuracy.”

Article:

This article has been sent to the following recipient:

0 /1 FREE ARTICLES LEFT THIS MONTH Remaining
Chemistry matters. Join us to get the news you need.