Advertisement

If you have an ACS member number, please enter it here so we can link this account to your membership. (optional)

ACS values your privacy. By submitting your information, you are gaining access to C&EN and subscribing to our weekly newsletter. We use the information you provide to make your reading experience better, and we will never sell your data to third party members.

ENJOY UNLIMITED ACCES TO C&EN

Energy

Call For Clean Energy Innovation

Summit explores applications of high-performance computing to get more efficient energy use

by Rajendrani Mukhopadhyay
June 13, 2011 | A version of this story appeared in Volume 89, Issue 24

SUPERCOMPUTER
[+]Enlarge
Credit: Los Alamos National Laboratory
Roadrunner, the supercomputer at Los Alamos National Laboratory, is being used to answer fundamental scientific questions, such as how materials perform under extreme conditions.
Credit: Los Alamos National Laboratory
Roadrunner, the supercomputer at Los Alamos National Laboratory, is being used to answer fundamental scientific questions, such as how materials perform under extreme conditions.

With the turmoil in the Middle East and North Africa, high gas prices in the U.S., and growing alarm over climate change, energy issues can no longer be ignored. But challenges to securing reliable and affordable energy in the U.S. remain a stumbling block that will require scientific, technological, and policy innovation.

BIODIESEL QUALITY
[+]Enlarge
Credit: Shutterstock
High-performance computing is helping to explain why biodiesel in Europe, such as that produced by this Italian facility, performs better than biodiesel made in the U.S.
Credit: Shutterstock
High-performance computing is helping to explain why biodiesel in Europe, such as that produced by this Italian facility, performs better than biodiesel made in the U.S.

High-performance computing—where complicated computations get solved in supercomputers and computer clusters—is one technique that researchers are using to help solve long-standing problems, such as biofuel efficiency. This approach and the need for a strategic energy policy that provides direction and support for implementing innovations were the focus of last month’s National Summit on Advancing Clean Energy Technologies, a two-day summit organized by the Bipartisan Policy Center, the Howard Baker Forum, and Lawrence Livermore National Laboratory (LLNL). Speakers came from disciplines spanning finance, government, industry, and academia.

The summit focused on various forms of energy, including nuclear, coal, and liquid fuels, and delved into how high-performance computing could help create more efficient and less harmful ways of exploiting different energy sources. John P. Holdren, the White House science adviser, explained that the world isn’t in danger of running out of fossil fuels in the near future, but it faces “economic, political, and environmental risks of fossil-fuel dependence using current technologies.” The alternatives to fossil fuels, such as nuclear power and biofuels, in their current state are riddled with similar economic, political, and environmental risks. “We need technological innovation,” Holdren said bluntly.

To meet this call for innovation, scientists are using high-performance computing. For example, Charles Westbrook, retired division leader for physics and chemistry at LLNL and current president of the Combustion Institute, explained how modeling and simulations are helping researchers understand biofuels.

Diesel is described by a parameter called the cetane number, which is a measurement of the combustion quality of the fuel during compression ignition. Biodiesel in the U.S. is made from soybean oil and has a cetane number of 47, Westbrook explained. But biodiesel in Europe, based on rapeseed and canola oil, has a cetane number of 54, indicating it’s a more efficient fuel. Researchers are using supercomputers and computer clusters to understand the difference in cetane, and therefore the difference in performance, of these two biofuels.

“We are starting to understand that the differences in the cetane numbers of the biodiesels are due to the number of carbon-carbon double bonds buried deep in these molecules,” Westbrook said. He noted that high-performance computing helped explain at the molecular level the fundamental concept of the cetane number.

Another example of the impact high-performance computing is having on scientific and technological R&D was noted by David S. Sholl, a professor of chemical and biomolecular engineering at Georgia Institute of Technology. He described how his team is using modeling to trim years off the R&D timetable in designing new membrane materials to capture carbon dioxide at a coal-fired power plant.

“There are, conservatively, 10,000 different materials we could use in making these membranes,” Sholl said, adding that it’s “simply impractical to do 10,000 different experiments, each of which might take three weeks or a month.” To rapidly sort through which materials hold the most promise, Scholl’s team applies high-performance computing to make “quantitative estimates of the properties of these materials so we can rapidly go from 10,000 to perhaps 100 materials.”

However, Thomas Mason, the director of Oak Ridge National Laboratory, pointed out that there are still limits to what the current state of the art in high-performance computing is capable of handling. Specifically, he emphasized that computers have difficulty dealing with complex problems such as climate change, one of the big factors that will impact future energy consumption.

Mason explained that high-performance computing is now limited in the amount of data it can handle. This technical limitation forces researchers to average the parameters that go into describing weather in large grids, which greatly limits chemical and physical conclusions they can draw about how weather and climate changes are interconnected. Capturing in computer simulations all the rich chemical and physical science that occurs in small, local regions and then understanding it on a more global scale “is something we absolutely cannot do with today’s computers,” Mason stated.

That was also the message delivered by David Turek, vice president of deep computing at IBM. He said innovation is sorely needed to enable computers to manage myriad forms and sources of data without sucking up large amounts of energy.

According to Turek, the problem ahead isn’t how to build a faster computer processor, but rather how to handle the data deluge that comes from all kinds of devices, from mobile phones to radio-frequency identification tags attached to merchandise in retail stores. The data deluge will continue to grow and has to be handled in a timely manner, often within seconds. “We need to reconsider architecture and structure of high-performance computing systems to be something more than just partial differential equation solvers,” Turek stated. “We have to conceptualize them quite explicitly to do data analytics effectively in real time.”

With that performance comes the problem of energy use, which will worsen over time. “There is a huge amount of energy consumption in the deployment of high-performance computing,” Turek said. He explained that by the end of this year, 100 billion kW of energy will be spent just running data centers. “Every time you do another Google search, remember, somewhere, somehow, a watt is being spent,” he noted. “All of us are contributors to this problem.”

The summit also included discussions of the need for a comprehensive U.S. energy policy to foster and implement innovations such as those resulting from studies using high-performance computing. Such a robust policy has been sorely lacking in the past, despite much talk over the decades about the U.S.’s need to bolster its energy security.

“In the U.S., we’re not so much addicted to oil; we are addicted to simple solutions and silver bullets,” said Shirley Ann Jackson, president of Rensselaer Polytechnic Institute, as she articulated why the U.S. has been dogged by energy problems. “As a consequence, for almost 40 years, we have failed to put into place strategic, comprehensive, and robust energy security policies.”

Jackson urged the implementation of a national strategic energy master plan, calling it “a tool for rationalizing, focusing, and prioritizing the decisions of government, industry, and academia.” Such a plan would encompass a diverse energy portfolio and would include timelines and goals to implement the portfolio.

Similarly, retired Gen. James Jones, former national security adviser for President Barack Obama, lamented the potential harm facing the U.S. in the absence of an energy policy. “No other issue is a bigger threat to our long-term national security than is our continuing failure to address our energy challenge,” he said.

During his talk, Jones described two pressing issues: the twin needs for structural and organizational reforms and a strategic energy master plan. “If we strategically organize and equip our federal agencies to succeed in the 21st century, and [if] we fund successful programs at levels commensurate with the challenges we hope to overcome, we can produce new innovative technologies and overcome immense energy challenges,” he said.

He urged that federal agencies be reorganized and consolidated so that they could face energy challenges with a more united front. Without a more efficient government, “we will be held hostage to what I call the ‘à la carte technology solutions,’ which are interesting when we hear about them, but they disappear very quickly and nothing seems to get done,” Jones stated. “The diffuse network of decisionmakers operating in a complex environment, many times with unclear or unaligned goals, must be coordinated strategically.”

Steven E. Koonin, undersecretary for science at the Department of Energy, noted that making changes to the energy infrastructure is not going to be easy because of its sheer size, complexity, and expense. He and others pointed out that countries like China are investing heavily in a wide variety of energy sources and, in many ways, outpacing the U.S. in their investment in petroleum alternatives.

But the U.S. has a notable advantage over other countries in high-performance computing, which “will not only help the competitiveness but our ability to change the energy system rapidly,” Koonin said.

When the summit drew to a close, Tomás Díaz de la Rubia, deputy director for science and technology at LLNL, said that the wide-ranging discussions from the summit would be used by the organizing committee to develop a national strategic plan. The stakeholder-developed plan would outline a partnership between the public and private sectors to enable effective adoption of innovative and clean technologies by the energy sector.

Article:

This article has been sent to the following recipient:

0 /1 FREE ARTICLES LEFT THIS MONTH Remaining
Chemistry matters. Join us to get the news you need.