Volume 95 Issue 2 | pp. 5-6 | News of The Week
Issue Date: January 9, 2017 | Web Date: January 5, 2017

Density functional theory heads the wrong way

Theoretical method may be getting the right answer for the wrong reasons
Department: Science & Technology
News Channels: Organic SCENE, Materials SCENE
Keywords: theoretical chemistry, computational chemistry, DFT, density functional theory
[+]Enlarge
Spread
User-friendly DFT methods, including ones based on the local density approximation (LDA) and variants of the generalized gradient approximation (GGA), gave increasingly more accurate electron density values until about 2000. Since then the errors have been growing.
Credit: Ivan S. Bushmarinov/Russian Academy of Sciences
09502-notw1-DFTProblems-update
 
Spread
User-friendly DFT methods, including ones based on the local density approximation (LDA) and variants of the generalized gradient approximation (GGA), gave increasingly more accurate electron density values until about 2000. Since then the errors have been growing.
Credit: Ivan S. Bushmarinov/Russian Academy of Sciences

Density functional theory (DFT) is a widely used computational method for carrying out quantum calculations in chemistry, materials science, and biology research. Despite its enormous popularity and ongoing modifications and updates, DFT seems to be getting worse at predicting key electron properties, according to a study (Science 2017, DOI: 10.1126/science.aah5975).

The finding suggests that users of DFT programs should carefully evaluate and benchmark results of their computations. And it may encourage DFT specialists and developers to redouble their efforts to improve the method’s capabilities.

For decades, researchers have depended on quantum methods to calculate electronic structures, bond lengths, and molecular geometries and energies. The values of those properties and others can be obtained with high accuracy from wave-function-based quantum methods. But applying those methods to all but the simplest chemical systems is complex, arduous, and expensive.

DFT simplifies the calculations. It sidesteps the use of wave functions to account for the motions of a molecule’s atoms and electrons. Instead, DFT determines electronic properties from the three-dimensional densities of the systems’ electron clouds. That simplification has helped put quantum calculations in the hands of large numbers of researchers, not just hard-core theoreticians.

But as Michael G. Medvedev and Ivan S. Bushmarinov of the Russian Academy of Sciences and coworkers now report, although DFT continues to provide ever more accurate energy values, thanks to ongoing method development and refinements, it is getting worse at correctly predicting electron densities.

The team carried out DFT calculations on 14 types of atoms and ions using 128 different functionals—mathematical descriptions of electron density—developed since 1970. They compared those results with ones obtained from high-level ab-initio wave function quantum methods, which are known to be highly accurate. They find that until about 2000, DFT-calculated energies and electron density values improved hand-in-hand. Since then, however, the energies have continued to improve, but the densities have become less accurate.

For some applications in chemistry and biology, the energies and geometries of molecules are the most important pieces of information, says Sharon Hammes-Schiffer, a chemistry professor and specialist in computational methods at the University of Illinois, Urbana-Champaign.

“If the electron density does not affect these properties then perhaps the inaccurate electron density is irrelevant,” she adds. It may have little effect on chemical bonding.

The key issue in Hammes-Schiffer’s view is that some modern functionals “may be giving the correct energies for the wrong reason.” She argues that this subject merits further investigation because “most scientists would prefer to obtain the correct answer for the correct reason.”

Describing the study as “provocative,” Martin Head-Gordon, a theoretician at the University of California, Berkeley, says one way to move forward is a combinatorial approach to develop new functionals that use fewer adjustable parameters than some modern ones use. His group is working on that approach and they plan to test it on electron densities.


This article has been translated into Spanish by Divulgame.org and can be found here.

 
Chemical & Engineering News
ISSN 0009-2347
Copyright © American Chemical Society
Comments
Andreas Klamt (Tue Jan 10 08:41:24 EST 2017)
The result correlates with my experience that the use of modern functionals never improved the electrostatic properties, i.e. the solvation energies, over the BP86 quality. That is the reason why we still develop our solvation model COSMO-RS based on the "old" functional BP86. We have tested many of the new functionals which are all computationally more expensive, but none of them resulted in better solvation free energies.
Sergio Ricardo de Lazaro (Wed Jan 11 12:36:16 EST 2017)
The discussion on DFT methods efficiency was always the main point of calculation based on electronic density as fundamental for properties. In the first functionals like LDA or GGA, the results expected were very confuse, the great example was the possibility of the electronic density be not localized together the nucleous. Then, the evolution of functionals is always positive because the results obtained are always dependents of them. Of course, good calculations need a good knowledge about quantum chemistry fundamental concepts to analysis its results; thereby, many parameters can modify a structural simulation to obtain its properties.
The precision of quantum calculations are more based on microscopic energies, such as, Hamiltonian energies and Total Energy, because are calculated directly. Another energies, mainly, energies associated to weak interactions are more difficult because they are indirectly calculated.
The things are more complicated than a general analysis in one property.
Sant (Mon Jan 16 19:37:02 EST 2017)
I am not surprised. We started seeing lot more scatter with the new functionals. It always required us to go back to older functionals to really identify the source of the problem. The training set used are really inadequate and more holistic approaches are currently doing a better job. I think the community mostly trust the theory groups to do a good job and be careful in managing the complex coupling in parameter space. I still think, the article is overly biased towards the troubles. The trouble is mainly coming from a whole lot of new functional trying to improve DFT past its limits. It is only telling that maybe, there is some inherent limitation to DFT and it's accuracy. This is not a bad thing - it is a call for action to move beyond DFT.
Leave A Comment