Abstract
A deep learning approach to density functional theory achieves higher accuracy than traditional methods while maintaining computational efficiency by learning electronic structure representations directly from data.
Density Functional Theory (DFT) underpins much of modern computational chemistry and materials science. Yet, the reliability of DFT-derived predictions of experimentally measurable properties remains fundamentally limited by the need to approximate the unknown exchange-correlation (XC) functional. The traditional paradigm for improving accuracy has relied on increasingly elaborate hand-crafted functional forms. This approach has led to a longstanding trade-off between computational efficiency and accuracy, which remains insufficient for reliable predictive modelling of laboratory experiments. Here we introduce Skala, a deep learning-based XC functional that surpasses state-of-the-art hybrid functionals in accuracy across the main-group chemistry benchmark set GMTKN55 with an error of 2.8 kcal/mol, while retaining the lower computational cost characteristic of semi-local DFT. This demonstrated departure from the historical trade-off between accuracy and efficiency is enabled by learning non-local representations of electronic structure directly from data, bypassing the need for increasingly costly hand-engineered features. Leveraging an unprecedented volume of high-accuracy reference data from wavefunction-based methods, we establish that modern deep learning enables systematically improvable neural exchange-correlation models as training datasets expand, positioning first-principles simulations to become progressively more predictive.
Community
Today we're sharing a major Skala update: new paper and model release.
Skala is a deep-learning exchange-correlation functional for DFT developed at Microsoft Research that currently reaches 2.8 kcal/mol on GMTKN55, wins 32 of its 55 subsets, and leads on accuracy short of double hybrids at semi-local DFT cost.
Why does this matter? Electrons are the glue holding atoms together in molecules and materials. Better XC functionals mean better predictions for reaction energies, barriers, structures, and properties that matter for chemistry, materials, and catalysis.
DFT is the workhorse of computational chemistry, but its predictive power is bottlenecked by the unknown exchange-correlation functional. Climbing Jacob's ladder usually buys accuracy by paying more compute. The result is a functional zoo—one specialized tool per domain.
Our long-run aim is to make that zoo obsolete. Skala is our bet that one deep-learning functional will be the right choice across any system or property. Today's results are a major step toward that goal, not the destination.
The core idea: instead of leaning on increasingly expensive hand-designed non-local ingredients, Skala learns non-local electronic representations directly from the electron density with a scalable neural architecture, at semi-local DFT cost.
The model is trained on about 400k accurate energy differences spanning atomization energies, conformers, affinities, reaction pathways, and non-covalent interactions. That's why Skala performs strongly across main-group chemistry rather than only on one narrow benchmark.
It's not just about energies, either. We now show accurate dipoles, hybrid-level equilibrium geometries, XC integration costs that stay in the semi-local DFT regime, and practical paths into production codes—directions we partly owe to the feedback of the DFT community.
The model is available now on GitHub, PyPI and conda-forge. We're already working to bring Skala to major DFT codes, including Psi4 and CP2K, through GauXC and direct code integrations.
If you try Skala on a system you care about, let us know what you find.
This is an automated message from the Librarian Bot. I found the following papers similar to this paper.
The following papers were recommended by the Semantic Scholar API
- Reaching for the performance limit of hybrid density functional theory for molecular chemistry (2026)
- MACE-POLAR-1: A Polarisable Electrostatic Foundation Model for Molecular Chemistry (2026)
- Latent space design of interatomic potentials (2026)
- Self-consistent Hessian-level meta-generalized gradient approximation (2026)
- MBD-ML: Many-body dispersion from machine learning for molecules and materials (2026)
- V2Rho-FNO: Fourier Neural Operator for Electronic Density Prediction (2026)
- Universal and efficient graph neural networks with dynamic attention for machine learning interatomic potentials (2026)
Please give a thumbs up to this comment if you found it helpful!
If you want recommendations for any Paper on Hugging Face checkout this Space
You can directly ask Librarian Bot for paper recommendations by tagging it in a comment: @librarian-bot recommend
Get this paper in your agent:
hf papers read 2506.14665 Don't have the latest CLI?
curl -LsSf https://hf.co/cli/install.sh | bash Models citing this paper 2
Datasets citing this paper 1
Spaces citing this paper 0
No Space linking this paper