Differentiable OFDFT
Automatic differentiation meets orbital-free density functional theory
Automatic differentiation (AD) is an indispensable utility of modern machine learning libraries. In the realm of scientific computing, AD has been leveraged to accelerate prototyping and unlock new opporunities previously obscured by the need for tedious and, at times, intractably complicated pen-and-paper derivatives. The PROFESS-AD software (Tan et al., 2023) is one such example, where I applied AD to the field of orbital-free density functional theory (OFDFT).
OFDFT is an approximate way to get to the quantum mechanical results of the more computationally-demanding Kohn-Sham DFT (KSDFT), that does involve orbitals, at a much cheaper cost. (OFDFT scales linearly with the number of atoms with a small prefactor, while conventional KSDFT scales cubically) Central to OFDFT is the energy density functional, that is, a single scalar quantity that depends on the electron density, a scalar field in 3D space (often defined on a grid). The ground state solution for the atomistic system in question is a variational one where the ground state electron density is the one that minimizes the energy functional - reminiscent of the loss function to be minimized for typical machine learning problems. PROFESS-AD leverages this link, taking advantage of the AD capabilities and gradient-based optimization utilities of PyTorch to build its engine for orbital-free density functional optimizations. A toy model is shown below to illustrate the density optimization process.