Multiscale Machine-Learned Interatomic Potential (MMLIPs)

Project Summary

All-atom, GNN-based Machine-Learned Interatomic Potentials (MLIPs) are highly accurate but scale poorly beyond a few million atoms, making large-scale biomolecular simulations computationally prohibitive. This project seeks to bridge atomic and coarse-grained scales while retaining physical fidelity.
This project proposes a new family of foundation interatomic potentials, Multiscale MLIPs, to extend existing E(3) equivariant models. The core idea is to learn representations not just for atoms, but also for coarse-grained primitives like small molecules or molecular complexes. The model will operate on a molecular graph with mixed granularity, where each primitive has its own degrees of freedom (such as center-of-mass position, spatial orientation, stress tensor, charge distribution or dipole moments). The student will design and implement novel (equivariant) message-passing networks that can learn interactions both within and between these different scales, aiming to retain physical fidelity while dramatically reducing computational cost.

Potential Supervisors

  • Dr Danilo Jimenez Rezende (Principal Scientist and Head of AI Research, EIT)
  • Additional Supervisor(s) from the University of Oxford

Skills Recommended

  • Strong mathematical and physics background
  • Experience with graph neural networks and deep learning
  • Proficiency in Python and a framework like PyTorch/JAX
  • Basic quantum mechanics understanding

Skills to be Developed

  • Designing novel neural network architectures
  • Multiscale modelling
  • Advanced ML for scientific simulation (ML4Sci)
  • Developing next-generation ML interatomic potentials

University DPhil Course(s) 

Relevant Background Reading

  • Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E. and Kozinsky, B., 2022. E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications, 13(1), p.2453.
  • Schütt, K.T., Sauceda, H.E., Kindermans, P.J., Tkatchenko, A. and Müller, K.R., 2018. Schnet–a deep learning architecture for molecules and materials. The Journal of chemical physics, 148(24).
  • Husic, B.E., Charron, N.E., Lemm, D., Wang, J., Pérez, A., Majewski, M., Krämer, A., Chen, Y., Olsson, S., De Fabritiis, G. and Noé, F., 2020. Coarse graining molecular dynamics with graph neural networks. The Journal of chemical physics, 153(19).