From Graph Machine Learning Telegram Group
The breakthrough on Denoising Diffusion Probabilistic Models (DDPM) happened about 2 years ago. Since then, we observe dramatic improvement in generation tasks: GLIDE, DALL-E 2, and recent Imagen for images, Diffusion-LM in language modeling, diffusion for video sequences, and even diffusion for reinforcement learning.
Diffusion might be the biggest trend in GraphML in 2022 - particularly when applied to drug discovery, molecules and conformers generation, and quantum chemistry in general. Often, they are paired with the latest advancements in equivariant GNNs. Recent cool works that you’d want to take a look at include:
- Equivariant Diffusion for Molecule Generation in 3D (Hoogeboom et al, ICML 2022)
- Generative Coarse-Graining of Molecular Conformations (Wang et al, ICML 2022)
- GeoDiff: a Geometric Diffusion Model for Molecular Conformation Generation (Xu et al, ICLR 2022)
- Torsional Diffusion for Molecular Conformer Generation (Jing and Corso et al, 2022)
Where to learn more about DDPMs and its (quite advanced) mathematics? Luckily, there is a good bunch of new educational blog posts with step-by-step illustrations of the diffusion process and its implementation - try it!
- The Annotated Diffusion Model by Niels Rogge and Kashif Rasul (HuggingFace)
- Improving Diffusion Models as an Alternative To GANs by Arash Vahdat and Karsten Kreis (NVIDIA)
- What are Diffusion Models by Lilian Weng (OpenAI)




