Abstract
Classical diffusion models typically rely on isotropic Gaussian noise, treating all regions uniformly and overlooking structural information that may be vital for high-quality generation. We introduce an edge-preserving diffusion process that generalizes isotropic models through a hybrid noise scheme. At its core is an edge-aware scheduler that transitions smoothly from edge-preserving to isotropic noise, allowing the model to capture fine structural details while generally maintaining global performance. To measure the impact of structure-aware noise on the generative process, we analyze and evaluate our edge-preserving process against isotropic models in both diffusion and flow-matching frameworks. Importantly, we show that existing isotropic models can be efficiently fine-tuned with edge-preserving noise, making our approach practical for adapting pre-trained systems. Beyond improvements in unconditional generation, it offers significant benefits in structure-guided tasks such as stroke-to-image synthesis, improving robustness, fidelity, and perceptual quality. Extensive evaluations (FID, KID, CLIP-score) show consistent improvements of up to 30%, highlighting edge-preserving noise as a simple yet powerful advance for generative diffusion, particularly in structure-guided settings.
Resources

A classic isotropic diffusion process (top row) is compared to our hybrid edge-aware diffusion process (middle row) on the left side. We propose a hybrid noise (bottom row) that progressively changes from anisotropic (t = 0) to isotropic noise (t = 499). We use our edge-aware noise for training and inference. On the right, we compare both noise schemes on the SDEdit framework (Meng et al., 2022) for stroke-based image generation. Our model consistently outperforms DDPM’s isotropic scheme, is more robust against visual artifacts and produces sharper outputs without missing structural details.
BibTex reference
@article{vandersanden2024edge,
title={Edge-preserving noise for diffusion models},
author={Vandersanden, Jente and Holl, Sascha and Huang, Xingchang and Singh, Gurprit},
year={2024}
}