Add pipeline tag, paper description, GitHub link, datasets, pretrained models, and usage example.

#6
by nielsr HF Staff - opened
Files changed (1) hide show
  1. README.md +62 -1
README.md CHANGED
@@ -1,11 +1,72 @@
1
  ---
2
- license: mit
3
  datasets:
4
  - bayes-group-diffusion/GAS-teachers
 
5
  tags:
6
  - arxiv:2510.17699
 
7
  ---
8
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
9
  ## Citation
10
 
11
  ```bibtex
 
1
  ---
 
2
  datasets:
3
  - bayes-group-diffusion/GAS-teachers
4
+ license: mit
5
  tags:
6
  - arxiv:2510.17699
7
+ pipeline_tag: unconditional-image-generation
8
  ---
9
 
10
+ # GAS: Improving Discretization of Diffusion ODEs via Generalized Adversarial Solver
11
+
12
+ This repository contains the implementation and models for the paper [GAS: Improving Discretization of Diffusion ODEs via Generalized Adversarial Solver](https://huggingface.co/papers/2510.17699).
13
+
14
+ While diffusion models achieve state-of-the-art generation quality, they still suffer from computationally expensive sampling. Recent works address this issue with gradient-based optimization methods that distill a few-step ODE diffusion solver from the full sampling process, reducing the number of function evaluations from dozens to just a few. However, these approaches often rely on intricate training techniques and do not explicitly focus on preserving fine-grained details. In this paper, we introduce the **Generalized Solver (GS)**: a simple parameterization of the ODE sampler that does not require additional training tricks and improves quality over existing approaches. We further combine the original distillation loss with adversarial training, which mitigates artifacts and enhance detail fidelity. We call the resulting method the **Generalized Adversarial Solver (GAS)** and demonstrate its superior performance compared to existing solver training methods.
15
+
16
+ Code: https://github.com/3145tttt/GAS
17
+
18
+ ## Datasets
19
+
20
+ The teacher data is available at [Hugging Face Hub](https://huggingface.co/datasets/bayes-group-diffusion/GAS-teachers). We provide SD datasets with both 30,000 and 6,000 samples specifying the teacher NFE. A list of datasets and related links are provided below:
21
+
22
+ | Dataset | Hugging Face Hub
23
+ | :-- | :--
24
+ | CIFAR-10 | [50k samples link](https://huggingface.co/datasets/bayes-group-diffusion/GAS-teachers/blob/main/edm/cifar10/dataset.pkl)
25
+ | FFHQ | [50k samples link](https://huggingface.co/datasets/bayes-group-diffusion/GAS-teachers/blob/main/edm/ffhq/dataset.pkl)
26
+ | AFHQv2 | [50k samples link](https://huggingface.co/datasets/bayes-group-diffusion/GAS-teachers/blob/main/edm/afhqv2/dataset.pkl)
27
+ | LSUN-Bedrooms | [50k samples link](https://huggingface.co/datasets/bayes-group-diffusion/GAS-teachers/blob/main/ldm/lsun_beds256/dataset.pkl)
28
+ | ImageNet | [50k samples link](https://huggingface.co/datasets/bayes-group-diffusion/GAS-teachers/blob/main/ldm/cin256-v2/dataset.pkl)
29
+ | Stable diffusion | NFE=5: [6k samples link](https://huggingface.co/datasets/bayes-group-diffusion/GAS-teachers/blob/main/sd-v1/nfe=5/dataset_6k.pkl), [30k samples link](https://huggingface.co/datasets/bayes-group-diffusion/GAS-teachers/blob/main/sd-v1/nfe=5/dataset_30k.pkl); <br> NFE=6: [6k samples link](https://huggingface.co/datasets/bayes-group-diffusion/GAS-teachers/blob/main/sd-v1/nfe=6/dataset_6k.pkl), [30k samples link](https://huggingface.co/datasets/bayes-group-diffusion/GAS-teachers/blob/main/sd-v1/nfe=6/dataset_30k.pkl); <br> NFE=7: [6k samples link](https://huggingface.co/datasets/bayes-group-diffusion/GAS-teachers/blob/main/sd-v1/nfe=7/dataset_6k.pkl), [30k samples link](https://huggingface.co/datasets/bayes-group-diffusion/GAS-teachers/blob/main/sd-v1/nfe=7/dataset_30k.pkl); <br> NFE=8: [6k samples link](https://huggingface.co/datasets/bayes-group-diffusion/GAS-teachers/blob/main/sd-v1/nfe=8/dataset_6k.pkl), [30k samples link](https://huggingface.co/datasets/bayes-group-diffusion/GAS-teachers/blob/main/sd-v1/nfe=8/dataset_30k.pkl);
30
+
31
+ ## Pre-trained models
32
+ Pre-trained **GS** and **GAS** checkpoints are available at [Hugging Face Hub](https://huggingface.co/bayes-group-diffusion/GAS-students). A list of datasets and related links are provided below:
33
+
34
+ | Dataset | Hugging Face Hub
35
+ | :-- | :--
36
+ | CIFAR-10 | [link](https://huggingface.co/bayes-group-diffusion/GAS-students/tree/main/edm/cifar10)
37
+ | FFHQ | [link](https://huggingface.co/bayes-group-diffusion/GAS-students/tree/main/edm/ffhq)
38
+ | AFHQv2 | [link](https://huggingface.co/bayes-group-diffusion/GAS-students/tree/main/edm/afhqv2)
39
+ | LSUN-Bedrooms | [link](https://huggingface.co/bayes-group-diffusion/GAS-students/tree/main/ldm/lsun_beds256)
40
+ | ImageNet | [link](https://huggingface.co/bayes-group-diffusion/GAS-students/tree/main/ldm/cin256-v2)
41
+ | Stable diffusion | [link](https://huggingface.co/bayes-group-diffusion/GAS-students/tree/main/sd-v1)
42
+
43
+ ## Usage
44
+
45
+ ### Generating Images
46
+ To generate a batch of images using a teacher solver, run:
47
+
48
+ ```.bash
49
+ # Generate 64 images and save them as out/*.png
50
+ python generate.py --config=configs/edm/cifar10.yaml \
51
+ --outdir=out \
52
+ --seeds=00000-63 \
53
+ --batch=64
54
+ ```
55
+ To generate images from a trained GS checkpoint:
56
+
57
+ ```.bash
58
+ # Generate 50000 images using 2 GPUs and a checkpoint from checkpoint_path
59
+ torchrun --standalone --nproc_per_node=2 generate.py \
60
+ --config=configs/edm/cifar10.yaml \
61
+ --outdir=data/teachers/cifar10 \
62
+ --seeds=50000-99999 \
63
+ --batch=1024 \
64
+ --steps=4 \
65
+ --checkpoint_path=checkpoint_path
66
+ ```
67
+
68
+ For more details on setup, training, and evaluation, please refer to the [official GitHub repository](https://github.com/3145tttt/GAS).
69
+
70
  ## Citation
71
 
72
  ```bibtex