DUO
Collection
The Diffusion Duality
•
5 items
•
Updated
•
7
To use the pre-trained model for masked language modeling, use the following snippet:
from transformers import AutoModelForMaskedLM, AutoTokenizer
# See the `DUO` collection page on the hub for list of available models.
tokenizer = transformers.AutoTokenizer.from_pretrained('gpt2')
model = AutoModelForMaskedLM.from_pretrained('s-sahoo/duo-distilled')
For a hands-on example, check out this Colab notebook. For more information and implementation details, visit our github repository: DUO
The model, which has a context length of 1024 and is similar in size to GPT2-medium with approximately 130 million non-embedding parameters,
was trained for 1M steps on the OpenWebText corpus.
For more details, please see our paper: The Diffusion Duality.
Project page: https://s-sahoo.com/duo
Please cite our work using the bibtex below:
BibTeX:
@inproceedings{
sahoo2025the,
title={The Diffusion Duality},
author={Subham Sekhar Sahoo and Justin Deschenaux and Aaron Gokaslan and Guanghan Wang and Justin T Chiu and Volodymyr Kuleshov},
booktitle={Forty-second International Conference on Machine Learning},
year={2025},
url={https://openreview.net/forum?id=9P9Y8FOSOk}
}
Subham Sekhar Sahoo ([email protected])