---
tags:
- datasets
- discord
- chatml
- conversation
- dialogue
- multi-turn
- single-turn
- fine-tuning
- reward-model
- llm-training
- chat-dataset
- open-source
- anonymized-data
- slang
- casual-dialogue
license: apache-2.0
language:
- en
pretty_name: Discord‑OpenMicae
size_categories:
- 100K
> **Discord-OpenMicae** is a dataset of anonymized Discord conversations from late spring to late summer 2025 for training and evaluating conversational AI models in a ChatML-friendly format.
- **250k+ Single-Turn Exchanges (STX)** – standalone user → reply pairs
- **100k+ Multi-Turn Chains** – two-participant reply chains, variable length
---
Nomic Atlas Map
---
## Features
- Human-only dialogues (no bots)
- Links, embeds, and commands removed
- Trading posts, code blocks, and LFG removed
- Two-author chains only
- Merged self-replies from the same author into a single message
- Cleaned and deduplicated for relevance
- Primarily English, with some other languages present
## Use
- Fine-tuning conversational models
- Training relevance/reward models
- Dialogue generation research
## Dataset
| Subset | Samples | Description |
|--------|-----------|---------------------------------------|
| STX | 260,670 | Single-turn prompt/response pairs |
| Chains | 101,480 | Multi-turn conversations (2 authors) |
## Text Statistics
| Metric | Value |
|-----------------------|------------:|
| Samples (count) | 362,150 |
| Min length (tokens) | 24 |
| Max length (tokens) | 106 |
| Mean length (tokens) | 61.96 |
| Median length (tokens)| 59 |
| Std dev (tokens) | 14.62 |
- **Total tokens:** 22,437,828 (using the [Hermes-3-Llama-3.2-3B tokenizer](https://huggingface.co/NousResearch/Hermes-3-Llama-3.2-3B))
- **Total characters:** 106,956,446
- **Total words:** 14,950,203
- **Assistant blocks:** 480,917
### Length Distribution (tokens)
| Bin (tokens) | Count |
|--------------|--------:|
| 31–38 | 19,953 |
| 39–46 | 21,765 |
| 47–54 | 76,180 |
| 55–62 | 99,760 |
| 63–70 | 60,461 |
| 71–78 | 36,277 |
| 79–86 | 21,161 |
| 87–94 | 14,873 |
| 95–102 | 9,614 |
| 103–110 | 2,721 |
## License
This project is licensed under the Apache License 2.0.
## How to cite:
```bibtex
@misc{discord-micae-hermes3b,
title = {Discord-OpenMicae},
author = {mookiezi},
year = {2025},
url={https://huggingface.co/datasets/mookiezi/Discord-OpenMicae}
}
```
## Related
- [mookiezi/Discord-Dialogues](https://huggingface.co/datasets/mookiezi/Discord-Dialogues)
- [mookiezi/Discord-Micae-Hermes-3-3B](https://huggingface.co/mookiezi/Discord-Micae-Hermes-3-3B)
- [NousResearch/Hermes-3-Llama-3.2-3B](https://huggingface.co/NousResearch/Hermes-3-Llama-3.2-3B)
## Disclaimer
All data was collected following Discord's Terms of Service.
[](https://20000.online/micae)
[](https://20000.online/openmicae)
[](https://20000.online/discord-dialogues)