Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
PicchiEv
/
DPO-m2-beta03
like
0
Safetensors
qwen3
Model card
Files
Files and versions
xet
Community
main
DPO-m2-beta03
Commit History
Upload 11 files
b4682a0
verified
PicchiEv
commited on
May 25, 2025
initial commit
a0adb1f
verified
PicchiEv
commited on
May 25, 2025