Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
sumo43
/
SOLAR-10.7B-Instruct-DPO-v2.0
like
0
Model card
Files
Files and versions
xet
Community
main
SOLAR-10.7B-Instruct-DPO-v2.0
1.52 kB
1 contributor
History:
1 commit
sumo43
initial commit
0f40421
almost 2 years ago
.gitattributes
Safe
1.52 kB
initial commit
almost 2 years ago