Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up

sumo43
/
SOLAR-10.7B-Instruct-DPO-v2.0

Model card Files Files and versions
xet
Community
SOLAR-10.7B-Instruct-DPO-v2.0
1.52 kB
  • 1 contributor
History: 1 commit
sumo43's picture
sumo43
initial commit
0f40421 almost 2 years ago
  • .gitattributes
    1.52 kB
    initial commit almost 2 years ago