Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up
OpenMOSE 's Collections
hxa079 RWKV-Transformer Hybrid series

hxa079 RWKV-Transformer Hybrid series

updated 6 days ago

HXA079 family of hybrid models, combining RWKV recurrent architectures with Transformer-based attention. Designed for efficient long-context.

Upvote
-

  • OpenMOSE/RWKV-Seed-OSS-36B-hxa079

    37B • Updated about 1 month ago • 4 • 3

  • OpenMOSE/RWKV-Qwen3-30B-A3B-2507-Instruct-hxa079

    31B • Updated Sep 29 • 4

  • OpenMOSE/RWKV-Qwen3-32B-hxa079-High

    34B • Updated 21 days ago • 182 • 2

  • OpenMOSE/RWKV-Qwen3-32B-hxa079-Low

    34B • Updated 18 days ago • 37

  • OpenMOSE/RWKV-Qwen3-32B-Hybrid-GGUF

    34B • Updated 5 days ago • 2.93k • 1

  • OpenMOSE/RWKV-Qwen3-32B-Hybrid-High-GGUF

    34B • Updated 7 days ago • 99

  • OpenMOSE/RWKV-Qwen3-15B-hxa079

    15B • Updated Sep 24 • 5 • 2

  • OpenMOSE/RWKV-Qwen3-30B-A3B-2507-Instruct-Hybrid-GGUF

    31B • Updated 5 days ago • 536 • 1
Upvote
-
  • Collection guide
  • Browse collections
Company
TOS Privacy About Jobs
Website
Models Datasets Spaces Pricing Docs