Efficient MoE-based LLM Collection Mixture-of-Experts Large Language Models with Advanced Quantization β’ 5 items β’ Updated 20 days ago β’ 23
nota-ai/Solar-Open-100B-NotaMoEQuant-NVFP4 Text Generation β’ 59B β’ Updated 21 days ago β’ 244 β’ 4
nota-ai/Solar-Open-100B-NotaMoEQuant-NVFP4 Text Generation β’ 59B β’ Updated 21 days ago β’ 244 β’ 4
Efficient Large Vision-Language Model Collection ERGO: LVLM trained with RL on efficiency objectives; https://github.com/nota-github/ERGO β’ 3 items β’ Updated Feb 22 β’ 25
Efficient MoE-based LLM Collection Mixture-of-Experts Large Language Models with Advanced Quantization β’ 5 items β’ Updated 20 days ago β’ 23
Running Featured 116 Compressed Stable Diffusion π 116 Compare image generation results from original and compressed AI models
Running Featured 116 Compressed Stable Diffusion π 116 Compare image generation results from original and compressed AI models