laion/CLIP-ViT-B-32-laion2B-s34B-b79K Zero-Shot Image Classification • 0.2B • Updated Jan 22 • 1.69M • 134
laion/CLIP-ViT-L-14-laion2B-s32B-b82K Zero-Shot Image Classification • 0.4B • Updated Jan 16, 2024 • 576k • 60
laion/CLIP-ViT-H-14-laion2B-s32B-b79K Zero-Shot Image Classification • 1.0B • Updated Jan 22 • 588k • 429
Bingsu/clip-vit-base-patch32-ko Zero-Shot Image Classification • 0.2B • Updated Nov 8, 2022 • 1.95k • 7
IDEA-CCNL/Taiyi-CLIP-RoBERTa-102M-ViT-L-Chinese Feature Extraction • 0.1B • Updated May 25, 2023 • 47 • 19
Bingsu/clip-vit-large-patch14-ko Zero-Shot Image Classification • 0.4B • Updated Nov 18, 2022 • 19.3k • 17
mattmdjaga/clip-vit-base-patch32_handler Zero-Shot Image Classification • 0.2B • Updated Aug 28, 2023 • 20
lyua1225/clip-huge-zh-75k-steps-bs4096 Zero-Shot Image Classification • Updated Dec 16, 2022 • 26 • 18
laion/CLIP-convnext_base_w-laion2B-s13B-b82K Zero-Shot Image Classification • Updated Apr 18, 2023 • 3.57k • 4
laion/CLIP-convnext_base_w-laion_aesthetic-s13B-b82K Zero-Shot Image Classification • Updated Apr 18, 2023 • 317 • 5
laion/CLIP-convnext_base_w_320-laion_aesthetic-s13B-b82K Zero-Shot Image Classification • Updated Apr 18, 2023 • 2.31k • 3
laion/CLIP-convnext_base_w-laion2B-s13B-b82K-augreg Zero-Shot Image Classification • Updated Apr 18, 2023 • 524k • 7
laion/CLIP-convnext_base_w_320-laion_aesthetic-s13B-b82K-augreg Zero-Shot Image Classification • Updated Apr 18, 2023 • 5.72k • 4