ggml-whisper-unofficial: Curated Fine-Tuned Whisper Models

A continuously updated repository collecting non-OpenAI official fine-tuned Whisper models converted to ggml format (used in whisper.cpp).

All models are converted from community fine-tuned versions, focusing on domain-specific performance (e.g., anime voice, industry-specific speech) and compatibility with CPU/GPU inference via whisper.cpp. New models will be added regularly.

Model in this repository

Model Ref (Original Fine-Tuned Model) SHA-256 Size
ggml-bagasshw-whisper-large-v2.bin bagasshw/whisper-large-v2 e954f27623d162b81dd02542fbf5658fc9d7627db02484799779066e43127556 3.09 GB
ggml-litagin-anime-whisper.bin litagin/anime-whisper bce42c15312fcdbdcc8432b2ce63ba967f4cc3d2cc783fd6ab2ba264caf0db8d 1.52 GB
ggml-whisper-hindi-large-v2.bin vasista22/whisper-hindi-large-v2 62cc7c4135faad16ac412c2f2e2e9f761a94c2b64b4e8b6a88c1ef2b8b5837ae 3.09 GB
ggml-whisper-large-v2-ru.bin mitchelldehaven/whisper-large-v2-ru cf16dfa83c23c6697379b676223c2a03df27ca549b536bd9434722350c72a8f0 3.09 GB
ggml-whisper-large-v2-uk.bin mitchelldehaven/whisper-large-v2-uk 0e351d0c45dddd42b079c0a1498d4b72fde442565a71fbe5872a49782acb96d7 3.09 GB
ggml-whisper-tamil-large-v2.bin vasista22/whisper-tamil-large-v2 80ccc7c0fbed733e7df98ff9bb4d7d6316e461fc3096283e21c3514f9f00ca4b 3.09 GB
ggml-whisper-telugu-large-v2.bin vasista22/whisper-telugu-large-v2 0bdcf1e093c9802c2a873a87b7d6db21f537fcd72a6394ff68e103299b70123c 3.09 GB

Model from other repositories

repository URL License
kotoba-tech/kotoba-whisper-v1.0-ggml https://huggingface.co/kotoba-tech/kotoba-whisper-v1.0-ggml apache-2.0
kotoba-tech/kotoba-whisper-v2.0-ggml https://huggingface.co/kotoba-tech/kotoba-whisper-v2.0-ggml apache-2.0
distil-whisper/distil-large-v3-ggml https://huggingface.co/distil-whisper/distil-large-v3-ggml mit
distil-whisper/distil-large-v3.5-ggml https://huggingface.co/distil-whisper/distil-large-v3.5-ggml mit
distil-whisper/distil-large-v2 https://huggingface.co/distil-whisper/distil-large-v2 mit
distil-whisper/distil-small.en https://huggingface.co/distil-whisper/distil-small.en mit
distil-whisper/distil-medium.en https://huggingface.co/distil-whisper/distil-medium.en mit
xezpeleta/whisper-large-v3-eu https://huggingface.co/xezpeleta/whisper-large-v3-eu apache-2.0

Note: You can clone the whisper PyTorch models from the original repositories and convert them to ggml format using the whisper.cpp | convert-h5-to-ggml.py

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support