Stream helps developers build engaging apps that scale to millions with performant and flexible Chat, Feeds, Moderation, and Video APIs and SDKs powered by a global edge network and enterprise-grade infrastructure. Learn more →
Distil-whisper Alternatives
Similar projects and alternatives to distil-whisper
-
-
Stream
Stream - Scalable APIs for Chat, Feeds, Moderation, & Video. Stream helps developers build engaging apps that scale to millions with performant and flexible Chat, Feeds, Moderation, and Video APIs and SDKs powered by a global edge network and enterprise-grade infrastructure.
-
-
-
FLiPStackWeekly
FLaNK AI Weekly covering Apache NiFi, Apache Flink, Apache Kafka, Apache Spark, Apache Iceberg, Apache Ozone, Apache Pulsar, and more...
-
-
-
InfluxDB
InfluxDB – Built for High-Performance Time Series Workloads. InfluxDB 3 OSS is now GA. Transform, enrich, and act on time series data directly in the database. Automate critical tasks and eliminate the need to move data externally. Download now.
-
-
-
-
-
-
TornadoVM
TornadoVM: A practical and efficient heterogeneous programming framework for managed languages
-
-
-
-
-
pyvideotrans
Translate the video from one language to another and add dubbing. 将视频从一种语言翻译为另一种语言,同时支持语音识别转录、语音合成、字幕翻译。
-
-
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
distil-whisper discussion
distil-whisper reviews and mentions
-
New OpenAI Whisper model: "turbo"
Details will be shared tomorrow, but from what I have read they have distilled the large model decoder into this turbo that only has 4 layers instead of 32, the encoder should remain the same size. Similar to https://github.com/huggingface/distil-whisper but the model is distilled using multilingual data instead of just English, and the decoder is 4 layers instead of 2.
- FLaNK Stack 05 Feb 2024
-
Distil-Whisper: a distilled variant of Whisper that is 6x faster
Training code will be released in the Distil-Whisper repository this week, enabling anyone in the community to distill a Whisper model in their choice of language!
- FLaNK Stack Weekly 06 Nov 2023
-
AI — weekly megathread!
Hugging Face released Distil-Whisper, a distilled version of Whisper that is 6 times faster, 49% smaller, and performs within 1% word error rate (WER) on out-of-distribution evaluation sets [Details].
- Distil-Whisper: distilled version of Whisper that is 6 times faster, 49% smaller
- Distil-Whisper is up to 6x faster than Whisper while performing within 1% Word-Error-Rate on out-of-distribution eval sets
-
Distilling Whisper on 20,000 hours of open-sourced audio data
- GitHub page: https://github.com/huggingface/distil-whisper/tree/main
-
Talk-Llama
Is https://github.com/huggingface/distil-whisper on its way to whisper.cpp?
-
A note from our sponsor - Stream
getstream.io | 16 Jul 2025
Stats
huggingface/distil-whisper is an open source project licensed under MIT License which is an OSI approved license.
The primary programming language of distil-whisper is Python.