-
distil-whisper
Distilled variant of Whisper for speech recognition. 6x faster, 50% smaller, within 1% word error rate.
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
LangChain launched LangChain Templates - a collection of easily deployable reference architectures for a wide variety of popular LLM use cases [Details].
Hugging Face released Distil-Whisper, a distilled version of Whisper that is 6 times faster, 49% smaller, and performs within 1% word error rate (WER) on out-of-distribution evaluation sets [Details].
Chatd: a desktop application that lets you use a local large language model (Mistral-7B) to chat with your documents. It comes with the local LLM runner packaged in [Link].
Related posts
-
Distil-Whisper: a distilled variant of Whisper that is 6x faster
-
Distil-Whisper: distilled version of Whisper that is 6 times faster, 49% smaller
-
Distil-Whisper is up to 6x faster than Whisper while performing within 1% Word-Error-Rate on out-of-distribution eval sets
-
Talk-Llama
-
Distilling Whisper on 20,000 hours of open-sourced audio data