-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
You do not need to buy an expensive GPU just to run your transcription! Services like Paperspace offer (free and paid) GPU compute. This lets you run the Whisper Python model for example in web-based Jupyter notebooks. That is exactly how I first used Whisper. However, we will see, if your audio is not too long (or alternatively, you are patient) you can run C++ code locally on your CPU. If you want to see how, read on!
whisper.cpp is a lightweight C++ implementation, by Georgi Gerganov, of the original Whisper Python model. It is optimized to run on Apple Silicon processors, but also runs on Intel processors. The app is CPU intensive, so not ideal for running on your 15-year-old laptop, already on its last legs! To follow this guide, you need to be comfortable running code in the Terminal. It is not essential that you know C++ but some previous experience compiling C++ code would help make setting things up a little easier.
Before you can use whisper.cpp, you need to clone the repo and compile the C++ code into a binary. We use CMake to help build the binary. CMake is cross-platform tooling useful when working with C++. It generates a make file, setting compiler paths for any third-party libraries. On macOS, you can install CMake with Homebrew. We will also need to have FFmpeg installed locally, so let’s feed two birds with one scone!