-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
Local-LLM-Comparison-Colab-UI
Compare the performance of different LLM that can be deployed locally on consumer hardware. Run yourself with Colab WebUI.
You can try llama.cpp or KoboldCPP for CPU infererece. Or you can try online demo like HuggingChat or Selfee.
You can try llama.cpp or KoboldCPP for CPU infererece. Or you can try online demo like HuggingChat or Selfee.
You can try llama.cpp or KoboldCPP for CPU infererece. Or you can try online demo like HuggingChat or Selfee.
I made a series of Colab notebooks for different models: https://github.com/Troyanovsky/Local-LLM-comparison