-
DistiLlama
Chrome Extension to Summarize or Chat with Web Pages/Local Documents Using locally running LLMs. Keep all of your data and conversations private. 🔐
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
https://github.com/shreyaskarnik/DistiLlama feedback/suggestions and PRs are welcome.
I saw that it says using Ollama but that is not available on Windows yet.
NOTE:
The number of mentions on this list indicates mentions on common posts plus user suggested alternatives.
Hence, a higher number means a more popular project.
Related posts
-
open-webui VS LibreChat - a user suggested alternative
2 projects | 29 Feb 2024 -
Apple Silicon Llama 7B running in docker?
-
Ask HN: How to structure Rust, Axum, and SQLx for clean architecture?
-
Jack Dorsey says that he's not on the Bluesky board anymore
-
Run Large and Small Language Models locally with ollama