-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
web-stable-diffusion
Bringing stable diffusion models to web browsers. Everything runs inside the browser with no server support.
Its not supported by TVM yet, but there is support for Qualcomm Hexagon.
You can kinda see some of the supported backends gated behind flags in the cmake file: https://github.com/mlc-ai/relax/blob/mlc/CMakeLists.txt
> a local wikipedia dump
There exists (at least) a project to train an LLM on local documents: privateGPT - https://github.com/imartinez/privateGPT
It should lead to the content source, to check the exact text:
> You'll need to wait 20-30 seconds (depending on your machine) while the LLM model consumes the prompt and prepares the answer. Once done, it will print the answer and the 4 sources it used as context from your documents
You will have noticed, in that first sentence, that it may not be practical, especially on an Orange Pi.
Yup, here's their web stable diffusion repo: https://github.com/mlc-ai/web-stable-diffusion
The input is a model (weights + runtime lib) compiled via the mlc-llm project: https://mlc.ai/mlc-llm/docs/compilation/compile_models.html
Related posts
-
Web StableDiffusion
-
[Stable Diffusion] Diffusion stable Web: exécution de diffusion stable directement dans le navigateur sans serveur GPU
-
Now that they started banning stable diffusion on google colab, what's the cheapest and the best way to deploy stable diffusion?
-
Bringing stable diffusion models to web browsers
-
mlc-ai/web-stable-diffusion: Bringing stable diffusion models to web browsers. Everything runs inside the browser with no server support.