-
ollama-webui
Discontinued ChatGPT-Style WebUI for LLMs (Formerly Ollama WebUI) [Moved to: https://github.com/open-webui/open-webui]
On mac silicon:
https://ollama.ai/
ollama pull mixtral
For a chatgpt-esk web ui
https://github.com/ollama-webui/ollama-webui
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v ollama-webui:/app/backend/data --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main
Navigate to http://localhost:3000
You can also use ollama in langchain.
-
InfluxDB
InfluxDB high-performance time series database. Collect, organize, and act on massive volumes of high-resolution data to power real-time intelligent systems.
-
ollama
Get up and running with Llama 3.3, DeepSeek-R1, Phi-4, Gemma 3, Mistral Small 3.1 and other large language models.
On mac silicon:
https://ollama.ai/
ollama pull mixtral
For a chatgpt-esk web ui
https://github.com/ollama-webui/ollama-webui
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v ollama-webui:/app/backend/data --name ollama-webui --restart always ghcr.io/ollama-webui/ollama-webui:main
Navigate to http://localhost:3000
You can also use ollama in langchain.
-
CogVLM is very good in my (brief) testing: https://github.com/THUDM/CogVLM
The model weights seem to be under a non-commercial license, not true open source, but it is "open access" as you requested.
-
-
Lmstudio (that they linked) is definitely not open source, and doesn't even offer a pricing model for business use.
Llmstudio is, but I suspect that was a typo in their comment. https://github.com/TensorOpsAI/LLMStudio
-
> The output quality is not "ruined" at all.
That was my experience as well.
I also tried 2-bit version, and it was horrible.
However, there is a new approach in the works[1] (merged yesterday) which works surprisingly well with 2.10 bits per weight (12.3 GB model size)
[1] https://github.com/ggerganov/llama.cpp/pull/4773
Related posts
-
AWS Bedrock anthropic claude tool call integration with microsoft semantic kernel
-
Nvidia on NixOS WSL – Ollama up 24/7 on your gaming PC
-
Model context protocol integration with microsoft semantic kernel
-
Sidekick: Local-first native macOS LLM app
-
Exploring AI Frameworks: A Deep Dive into Semantic Kernel and My Open Source Contributions