Our great sponsors
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
ruby-openai
OpenAI API + Ruby! 🤖❤️ Now with Assistants, Threads, Messages, Runs and Text to Speech 🍾
-
signal-aichat
Discontinued An AI chatbot for Signal powered by Google Bard, Bing Chat, ChatGPT, HuggingChat, and llama.cpp
-
text-generation-webui
A Gradio web UI for Large Language Models. Supports transformers, GPTQ, AWQ, EXL2, llama.cpp (GGUF), Llama models.
-
memos
An open source, lightweight note-taking service. Easily capture and share your great thoughts.
If you’re curious: https://github.com/dannyvfilms/gptemailsummary
I've been playing with ruby-openai, a Ruby gem (ie, library) that interfaces with the OpenAI API. I'm building myself a small movie-recommendation tool, just a personal need, as well as to teach myself how this stuff works. It's unlikely I'll ever build it into a proper app, as I'm okay with interacting in the shell and with text files, versus a web-based thing. But if there's interest, I'll push my tool up to Github.
I released a chatbot for Signal the other day that supports ChatGPT and local llama.cpp models via the OpenAI API. Bing Chat works as well, but that's a separate API.
Oobabooga is the easiest and most complete way to run local llms. Once installed you can just paste the model you want from huggingface and it will download it. You still have to make sure its compatible and tweak things from time to time.
https://github.com/usememos/memos has a feature "Ask AI" which is tied to OpenAI API. Never used it though..