-
gpt-llama.cpp
A llama.cpp drop-in replacement for OpenAI's GPT endpoints, allowing GPT-powered apps to run off local llama.cpp models instead of OpenAI.
-
AGiXT
AGiXT is a dynamic AI Agent Automation Platform that seamlessly orchestrates instruction management and complex task execution across diverse AI providers. Combining adaptive memory, smart features, and a versatile plugin system, AGiXT delivers efficient and comprehensive AI solutions.
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
Auto-LLM-Local
Discontinued Created my own python script similar to AutoGPT where you supply a local llm model like alpaca13b (The main one I use), and the script can access the supplied tools to achieve your objective. Code fully works as far as I can tell. Takes me 5 minutes per chain on my slow laptop.
-
AutoGPT4J
A repository for a Java implementation of AutoGPT. It is heavily inspired by AutoGPT if not a clone of some of its functionality, though there is some deviation from it particularly in how Agents are implemented as well as planned support for more open source models.
That is nice and all, but AgentLLM has been able to do this for a little while now, and has a nice webui to interface with agents as they run along with support for GPU accelerations via oobabooga api or many other options.
Related posts
-
AGiXT: A local automation platform with memories and SmartGPT-like prompting. Works with Ooba/LCPP/GPT4All, and more
-
How big of a jump is 13B Vicuna Uncensored vs 30B Vicuna Uncensored?
-
Langchain, Langchain.js, vs AutoGPT for local agent development
-
Is there an alternative to AgentGPT that I can run on my CPU with 32 GB of RAM?
-
"Question answering over Docs" langchain integration into Textgen