Has anyone managed to get AutoGPT working with a local LLM that can use the inbuilt commands?

This page summarizes the projects mentioned and recommended in the original post on /r/AutoGPT

InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
  • LocalAI

    :robot: The free, Open Source OpenAI alternative. Self-hosted, community-driven and local-first. Drop-in replacement for OpenAI running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. It allows to generate Text, Audio, Video, Images. Also with voice cloning capabilities.

  • That’s essentially what I am running- see https://github.com/go-skynet/LocalAI. I’m only running it on CPU hence very slow. I’d consider investing in a decent GPU if things work (I do have an old 8GB 1070ti but currently running a 24GB model)

  • gorilla

    Gorilla: An API store for LLMs

  • This project https://github.com/ShishirPatil/gorilla seems to be designed as an LLM that can understand and utilize various APIs. Unfortunately, it is not currently open-source.

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

    InfluxDB logo
NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts