Alpaca.cpp is extremely simple to get working.

This page summarizes the projects mentioned and recommended in the original post on /r/Oobabooga

Our great sponsors
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • WorkOS - The modern identity platform for B2B SaaS
  • SaaSHub - Software Alternatives and Reviews
  • alpaca.cpp

    Discontinued Locally run an Instruction-Tuned Chat-Style LLM

  • Alpaca.cpp is extremely simple to get up and running. You don't need any Conda environments, don't need to install Linux or WSL, don't need to install Python, CUDA, anything at all. It's a single ~200kb EXE that you just run, and you put a 4GB model file into the directory. That's it.

  • llamacpp-for-kobold

    Discontinued Port of Facebook's LLaMA model in C/C++ [Moved to: https://github.com/LostRuins/koboldcpp]

  • Try this https://github.com/LostRuins/llamacpp-for-kobold

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

    InfluxDB logo
  • text-generation-webui

    A Gradio web UI for Large Language Models. Supports transformers, GPTQ, AWQ, EXL2, llama.cpp (GGUF), Llama models.

  • https://github.com/oobabooga/text-generation-webui/pull/447 this, in theory, could easily run Alpaca via C++ exe backend and oobabooga webUI interface

  • alpaca.http

    Locally run an Instruction-Tuned Chat-Style LLM

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts