Running oobabooga with Alpaca on Apple Silicon (M1/M2)

This page summarizes the projects mentioned and recommended in the original post on /r/LocalLLaMA

Our great sponsors
  • WorkOS - The modern identity platform for B2B SaaS
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • SaaSHub - Software Alternatives and Reviews
  • text-generation-webui

    A Gradio web UI for Large Language Models. Supports transformers, GPTQ, AWQ, EXL2, llama.cpp (GGUF), Llama models.

  • I struggle to find a working install of oobabooga and Alpaca model. Im running on a Macbook Pro M2 24GB.

  • alpaca-electron

    The simplest way to run Alpaca (and other LLaMA-based local LLMs) on your own computer

  • WorkOS

    The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.

    WorkOS logo
  • dalai

    The simplest way to run LLaMA on your local machine

  • llama.cpp

    LLM inference in C/C++

  • Use llama.cpp

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts