Alpaca recreation without LORA ( released as a diff. )

This page summarizes the projects mentioned and recommended in the original post on /r/LocalLLaMA

Our great sponsors
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • WorkOS - The modern identity platform for B2B SaaS
  • SaaSHub - Software Alternatives and Reviews
  • point-alpaca

  • text-generation-webui

    A Gradio web UI for Large Language Models. Supports transformers, GPTQ, AWQ, EXL2, llama.cpp (GGUF), Llama models.

  • 1. Navigate to the text-generation-webui folder 2. Ensure it's up to date with: git pull https://github.com/oobabooga/text-generation-webui 3. Re-install the requirements if needed: pip install -r requirements.txt 4. Navigate to the loras folder and download the LoRA with: git lfs install && git clone https://huggingface.co/tloen/alpaca-lora-7b 5. Load LLaMa-7B in 8-bit mode only: python server.py --model llama-7b --load-in-8bit 6. Select the LoRA in the Parameters tab

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

    InfluxDB logo
NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts