Our great sponsors
-
text-generation-webui
A Gradio web UI for Large Language Models. Supports transformers, GPTQ, AWQ, EXL2, llama.cpp (GGUF), Llama models.
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
1. Navigate to the text-generation-webui folder 2. Ensure it's up to date with: git pull https://github.com/oobabooga/text-generation-webui 3. Re-install the requirements if needed: pip install -r requirements.txt 4. Navigate to the loras folder and download the LoRA with: git lfs install && git clone https://huggingface.co/tloen/alpaca-lora-7b 5. Load LLaMa-7B in 8-bit mode only: python server.py --model llama-7b --load-in-8bit 6. Select the LoRA in the Parameters tab
Related posts
- Show HN: Find similar folders based on folder name, folder size, and count
- Show HN: I made a privacy friendly and simple app to track my menstruation
- Show HN: Open-Source Image Model Leaderboard with Public Preference Data
- Show HN: Define and implement any function on the fly with LLMs
- Principles for Keyboard Layouts (2022)