How I Run 34B Models at 75K Context on 24GB, Fast

This page summarizes the projects mentioned and recommended in the original post on /r/LocalLLaMA

InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
  • exui

    Web UI for ExLlamaV2

  • Download https://github.com/turboderp/exui

  • HuggingFaceModelDownloader

    Simple go utility to download HuggingFace Models and Datasets

  • Download a 3-4bpw exl2 34B quantization of a Yi 200K model. Not a Yi base 32K model. Not a GGUF. GPTQ kinda works, but will severely limit your context size. I use this for downloads instead of git: https://github.com/bodaay/HuggingFaceModelDownloader

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

    InfluxDB logo
NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts

  • HuggingFace Is Down

    1 project | news.ycombinator.com | 28 Feb 2024
  • Huggingface alternative

    1 project | /r/LocalLLaMA | 4 Jul 2023
  • Show HN: Download HuggingFace Models/Datasets easily and super fast

    2 projects | news.ycombinator.com | 24 Jun 2023
  • Simple Utility in Go to download HuggingFace Models

    2 projects | /r/LocalLLaMA | 23 Jun 2023
  • Show HN: Simple Utility in Go to Download HuggingFace Models

    1 project | news.ycombinator.com | 23 Jun 2023