gptqlora VS chain-of-thought-hub

Compare gptqlora vs chain-of-thought-hub and see what are their differences.

gptqlora

GPTQLoRA: Efficient Finetuning of Quantized LLMs with GPTQ (by qwopqwop200)

chain-of-thought-hub

Benchmarking large language models' complex reasoning ability with chain-of-thought prompting (by FranxYao)
Our great sponsors
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • WorkOS - The modern identity platform for B2B SaaS
  • SaaSHub - Software Alternatives and Reviews
gptqlora chain-of-thought-hub
2 10
94 2,361
- -
7.6 6.9
11 months ago 8 days ago
Python Jupyter Notebook
MIT License MIT License
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

gptqlora

Posts with mentions or reviews of gptqlora. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-06-02.
  • (2/2) May 2023
    14 projects | /r/dailyainews | 2 Jun 2023
    GPTQLoRA: Efficient Finetuning of Quantized LLMs with GPTQ (https://github.com/qwopqwop200/gptqlora/tree/main)
  • GPTQLoRA: Efficient Finetuning of Quantized LLMs with GPTQ
    1 project | /r/LocalLLaMA | 24 May 2023
    The difference from QLoRA is that GPTQ is used instead of NF4 (Normal Float4) + DQ (Double Quantization) for model quantization. The advantage is that you can expect better performance because it provides better quantization than conventional bitsandbytes. The downside is that it is a one-shot quantization methodology, so it is more inconvenient than bitsandbytes, and unlike bitsandbytes, it is not universal. I'm still experimenting, but it seems to work. At least, I hope it can be more options for people using LoRA. https://github.com/qwopqwop200/gptqlora/tree/main

chain-of-thought-hub

Posts with mentions or reviews of chain-of-thought-hub. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-06-08.

What are some alternatives?

When comparing gptqlora and chain-of-thought-hub you can also consider the following projects:

tree-of-thoughts - Plug in and Play Implementation of Tree of Thoughts: Deliberate Problem Solving with Large Language Models that Elevates Model Reasoning by atleast 70%

DB-GPT - AI Native Data App Development framework with AWEL(Agentic Workflow Expression Language) and Agents

GirlfriendGPT - Girlfriend GPT is a Python project to build your own AI girlfriend using ChatGPT4.0

llm-leaderboard - A joint community effort to create one central leaderboard for LLMs.

chathub - All-in-one chatbot client

guidance - A guidance language for controlling large language models. [Moved to: https://github.com/guidance-ai/guidance]

airoboros - Customizable implementation of the self-instruct paper.

gorilla - Gorilla: An API store for LLMs

llm-humaneval-benchmarks