alpaca.cpp
OpenChatKit
Our great sponsors
alpaca.cpp | OpenChatKit | |
---|---|---|
94 | 23 | |
9,878 | 8,996 | |
- | 0.2% | |
9.4 | 7.1 | |
about 1 year ago | 17 days ago | |
C | Python | |
MIT License | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
alpaca.cpp
-
LLaMA Now Goes Faster on CPUs
Where's the 30B-in-6GB claim? ^FGB in your GH link finds [0] which is neither by jart nor by ggerganov but by another user who promptly gets told to look at [1] where Justine denies that claim.
[0] https://github.com/antimatter15/alpaca.cpp/issues/182
-
Is there potential to short NVDA?
You can just download the language model, dude!!! Everyone doesn’t need to make their own and the open source models literally get better every day.
- [Oobabooga] Alpaca.cpp est extrêmement simple à travailler.
-
Hollywood’s Screenwriters Are Right to Fear AI
Alpaca
-
Square Enix’s AI Tech Demo Is a Staggering Failure
Square could have also trained a more specific data source for their NLP, very similar to Alpaca. Alpaca was trained from interactions from a larger dataset. So while it isn't as smart, it's still able to understand instructions and act upon them.
- [Singularity] Ich bin Alpaka 13B - Frag mich alles
-
Alpaca Vs. Final Jeopardy
The model I found was in 8 parts. The alpaca.cpp chat client (chat.cpp) needs to be modified to run the 8 part model, documented here: https://github.com/antimatter15/alpaca.cpp/issues/149
-
LocalAI: OpenAI compatible API to run LLM models locally on consumer grade hardware!
try the instructions on this github repo https://github.com/antimatter15/alpaca.cpp, its not the best one but I was able to run this model on my linux machine with 16GB memory, I think its a good starting point.
-
What educational materials do you think would be most useful during/after collapse?
Doesn't run offline. If you're running something without a beefy-ish GPU, there's https://github.com/antimatter15/alpaca.cpp .
-
ChatGPT Reignited My Passion For Coding
Ye, atm. toying with alpaca 7B/13B in a local install.
OpenChatKit
- OpenChatKit - OSS Framework for building chatbots
-
How should I get an in-depth mathematical understanding of generative AI?
ChatGPT isn't open sourced so we don't know what the actual implementation is. I think you can read Open Assistant's source code for application design. If that is too much, try Open Chat Toolkit's source code for developer tools . If you need very bare implementation, you should go for lucidrains/PaLM-rlhf-pytorch.
- OpenChatKit
- OpenChatKit: Open-source kit for setting up a local, libre, LLM chatbot
-
I created a locally-run ai assistant for UE5’s documentation
For a locally run open source option, I'd recommend taking a look at OpenChatKit. It's built on top of a couple different open source LLMs that have been fine-tuned for use as chatbots. I've only messed around with the online demo a little bit, but from what I've read it is supposed to run on a laptop and be almost as good as ChatGPT 3.5.
-
[D] Are there any MIT licenced (or similar) open-sourced instruction-tuned LLMs available?
OpenChatKit https://github.com/togethercomputer/OpenChatKit
-
[D] Is there currently anything comparable to the OpenAI API?
Togethercomputer released openchatkit a few weeks ago. Not tested it but looks promising https://github.com/togethercomputer/OpenChatKit
What are some alternatives?
gpt4all - gpt4all: run open-source LLMs anywhere
roomGPT - Upload a photo of your room to generate your dream room with AI.
llama.cpp - LLM inference in C/C++
Open-Assistant - OpenAssistant is a chat-based assistant that understands tasks, can interact with third-party systems, and retrieve information dynamically to do so.
coral-pi-rest-server - Perform inferencing of tensorflow-lite models on an RPi with acceleration from Coral USB stick
wik - wik is use to get information about anything on the shell using Wikipedia.
ggml - Tensor library for machine learning
simple-llm-finetuner - Simple UI for LLM Model Finetuning
text-generation-webui - A Gradio web UI for Large Language Models. Supports transformers, GPTQ, AWQ, EXL2, llama.cpp (GGUF), Llama models.
minChatGPT - A minimum example of aligning language models with RLHF similar to ChatGPT
alpaca-lora - Instruct-tune LLaMA on consumer hardware
simpleAI - An easy way to host your own AI API and expose alternative models, while being compatible with "open" AI clients.