GPT-4-LLM
Instruction Tuning with GPT-4 (by Instruction-Tuning-with-GPT-4)
gpt4all
gpt4all: run open-source LLMs anywhere (by nomic-ai)
GPT-4-LLM | gpt4all | |
---|---|---|
5 | 139 | |
4,012 | 65,076 | |
- | 3.3% | |
5.4 | 9.8 | |
12 months ago | 7 days ago | |
HTML | C++ | |
Apache License 2.0 | MIT License |
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
GPT-4-LLM
Posts with mentions or reviews of GPT-4-LLM.
We have used some of these posts to build our list of alternatives
and similar projects. The last one was on 2023-08-22.
-
Fine-tuning LLMs with LoRA: A Gentle Introduction
I'm using the Instruction Tuning with GPT-4 dataset, which is hosted on Huggingface.
- (31F). Lost 1.8% body fat and gained 1.3 lbs muscle mass in 2 weeks!
-
What’s the current best model that will run well locally on a 3090?
No, GPT4 x Alpaca, GPT4 Alpaca, and GPT4All use different datasets. GPT4 x Alpaca uses GPTeacher, GPT4 Alpaca uses Microsoft Research's GPT-4-LLM, and GPT4All uses their own. GPT4All is commonly considered to be the worst out of all of them in the general community.
-
GPT4-X-Alpaca 30B 4-bit, by MetaIX based on LoRA by chansung
For anyone wondering how this compares with the 13B GPT4 x Alpaca, the dataset used is different. The 13B GPT4xAlpaca uses the GPTeacher dataset, while this uses the Microsoft Research dataset from Instruction Tuning with GPT-4. It should be a direct upgrade to Stanford's Alpaca, and I'll add it to the wiki as GPT4 Alpaca without an x to differentiate it.
-
GPT-4 Takes the Lead in Instruction-Tuning of Large Language Models: Advancing Generalization Capabilities for Real-World Tasks
Github: https://github.com/Instruction-Tuning-with-GPT-4/GPT-4-LLM
gpt4all
Posts with mentions or reviews of gpt4all.
We have used some of these posts to build our list of alternatives
and similar projects. The last one was on 2024-02-27.
- Show HN: I made an app to use local AI as daily driver
-
Ollama Python and JavaScript Libraries
I don’t know if Ollama can do this but https://gpt4all.io/ can.
-
Ask HN: How do I train a custom LLM/ChatGPT on my own documents in Dec 2023?
Gpt4all is a local desktop app with a Python API that can be trained on your documents: https://gpt4all.io/
-
WyGPT: Minimal mature GPT model in C++
The readme page is cryptic. What does 'mature' mean in this context? What is the sample text a continuation of?
Hving a gif the thing in use would be great, similar to the gpt4all readme page. (https://github.com/nomic-ai/gpt4all)
-
LibreChat
Check https://github.com/nomic-ai/gpt4all instead.
-
OpenAI Negotiations to Reinstate Altman Hit Snag over Board Role
"I ran performance tests on two systems, here's the results of system 1, and heres the results of system 2. Summarize the results, and build a markdown table containing x,y,z rows."
"extract the reusable functions out of this bash script"
"write me a cfssl command to generate a intermediate CA"
"What is the regex for _____"
"Here are my accomplishments over the last 6 months, summarize them into a 1 page performance report."
etc etc etc
If you're not using GPT4 or some LLM as part of your daily flow you're working too hard.
Get GPT4All (https://gpt4all.io), log into OpenAI, drop $20 on your account, get a API key, and start using GPT4.
-
Darbe uzdraude naudotis CHATGPT: ar cia normalu?
offline versija, nors ir ne tokia pažengus - https://github.com/nomic-ai/gpt4all ; https://gpt4all.io/index.html
- GPT4All: An ecosystem of open-source on-edge large language models - by Nomic AI
-
Why use OpenAI's ChatGPT3.5 online service, if you can instead host your own local llama?
Take a look at https://gpt4all.io, their docs are pretty awesome
-
Ask HN: Are you using a local LLM? If yes, what for?
I run one. I built an iMessage-like frontend to it using plain JS and a Python websocket backend. I mostly just use it for curiosity and playing with different prompts. I only have 16GB of RAM to dedicate to it, so I use an 8B parameter model which is enough for fun and chitchat, but I don't find it good enough to replace ChatGPT.
https://github.com/nomic-ai/gpt4all