ChatGLM-6B
pythia
ChatGLM-6B | pythia | |
---|---|---|
17 | 7 | |
39,341 | 2,041 | |
1.6% | 2.4% | |
8.4 | 7.8 | |
3 months ago | 8 days ago | |
Python | Jupyter Notebook | |
Apache License 2.0 | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
ChatGLM-6B
-
What are the current fastest multi-gpu inference frameworks?
ChatGLM seems to be pretty popular but I've never used this before.
-
A CEO is spending more than $2,000 a month on ChatGPT Plus accounts for all of his employees, and he says it's saving 'hours' of time
There are also locally hosted options that approach the effectiveness of ChatGPT. This GLM for example was specifically trained to be able to be processed on a single consumer grade GPU
- Open Source Chinese LLMs
- ChatGLM-6B: run locally on consumer graphics card (6GB of GPU memory required)
- Ask HN: Open source LLM for commercial use?
-
Coding LLaMa Modell?
A link to for y'all. Definitely gonna try to mess around with this!
- 关于GPT,AI和未来的一些社会经济问题,向诸位请教
- FLiPN-FLaNK Stack Weekly for 20 March 2023
- ChatGLM-6B - an open source 6.2 billion parameter English/Chinese bilingual LLM trained on 1T tokens, supplemented by supervised fine-tuning, feedback bootstrap, and Reinforcement Learning from Human Feedback. Runs on consumer grade GPUs
- ChatGLM: Open bilingual language model based on General Language Model framework
pythia
-
If you can't reproduce the model then it's not open-source
You can grep for bad words. What you can't do(unless hoops are jumped through) is to verify that weights came from the same dataset. You can set the same random seed and still get different results. Calculations are not that deterministic. (https://pytorch.org/docs/stable/notes/randomness.html#reprod...).
>I am overall skeptical that this is true in the case of LLMs
This skepticism seems reasonable. EleutherAI have documentation to reproduce training (https://github.com/EleutherAI/pythia#reproducing-training). So far I haven't seen it leading to anything.
-
Local Alternatives of ChatGPT and Midjourney
LLaMA, Pythia, RWKV, Flan-T5 (self-hosted), FlexGen
- Ask HN: Open source LLM for commercial use?
-
A New AI Research Proposes Pythia: A Suite of Decoder-Only Autoregressive Language Models Ranging from 70M to 12B Parameters
Github: https://github.com/EleutherAI/pythia
- Pythia: Interpreting Autoregressive Transformers Across Time and Scale
- AI computing startup Cerebras releases open source ChatGPT-like models
-
Is there a way to easily train ChatGPT or GPT on custom knowledge?
Pythia is another smaller option that seems to have pretty good performance. As well as FLAN. Both of those are okay for commercial use AFAIK (though double check for yourself).
What are some alternatives?
llama.cpp - LLM inference in C/C++
lollms-webui - Lord of Large Language Models Web User Interface
alpaca.cpp - Locally run an Instruction-Tuned Chat-Style LLM
geov - The GeoV model is a large langauge model designed by Georges Harik and uses Rotary Positional Embeddings with Relative distances (RoPER). We have shared a pre-trained 9B parameter model.
stanford_alpaca - Code and documentation to train Stanford's Alpaca models, and generate the data.
GLM-130B - GLM-130B: An Open Bilingual Pre-Trained Model (ICLR 2023)
Open-Assistant - OpenAssistant is a chat-based assistant that understands tasks, can interact with third-party systems, and retrieve information dynamically to do so.
InvokeAI - InvokeAI is a leading creative engine for Stable Diffusion models, empowering professionals, artists, and enthusiasts to generate and create visual media using the latest AI-driven technologies. The solution offers an industry leading WebUI, supports terminal use through a CLI, and serves as the foundation for multiple commercial products.
datagen - Generate authentic looking mock data based on a SQL, JSON or Avro schema and produce to Kafka in JSON or Avro format.
basaran - Basaran is an open-source alternative to the OpenAI text completion API. It provides a compatible streaming API for your Hugging Face Transformers-based text generation models.
stable-diffusion-webui - Stable Diffusion web UI