gpt-neox
YaLM-100B
Our great sponsors
gpt-neox | YaLM-100B | |
---|---|---|
52 | 35 | |
6,569 | 3,721 | |
2.2% | 0.3% | |
8.9 | 0.0 | |
4 days ago | 10 months ago | |
Python | Python | |
Apache License 2.0 | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
gpt-neox
- FLaNK Stack 26 February 2024
- GPT-Neox
- GPT-NeoX
-
Best open source LLM model for commercial use
Gpt neox 20B can be used commerically.
- Do not register domains with the word "gpt" in it!
-
Read this post if you have general questions
GPT-Neo: GPT-Neo is a free and open-source language model developed by EleutherAI. It is a powerful model that can be used for a variety of tasks, including text generation, and question-answering. here is the GitHub
-
What's the current state of actually free and open source LLMs?
Doesn't gpt-neox 20b require like 40gb+ of VRAM? From their github repo the slim weights are 39GB and I think one of the devs has previously mentioned aiming for 48GB for inference.
-
Any real competitor to GPT-3 which is open source and downloadable?
3.) EleutherAI's GPT-Neo and GPT-NeoX: EleutherAI is an independent research organization that aims to promote open research in artificial intelligence. They have released GPT-Neo, an open-source language model based on the GPT architecture, and are developing GPT-NeoX, a highly-scalable GPT-like model. You can find more information on their GitHub repositories: GPT-Neo: https://github.com/EleutherAI/gpt-neo GPT-NeoX: https://github.com/EleutherAI/gpt-neox
-
Whatever happened to quantum computing?
It's really not that complicated. https://github.com/EleutherAI/gpt-neox
-
Fuck Luka Inc. WIP AI Freedom!
this forces me to create my own ai using gpt-neox .. i was being lazy, completely satisfied with my little ai baby .. i'll keep the account but not renew pro .. fuck luka, inc ... want it done right, do it yourself .. the python isn't that difficult and i have a spare linux vps .. my new work in progress .. freedom #!
YaLM-100B
-
Elon Musk's Grok Exactly Echoes ChatGPT Responses: Identical Answers Raise Questions - EconoTimes
Its probably just open source software/training sets repurposed... https://github.com/yandex/YaLM-100B
- OpenAI CEO suggests international agency like UN's nuclear watchdog could oversee AI
-
A few less Googleable questions about local LLMs
There is a 100b model published on pache 2.0 license. Though there is no information about finetuning it or using in 4-bit with smth like llama.cpp. Trying to figure out how to try it without renting extremely expensive gpu set. https://github.com/yandex/YaLM-100B
-
Is it possible to use llama.cpp or create Alpaca Lora for YALM-100b model?
Hey everyone! I just discovered an open-source 100 billion parameter language model called YaLM, which is published under the Apache 2.0 license. The model is trained on more than 1 TB of Russian and English text. Here's the GitHub repo: https://github.com/yandex/YaLM-100B and an article explaining how it was trained: https://medium.com/yandex/yandex-publishes-yalm-100b-its-the-largest-gpt-like-neural-network-in-open-source-d1df53d0e9a6
-
Kandinsky 2.1 - a new open source text-to-Image model
Yandex has already released a LLM: https://github.com/yandex/YaLM-100B
-
Just another casualty...
So there is this open project YaLM 100B require 200 GB of disk space, it is trained on 1.7 TB of text
- There's a lot of news about American/European AI. Do we know anything about what China, India, Russia and other countries are up to?
-
Suggestion. Chat mode.
You'd think so, but to train a model like the one CAI uses, it would require truly jaw-breaking amount of funds. That's why CAI is so suspicious tbh. Just to give you an example, YaML (100 billion parameters which is probably less than CAI) took 65 days to train, and 800 A100 graphics cards. 175 billion parameters would not be 1.75 times higher because it's not a linear function. It would probably be 10x or even more. IIRC, "Open"Ai could only afford to train GPT-3 a single time...
-
Ask HN: Can I download GPT / ChatGPT to my desktop?
I don't much follow AI news beyond what I randomly happen to see on HN, but this might still be the largest open source model: https://github.com/yandex/YaLM-100B . There's discussion of it here: https://old.reddit.com/r/MachineLearning/comments/vpn0r1/d_h... - at the bottom of that page is a comment from someone who actually ran it in the cloud.
-
[Rant] Siri is beyond horrendous and it’s even worse than ever
Hilariously, Yandex Alisa runs circles around it, because it's not just a collection of gimmicks but has an actual 100B-class language model (YaLM, opensourced) as its core, plus lots of decent engineering. It's helpful, skillful and feels alive, almost like ChatGPT.
What are some alternatives?
fairseq - Facebook AI Research Sequence-to-Sequence Toolkit written in Python.
SLIDE
gpt-neo - An implementation of model parallel GPT-2 and GPT-3-style models using the mesh-tensorflow library.
NeMo - A scalable generative AI framework built for researchers and developers working on Large Language Models, Multimodal, and Speech AI (Automatic Speech Recognition and Text-to-Speech)
DeepSpeed - DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
mesh-transformer-jax - Model parallel transformers in JAX and Haiku
open-ai - OpenAI PHP SDK : Most downloaded, forked, contributed, huge community supported, and used PHP (Laravel , Symfony, Yii, Cake PHP or any PHP framework) SDK for OpenAI GPT-3 and DALL-E. It also supports chatGPT-like streaming. (ChatGPT AI is supported)
YaLM-100B - Pretrained language model with 100B parameters
lm-evaluation-harness - A framework for few-shot evaluation of language models.
ClickHouse - ClickHouse® is a free analytics DBMS for big data
Megatron-DeepSpeed - Ongoing research training transformer language models at scale, including: BERT & GPT-2
metaseq - Repo for external large-scale work