DALLE-mtf
hivemind
DALLE-mtf | hivemind | |
---|---|---|
41 | 40 | |
435 | 1,837 | |
0.0% | 1.5% | |
0.0 | 5.4 | |
about 2 years ago | about 1 month ago | |
Python | Python | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
DALLE-mtf
-
How Open is Generative AI? Part 2
This vision is in line with EleutherAI, a non-profit organization founded in July 2020 by a group of researchers. Driven by the perceived opacity and the challenge of reproducibility in AI, their goal was to create leading open-source language models.
- The open source learning curve for AI researchers
- EleutherAI: Empowering Open-Source Artificial Intelligence Research
-
Seeking advice on fine-tuning Pythia for semantic search in a non-English language
My current idea is to utilize the EleutherAI pythia (Databricks Dolly). I would like to know whether translating the Dolly-15k dataset into the desired language using state-of-the-art translation techniques like DeepL would be a viable approach to fine-tune the Pythia base model. I want to use this model for semantic search, so perfection is not a necessity.
-
Does anyone want to collaborate to make anti-capitalist AI?
There are open source AI efforts, like EleutherAI. Needless to say, they are lagging behind big players, but it's better than nothing.
-
ChatGPT is bonkers.
The new GPT 3.5 isn't aware what are GPT-3.5 or davinci-002 (repeatable) and claimed that it was designed by EleutherAI and has only 6 bil parameters (wasn't been able to repeat but didn't really try).
-
My teacher has falsely accused me of using ChatGPT to use an assignment.
Hi, my name is Stella Biderman and I run EleutherAI, the one of the foremost non-profit research institutes in the world that trains and studies large language models. I have been involved with the majority of models to hold the title “largest open source GPT model in the world” and have dabbled in exploring using plagiarism detection tools to identify code written by GPT-J.
-
dolly-v2-12b
dolly-v2-12bis a 12 billion parameter causal language model created by Databricks that is derived from EleutherAI’s Pythia-12b and fine-tuned on a ~15K record instruction corpus generated by Databricks employees and released under a permissive license (CC-BY-SA)
-
Futurism: "The Company Behind Stable Diffusion Appears to Be At Risk of Going Under"
It is true that Emad needs to find an appropriate business model. The good news is that the hype is still undergoing. I'm sure that Emad can grab another round of liquidity injection. He got plenty of resources. Remember he is also from the finance industry. He got https://www.eleuther.ai/ which can supply a secured, in-house custom LLM equivalent to bloombergGPT.
-
How can AI be used to protect against exploitative use of other AI?
By promoting fully open-source AI, i.e. making datasets, models, methodology and codebases freely available and transparent. What OpenAI claimed to be aiming for, basically.
hivemind
-
You can now train a 70B language model at home
https://github.com/learning-at-home/hivemind is also relevant
-
Would anyone be interested in contributing to some group projects?
I really hope you'll join me, for the Petals support, at least! A single docker-compose.yml file is all we need, for now. If we are able to find enough people willing to host some smaller models, perhaps we could expand into the Hivemind, and create our own, custom foundation model one day?
- Hive mind:Train deep learning models on thousands of volunteers across the world
-
Could a model not be trained by a decentralized network? Like Seti @ home or kinda-sorta like bitcoin. Petals accomplishes this somewhat, but if raw computer power is the only barrier to open-source I'd be happy to try organizing decentalized computing efforts
Decentralized deep learning: https://github.com/learning-at-home/hivemind
-
Orca (built on llama13b) looks like the new sheriff in town
https://github.com/learning-at-home/hivemind - same people behind it, was made before petals I think.
-
Do you think that AI research will slow down to a halt because of regulation?
not if we rise to meet that challenge. here's a few tools that facilitate AI research in the face of an advanced persistent threat: Hivemind- a distributed Pytorch framework
-
LLM@home
yeah, there's Hivemind. and there's research wrt how to chunk out training workload so it can be scaled up. not sure why there's commentary that latency issues would limit this sort of enterprise, the architecture typically isn't designed for liveness. other subfields of distributed training/inference include zero-knowledge machine learning. besides all of that, there's also adversarial computation like SafetyNets and refereed delegation of computation.
-
[D] Google "We Have No Moat, And Neither Does OpenAI": Leaked Internal Google Document Claims Open Source AI Will Outcompete Google and OpenAI
We already have the software for it. There are some projects, but the one I'm most familiar with is https://github.com/learning-at-home/hivemind for training and it's sister project https://petals.ml/ for running large models distributed.
-
Run 100B+ language models at home, BitTorrent‑style
I'm not entirely how the approach they're using works [0], but I study federated learning and one of the highly-cited survey papers has several chapters (5 and 6 in particular) addressing potential attacks, failure modes, and bias [1].
0: https://github.com/learning-at-home/hivemind
1: https://arxiv.org/abs/1912.04977
-
SETI Home Is in Hibernation
The Hivemind project is just that
https://github.com/learning-at-home/hivemind
What are some alternatives?
VQGAN-CLIP - Just playing with getting VQGAN+CLIP running locally, rather than having to use colab.
replika-research - Replika.ai Research Papers, Posters, Slides & Datasets
CLIP-Guided-Diffusion - Just playing with getting CLIP Guided Diffusion running locally, rather than having to use colab.
GLM-130B - GLM-130B: An Open Bilingual Pre-Trained Model (ICLR 2023)
dalle-mini - DALL·E Mini - Generate images from a text prompt
Super-SloMo - PyTorch implementation of Super SloMo by Jiang et al.
big-sleep - A simple command line tool for text to image generation, using OpenAI's CLIP and a BigGAN. Technique was originally created by https://twitter.com/advadnoun
alpa - Training and serving large-scale neural networks with auto parallelization.
gpt-3 - GPT-3: Language Models are Few-Shot Learners
mesh-transformer-jax - Model parallel transformers in JAX and Haiku
DALLE-pytorch - Implementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch
HiveMind-core - Join the OVOS collective, utils for OpenVoiceOS mesh networking