Our great sponsors
-
petals
🌸 Run LLMs at home, BitTorrent-style. Fine-tuning and inference up to 10x faster than offloading
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
box
Discontinued a framework [Moved to: https://github.com/0-5788719150923125/vtx] (by 0-5788719150923125)
-
hivemind
Decentralized deep learning in PyTorch. Built to train models on thousands of volunteers across the world.
The other project I've recently spent a lot of time with is, in large part, a chat bot. It uses Huggingface's Transformers for inference, a LightningAI for training, and Docker for automation of nearly every task. At every level, it was designed for the desktop user (small models, single GPU). The bot is capable of connecting to Discord, Telegram, Twitch, Reddit, Twitter, and the Source (of course!)
Recently, I've added support for Petals to this bot. I was hoping to experiment with some prefix-tuning of smaller models (like BLOOM 560m), but there's nobody hosting them - and I do not have enough VRAM to do it myself! I would greatly appreciate some additional GPUs, here.
I really hope you'll join me, for the Petals support, at least! A single docker-compose.yml file is all we need, for now. If we are able to find enough people willing to host some smaller models, perhaps we could expand into the Hivemind, and create our own, custom foundation model one day?
I really hope you'll join me, for the Petals support, at least! A single docker-compose.yml file is all we need, for now. If we are able to find enough people willing to host some smaller models, perhaps we could expand into the Hivemind, and create our own, custom foundation model one day?
Related posts
- Hive mind:Train deep learning models on thousands of volunteers across the world
- Could a model not be trained by a decentralized network? Like Seti @ home or kinda-sorta like bitcoin. Petals accomplishes this somewhat, but if raw computer power is the only barrier to open-source I'd be happy to try organizing decentalized computing efforts
- Orca (built on llama13b) looks like the new sheriff in town
- Do you think that AI research will slow down to a halt because of regulation?
- [D] Google "We Have No Moat, And Neither Does OpenAI": Leaked Internal Google Document Claims Open Source AI Will Outcompete Google and OpenAI