petals
boinc
petals | boinc | |
---|---|---|
98 | 213 | |
8,684 | 1,918 | |
1.5% | 1.0% | |
8.3 | 9.6 | |
5 days ago | 1 day ago | |
Python | PHP | |
MIT License | GNU Lesser General Public License v3.0 only |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
petals
-
Mistral Large
So how long until we can do an open source Mistral Large?
We could make a start on Petals or some other open source distributed training network cluster possibly?
[0] https://petals.dev/
-
Distributed Inference and Fine-Tuning of Large Language Models over the Internet
Can check out their project at https://github.com/bigscience-workshop/petals
- Make no mistake—AI is owned by Big Tech
- Would you donate computation and storage to help build an open source LLM?
-
Run 70B LLM Inference on a Single 4GB GPU with This New Technique
There is already an implementation along the same line using the torrent architecture.
https://petals.dev/
-
Run LLMs in bittorrent style
Check it out at Petals.dev. Chatbot
- Is distributed computing dying, or just fading into the background?
-
Ask HN: Are there any projects currently exploring distributed AI training?
https://github.com/bigscience-workshop/petals
-
Mistral 7B,The complete Guide of the Best 7B model
https://github.com/bigscience-workshop/petals
Inference only: https://lite.koboldai.net/
- Run LLMs at home, BitTorrent‑style
boinc
-
Bitcoin Block 840000
The only way I can foresee a cryptocoin actually holding value is if spending the coin meant spending processing cycles and RAM doing things like this:
https://en.wikipedia.org/wiki/List_of_volunteer_computing_pr...
But in more general sense, less like https://boinc.berkeley.edu/ and more like AWS...
It's the only way to have value, actually holding computing power in a distributed network.
-
Distributed Inference and Fine-Tuning of Large Language Models over the Internet
Made me think of Gridcoin and BOINC https://boinc.berkeley.edu/
-
Have you ever donated your computing power with BOINC? Take 5 minutes to fill out the 2023 BOINC Census!
The BOINC Census is back for another year! BOINC is an open source software and network for volunteer computing. People can use it do donate their CPU/GPU power to various scientific research areas like cancer, drug discovery, mapping the galaxy, and more.
- Berkeley Open Infrastructure for Network Computing
-
Ask HN: What should I do with my leftover bandwidth?
A few years back, I was in a similar situation and found BOINC(https://boinc.berkeley.edu/) to be a great way to contribute. It's a platform that lets you support various scientific research projects by sharing your computational power and bandwidth. However, it's worth noting that BOINC might tends to be more CPU/GPU intensive rather than bandwidth-heavy
- If you have a decent computer, you could contribute to science by installing Boinc. A couple of different projects are researching COVID cures.
-
It's never too late for Mapping the Mayo Way! Get crunching (mapping)!
Sign up or login to the Milky Way MayoCoin team (CPU only) and Einstein MayoCoin team (GPU and CPU) using a BOINC account. Use your Reddit or Discord username.
-
Ash HN: How can I make my idle CPU time useful to others?
Berkeley Open Infrastructure for Network Computing (BOINC)
https://boinc.berkeley.edu/
https://boinc.berkeley.edu/projects.php
Has a unified management experience with the ability to subscribe to various projects, and set priorities/schedules for work units.
-
Scientific computing on a personal machine vs university resources
Probably BOINC (https://boinc.berkeley.edu/) could be a good solution for you. You can write me a DM, and I could help you to clarify is this something that could help you with your research. By default, to run your computations on BOINC you need to create a server, but we can deal with that and run your research on our own server first, so this could help you to start faster, and then later decide if you need a separate server. And yes - it's totally free.
- Boinc
What are some alternatives?
text-generation-webui - A Gradio web UI for Large Language Models. Supports transformers, GPTQ, AWQ, EXL2, llama.cpp (GGUF), Llama models.
android - :phone: The ownCloud Android App
llama - Inference code for Llama models
pwnagotchi - (⌐■_■) - Deep Reinforcement Learning instrumenting bettercap for WiFi pwning.
alpaca-lora - Instruct-tune LLaMA on consumer hardware
fairgame - Tool to help us buy hard to find items.
GLM-130B - GLM-130B: An Open Bilingual Pre-Trained Model (ICLR 2023)
android - 📱 Nextcloud Android app
Auto-GPT - An experimental open-source attempt to make GPT-4 fully autonomous. [Moved to: https://github.com/Significant-Gravitas/Auto-GPT]
pyLoad - The free and open-source Download Manager written in pure Python
Open-Assistant - OpenAssistant is a chat-based assistant that understands tasks, can interact with third-party systems, and retrieve information dynamically to do so.
openhab-android - openHAB client for Android