YaLM-100B
boinc
YaLM-100B | boinc | |
---|---|---|
35 | 213 | |
3,722 | 1,918 | |
0.1% | 1.0% | |
0.0 | 9.6 | |
10 months ago | 6 days ago | |
Python | PHP | |
Apache License 2.0 | GNU Lesser General Public License v3.0 only |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
YaLM-100B
-
Elon Musk's Grok Exactly Echoes ChatGPT Responses: Identical Answers Raise Questions - EconoTimes
Its probably just open source software/training sets repurposed... https://github.com/yandex/YaLM-100B
- OpenAI CEO suggests international agency like UN's nuclear watchdog could oversee AI
-
A few less Googleable questions about local LLMs
There is a 100b model published on pache 2.0 license. Though there is no information about finetuning it or using in 4-bit with smth like llama.cpp. Trying to figure out how to try it without renting extremely expensive gpu set. https://github.com/yandex/YaLM-100B
-
Is it possible to use llama.cpp or create Alpaca Lora for YALM-100b model?
Hey everyone! I just discovered an open-source 100 billion parameter language model called YaLM, which is published under the Apache 2.0 license. The model is trained on more than 1 TB of Russian and English text. Here's the GitHub repo: https://github.com/yandex/YaLM-100B and an article explaining how it was trained: https://medium.com/yandex/yandex-publishes-yalm-100b-its-the-largest-gpt-like-neural-network-in-open-source-d1df53d0e9a6
-
Kandinsky 2.1 - a new open source text-to-Image model
Yandex has already released a LLM: https://github.com/yandex/YaLM-100B
-
Just another casualty...
So there is this open project YaLM 100B require 200 GB of disk space, it is trained on 1.7 TB of text
- There's a lot of news about American/European AI. Do we know anything about what China, India, Russia and other countries are up to?
-
Suggestion. Chat mode.
You'd think so, but to train a model like the one CAI uses, it would require truly jaw-breaking amount of funds. That's why CAI is so suspicious tbh. Just to give you an example, YaML (100 billion parameters which is probably less than CAI) took 65 days to train, and 800 A100 graphics cards. 175 billion parameters would not be 1.75 times higher because it's not a linear function. It would probably be 10x or even more. IIRC, "Open"Ai could only afford to train GPT-3 a single time...
-
Ask HN: Can I download GPT / ChatGPT to my desktop?
I don't much follow AI news beyond what I randomly happen to see on HN, but this might still be the largest open source model: https://github.com/yandex/YaLM-100B . There's discussion of it here: https://old.reddit.com/r/MachineLearning/comments/vpn0r1/d_h... - at the bottom of that page is a comment from someone who actually ran it in the cloud.
-
[Rant] Siri is beyond horrendous and it’s even worse than ever
Hilariously, Yandex Alisa runs circles around it, because it's not just a collection of gimmicks but has an actual 100B-class language model (YaLM, opensourced) as its core, plus lots of decent engineering. It's helpful, skillful and feels alive, almost like ChatGPT.
boinc
-
Bitcoin Block 840000
The only way I can foresee a cryptocoin actually holding value is if spending the coin meant spending processing cycles and RAM doing things like this:
https://en.wikipedia.org/wiki/List_of_volunteer_computing_pr...
But in more general sense, less like https://boinc.berkeley.edu/ and more like AWS...
It's the only way to have value, actually holding computing power in a distributed network.
-
Distributed Inference and Fine-Tuning of Large Language Models over the Internet
Made me think of Gridcoin and BOINC https://boinc.berkeley.edu/
-
Have you ever donated your computing power with BOINC? Take 5 minutes to fill out the 2023 BOINC Census!
The BOINC Census is back for another year! BOINC is an open source software and network for volunteer computing. People can use it do donate their CPU/GPU power to various scientific research areas like cancer, drug discovery, mapping the galaxy, and more.
- Berkeley Open Infrastructure for Network Computing
-
Ask HN: What should I do with my leftover bandwidth?
A few years back, I was in a similar situation and found BOINC(https://boinc.berkeley.edu/) to be a great way to contribute. It's a platform that lets you support various scientific research projects by sharing your computational power and bandwidth. However, it's worth noting that BOINC might tends to be more CPU/GPU intensive rather than bandwidth-heavy
- If you have a decent computer, you could contribute to science by installing Boinc. A couple of different projects are researching COVID cures.
-
It's never too late for Mapping the Mayo Way! Get crunching (mapping)!
Sign up or login to the Milky Way MayoCoin team (CPU only) and Einstein MayoCoin team (GPU and CPU) using a BOINC account. Use your Reddit or Discord username.
-
Ash HN: How can I make my idle CPU time useful to others?
Berkeley Open Infrastructure for Network Computing (BOINC)
https://boinc.berkeley.edu/
https://boinc.berkeley.edu/projects.php
Has a unified management experience with the ability to subscribe to various projects, and set priorities/schedules for work units.
-
Scientific computing on a personal machine vs university resources
Probably BOINC (https://boinc.berkeley.edu/) could be a good solution for you. You can write me a DM, and I could help you to clarify is this something that could help you with your research. By default, to run your computations on BOINC you need to create a server, but we can deal with that and run your research on our own server first, so this could help you to start faster, and then later decide if you need a separate server. And yes - it's totally free.
- Boinc
What are some alternatives?
gpt-neox - An implementation of model parallel autoregressive transformers on GPUs, based on the DeepSpeed library.
android - :phone: The ownCloud Android App
SLIDE
pwnagotchi - (⌐■_■) - Deep Reinforcement Learning instrumenting bettercap for WiFi pwning.
NeMo - A scalable generative AI framework built for researchers and developers working on Large Language Models, Multimodal, and Speech AI (Automatic Speech Recognition and Text-to-Speech)
fairgame - Tool to help us buy hard to find items.
mesh-transformer-jax - Model parallel transformers in JAX and Haiku
android - 📱 Nextcloud Android app
YaLM-100B - Pretrained language model with 100B parameters
pyLoad - The free and open-source Download Manager written in pure Python
ClickHouse - ClickHouse® is a free analytics DBMS for big data
openhab-android - openHAB client for Android