manim
metaseq
Our great sponsors
manim | metaseq | |
---|---|---|
144 | 53 | |
57,952 | 6,386 | |
- | 1.0% | |
8.6 | 6.2 | |
13 days ago | 7 days ago | |
Python | Python | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
manim
-
3Blue1Brown: Visualizing Attention, a Transformer's Heart
That is definitely one of the things he does better than most. He actually wrote a custom library for math animations: https://github.com/3b1b/manim
-
Where Is Noether's Principle in Machine Learning?
Not quite what you're looking for, but worth pointing out that Grant Sanderson of 3Blue1Brown has published the "framework" he uses for his math videos on GitHub.
https://github.com/3b1b/manim
-
3Blue1Brown Calculus Blog Series
3b1b uses a python library for creating those videos.
https://github.com/3b1b/manim
-
Animating High School Maths Curriculum
Manim, 3b1b's animation library is open source: https://github.com/3b1b/manim
-
Why do people think animation involves a ton of coding?
Coming to motion design, this rumour takes of due to the fact that there are programming libraries like Manim and Motion-Canvas which are actually used to generate animations from code. You can search 3Blue1Brown channel on youtube.
- Connaissez-vous des petits youtubeurs dans le style de Micode?
-
Stickman fucks around with math and finds out
It kinda looks like this: https://github.com/3b1b/manim, but that would be a crazy usage of it. Wondering if they’re compositing Manim with a more traditional animation suite.
-
Online classes in china 🔥
Probably used this for the animation: https://github.com/3b1b/manim
- Material python
-
What language for creating mathematical modeling program?
3blue1brown has had success with (their tool) manim, which uses Python.
metaseq
-
Training great LLMs from ground zero in the wilderness as a startup
This is a super important issue that affects the pace and breadth of iteration of AI almost as much as the raw hardware improvements do. The blog is fun but somewhat shallow and not technical or very surprising if you’ve worked with clusters of GPUs in any capacity over the years. (I liked the perspective of a former googler, but I’m not sure why past colleagues would recommend Jax over pytorch for LLMs outside of Google.) I hope this newco eventually releases a more technical report about their training adventures, like the PDF file here: https://github.com/facebookresearch/metaseq/tree/main/projec...
- Chronicles of Opt Development
-
See the pitch memo that raised €105M for four-week-old startup Mistral
The number of people who can actually pre-train a true LLM is very small.
It remains a major feat with many tweaks and tricks. Case in point: the 114 pages of OPT175B logbook [1]
[1] https://github.com/facebookresearch/metaseq/blob/main/projec...
- Technologie: „Austro-ChatGPT“ – aber kein Geld zum Testen
- OPT (Open Pre-trained Transformers) is a family of NLP models trained on billions of tokens of text obtained from the internet
- Current state-of-the-art open source LLM
-
Elon Musk Buys Ten Thousand GPUs for Secretive AI Project
Reliability at scale: take a look at the OPT training log book for their 175B model run. It needed a lot of babysitting. In my experience, that scale of TPU training run requires a restart about once every 1-2 weeks—and they provide the middleware to monitor the health of the cluster and pick up on hardware failures.
-
Is AI Development more fun than Software Development?
I really appreciated this log of Facebook training a large language model of how troublesome AI development can be: https://github.com/facebookresearch/metaseq/tree/main/projects/OPT/chronicles
-
Visual ChatGPT
Stable Diffusion will run on any decent gaming GPU or a modern MacBook, meanwhile LLMs comparable to GPT-3/ChatGPT have had pretty insane memory requirements - e.g., <https://github.com/facebookresearch/metaseq/issues/146>
-
Ask HN: Is There On-Call in ML?
It seems so, check this log book from Meta: https://github.com/facebookresearch/metaseq/blob/main/projec...
What are some alternatives?
geogebra - GeoGebra apps (mirror)
stable-diffusion - A latent text-to-image diffusion model
reanimate - Haskell library for building declarative animations based on SVG graphics
gpt-2 - Code for the paper "Language Models are Unsupervised Multitask Learners"
Tools-to-Design-or-Visualize-Architecture-of-Neural-Network - Tools to Design or Visualize Architecture of Neural Network
nlp-resume-parser - NLP-powered, GPT-3 enabled Resume Parser from PDF to JSON.
matplotplusplus - Matplot++: A C++ Graphics Library for Data Visualization 📊🗾
GLM-130B - GLM-130B: An Open Bilingual Pre-Trained Model (ICLR 2023)
NumPy - The fundamental package for scientific computing with Python.
cupscale - Image Upscaling GUI based on ESRGAN
jupyter-manim - manim cell magic for IPython/Jupyter to show the output video
ChatGPT.el - ChatGPT in Emacs