CogVideo
jax
CogVideo | jax | |
---|---|---|
39 | 82 | |
3,512 | 28,174 | |
1.6% | 2.4% | |
2.4 | 10.0 | |
11 months ago | 6 days ago | |
Python | Python | |
Apache License 2.0 | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
CogVideo
-
InstructPix2Pix Video: "Turn the wave into trash"
Additionally two open source demo models [CogVideo[(https://github.com/THUDM/CogVideo) by a groups of cs students a model by [Antonia Antonova](https://antonia.space/text-to-video-generation) and have presented their own innovative methods of generating video from text
-
Effortpost: The Future Of Media Synthesis and AI Art
The second thing that will happen is the appearance of AI video and audio. Google has shown two programs for video generation, one which is fairly high quality and the other which can make long videos with several scenes. Meta has also demonstrated their own. We've already seen other projects like CogVideo, as well as many others that are currently being worked on. It's likely that these techniques will become so refined that over the next year or two, they'll have a similar boom to image generation programs. And eventually, they'll have a similar application in video editing, once coherence is adequate enough. Select a person's shirt, and it stays that for the remainder of the scene. Change an actor's hairstyle in real time, or add characters that didn't exist into a scene and let the computer figure out the desired level of realism. This'll revolutionize VFX to a degree where making an effects heavy will be less about wrangling complex toolsets and more about making aesthetic choices of style and placement.
- AI Content Generation, Part 1: Machine Learning Basics
- Can we please make a general update on all the "most important" news/repos available?
-
Stable Diffusion Public Release – Stability.ai
Check out https://github.com/THUDM/CogVideo - progress is being made on coherent video generation.
Characters and dialogue are effectively solved, just look at GPT-3.
The entity behind StableDiffusion is also supporting generative music art, so let's see what is coming out of that: https://www.harmonai.org/
We are currently far away from generating a production quality movie with AI, but I don't think it's going to be nearly as long as a lifetime. In my opinion, we'll have high quality AI shorts within the decade.
-
How far away are we from have AI like DALL-E 2 be able to create other media like 3d models or video?
CogVideo and a CogView web app.
-
Does training transformers on large corpuses of music files have some hidden difficulty which makes it impossible?
A better comparison to AI music generation would be video generation, which has not improved much since i saw first examples some years ago. The last iteration is stuff like CogVideo and this is only able to generate 4 second videos with mid-strong artifacts.
-
[R] CogVideo: Large-scale Pretraining for Text-to-Video Generation via Transformers + Gradio Web Demo
github: https://github.com/THUDM/CogVideo
- CogVideo: Code and 94B Model for Text-to-Video Generation via Transformers
-
CogVideo (text-to-video) model, code, and demo are available
GitHub repo.
jax
-
The Elements of Differentiable Programming
The dual numbers exist just as surely as the real numbers and have been used well over 100 years
https://en.m.wikipedia.org/wiki/Dual_number
Pytorch has had them for many years.
https://pytorch.org/docs/stable/generated/torch.autograd.for...
JAX implements them and uses them exactly as stated in this thread.
https://github.com/google/jax/discussions/10157#discussionco...
As you so eloquently stated, "you shouldn't be proclaiming things you don't actually know on a public forum," and doubly so when your claimed "corrections" are so demonstrably and totally incorrect.
-
Julia GPU-based ODE solver 20x-100x faster than those in Jax and PyTorch
On your last point, as long as you jit the topmost level, it doesn't matter whether or not you have inner jitted functions. The end result should be the same.
Source: https://github.com/google/jax/discussions/5199#discussioncom...
-
Apple releases MLX for Apple Silicon
The design of MLX is inspired by frameworks like NumPy, PyTorch, Jax, and ArrayFire.
-
MLPerf training tests put Nvidia ahead, Intel close, and Google well behind
I'm still not totally sure what the issue is. Jax uses program transformations to compile programs to run on a variety of hardware, for example, using XLA for TPUs. It can also run cuda ops for Nvidia gpus without issue: https://jax.readthedocs.io/en/latest/installation.html
There is also support for custom cpp and cuda ops if that's what is needed: https://jax.readthedocs.io/en/latest/Custom_Operation_for_GP...
I haven't worked with float4, but can imagine that new numerical types would require some special handling. But I assume that's the case for any ml environment.
But really you probably mean fixed point 4bit integer types? Looks like that has had at least some work done in Jax: https://github.com/google/jax/issues/8566
-
MatX: Efficient C++17 GPU numerical computing library with Python-like syntax
>
Are they even comparing apples to apples to claim that they see these improvements over NumPy?
> While the code complexity and length are roughly the same, the MatX version shows a 2100x over the Numpy version, and over 4x faster than the CuPy version on the same GPU.
NumPy doesn't use GPU by default unless you use something like Jax [1] to compile NumPy code to run on GPUs. I think more honest comparison will mainly compare MatX running on same CPU like NumPy as focus the GPU comparison against CuPy.
[1] https://github.com/google/jax
-
JAX – NumPy on the CPU, GPU, and TPU, with great automatic differentiation
Actually that never changed. The README has always had an example of differentiating through native Python control flow:
https://github.com/google/jax/commit/948a8db0adf233f333f3e5f...
The constraints on control flow expressions come from jax.jit (because Python control flow can't be staged out) and jax.vmap (because we can't take multiple branches of Python control flow, which we might need to do for different batch elements). But autodiff of Python-native control flow works fine!
-
Julia and Mojo (Modular) Mandelbrot Benchmark
For a similar "benchmark" (also Mandelbrot) but took place in Jax repo discussion: https://github.com/google/jax/discussions/11078#discussionco...
-
Functional Programming 1
2. https://github.com/fantasyland/fantasy-land (A bit heavy on jargon)
Note there is a python version of Ramda available on pypi and there’s a lot of FP tidbits inside JAX:
3. https://pypi.org/project/ramda/ (Worth making your own version if you want to learn, though)
4. For nested data, JAX tree_util is epic: https://jax.readthedocs.io/en/latest/jax.tree_util.html and also their curry implementation is funny: https://github.com/google/jax/blob/4ac2bdc2b1d71ec0010412a32...
Anyway don’t put FP on a pedestal, main thing is to focus on the core principles of avoiding external mutation and making helper functions. Doesn’t always work because some languages like Rust don’t have legit support for currying (afaik in 2023 August), but in those cases you can hack it with builder methods to an extent.
Finally, if you want to understand the middle of the midwit meme, check out this wiki article and connect the free monoid to the Kleene star (0 or more copies of your pattern) and Kleene plus (1 or more copies of your pattern). Those are also in regex so it can help you remember the regex symbols. https://en.wikipedia.org/wiki/Free_monoid?wprov=sfti1
The simplest example might be {0}^* in which case
0: “” // because we use *
-
Best Way to Learn JAX
Hello! I'm trying to learn JAX over the next couple of weeks. Ideally, I want to be comfortable with using it for projects after about 3 weeks to a month, although I understand that may not be realistic. I currently have experience with PyTorch and TensorFlow. How should I go about learning JAX? Is there a specific YouTube tutorial or online course I should use, or should I just use the tutorial on https://jax.readthedocs.io/? Any information, advice, or experience you can share would be much appreciated!
- Codon: Python Compiler
What are some alternatives?
stable-diffusion-ui - Easiest 1-click way to install and use Stable Diffusion on your computer. Provides a browser UI for generating images from text prompts and images. Just enter your text prompt, and see the generated image. [Moved to: https://github.com/easydiffusion/easydiffusion]
Numba - NumPy aware dynamic Python compiler using LLVM
dalle-playground - A playground to generate images from any text prompt using Stable Diffusion (past: using DALL-E Mini)
functorch - functorch is JAX-like composable function transforms for PyTorch.
stable-diffusion - Optimized Stable Diffusion modified to run on lower GPU VRAM
julia - The Julia Programming Language
stable-diffusion-webui-feature-showcase - Feature showcase for stable-diffusion-webui
Pytorch - Tensors and Dynamic neural networks in Python with strong GPU acceleration
stable-diffusion - A latent text-to-image diffusion model
Cython - The most widely used Python to C compiler
imagen-pytorch - Implementation of Imagen, Google's Text-to-Image Neural Network, in Pytorch
jax-windows-builder - A community supported Windows build for jax.