pyp
DALLE-pytorch
pyp | DALLE-pytorch | |
---|---|---|
5 | 20 | |
1,366 | 5,493 | |
- | - | |
6.3 | 2.5 | |
2 months ago | 3 months ago | |
Python | Python | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
pyp
-
Modern Linux Tools vs. Unix Classics: Which Would I Choose?
> I too can never remember jq syntax when I need to. I usually just end up writing a Python script
Same here! That's why for small things I made pyxargs [1] to use python in the shell. In another thread I also just learned of pyp [2] which I haven't tried yet but looks like it'd be even better for this use case.
[1] https://github.com/elesiuta/pyxargs
[2] https://github.com/hauntsaninja/pyp
-
Shshsh is a bridge connects Python and shell
I have bookmarked/tried so many Python/Shell mashups over the years.
IMHO the following is about the only one that's tasteful and not going off the deep end: https://github.com/hauntsaninja/pyp
-
Easily handle CLI operation via Python instead of regular Bash programs
I wrote a similar tool a while back that lets you create your own "magic" variables. I use `f` all the time! https://github.com/hauntsaninja/pyp#pyp-lets-you-configure-y...
-
A Tour of the Oil Language
Thank you for the extensive and thoughtful comment! This does help clarify your approach quite considerably. I wonder, since you are hoping to attract collaborators, whether there is some kind of formal spec for the language somewhere? For example, you mentioned parallel efforts: suppose I wanted to write a port to pure C; is there any way, short of reading every one of your posts and trying to contain the whole language in my head at once, for me to know exactly what I need to implement?
Something I've been trying to figure out: what is the exact relationship at present between OSH and Oil? When you say "OSH" do you mean the language, or the shell itself "oil shell"? If Oil is not something I can download, why exactly does that `const v = max(1, 2)` statement work in osh? It's clearly not just a Bash implementation, it's got other features. Is that a subset of Oil's features? Which subset?
Since you're also interested in other shells, you might have a look at pyp [1]. It captures a lot of the way I personally would like to use some future shell. If the features of pyp were integrated into the shell itself, you wouldn't need an external command, you could just (for example) pipe the output of one program into a python-like statement that mangles the incoming strings in some way, and pipe that out to some xargs-like program to use in a subshell. (The fact that you apparently can't use the pipe in what Xonsh calls "Python mode" is for me the central limiting feature of that shell.)
[1] https://github.com/hauntsaninja/pyp
-
9 Command-Line Tools to Go to Infinity & Beyond
9. Pyp
DALLE-pytorch
-
The Eleuther AI Mafia
It all started originally on lucidrains/dalle-pytorch in the months following the release of DALL-E (1). The group started as `dalle-pytorch-replicate` but was never officially "blessed" by Phil Wang who seems to enjoy being a free agent (can't blame him).
https://github.com/lucidrains/DALLE-pytorch/issues/116 is where the discord got kicked off originally. There's a lot of other interactions between us in the github there. You should be able to find when Phil was approached by Jenia Jitsev, Jan Ebert, and Mehdi Cherti (all starting LAION members) who graciously offered the chance to replicate the DALL-E paper using their available compute at the JUWELS and JUWELS Booster HPC system. This all predates Emad's arrival. I believe he showed up around the time guided diffusion and GLIDE, but it may have been a bit earlier.
Data work originally focused on amassing several of the bigger datasets of the time. Getting CC12M downloaded and trained on was something of an early milestone (robvanvolt's work). A lot of early work was like that though, shuffling through CC12M, COCO, etc. with the dalle-pytorch codebase until we got an avocado armchair.
Christophe Schumann was an early contributor as well and great at organizing and rallying. He focused a lot on the early data scraping work for what would become the "LAION5B" dataset. I don't want to credit him with the coding and I'm ashamed to admit I can't recall who did much of the work there - but a distributed scraping program was developed (the name was something@home... not scraping@home?).
The discord link on Phil Wang's readme at dalle-pytorch got a lot of traffic and a lot of people who wanted to pitch in with the scraping effort.
Eventually a lot of people from Eleuther and many other teams mingled with us, some sort of non-profit org was created in Germany I believe for legal purposes. The dataset continued to grow and the group moved from training DALLE's to finetuning diffusion models.
The `CompVis` team were great inspiration at the time and much of their work on VQGAN and then latent diffusion models basically kept us motivated. As I mentioned a personal motivation was Katherine Crowson's work on a variety of things like CLIP-guided vqgan, diffusion, etc.
I believe Emad Mostaque showed up around the time GLIDE was coming out? I want to say he donated money for scrapers to be run on AWS to speed up data collection. I was largely hands off for much of the data scraping process and mostly enjoyed training new models on data we had.
As with any online community things got pretty ill-defined, roles changed over, volunteers came/went, etc. I would hardly call this definitive and that's at least partially the reason it's hard to trace as an outsider. That much of the early history is scattered about GitHub issues and PR's can't have helped though.
-
Thoughts on AI image generators from text
Here you go: https://github.com/lucidrains/DALLE-pytorch
-
[P] DALL·E Mini & Mega demo and production API
Here are some other implementations of Dalle clones in Pytorch by various authors in the ML and DL community: https://github.com/lucidrains/DALLE-pytorch
- New text-to-image network from Google beats DALL-E
-
[Project] DALL-3 - generate better images with fewer tokens through clip guided diffusion
If in general DDPM > GAN > VAE, why do transformer image generators all use VQVAE to decode images? Wouldn't it be better to use a diffusion model? I was wondering about this and started experimenting with different ways to decode vector-quantized embeddings with a diffusion model - see discussion here After a lot of trial and error I got something that works pretty well.
- Still waiting for dall-e
-
Ask HN: Computer Vision Project Ideas?
- "Discrete VAE", used as the backbone for OpenAI's DALL-E, reimplimented here (and other places) https://github.com/lucidrains/DALLE-pytorch (code for training a discrete VAE)
-
Crawling@Home: Help Build The Worlds Largest Image-Text Pair Dataset!
Here's the DALLE-pytorch git repo.
-
(from the discord stream) I'm so hyped for this game. This generation is really good.
I am very excited, when AI Dungeon was released and seeing them filtering stuff, I thought that one day there will be an open source version of this without filters, the same goes for any future open sourced GPT-X. Now if we can get to train an open source DALL-E too and integrate it on NovelAI. Wouldn't that be even more awesome?
-
Wann habt Ihr euch das letzte Mal wie ein Kind über eine Sache gefreut?
Vielleicht bei https://github.com/lucidrains/DALLE-pytorch und https://github.com/kobiso/DALLE-reproduction
What are some alternatives?
InquirerPy - :snake: Python port of Inquirer.js (A collection of common interactive command-line user interfaces)
DALL-E - PyTorch package for the discrete VAE used for DALL·E.
Pawky - The Python version of awk
DALLE2-pytorch - Implementation of DALL-E 2, OpenAI's updated text-to-image synthesis neural network, in Pytorch
shyaml - YAML for command line
deep-daze - Simple command line tool for text to image generation using OpenAI's CLIP and Siren (Implicit neural representation network). Technique was originally created by https://twitter.com/advadnoun
DALLE-datasets - This is a summary of easily available datasets for generalized DALLE-pytorch training.
Command-line-text-processing - :zap: From finding text to search and replace, from sorting to beautifying text and more :art:
imagen-pytorch - Implementation of Imagen, Google's Text-to-Image Neural Network, in Pytorch
theme.sh - A script which lets you set your $terminal theme.
CoCa-pytorch - Implementation of CoCa, Contrastive Captioners are Image-Text Foundation Models, in Pytorch