x-transformers
euporie
x-transformers | euporie | |
---|---|---|
10 | 20 | |
4,147 | 1,453 | |
- | - | |
8.7 | 9.7 | |
3 days ago | 4 days ago | |
Python | Python | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
x-transformers
- x-transformers
- GPT-4 architecture: what we can deduce from research literature
- Doubt about transformers
-
The GPT Architecture, on a Napkin
it is all documented here, in writing and in code https://github.com/lucidrains/x-transformers
you will want to use rotary embeddings, if you do not need length extrapolation
-
[R] Deepmind's Gato: a generalist learning agent
it is just a single transformer encoder, so just use https://github.com/lucidrains/x-transformers with ff_glu set to True
-
[D] Transformer sequence generation - is it truly quadratic scaling?
However, I've come across the concept of Key, Value Caching in Transformer-Decoders recently (e.g. Figure 3 here), wherein because each output (and hence each input, since the model is autoregressive) only depends on previous outputs (inputs), we don't need to re-compute Key and Value vectors for all t < t_i at timestep i of the sequence. My intuition leads me to believe, then, that (unconditioned) inference for a decoder-only model uses an effective sequence length of 1 (the most recently produced token is the only real input that requires computation on), making Attention a linear-complexity operation. This thinking seems to be validated by this github issue, and this paper (2nd paragraph of Introduction).
-
[D] Sudden drop in loss after hours of no improvement - is this a thing?
The Project - Model: The primary architecture consists of a CNN with a transformer encoder and decoder. At first, I used my implementation of self-attention. Still, due to it not converging, I switched to using x-transformer implementation by lucidrains - as it includes improvements from many papers. The objective is simple; the CNN encoder converts images to a high-level representation; feeds them to the transformer encoder for information flow. Finally, a transformer decoder tries to decode the text character-by-character using autoregressive loss. After two weeks of trying around different things, the training did not converge within the first hour - as this is the usual mark I use to validate if a model is learning or not.
-
Hacker News top posts: May 9, 2021
X-Transformers: A fully-featured transformer with experimental features\ (25 comments)
- X-Transformers: A fully-featured transformer with experimental features
-
[D] Theoretical papers on transformers? (or attention mechanism, or just seq2seq?)
One thing I’ve looked at is the fact that there’s no obvious reason to distinguish between W_K and W_Q in the formulation of a transformer as far as I can tell. However if you build a transformer where you merge the two matrices, it doesn’t learn as well. It still learns, but not as well. You can try out the code here. The training loss can be seen here, though we aborted the run because of how poorly it was doing.
euporie
-
I'm building a new web browser
Currently it's part of euporie-notebook, but I'm planning on splitting it out and publishing the web-browser as an independent project.
-
VT330/VT340 Sixel Graphics
You can get most of the way there with euporie:
https://github.com/joouha/euporie
I don't support audio yet, but it should be possible using DECPS escape sequences
-
UnicodePlots
If you use euporie [1], you can draw plots in a Jupyter notebook in the terminal using matotlib and friends, and have them displayed using terminal graphics.
[1] https://github.com/joouha/euporie
-
Xonsh kernel for Jupyter
Now with xontrib-jupyter you can use xonsh language in web-based Jupyter Notebook, JupyterLab and in terminal-based Euporie.
-
Neovim workflow for machine learning / data scientist. Struggling with jupyter notebooks.
https://github.com/joouha/euporie in a a separate terminal works fine for me.
-
data science (jupyter notebooks) with vim?
Why synchronize if you can stay in the terminal
- euporie - Jupyter notebooks in the terminal
-
CLIs and TUIs packages
I would like a rust lib to build a terminal UI like this: https://i.imgur.com/d5mo8ce.png - that's euporie (https://github.com/joouha/euporie) implemented Python using the prompt_toolkit(?) - it's very pretty and even mouse works...
-
Ask HN: Those making $0/month or less on side projects – Show and tell
I'm working on a TUI Jupyter Notebook editor, euporie, which allows you to run and edit Jupyter Notebooks in the terminal.
https://github.com/joouha/euporie
It's useful for editing and running notebooks on remote servers over SSH, or inside containers where setting up port forwarding is not possible or too difficult, or if you just like working in the terminal.
It's open-source, and I have no idea how I would go about monetizing it!
I've spent a lot of time recently working on euporie's HTML renderer, which I'm planning on using to make a new terminal web-browser.
-
I have reached Vim nirvana
If people are looking for a more JupyterLab like environment for the terminal, you could try euporie [1] (I am the author).
It supports vim and emacs style key-bindings, and can display rich cell output like images and widgets.
[1] https://github.com/joouha/euporie
What are some alternatives?
EasyOCR - Ready-to-use OCR with 80+ supported languages and all popular writing scripts including Latin, Chinese, Arabic, Devanagari, Cyrillic and etc.
jupyter-vim-binding - Jupyter meets Vim. Vimmer will fall in love.
TimeSformer-pytorch - Implementation of TimeSformer from Facebook AI, a pure attention-based solution for video classification
jupynium.nvim - Selenium-automated Jupyter Notebook that is synchronised with NeoVim in real-time.
flamingo-pytorch - Implementation of 🦩 Flamingo, state-of-the-art few-shot visual question answering attention net out of Deepmind, in Pytorch
jupyter-kernel.nvim - Get (IPython) Jupyter kernel completion suggestions and object inspection into Neovim.
DALLE-pytorch - Implementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch
ttyplot - a realtime plotting utility for terminal/console with data input from stdin
memory-efficient-attention-pytorch - Implementation of a memory efficient multi-head attention as proposed in the paper, "Self-attention Does Not Need O(n²) Memory"
vimpyter - Edit your Jupyter notebooks in Vim/Neovim
performer-pytorch - An implementation of Performer, a linear attention-based transformer, in Pytorch
SpecBAS - An enhanced Sinclair BASIC interpreter for modern PCs