x-transformers
crawl
x-transformers | crawl | |
---|---|---|
10 | 659 | |
4,147 | 2,213 | |
- | 0.5% | |
8.7 | 10.0 | |
3 days ago | 2 days ago | |
Python | C++ | |
MIT License | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
x-transformers
- x-transformers
- GPT-4 architecture: what we can deduce from research literature
- Doubt about transformers
-
The GPT Architecture, on a Napkin
it is all documented here, in writing and in code https://github.com/lucidrains/x-transformers
you will want to use rotary embeddings, if you do not need length extrapolation
-
[R] Deepmind's Gato: a generalist learning agent
it is just a single transformer encoder, so just use https://github.com/lucidrains/x-transformers with ff_glu set to True
-
[D] Transformer sequence generation - is it truly quadratic scaling?
However, I've come across the concept of Key, Value Caching in Transformer-Decoders recently (e.g. Figure 3 here), wherein because each output (and hence each input, since the model is autoregressive) only depends on previous outputs (inputs), we don't need to re-compute Key and Value vectors for all t < t_i at timestep i of the sequence. My intuition leads me to believe, then, that (unconditioned) inference for a decoder-only model uses an effective sequence length of 1 (the most recently produced token is the only real input that requires computation on), making Attention a linear-complexity operation. This thinking seems to be validated by this github issue, and this paper (2nd paragraph of Introduction).
-
[D] Sudden drop in loss after hours of no improvement - is this a thing?
The Project - Model: The primary architecture consists of a CNN with a transformer encoder and decoder. At first, I used my implementation of self-attention. Still, due to it not converging, I switched to using x-transformer implementation by lucidrains - as it includes improvements from many papers. The objective is simple; the CNN encoder converts images to a high-level representation; feeds them to the transformer encoder for information flow. Finally, a transformer decoder tries to decode the text character-by-character using autoregressive loss. After two weeks of trying around different things, the training did not converge within the first hour - as this is the usual mark I use to validate if a model is learning or not.
-
Hacker News top posts: May 9, 2021
X-Transformers: A fully-featured transformer with experimental features\ (25 comments)
- X-Transformers: A fully-featured transformer with experimental features
-
[D] Theoretical papers on transformers? (or attention mechanism, or just seq2seq?)
One thing I’ve looked at is the fact that there’s no obvious reason to distinguish between W_K and W_Q in the formulation of a transformer as far as I can tell. However if you build a transformer where you merge the two matrices, it doesn’t learn as well. It still learns, but not as well. You can try out the code here. The training loss can be seen here, though we aborted the run because of how poorly it was doing.
crawl
-
Slay the Spire 2 Announced – Using Godot
It's probably not as rigorous as what you're thinking of but the devs of DCSS have cited online win rates of certain combinations as the impetus for balance changes before.
https://crawl.develz.org
-
The Mana World Classic – Open-Source Mmorpg
In a similar vein, see Dungeon Crawl Stone Soup, a free and open-source roguelike that's been continuously developed by volunteers for 20 years: http://crawl.develz.org/
-
Trog
They are entirely too humanoid. I mean, look at these splash screens: Kiku, Ignis, Chei. While those images shouldn't be taken as canonical, they at least demonstrate the general inhumanity of the Crawl pantheon.
-
Games you can play for 20+ hours and not get bored?
Since you like turn-based games too, try some old-school roguelikes. Many are open-source freeware so you have nothing to lose but time. I've been playing Dungeon Crawl Stone Soup for almost 10 years.
-
any suggestions for a beginner roguelike? something that's not infuriating
Since Brogue's already been mentioned, I'd add Dungeon Crawl Stone Soup and Tales of Maj Eyal as pretty beginner friendly games.
-
new Oka should have the option to refuse gifts, like refusing Ru sacrifices
For people who don't follow trunk, Okawru's gifting has been changed (and arguably* nerfed):
-
Game to Play at Work
My personal first timer recommendation? Maybe Nethack or Dungeon crawl?
- Early thoughts on the new shapeshifter (transmuter) mechanics
- Games without a hunger mechanic.
-
What is your favourite open source game(s)?
Dungeon Crawl Stone Soup - Free traditional roguelike with fair mechanics and a lot of variety between species/skill/god choices (~25 gods and maybe with the exception of sif/veh and oka/trog, they are very distinct). The tiles are great. There's many developers and they are very welcoming of code or vault contributions. Reducing incentives to play tediously is one of the goals. Easily hundreds of hours of gameplay for free. Playable online (connecting to a server through your browser/terminal) or offline (terminal or tiles version). There have been win streaks of 50+ games with a variety of species/background combos so you know it's mostly fair(it IS possible for rng to give you an unwinnable game), but it's very difficult if your goal is to win every game.
What are some alternatives?
EasyOCR - Ready-to-use OCR with 80+ supported languages and all popular writing scripts including Latin, Chinese, Arabic, Devanagari, Cyrillic and etc.
seed-search - Utilities to catalog and search data for Dungeon Crawl Stone Soup dungeon generation seeds
TimeSformer-pytorch - Implementation of TimeSformer from Facebook AI, a pure attention-based solution for video classification
angband - A free, single-player roguelike dungeon exploration game
flamingo-pytorch - Implementation of 🦩 Flamingo, state-of-the-art few-shot visual question answering attention net out of Deepmind, in Pytorch
Cataclysm-DDA - Cataclysm - Dark Days Ahead. A turn-based survival game set in a post-apocalyptic world.
DALLE-pytorch - Implementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch
NetHack - Official NetHack Git Repository
memory-efficient-attention-pytorch - Implementation of a memory efficient multi-head attention as proposed in the paper, "Self-attention Does Not Need O(n²) Memory"
BrogueCE - Brogue: Community Edition - a community-lead fork of the much-loved minimalist roguelike game
performer-pytorch - An implementation of Performer, a linear attention-based transformer, in Pytorch
SpecBAS - An enhanced Sinclair BASIC interpreter for modern PCs