RASP
An interpreter for RASP as described in the ICML 2021 paper "Thinking Like Transformers" (by tech-srl)
tracr
By google-deepmind
RASP | tracr | |
---|---|---|
3 | 2 | |
267 | 467 | |
4.1% | 1.1% | |
6.5 | 7.8 | |
2 months ago | 3 months ago | |
Python | Python | |
MIT License | Apache License 2.0 |
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
RASP
Posts with mentions or reviews of RASP.
We have used some of these posts to build our list of alternatives
and similar projects. The last one was on 2023-06-15.
-
Thinking Like Transformers
there's also an interpreter for RASP as described in the paper :) https://github.com/tech-srl/RASP
Sasha's blog (your link) has a nice walkthrough of long addition with RASP though!
I'm not sure if it's from the authors of the paper, but this appears to be that: https://github.com/tech-srl/RASP
- [D] How to truly understand attention mechanism in transformers?
tracr
Posts with mentions or reviews of tracr.
We have used some of these posts to build our list of alternatives
and similar projects. The last one was on 2023-06-15.
-
Thinking Like Transformers
To quote someone: RASP is like Matlab, designed by Satan.
There is an interpreter for a RASP like language if you want to try it out: https://srush.github.io/raspy/
And deepmind published a compiler from RASP to Transformer weights: https://github.com/deepmind/tracr
-
🚨 Deepmind Open Sources Tracr: A Tool for Compiling Human-Readable Code to the Weights of a Transformer Model
Quick Read: https://www.marktechpost.com/2023/03/02/deepmind-open-sources-tracr-a-tool-for-compiling-human-readable-code-to-the-weights-of-a-transformer-model/ Paper: https://arxiv.org/pdf/2301.05062.pdf Github: https://github.com/deepmind/tracr
What are some alternatives?
When comparing RASP and tracr you can also consider the following projects:
pytorch-seq2seq - Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.