Suggest an alternative to

RWKV-LM

RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.

A URL to the alternative repo (e.g. GitHub, GitLab)

Here you can share your experience with the project you are suggesting or its comparison with RWKV-LM. Optional.

A valid email to send you a verification link when necessary or log in.