Our great sponsors
-
RWKV-LM
RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
RWKV models inference: https://github.com/BlinkDL/ChatRWKV (fast CUDA).
See https://github.com/BlinkDL/RWKV-LM for details on the RWKV Language Model (100% RNN).
Q8_0 models: only for https://github.com/saharNooby/rwkv.cpp (fast CPU).
NOTE:
The number of mentions on this list indicates mentions on common posts plus user suggested alternatives.
Hence, a higher number means a more popular project.