Our great sponsors
-
memory-efficient-attention-pytorch
Discontinued Implementation of a memory efficient multi-head attention as proposed in the paper, "Self-attention Does Not Need O(n²) Memory"
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
Check these efficient attention mechanism which are almost a drop in replacement: efficient attention flash attention
Check these efficient attention mechanism which are almost a drop in replacement: efficient attention flash attention
NOTE:
The number of mentions on this list indicates mentions on common posts plus user suggested alternatives.
Hence, a higher number means a more popular project.