- flash-attention-jax VS EfficientZero
- flash-attention-jax VS RHO-Loss
- flash-attention-jax VS flash-attention
- flash-attention-jax VS CodeRL
- flash-attention-jax VS XMem
- flash-attention-jax VS DeepSpeed
- flash-attention-jax VS perceiver-ar
- flash-attention-jax VS google-research
- flash-attention-jax VS msn
Flash-attention-jax Alternatives
Similar projects and alternatives to flash-attention-jax
-
EfficientZero
Open-source codebase for EfficientZero, from "Mastering Atari Games with Limited Data" at NeurIPS 2021.
-
InfluxDB
Build time-series-based applications quickly and at scale.. InfluxDB is the Time Series Platform where developers build real-time applications for analytics, IoT and cloud-native services. Easy to start, it is available in the cloud or on-premises.
-
-
CodeRL
This is the official code for the paper CodeRL: Mastering Code Generation through Pretrained Models and Deep Reinforcement Learning (NeurIPS22).
-
XMem
[ECCV 2022] XMem: Long-Term Video Object Segmentation with an Atkinson-Shiffrin Memory Model
-
DeepSpeed
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
-
Sonar
Write Clean Python Code. Always.. Sonar helps you commit clean code every time. With over 225 unique rules to find Python bugs, code smells & vulnerabilities, Sonar finds the issues while you focus on the work.
-
-
msn
Masked Siamese Networks for Label-Efficient Learning (https://arxiv.org/abs/2204.07141)
flash-attention-jax reviews and mentions
-
[D] Most important AI Paper´s this year so far in my opinion + Proto AGI speculation at the end
FlashAttention: Fast and Memory-Efficient Exact Attention with IO-Awareness Paper: https://arxiv.org/abs/2205.14135 Github: https://github.com/HazyResearch/flash-attention and https://github.com/lucidrains/flash-attention-jax
Stats
lucidrains/flash-attention-jax is an open source project licensed under MIT License which is an OSI approved license.