- TransformerEngine VS Whisper
- TransformerEngine VS autocvd
- TransformerEngine VS warp-drive
- TransformerEngine VS ivy
- TransformerEngine VS nanoGPT
- TransformerEngine VS fastaudio
- TransformerEngine VS liberate-fhe
- TransformerEngine VS PyTorch-Guide
- TransformerEngine VS FastFold
- TransformerEngine VS Pytorch
TransformerEngine Alternatives
Similar projects and alternatives to TransformerEngine
-
Whisper
High-performance GPGPU inference of OpenAI's Whisper automatic speech recognition (ASR) model (by Const-me)
-
autocvd
Tool to automatically set CUDA_VISIBLE_DEVICES based on GPU utilization. Usable from command line and code.
-
WorkOS
The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.
-
warp-drive
Extremely Fast End-to-End Deep Multi-Agent Reinforcement Learning Framework on a GPU (JMLR 2022)
-
nanoGPT
The simplest, fastest repository for training/finetuning medium-sized GPTs.
-
-
-
liberate-fhe
A Fully Homomorphic Encryption (FHE) library for bridging the gap between theory and practice with a focus on performance and accuracy.
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
-
Pytorch
Tensors and Dynamic neural networks in Python with strong GPU acceleration
-
FastFold
Discontinued Optimizing AlphaFold Training and Inference on GPU Clusters (by hpcaitech)
TransformerEngine reviews and mentions
-
GPUs for Deep Learning in 2023 – An In-depth Analysis
Would be curious to see your benchmarks. Btw, Nvidia will be providing support for fp8 in a future release of CUDA - https://github.com/NVIDIA/TransformerEngine/issues/15
I think TMA may not matter as much for consumer cards given the disproportionate amount of fp32 / int32 compute that they have.
Would be interesting to see how close to theoretical folks are able to get once CUDA support comes through.
Stats
NVIDIA/TransformerEngine is an open source project licensed under Apache License 2.0 which is an OSI approved license.
The primary programming language of TransformerEngine is Python.