Transformer-Models-from-Scratch VS nn

Compare Transformer-Models-from-Scratch vs nn and see what are their differences.

nn

🧑‍🏫 60 Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠 (by lab-ml)
Our great sponsors
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • WorkOS - The modern identity platform for B2B SaaS
  • SaaSHub - Software Alternatives and Reviews
Transformer-Models-from-Scratch nn
1 26
58 47,503
- 7.6%
0.0 7.7
almost 2 years ago 27 days ago
Jupyter Notebook Jupyter Notebook
- MIT License
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

Transformer-Models-from-Scratch

Posts with mentions or reviews of Transformer-Models-from-Scratch. We have used some of these posts to build our list of alternatives and similar projects.

nn

Posts with mentions or reviews of nn. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2022-01-09.

What are some alternatives?

When comparing Transformer-Models-from-Scratch and nn you can also consider the following projects:

OpenNMT-py - Open Source Neural Machine Translation and (Large) Language Models in PyTorch

GFPGAN-for-Video-SR - A colab notebook for video super resolution using GFPGAN

pytorch-seq2seq - Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.

labml - 🔎 Monitor deep learning model training and hardware usage from your mobile phone 📱

ganbert-pytorch - Enhancing the BERT training with Semi-supervised Generative Adversarial Networks in Pytorch/HuggingFace

functorch - functorch is JAX-like composable function transforms for PyTorch.

emotion-classifier - An attention-based BiLSTM for emotion classification.

ZoeDepth - Metric depth estimation from a single image

tf-transformers - State of the art faster Transformer with Tensorflow 2.0 ( NLP, Computer Vision, Audio ).

onnx-simplifier - Simplify your onnx model

Basic-UI-for-GPT-J-6B-with-low-vram - A repository to run gpt-j-6b on low vram machines (4.2 gb minimum vram for 2000 token context, 3.5 gb for 1000 token context). Model loading takes 12gb free ram.

Behavior-Sequence-Transformer-Pytorch - This is a pytorch implementation for the BST model from Alibaba https://arxiv.org/pdf/1905.06874.pdf