MTR
rtdl
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
MTR
-
Rethinking Data Augmentation for Tabular Data in Deep Learning
Tabular data is the most widely used data format in machine learning (ML). While tree-based methods outperform DL-based methods in supervised learning, recent literature reports that self-supervised learning with Transformer-based models outperforms tree-based methods. In the existing literature on self-supervised learning for tabular data, contrastive learning is the predominant method. In contrastive learning, data augmentation is important to generate different views. However, data augmentation for tabular data has been difficult due to the unique structure and high complexity of tabular data. In addition, three main components are proposed together in existing methods: model structure, self-supervised learning methods, and data augmentation. Therefore, previous works have compared the performance without comprehensively considering these components, and it is not clear how each component affects the actual performance. In this study, we focus on data augmentation to address these issues. We propose a novel data augmentation method, $\textbf{M}$ask $\textbf{T}$oken $\textbf{R}$eplacement ($\texttt{MTR}$), which replaces the mask token with a portion of each tokenized column; $\texttt{MTR}$ takes advantage of the properties of Transformer, which is becoming the predominant DL-based architecture for tabular data, to perform data augmentation for each column embedding. Through experiments with 13 diverse public datasets in both supervised and self-supervised learning scenarios, we show that $\texttt{MTR}$ achieves competitive performance against existing data augmentation methods and improves model performance. In addition, we discuss specific scenarios in which $\texttt{MTR}$ is most effective and identify the scope of its application. The code is available at https://github.com/somaonishi/MTR/.
rtdl
-
[Project] Improving deep learning for tabular data with numerical embeddings (FT-Transformer)
Found relevant code at https://github.com/yandex-research/rtdl + all code implementations here
- [P] pytorch-widedeep v1.0.9: the Perceiver and the FastFormer for tabular data are now available in the library
- [P] pytorch-widedeep model alert: SAINT and the FT-Transformer are now available in the library
- [R] Revisiting Deep Learning Models for Tabular Data
What are some alternatives?
Papers-in-100-Lines-of-Code - Implementation of papers in 100 lines of code.
tab-transformer-pytorch - Implementation of TabTransformer, attention network for tabular data, in Pytorch
rtdl - Research on Tabular Deep Learning [Moved to: https://github.com/yandex-research/rtdl]
pytorch-widedeep - A flexible package for multimodal-deep-learning to combine tabular data with text and images using Wide and Deep models in Pytorch
tabular-dl-pretrain-objectives - Revisiting Pretrarining Objectives for Tabular Deep Learning
100DaysofMLCode - My journey to learn and grow in the domain of Machine Learning and Artificial Intelligence by performing the #100DaysofMLCode Challenge. Now supported by bright developers adding their learnings :+1:
ArtLine - A Deep Learning based project for creating line art portraits.
best_AI_papers_2021 - A curated list of the latest breakthroughs in AI (in 2021) by release date with a clear video explanation, link to a more in-depth article, and code.
tabnet - PyTorch implementation of TabNet paper : https://arxiv.org/pdf/1908.07442.pdf
rtdl-num-embeddings - (NeurIPS 2022) On Embeddings for Numerical Features in Tabular Deep Learning
creative-prediction - Creative Prediction with Neural Networks
rtdl-revisiting-models - (NeurIPS 2021) Revisiting Deep Learning Models for Tabular Data