Papers-in-100-Lines-of-Code VS MTR

Compare Papers-in-100-Lines-of-Code vs MTR and see what are their differences.

InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
Papers-in-100-Lines-of-Code MTR
3 1
582 9
- -
5.4 4.9
3 days ago 7 months ago
Python Python
MIT License MIT License
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

Papers-in-100-Lines-of-Code

Posts with mentions or reviews of Papers-in-100-Lines-of-Code. We have used some of these posts to build our list of alternatives and similar projects.

MTR

Posts with mentions or reviews of MTR. We have used some of these posts to build our list of alternatives and similar projects.
  • Rethinking Data Augmentation for Tabular Data in Deep Learning
    1 project | /r/BotNewsPreprints | 18 May 2023
    Tabular data is the most widely used data format in machine learning (ML). While tree-based methods outperform DL-based methods in supervised learning, recent literature reports that self-supervised learning with Transformer-based models outperforms tree-based methods. In the existing literature on self-supervised learning for tabular data, contrastive learning is the predominant method. In contrastive learning, data augmentation is important to generate different views. However, data augmentation for tabular data has been difficult due to the unique structure and high complexity of tabular data. In addition, three main components are proposed together in existing methods: model structure, self-supervised learning methods, and data augmentation. Therefore, previous works have compared the performance without comprehensively considering these components, and it is not clear how each component affects the actual performance. In this study, we focus on data augmentation to address these issues. We propose a novel data augmentation method, $\textbf{M}$ask $\textbf{T}$oken $\textbf{R}$eplacement ($\texttt{MTR}$), which replaces the mask token with a portion of each tokenized column; $\texttt{MTR}$ takes advantage of the properties of Transformer, which is becoming the predominant DL-based architecture for tabular data, to perform data augmentation for each column embedding. Through experiments with 13 diverse public datasets in both supervised and self-supervised learning scenarios, we show that $\texttt{MTR}$ achieves competitive performance against existing data augmentation methods and improves model performance. In addition, we discuss specific scenarios in which $\texttt{MTR}$ is most effective and identify the scope of its application. The code is available at https://github.com/somaonishi/MTR/.

What are some alternatives?

When comparing Papers-in-100-Lines-of-Code and MTR you can also consider the following projects:

EasyOCR - Ready-to-use OCR with 80+ supported languages and all popular writing scripts including Latin, Chinese, Arabic, Devanagari, Cyrillic and etc.

rtdl - Research on Tabular Deep Learning (Python package & papers) [Moved to: https://github.com/Yura52/rtdl]

taichi-ngp-renderer - An Instants-NGP renderer that has been implemented using Taichi

rtdl - Research on Tabular Deep Learning [Moved to: https://github.com/yandex-research/rtdl]

CelebV-HQ - [ECCV 2022] CelebV-HQ: A Large-Scale Video Facial Attributes Dataset

tabular-dl-pretrain-objectives - Revisiting Pretrarining Objectives for Tabular Deep Learning

rtdl-num-embeddings - (NeurIPS 2022) On Embeddings for Numerical Features in Tabular Deep Learning

nni - An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.

artbench - Benchmarking Generative Models with Artworks

pi-GAN-pytorch - Implementation of π-GAN, for 3d-aware image synthesis, in Pytorch

magic3d-pytorch - Implementation of Magic3D, Text to 3D content synthesis, in Pytorch

giraffe - This repository contains the code for the CVPR 2021 paper "GIRAFFE: Representing Scenes as Compositional Generative Neural Feature Fields"