Transformers4Rec
NeuRec
Our great sponsors
Transformers4Rec | NeuRec | |
---|---|---|
4 | 2 | |
1,030 | 1,031 | |
4.1% | - | |
5.3 | 0.0 | |
2 days ago | about 1 year ago | |
Python | Python | |
Apache License 2.0 | - |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Transformers4Rec
- New item prediction modules in open source libraries
-
Okay Do you think a recommendation engine for a final year project is too simple?
It's fine for a thesis project IMO! Recommendation is very much an active field with cool recent developments. If your supervisors are still sceptical you could try implementing one of the recent papers that apply transformers (like https://github.com/NVIDIA-Merlin/Transformers4Rec) or zone in on cold start problems in your domain.
- Show HN: Transformers4Rec -a new library for Transformers on Recommender Systems
NeuRec
What are some alternatives?
TabFormer - Code & Data for "Tabular Transformers for Modeling Multivariate Time Series" (ICASSP, 2021)
Artificial-Intelligence-Deep-Learning-Machine-Learning-Tutorials - A comprehensive list of Deep Learning / Artificial Intelligence and Machine Learning tutorials - rapidly expanding into areas of AI/Deep Learning / Machine Vision / NLP and industry specific areas such as Climate / Energy, Automotives, Retail, Pharma, Medicine, Healthcare, Policy, Ethics and more.
transformers - 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Ringer-Client - Ringer is a new messaging app for windows. Its currently in beta but lets work together to make it better. The code here is snapshot code so it might not work as intended. To get the latest stable build pls download the installer.
BERT-pytorch - Google AI 2018 BERT pytorch implementation
kogpt - KakaoBrain KoGPT (Korean Generative Pre-trained Transformer)
Multimodal-Toolkit - Multimodal model for text and tabular data with HuggingFace transformers as building block for text data
Neural-Scam-Artist - Web Scraping, Document Deduplication & GPT-2 Fine-tuning with a newly created scam dataset.
transformer-mlm - Implementation of Transformer Encoders / Masked Language Modeling Objective
recs-at-resonable-scale - Recommendations at "Reasonable Scale": joining dataOps with recSys through dbt, Merlin and Metaflow
Locomotive - Toolkit for training/converting LibreTranslate compatible language models 🚂