adaptnlp
Deep-Learning-Experiments
adaptnlp | Deep-Learning-Experiments | |
---|---|---|
2 | 1 | |
414 | 1,081 | |
0.0% | - | |
0.0 | 8.3 | |
over 2 years ago | about 1 month ago | |
Jupyter Notebook | Jupyter Notebook | |
Apache License 2.0 | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
adaptnlp
-
Tools to use for Semantic-searching Question Answering System
Check out adaptnlp
-
Case Sensitivity using HuggingFace & Google's T5 model (base)
Yes, there are capitals in the tokenizer vocabulary of t5-base and t5-small, so both support capitalization. A few days ago I was using t5-small through adaptnlp for extractive summarization and capitalization was working fine (https://github.com/Novetta/adaptnlp). AdaptNLP is basically just a transformers wrapper, so if you can't figure out a solution, you could just dissect their source code.
Deep-Learning-Experiments
-
EEE 197 - Deep Learning
Hello, took the course last sem. Maraming napa-drop sa amin dahil sa difficulty nung assignments pero doable naman. Open-source mismo yung course, available sya sa GitHub: https://github.com/roatienza/Deep-Learning-Experiments
What are some alternatives?
Basic-UI-for-GPT-J-6B-with-low-vram - A repository to run gpt-j-6b on low vram machines (4.2 gb minimum vram for 2000 token context, 3.5 gb for 1000 token context). Model loading takes 12gb free ram.
conformal_classification - Wrapper for a PyTorch classifier which allows it to output prediction sets. The sets are theoretically guaranteed to contain the true class with high probability (via conformal prediction).
keytotext - Keywords to Sentences
DeepLearning - Contains all my works, references for deep learning
fastai - The fastai deep learning library
python_autocomplete - Use Transformers and LSTMs to learn Python source code
gector - Official implementation of the papers "GECToR – Grammatical Error Correction: Tag, Not Rewrite" (BEA-20) and "Text Simplification by Tagging" (BEA-21)
nn - 🧑🏫 60 Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠
browser-ml-inference - Edge Inference in Browser with Transformer NLP model
pytorch-deepdream - PyTorch implementation of DeepDream algorithm (Mordvintsev et al.). Additionally I've included playground.py to help you better understand basic concepts behind the algo.
Transformers-Tutorials - This repository contains demos I made with the Transformers library by HuggingFace.
TTS - :robot: :speech_balloon: Deep learning for Text to Speech (Discussion forum: https://discourse.mozilla.org/c/tts)