adaptnlp
BLOOM-fine-tuning
adaptnlp | BLOOM-fine-tuning | |
---|---|---|
2 | 1 | |
414 | 38 | |
0.0% | - | |
0.0 | 1.7 | |
over 2 years ago | about 1 year ago | |
Jupyter Notebook | Jupyter Notebook | |
Apache License 2.0 | - |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
adaptnlp
-
Tools to use for Semantic-searching Question Answering System
Check out adaptnlp
-
Case Sensitivity using HuggingFace & Google's T5 model (base)
Yes, there are capitals in the tokenizer vocabulary of t5-base and t5-small, so both support capitalization. A few days ago I was using t5-small through adaptnlp for extractive summarization and capitalization was working fine (https://github.com/Novetta/adaptnlp). AdaptNLP is basically just a transformers wrapper, so if you can't figure out a solution, you could just dissect their source code.
BLOOM-fine-tuning
What are some alternatives?
Basic-UI-for-GPT-J-6B-with-low-vram - A repository to run gpt-j-6b on low vram machines (4.2 gb minimum vram for 2000 token context, 3.5 gb for 1000 token context). Model loading takes 12gb free ram.
gpt-j-fine-tuning-example - Fine-tuning 6-Billion GPT-J (& other models) with LoRA and 8-bit compression
keytotext - Keywords to Sentences
voice-assistant-whisper-chatgpt - This repository will guide you to create your own Smart Virtual Assistant like Google Assistant using Open AI's ChatGPT, Whisper. The entire solution is created using Python & Gradio.
fastai - The fastai deep learning library
llm-applications - A comprehensive guide to building RAG-based LLM applications for production.
gector - Official implementation of the papers "GECToR – Grammatical Error Correction: Tag, Not Rewrite" (BEA-20) and "Text Simplification by Tagging" (BEA-21)
language-planner - Official Code for "Language Models as Zero-Shot Planners: Extracting Actionable Knowledge for Embodied Agents"
browser-ml-inference - Edge Inference in Browser with Transformer NLP model
xrays-and-gradcam - Classification and Gradient-based Localization of Chest Radiographs using PyTorch.
Transformers-Tutorials - This repository contains demos I made with the Transformers library by HuggingFace.
ML-Workspace - 🛠 All-in-one web-based IDE specialized for machine learning and data science.