tiger
discus
tiger | discus | |
---|---|---|
3 | 1 | |
385 | 60 | |
- | - | |
9.0 | 7.7 | |
6 months ago | 6 months ago | |
Jupyter Notebook | Python | |
Apache License 2.0 | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
tiger
- FLaNK Stack Weekly for 13 November 2023
-
TigerLab – open-source LLM toolkit (RAG, FineTune, AI safety)
Introducing TigerLab - an Open Source LLM toolkit, providing solutions to a variety of LLM domains (RAG, finetune, search, AIsafety)
More about TigerLab: https://github.com/tigerlab-ai/tiger
You can also find more experiments on https://www.tigerlab.ai
discus
What are some alternatives?
canopy - Retrieval Augmented Generation (RAG) framework and context engine powered by Pinecone
kani - kani (カニ) is a highly hackable microframework for chat-based language models with tool use/function calling. (NLP-OSS @ EMNLP 2023)
inshellisense - IDE style command line auto complete
tdk-demo - This is a collection of TDK demo projects that use different databases and options
engblogs - learn from your favorite tech companies
LongLoRA - Code and documents of LongLoRA and LongAlpaca (ICLR 2024 Oral)
TencentPretrain - Tencent Pre-training framework in PyTorch & Pre-trained Model Zoo
distilabel - ⚗️ distilabel is a framework for synthetic data and AI feedback for AI engineers that require high-quality outputs, full data ownership, and overall efficiency.
YiVal - Your Automatic Prompt Engineering Assistant for GenAI Applications
start-llms - A complete guide to start and improve your LLM skills in 2024 with little background in the field and stay up-to-date with the latest news and state-of-the-art techniques!
ragna - RAG orchestration framework ⛵️
LLM-Adapters - Code for our EMNLP 2023 Paper: "LLM-Adapters: An Adapter Family for Parameter-Efficient Fine-Tuning of Large Language Models"