MoE-LLaVA
setfit
MoE-LLaVA | setfit | |
---|---|---|
2 | 13 | |
1,693 | 2,001 | |
9.0% | 4.8% | |
9.5 | 9.2 | |
8 days ago | 10 days ago | |
Python | Jupyter Notebook | |
Apache License 2.0 | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
MoE-LLaVA
setfit
- FLaNK Stack 05 Feb 2024
- Smarter Summaries with Finetuning GPT-3.5 and Chain of Density
-
[Discussion] Convince me that this training set contamination is fine (or not)
It did, sorry for the hasty edits! I removed that part b/c I realized that there isn't a compelling-enough reason for me to believe that text similarity is clearly inappropriate. In fact, you can train the Pr(condition | chat) classifier I suggested above using similarity training! Use SetFit for that. In the end you'll get a classifier and a similarity model.
-
Ask HN: What's the best framework for text classification (few-shot learning)?
[3] https://github.com/huggingface/setfit
-
Is it worth using LLMs like GPT-3 for text classification?
There's also kinda related approaches like SetFit which calculate embeddings from pretrained transformer models then then fit a classifier on top of the embeddings. I've yet to try it but it supposedly works well with very few labelled examples.
- LLMs for Text Classification (7B parameters)
- GPT-3 vs GPT-Neo / GPT-J for startup classification
-
Ideas on how to improve classification and scoring using Mean Pooled Sentence Embeddings
You could have a look at setfit.
-
SetFit (Sentence Transformer Fine-tuning) - Fewshot Learning without prompts [D]
Found relevant code at https://github.com/huggingface/setfit + all code implementations here
-
Most Popular AI Research Sept 2022 - Ranked Based On Total GitHub Stars
Efficient Few-Shot Learning Without Prompts https://github.com/huggingface/setfit https://arxiv.org/abs/2209.11055v1
What are some alternatives?
DeepSeek-Coder - DeepSeek Coder: Let the Code Write Itself
iris - Transformers are Sample-Efficient World Models. ICLR 2023, notable top 5%.
sqlchat - Chat-based SQL Client and Editor for the next decade
whisper - Robust Speech Recognition via Large-Scale Weak Supervision
onnxruntime - ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator
VToonify - [SIGGRAPH Asia 2022] VToonify: Controllable High-Resolution Portrait Video Style Transfer
TornadoVM - TornadoVM: A practical and efficient heterogeneous programming framework for managed languages
motion-diffusion-model - The official PyTorch implementation of the paper "Human Motion Diffusion Model"
openvino - OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference
git-re-basin - Code release for "Git Re-Basin: Merging Models modulo Permutation Symmetries"
ClickBench - ClickBench: a Benchmark For Analytical Databases
storydalle