KoAlpaca
KoAlpaca: 한국어 명령어를 이해하는 오픈소스 언어모델 (by Beomi)
cabrita
Finetuning InstructLLaMA with portuguese data (by 22-hours)
KoAlpaca | cabrita | |
---|---|---|
1 | 2 | |
1,516 | 548 | |
- | 1.1% | |
4.7 | 3.1 | |
17 days ago | about 1 year ago | |
Jupyter Notebook | Jupyter Notebook | |
Apache License 2.0 | Apache License 2.0 |
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
KoAlpaca
Posts with mentions or reviews of KoAlpaca.
We have used some of these posts to build our list of alternatives
and similar projects. The last one was on 2023-04-05.
-
Its possible to fine tune the llama model to better understand another language?
Korean: https://github.com/Beomi/KoAlpaca
cabrita
Posts with mentions or reviews of cabrita.
We have used some of these posts to build our list of alternatives
and similar projects. The last one was on 2023-04-05.
-
Teaching Llama to reason in another language help!
There was an attempt to teach LLama 1 Portuguese: https://github.com/22-hours/cabrita , and so I used the same dataset on Llama2-13B-chat to update the project, but like some of you have been experiencing, the model goes off its rocker after around 100 tokens, doesn't know when to stop, often lapses into English while still being correct, etc.
-
Its possible to fine tune the llama model to better understand another language?
Portuguese: https://github.com/22-hours/cabrita/
What are some alternatives?
When comparing KoAlpaca and cabrita you can also consider the following projects:
Local-LLM-Langchain - Load local LLMs effortlessly in a Jupyter notebook for testing purposes alongside Langchain or other agents. Contains Oobagooga and KoboldAI versions of the langchain notebooks with examples.
text-generation-webui-colab - A colab gradio web UI for running Large Language Models
kruk - Ukrainian instruction-tuned language models and datasets
Chinese-LLaMA-Alpaca - 中文LLaMA&Alpaca大语言模型+本地CPU/GPU训练部署 (Chinese LLaMA & Alpaca LLMs)