Yi
Chinese-LLaMA-Alpaca
Yi | Chinese-LLaMA-Alpaca | |
---|---|---|
9 | 4 | |
7,141 | 17,348 | |
2.8% | - | |
9.4 | 8.3 | |
4 days ago | 3 days ago | |
Python | Python | |
Apache License 2.0 | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Yi
-
Yi: Open Foundation Models by 01.ai
The model license:
https://github.com/01-ai/Yi/blob/main/MODEL_LICENSE_AGREEMEN...
1) Your use of the Yi Series Models must comply with the Laws and Regulations as
-
Chinese Startup Is Winning the Open Source AI Race
01.ai's Yi model has performed well and there are several strong fine-tunes on Huggingface also.
I wonder what definition of "open source" the author is using or if he even read the license Yi is released under.
The Yi model license agreement [1] restricts usage and requires compliance with the "laws and administrative regulations of the mainland of the People's Republic of China" and they have a separate license that you can apply for if you want to use Yi commercially. [2]
Kudos to the 01.ai team on a strong LLM model but I do wonder if Wired and others should be a little more careful with the use of "open source" when describing AI models.
1. https://github.com/01-ai/Yi/blob/main/MODEL_LICENSE_AGREEMEN...
-
EU regulation implications ?
I doubt it, training a foundational model (very important to make the difference with language model) make very little economic sense when there's already plenty of open source option. And in fact, almost all AI startup use/fine-tune these models. Only big research center will do foundational research, as it as always been the case. (Mistral and 01.ai are outliers, and I don't see how they're ever gonna recoup their cost)
-
What the heck is so great about this model?
Yi-34b: https://github.com/01-ai/Yi
-
Yi-34B-Chat
The 6B model is unfortunately still a base text completion model. I've been waiting for the Chat version it to be open-sourced :). The 01-ai team is working on it! https://github.com/01-ai/Yi/issues/173
- 零一万物-AI2.0大模型技术和应用的全球公司(01.AI)- Source of Yi 34B LLM AI
- Kai-Fu Lee's Yi-34B uses Llama's architecture except for two tensors renamed
- 01-AI/Yi: A series of large language models trained from scratch
Chinese-LLaMA-Alpaca
-
Chinese-Alpaca-Plus-13B-GPTQ
I'd like to share with you today the Chinese-Alpaca-Plus-13B-GPTQ model, which is the GPTQ format quantised 4bit models of Yiming Cui's Chinese-LLaMA-Alpaca 13B for GPU reference.
-
How to train a new language that is not in base model?
Could follow what people did with the Chinese-LLaMA, just for Korean. Might want to have a pure Korean corpus before feeding in a translation dataset. How big is it by the way?
- Open Source Chinese LLMs
-
Its possible to fine tune the llama model to better understand another language?
Chinese: https://github.com/ymcui/Chinese-LLaMA-Alpaca