Cornucopia-LLaMA-Fin-Chinese
ray-llm
Cornucopia-LLaMA-Fin-Chinese | ray-llm | |
---|---|---|
19 | 4 | |
536 | 1,163 | |
- | 5.9% | |
4.4 | 8.6 | |
11 months ago | 7 days ago | |
Python | Python | |
Apache License 2.0 | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Cornucopia-LLaMA-Fin-Chinese
ray-llm
- Aviary: Compare Open Source LLMs for cost, latency and quality
-
[N] Aviary: Comparing Open Source LLMs for cost, latency and quality
Aviary is a open source utility to compare leading OSS LLMs. https://aviary.anyscale.com/
- Anyscale's Aviary is a dashboard for evaluating Open Source LLMs
- Aviary simplifies OSS LLM eval and deployment
What are some alternatives?
Baichuan-7B - A large-scale 7B pretraining language model developed by BaiChuan-Inc.
AutoGPTQ - An easy-to-use LLMs quantization package with user-friendly apis, based on GPTQ algorithm.
tableQA-Chinese - Unsupervised tableQA and databaseQA on chinese finance question and tabular data
safe-rlhf - Safe RLHF: Constrained Value Alignment via Safe Reinforcement Learning from Human Feedback
Chinese-LLaMA-Alpaca - 中文LLaMA&Alpaca大语言模型+本地CPU/GPU训练部署 (Chinese LLaMA & Alpaca LLMs)
AtomGPT - 中英文预训练大模型,目标与ChatGPT的水平一致
HugNLP - CIKM2023 Best Demo Paper Award. HugNLP is a unified and comprehensive NLP library based on HuggingFace Transformer. Please hugging for NLP now!😊
Huatuo-Llama-Med-Chinese - Repo for BenTsao [original name: HuaTuo (华驼)], Instruction-tuning Large Language Models with Chinese Medical Knowledge. 本草(原名:华驼)模型仓库,基于中文医学知识的大语言模型指令微调
Ray - Ray is a unified framework for scaling AI and Python applications. Ray consists of a core distributed runtime and a set of AI Libraries for accelerating ML workloads.
storium-backend - Source code for the web backend for hosting story generation models in the EMNLP 2020 paper "STORIUM: A Dataset and Evaluation Platform for Human-in-the-Loop Story Generation"
pinferencia - Python + Inference - Model Deployment library in Python. Simplest model inference server ever.