AST-1
llama.onnx
AST-1 | llama.onnx | |
---|---|---|
1 | 2 | |
61 | 331 | |
- | - | |
10.0 | 7.3 | |
about 1 year ago | 11 months ago | |
Python | ||
GNU General Public License v3.0 only | GNU General Public License v3.0 only |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
AST-1
llama.onnx
-
Qnap TS-264
You can find LLM models in the onnx format here: https://github.com/tpoisonooo/llama.onnx
-
Langchain question and answer without openai
You also need a LLM to do this. Please check this out to pick one up from the llama family. Other works like llama.onnx, alpaca-native and llama model on hugging face are also worth checking.
What are some alternatives?
text-generation-webui-colab - A colab gradio web UI for running Large Language Models
llama.cpp - LLM inference in C/C++
llama.go - llama.go is like llama.cpp in pure Golang!
Chinese-LLaMA-Alpaca - 中文LLaMA&Alpaca大语言模型+本地CPU/GPU训练部署 (Chinese LLaMA & Alpaca LLMs)
fastT5 - ⚡ boost inference speed of T5 models by 5x & reduce the model size by 3x.
openvino - OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference
motorhead - 🧠 Motorhead is a memory and information retrieval server for LLMs.
llama2.openvino - This sample shows how to implement a llama-based model with OpenVINO runtime
LocalAI - :robot: The free, Open Source OpenAI alternative. Self-hosted, community-driven and local-first. Drop-in replacement for OpenAI running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. It allows to generate Text, Audio, Video, Images. Also with voice cloning capabilities.