Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality. Learn more →
Vanilla-llama Alternatives
Similar projects and alternatives to vanilla-llama based on common topics and language
-
coral-pi-rest-server
Perform inferencing of tensorflow-lite models on an RPi with acceleration from Coral USB stick
-
LLaVA
[NeurIPS'23 Oral] Visual Instruction Tuning (LLaVA) built towards GPT-4V level capabilities and beyond.
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
chat-llama-discord-bot
A Discord Bot for chatting with LLaMA, Vicuna, Alpaca, MPT, or any other Large Language Model (LLM) supported by text-generation-webui or llama.cpp.
-
xTuring
Build, customize and control you own LLMs. From data pre-processing to fine-tuning, xTuring provides an easy way to personalize open-source LLMs. Join our discord community: https://discord.gg/TgHXuSJEk6
-
WorkOS
The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.
vanilla-llama reviews and mentions
-
How to extract vector embeddings from passages analyzed with LLaMA
I shouldn't have any trouble with the second step, but I'm not sure how to get started on the first one. I found a Python package for interfacing with LLaMA, but its examples focus on just generating text, and I'm not sure how I would actually get embedding vectors or anything beyond text generation. Ideally, I would like to not even just create embedding vectors but rather directly hook up some new layers to LLaMA for supervised learning.
- Has anyone used LLaMA with a TPU instead of GPU?
- [P] vanilla-llama an hackable plain-pytorch implementation of LLaMA that can be run on any system (if you have enough resources)
-
A note from our sponsor - InfluxDB
www.influxdata.com | 29 Apr 2024
Stats
galatolofederico/vanilla-llama is an open source project licensed under GNU General Public License v3.0 only which is an OSI approved license.
The primary programming language of vanilla-llama is Python.
Sponsored