Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality. Learn more →
fastLLaMa Alternatives
Similar projects and alternatives to fastLLaMa
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
open_llama
OpenLLaMA, a permissively licensed open source reproduction of Meta AI’s LLaMA 7B trained on the RedPajama dataset
-
serge
A web interface for chatting with Alpaca through llama.cpp. Fully dockerized, with an easy to use API.
-
RedPajama-Data
The RedPajama-Data repository contains code for preparing large datasets for training large language models.
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
-
gpt-llama.cpp
A llama.cpp drop-in replacement for OpenAI's GPT endpoints, allowing GPT-powered apps to run off local llama.cpp models instead of OpenAI.
fastLLaMa reviews and mentions
-
[N] OpenLLaMA: An Open Reproduction of LLaMA
If your GPU isn't good enough, you could use llama.cpp, which runs on CPU, or one of its forks like fastLLaMa.
-
Serge... Just works
possible through fastllama in python or gpt-llama.cpp an API wrapper around llama.cpp
-
llama-cpp-python VS fastLLaMa - a user suggested alternative
2 projects | 25 Apr 2023
It is better, Lots of low level cpp optimisations that are better
-
[P] LoRA adapter switching at runtime to enable Base model to inherit multiple personalities
u/_Arsenie_Boca_ you can have a look at this discussion for more info https://github.com/PotatoSpudowski/fastLLaMa/discussions/48
-
[P] fastLLaMa, A python wrapper to run llama.cpp
Repo Link
-
A note from our sponsor - InfluxDB
www.influxdata.com | 3 May 2024
Stats
PotatoSpudowski/fastLLaMa is an open source project licensed under MIT License which is an OSI approved license.
fastLLaMa is marked as "self-hosted". This means that it can be used as a standalone application on its own.
The primary programming language of fastLLaMa is C.
Sponsored