-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
Eucalyptus-Chat
Discontinued A frontend for large language models like 🐨 Koala or 🦙 Vicuna running on CPU with llama.cpp, using the API server library provided by llama-cpp-python. NOTE: I had to discontinue this project because its maintenance takes more time than I can and want to invest. Feel free to fork :)
There is this language model called Koala, developed by researchers at UC Berkeley. It's based on LLaMA by Meta. Using llama.cpp, one can easily run this model on the CPU instead on a dedicated GPU. With the python library llama-cpp-python, it can be used with Python + it has an API-Server included.
There is this language model called Koala, developed by researchers at UC Berkeley. It's based on LLaMA by Meta. Using llama.cpp, one can easily run this model on the CPU instead on a dedicated GPU. With the python library llama-cpp-python, it can be used with Python + it has an API-Server included.
I put this all together and built a chat-like frontend on top of it: Eucalyptus. It's not well documented, but I think you should be able to use it without problems - if you have the model files for Koala (that must be converted & should be quantized using llama.cpp by the way).