-
localGPT
Chat with your documents on your local device using GPT models. No data leaves your device and 100% private.
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
See here: https://github.com/marella/chatdocs#configuration (chatdocs.yml file, context_length)
You can try localGPT. It's a fork of privateGPT which uses HF models instead of llama.cpp. It uses TheBloke/vicuna-7B-1.1-HF which is not commercially viable but you can quite easily change the code to use something like mosaicml/mpt-7b-instruct or even mosaicml/mpt-30b-instruct which fit the bill.
NOTE:
The number of mentions on this list indicates mentions on common posts plus user suggested alternatives.
Hence, a higher number means a more popular project.