-
Note that this is a very basic implementation of a retriever. The standard practice is to use something like ChromaDB or Azure Cognitive Search for indexing and retrieving. Also, you can get the complete list of stop words this GitHub gist.
-
InfluxDB
Purpose built for real-time analytics at any scale. InfluxDB Platform is powered by columnar analytics, optimized for cost-efficient storage, and built with open data standards.
-
The next step is to pass this context to an LLM. We'll use SOTA gpt-3.5-turbo model. You'll need an OpenAI API key for this, which you can get here.
-
Source code
NOTE:
The number of mentions on this list indicates mentions on common posts plus user suggested alternatives.
Hence, a higher number means a more popular project.
Related posts
-
Retrieval Augmented Generation Frameworks: LangChain
-
Txtai – A Strong Alternative to ChromaDB and LangChain for Vector Search and RAG
-
Build Your Own RAG App: A Step-by-Step Guide to Setup LLM locally using Ollama, Python, and ChromaDB
-
Chroma – the open-source embedding database
-
Show HN: Embeddings Solution for Personal Journal