Revolutionize your code reviews with AI. CodeRabbit offers PR summaries, code walkthroughs, 1-click suggestions, and AST-based analysis. Boost productivity and code quality across all major languages with each PR. Learn more →
Langroid Alternatives
Similar projects and alternatives to langroid
-
text-generation-webui
A Gradio web UI for Large Language Models with support for multiple inference backends.
-
Judoscale
Save 47% on cloud hosting with autoscaling that just works. Judoscale integrates with Django, FastAPI, Celery, and RQ to make autoscaling easy and reliable. Save big, and say goodbye to request timeouts and backed-up task queues.
-
ollama
Get up and running with Llama 3.3, DeepSeek-R1, Phi-4, Gemma 3, Mistral Small 3.1 and other large language models.
-
txtai
đź’ˇ All-in-one open-source embeddings database for semantic search, LLM orchestration and language model workflows
-
-
Typesense
Open Source alternative to Algolia + Pinecone and an Easier-to-Use alternative to ElasticSearch ⚡ 🔍 ✨ Fast, typo tolerant, in-memory fuzzy Search Engine for building delightful search experiences
-
-
-
CodeRabbit
CodeRabbit: AI Code Reviews for Developers. Revolutionize your code reviews with AI. CodeRabbit offers PR summaries, code walkthroughs, 1-click suggestions, and AST-based analysis. Boost productivity and code quality across all major languages with each PR.
-
-
-
-
Back In Time
A comfortable and well-configurable graphical Frontend for incremental backups, with a command-line version also available. Modified files are transferred, while unchanged files are linked to the new folder using rsync's hard link feature, saving storage space. Restoring is straightforward via file manager, command line or Back In Time itself.
-
-
-
-
-
swarm
Educational framework exploring ergonomic, lightweight multi-agent orchestration. Managed by OpenAI Solution team.
-
agency
🕵️‍♂️ Library designed for developers eager to explore the potential of Large Language Models (LLMs) and other generative AI through a clean, effective, and Go-idiomatic approach. (by neurocult)
-
-
-
InfluxDB
InfluxDB high-performance time series database. Collect, organize, and act on massive volumes of high-resolution data to power real-time intelligent systems.
langroid discussion
langroid reviews and mentions
-
Understanding the BM25 full text search algorithm
In the Langroid[1] LLM library we have a clean, extensible RAG implementation in the DocChatAgent[2] -- it uses several retrieval techniques, including lexical (bm25, fuzzy search) and semantic (embeddings), and re-ranking (using cross-encoder, reciprocal-rank-fusion) and also re-ranking for diversity and lost-in-the-middle mitigation:
[1] Langroid - a multi-agent LLM framework from CMU/UW-Madison researchers https://github.com/langroid/langroid
[2] DocChatAgent Implementation -
-
Ask HN: What Open Source Projects Need Help?
Langroid: https://github.com/langroid/langroid
Langroid (2.7k stars, 20k downloads/mo) is an intuitive, lightweight, extensible and principled Python framework to easily build agent-oriented LLM-powered applications, from CMU and UW-Madison researchers. You set up Agents, equip them with optional components (LLM,
-
Swarm, a new agent framework by OpenAI
You can have a look at Langroid, an agent-oriented LLM framework from CMU/UW-Madison researchers (I am the lead dev). We are seeing companies using it in production in preference to other libs mentioned here.
https://github.com/langroid/langroid
-
Every Way to Get Structured Output from LLMs
https://github.com/langroid/langroid/tree/main/examples/basi...
[1] Langroid: https://github.com/langroid/langroid
-
Show HN: Mesop, open-source Python UI framework used at Google
This is very interesting. To build LLM chat-oriented WebApps in python, these days I use Chainlit[1], which I find is much better than Streamlit for this. I've integrated Chainlit into the Langroid[2] Multi-Agent LLM framework via a callback injection class[3] (i.e. hooks to display responses by various entities).
One of the key requirements in a multi-agent chat app is to be able to display steps of sub-tasks nested under parent tasks (to any level of nesting), with the option to fold/collapse sub-steps to only view the parent steps. I was able to get this to work with chainlit, though it was not easy, since their sub-step rendering mental model seemed more aligned to a certain other LLM framework with a partial name overlap with theirs.
That said, I am very curious if Mesop could be a viable alternative, for this type of nested chat implementation, especially if the overall layout can be much more flexible (which it seems like), and more production-ready.
[1] Chainlit https://github.com/Chainlit/chainlit
[2] Langroid: https://github.com/langroid/langroid
[3] Langroid ChainlitAgentCallback class: https://github.com/langroid/langroid/blob/main/langroid/agen...
-
OpenAI: Streaming is now available in the Assistants API
This was indeed true in the beginning, and I don’t know if this has changed. Inserting messages with Assistant role is crucial for many reasons, such as if you want to implement caching, or otherwise edit/compress a previous assistant response for cost or other reason.
At the time I implemented a work-around in Langroid[1]: since you can only insert a “user” role message, prepend the content with ASSISTANT: whenever you want it to be treated as an assistant role. This actually works as expected and I was able to do caching. I explained it in this forum:
https://community.openai.com/t/add-custom-roles-to-messages-...
[1] the Langroid code that adds a message with a given role, using this above “assistant spoofing trick”:
https://github.com/langroid/langroid/blob/main/langroid/agen...
- FLaNK Stack 29 Jan 2024
-
Ollama Python and JavaScript Libraries
Same question here. Ollama is fantastic as it makes it very easy to run models locally, But if you already have a lot of code that processes OpenAI API responses (with retry, streaming, async, caching etc), it would be nice to be able to simply switch the API client to Ollama, without having to have a whole other branch of code that handles Alama API responses. One way to do an easy switch is using the litellm library as a go-between but it’s not ideal (and I also recently found issues with their chat formatting for mistral models).
For an OpenAI compatible API my current favorite method is to spin up models using oobabooga TGW. Your OpenAI API code then works seamlessly by simply switching out the api_base to the ooba endpoint. Regarding chat formatting, even ooba’s Mistral formatting has issues[1] so I am doing my own in Langroid using HuggingFace tokenizer.apply_chat_template [2]
[1] https://github.com/oobabooga/text-generation-webui/issues/53...
[2] https://github.com/langroid/langroid/blob/main/langroid/lang...
Related question - I assume ollama auto detects and applies the right chat formatting template for a model?
-
Pushing ChatGPT's Structured Data Support to Its Limits
we (like simpleaichat from OP) leverage Pydantic to specify the desired structured output, and under the hood Langroid translates it to either the OpenAI function-calling params or (for LLMs that don’t natively support fn-calling), auto-insert appropriate instructions into tje system-prompt. We call this mechanism a ToolMessage:
https://github.com/langroid/langroid/blob/main/langroid/agen...
We take this idea much further — you can define a method in a ChatAgent to “handle” the tool and attach the tool to the agent. For stateless tools you can define a “handle” method in the tool itself and it gets patched into the ChatAgent as the handler for the tool.
-
Ask HN: How do I train a custom LLM/ChatGPT on my own documents in Dec 2023?
Many services/platforms are careless/disingenuous when they claim they “train” on your documents, where they actually mean they do RAG.
An under-appreciate benefit of RAG is the ability to have the LLM cite sources for its answers (which are in principle automatically/manually verifiable). You lose this citation ability when you finetune on your documents.
In Langroid (the Multi-Agent framework from ex-CMU/UW-Madison researchers) https://github.com/langroid/langroid
-
A note from our sponsor - CodeRabbit
coderabbit.ai | 22 Apr 2025
Stats
langroid/langroid is an open source project licensed under MIT License which is an OSI approved license.
The primary programming language of langroid is Python.