langflow
PrivateGPT4Linux
langflow | PrivateGPT4Linux | |
---|---|---|
28 | 23 | |
17,467 | 15 | |
12.6% | - | |
10.0 | 4.1 | |
4 days ago | 7 days ago | |
JavaScript | Shell | |
MIT License | GNU General Public License v3.0 only |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
langflow
-
News DataStax just bought our startup Langflow
Hey folks I'm the Head of DevRel @ DataStax here and just wanted to share to the HN community that in conjunction with this big acquisition news, the LF team has shipped 1.0-alpha of Langflow.
It's a simple `pip install` and the team would love any and all feedback!
https://github.com/logspace-ai/langflow/
-
Node-based AutoGen with local LLMs inside ComfyUI
You can also check langflow, a node UI for langchain https://github.com/logspace-ai/langflow
- Show HN: Rivet – open-source AI Agent dev env with real-world applications
-
Using Retrieval Augmented Generation to Clear Our GitHub Backlog
There's a few tools out there like AgentGPT (https://github.com/reworkd/AgentGPT, although it's a more conversational interface), and (https://github.com/logspace-ai/langflow) and others. I think most developers definitely prefer a code-first interface though like a library but haven't found one that's great yet. We've used them in the past but didn't have the best experience so would love to hear if anyone has worked with a library they found really flexible.
- Show HN: ChainForge, a visual tool for prompt engineering and LLM evaluation
-
Anyone know how to get LangFlow working with oobabooga?
I found this thread talking about it here: https://github.com/logspace-ai/langflow/issues/263
-
Found a fun little open source project called Flowise. It's a drag & drop UI to build your customized LLM flow using LangchainJS
also check https://github.com/logspace-ai/langflow
-
What exactly is AutoGPT?
AutoGPT is basically a demo of what you can do with Langchain. If you want to play with Langchain in a drag and drop blueprint environment I suggest Langflow
-
Launch HN: Fastgen (YC W23) – Visual Low-Code Back End Builder
Hi, I like this! I'm curious what drove the decision to use the vertical block builder style you chose. I'm partial to node-based editors and have been building things with React Flow recently. LangFlow [1] is a good example, but there's lots of UIs that use a similar interface (e.g. Blender [2] and Unity [3]).
[1] https://github.com/logspace-ai/langflow
[2] https://docs.blender.org/manual/en/3.5/interface/controls/no...
[3] https://unity.com/features/unity-visual-scripting
-
Having fun testing CanvasGPT - a new project launching soon
Here's an open source version that's very similar LangFlow
PrivateGPT4Linux
- PrivateGPT: Interact with your documents using the power of GPT, 100% privately, no data leaks
-
Need guidance in this sea of information on how to set up a local AI
I found things like this dataset and LocalAI and I followed the article to get PrivateGPT and the GPT4ALL groovy.bin but I'm completely lost and it feels like the more I research the internet or ask BingAI for answers, the more questions I get instead. At this stage I don't know what goes where, if there's a difference between source documents and datasets, should I run this from my 2tb SSD? Should I have the data on my 8tb HDD? Will all this even work on my PC?
-
Several newb questions
No, as the same as the last question, It does not have access to anything except the model data itself. However, there are some approaches that can let LLMs have access LOCAL documents, which means if you can have a program that extracts data from the database into a local folder which contains TEXT files. This could also work for 2(I didn't mention it in 2 because online datas are REALLY big. It would take the model hours to give an answer. If the database is not large then there might be a shot. Check https://github.com/imartinez/privateGPT(Must be GPT4all compatible models sadly).
-
What solution would best suite a SaaS - for reading and answering data from PDF files uploaded by users
I've been doing exactly this with an open source repository called PrivateGPT imartinez/privateGPT: Interact privately with your documents using the power of GPT, 100% privately, no data leaks (github.com)
- How to run an open source AI model, offline, on my own computer?
- Check out my script which installs privateGPT for Linux!
-
are there anytools or frameworks similar to "langchain" or "llamaindexbut implemented or designed in a language other than python?
Not really, you will probably need to change the data location and the LLM provider in the example code to get it running. But you don't have to implement that yourself there are a couple projects that already do that like privateGPT. I use it for searching datasheets, got it up an running in a few hours and I'm pretty happy with it so far.
-
Intern tasked to make a "local" version of chatGPT for my work
PrivateGPT can do that.
- I've made privateGPT work for Linux check it out (documents)
- I've made privateGPT work for Linux check it out
What are some alternatives?
Flowise - Drag & drop UI to build your customized LLM flow
private-gpt - Interact with your documents using the power of GPT, 100% privately, no data leaks
langchain-visualizer - Visualization and debugging tool for LangChain workflows
nanoGPT - The simplest, fastest repository for training/finetuning medium-sized GPTs.
langchain - ⚡ Building applications with LLMs through composability ⚡ [Moved to: https://github.com/langchain-ai/langchain]
Local-LLM-Comparison-Colab-UI - Compare the performance of different LLM that can be deployed locally on consumer hardware. Run yourself with Colab WebUI.
Voyager - An Open-Ended Embodied Agent with Large Language Models
GPTQ-for-LLaMa - 4 bits quantization of LLaMa using GPTQ
llm - An ecosystem of Rust libraries for working with large language models
serge - A web interface for chatting with Alpaca through llama.cpp. Fully dockerized, with an easy to use API.
llm-chain - `llm-chain` is a powerful rust crate for building chains in large language models allowing you to summarise text and complete complex tasks