Flowise
llama.cpp
Flowise | llama.cpp | |
---|---|---|
21 | 778 | |
25,017 | 57,984 | |
9.0% | - | |
9.9 | 10.0 | |
2 days ago | 2 days ago | |
TypeScript | C++ | |
Apache License 2.0 | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Flowise
- FLaNK Stack Weekly 12 February 2024
-
Docker Image not running. Error Command failed with exit code 127
The GitHub repo is: https://github.com/FlowiseAI/Flowise
-
Show HN: Rivet – open-source AI Agent dev env with real-world applications
- https://github.com/FlowiseAI/Flowise
It's absolutely ok if the answer is "Yes", I think that in this hot market each product will find a place. And competition is also motivate :)
It would be also nice to add Rivet here:
- Show HN: ChainForge, a visual tool for prompt engineering and LLM evaluation
-
Bing claims my Yamaha sound bar has a 3.5 mm mini-jack, and when the error is pointed out it doubles down by inventing a reference to a non-existing manual, including a firmware update adding a physical 3.5 mm input port
If you're doing anything more custom or advanced than the regular ChatGPT type of interface can handle, you can use Flowise to build your own bot with any number of advanced plugins like internet search, calculators, recursion, file read/write access, long-term memory, other AI's...
-
How to add SystemMessage to ConversationalRetrievalQAChain?
Dove into Flowise Docs for a useful example.
-
How to create a Langchain application that can Chat with multiple large JSON files
For rapid prototyping try Flowise
-
What exactly is AutoGPT?
Flowise as well, you are right!
-
LocalAI v1.18.0 release!
Flowise
-
April 2023
Drag & drop UI to build your customized LLM flow using LangchainJS (https://github.com/FlowiseAI/Flowise)
llama.cpp
-
IBM Granite: A Family of Open Foundation Models for Code Intelligence
if you can compile stuff, then looking at llama.cpp (what ollama uses) is also interesting: https://github.com/ggerganov/llama.cpp
the server is here: https://github.com/ggerganov/llama.cpp/tree/master/examples/...
And you can search for any GGUF on huggingface
-
Ask HN: Affordable hardware for running local large language models?
Yes, Metal seems to allow a maximum of 1/2 of the RAM for one process, and 3/4 of the RAM allocated to the GPU overall. There’s a kernel hack to fix it, but that comes with the usual system integrity caveats. https://github.com/ggerganov/llama.cpp/discussions/2182
- Xmake: A modern C/C++ build tool
-
Better and Faster Large Language Models via Multi-Token Prediction
For anyone interested in exploring this, llama.cpp has an example implementation here:
https://github.com/ggerganov/llama.cpp/tree/master/examples/...
- Llama.cpp Bfloat16 Support
-
Fine-tune your first large language model (LLM) with LoRA, llama.cpp, and KitOps in 5 easy steps
Getting started with LLMs can be intimidating. In this tutorial we will show you how to fine-tune a large language model using LoRA, facilitated by tools like llama.cpp and KitOps.
- GGML Flash Attention support merged into llama.cpp
-
Phi-3 Weights Released
well https://github.com/ggerganov/llama.cpp/issues/6849
- Lossless Acceleration of LLM via Adaptive N-Gram Parallel Decoding
- Llama.cpp Working on Support for Llama3
What are some alternatives?
langflow - ⛓️ Langflow is a dynamic graph where each node is an executable unit. Its modular and interactive design fosters rapid experimentation and prototyping, pushing hard on the limits of creativity.
ollama - Get up and running with Llama 3, Mistral, Gemma, and other large language models.
llama.go - llama.go is like llama.cpp in pure Golang!
gpt4all - gpt4all: run open-source LLMs anywhere
chatbot-ui - AI chat for every model.
text-generation-webui - A Gradio web UI for Large Language Models. Supports transformers, GPTQ, AWQ, EXL2, llama.cpp (GGUF), Llama models.
rivet-example
GPTQ-for-LLaMa - 4 bits quantization of LLaMA using GPTQ
deepdoctection - A Repo For Document AI
ggml - Tensor library for machine learning
hugo-quick-start - Hugo Quick Start on Render
alpaca.cpp - Locally run an Instruction-Tuned Chat-Style LLM