Neurite
graph-of-thoughts
Neurite | graph-of-thoughts | |
---|---|---|
47 | 2 | |
768 | 1,867 | |
- | 3.9% | |
9.2 | 6.4 | |
6 days ago | 27 days ago | |
JavaScript | Python | |
MIT License | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Neurite
- Q* Could Be It - Forget AlphaGO - It's Diplomacy - Peg 1 May Have Fallen - Noam Brown May Have Achieved The Improbable - Is this Q* Leak 2.0?
- Node interface physics simulated within a fractal
- Tuesday Self-Promotion Thread
- Neurite has seen some significant improvements. Will be creating more video documentation soon.
- Neurite has seen some significant improvements. Will be creating new video documentation soon
- Neurite has seen some significant improvements. Will be creating new video documentation soon.
- Looking for open-source projects to contribute to
- How to include mindmaps into notes?
- Do you think we will be able to create our "own internet" in the future? By that I mean, simulating sites like reddit populated with AIs reacting to our content.
- How to make ChatGPT remember previous logs?
graph-of-thoughts
-
Q* Could Be It - Forget AlphaGO - It's Diplomacy - Peg 1 May Have Fallen - Noam Brown May Have Achieved The Improbable - Is this Q* Leak 2.0?
We introduce Graph of Thoughts (GoT): a framework that advances prompting capabilities in large language models (LLMs) beyond those offered by paradigms such as Chain-ofThought or Tree of Thoughts (ToT). The key idea and primary advantage of GoT is the ability to model the information generated by an LLM as an arbitrary graph, where units of information (“LLM thoughts”) are vertices, and edges correspond to dependencies between these vertices. This approach enables combining arbitrary LLM thoughts into synergistic outcomes, distilling the essence of whole networks of thoughts, or enhancing thoughts using feedback loops. We illustrate that GoT offers advantages over state of the art on different tasks, for example increasing the quality of sorting by 62% over ToT, while simultaneously reducing costs by >31%. We ensure that GoT is extensible with new thought transformations and thus can be used to spearhead new prompting schemes. This work brings the LLM reasoning closer to human thinking or brain mechanisms such as recurrence, both of which form complex networks. Website & code: https://github.com/spcl/graph-of-thoughts
-
Graph of Thoughts: Solving Elaborate Problems with Large Language Models
https://github.com/spcl/graph-of-thoughts Code for the paper too
What are some alternatives?
tree-of-thought-llm - [NeurIPS 2023] Tree of Thoughts: Deliberate Problem Solving with Large Language Models
trex - Enforce structured output from LLMs 100% of the time
tree-of-thoughts - Plug in and Play Implementation of Tree of Thoughts: Deliberate Problem Solving with Large Language Models that Elevates Model Reasoning by atleast 70%
prompttools - Open-source tools for prompt testing and experimentation, with support for both LLMs (e.g. OpenAI, LLaMA) and vector databases (e.g. Chroma, Weaviate, LanceDB).
openai-cookbook - Examples and guides for using the OpenAI API
llm-guard - The Security Toolkit for LLM Interactions
zep - Zep: Long-Term Memory for AI Assistants.
prompt-lib - A set of utilities for running few-shot prompting experiments on large-language models
AgentGPT - 🤖 Assemble, configure, and deploy autonomous AI Agents in your browser.
ToolEmu - A language model (LM)-based emulation framework for identifying the risks of LM agents with tool use
auto-gpt-web - Set Your Goals, AI Achieves Them.
ChainFury - 🦋 Production grade chaining engine behind TuneChat. Self host today!