AutoGPT
openai-cookbook
AutoGPT | openai-cookbook | |
---|---|---|
180 | 215 | |
161,405 | 55,954 | |
0.7% | 1.0% | |
9.9 | 9.5 | |
4 days ago | 3 days ago | |
JavaScript | MDX | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
AutoGPT
- Accessible AI for Everyone
-
AGI has, in some sense, been achieved: Tell me why I am wrong
Define agency. Does AutoGPT or BabyAGI fit the definition?
-
The Emergence of Autonomous Agents
This leap is evident in projects like BabyAGI and AutoGPT, showcasing how such agents can prioritize and execute tasks based on a pre-defined objective and the results of previous actions, such as sales prospecting or ordering pizza.
- An experimental open-source attempt to make GPT-4 autonomous
-
[Long read] Deep dive into AutoGPT: A comprehensive and in-depth step-by-step guide to how it works
A system and a user message are constructed from the task given by the user in code and passed to the LLM as input.
-
1000 Member Celebration and FAQ
A: How much do you know? If you can easily read code (in this example Python, but this will still benefit anyone who can read code), you should check out Auto-GPT. If you are looking to explore different options, check out this doc on AI Agents.
-
Agents: An Open-source Framework for Autonomous Language Agents - AIWaves Inc 2023
Also I think most agents I have seen have implemented some form of long-short term memory. Why does it say autogpt doesnt support it? https://github.com/Significant-Gravitas/Auto-GPT/tree/master/autogpts/autogpt/autogpt/memory
-
MetaGPT: The Next Evolution or Just More Hype?
In my newest experiment, I try out MetaGPT, which is supposed to be better than AutoGPT according to MetaGPT's paper.
-
List of Awesome AI Agents like AutoGPT and BabyAGI / Many open-source Agents with code included!
In my opinion the most interesting Agents: Auto-GPT Github: https://github.com/Significant-Gravitas/Auto-GPT BabyAGI Github: https://github.com/yoheinakajima/babyagi Voyager Github: https://github.com/MineDojo/Voyager / Paper: https://arxiv.org/abs/2305.16291 I would also add: ChemCrow: Augmenting large-language models with chemistry tools Github: https://github.com/ur-whitelab/chemcrow-public/ Paper: https://arxiv.org/abs/2304.05376
-
We've released Auto-GPT v0.4.5!
Check out the new Re-Arch README and ARCHITECTURE_NOTES.
openai-cookbook
-
Question-Answer System Architectures using LLMs
A pretrained LLM is a closed-book system: It can only access information that it was trained on. With domain fine-tuning, the system manifests additional material. An early prototype of this technique was shown in this OpenAi cookbook: For the target domain, text was embedded using an API, and then when using the LLM, embeddings were retrieved using semantic similarity search to formulate an answer. Although this approach evolved to retrieval-augmented generation, its still a technique to adapt a Gen2 (2020) or Gen3 (2022) LLM into a question-answering system.
-
Ask HN: High quality Python scripts or small libraries to learn from
https://github.com/openai/openai-cookbook/blob/main/examples...
- Collection of notebooks showcasing some fun and effective ways of using Claude
- OpenAI Cookbook: Techniques to improve reliability
- OpenAI Cookbooks
-
How to fine tune vit/convnet to focus on the layout of the input room image and ignore other things ?
It sounds like you are trying to tweak embeddings for similarity search. Rather than fine-tune the model's layers, you may want to try training a linear transformation the existing model's output embedding. Openai has a cookbook on how to do that. You will need some data though - but I think you can try it with ~20 pieces of synthetically generated data.
-
Best base model 1B or 7B for full finetuning
tutorial from OpenAI https://github.com/openai/openai-cookbook/blob/main/examples/Question_answering_using_embeddings.ipynb
-
Resources to learn ChatGPT and the OpenAI API
OpenAI Cookbook
- OpenAI Cookbook
-
Another Major Outage Across ChatGPT and API
OpenAI community repo with lots of examples: https://github.com/openai/openai-cookbook
What are some alternatives?
langchain - ⚡ Building applications with LLMs through composability ⚡ [Moved to: https://github.com/langchain-ai/langchain]
gpt4all - gpt4all: run open-source LLMs anywhere
gpt4-pdf-chatbot-langchain - GPT4 & LangChain Chatbot for large PDF docs
llama.cpp - LLM inference in C/C++
chatgpt-retrieval-plugin - The ChatGPT Retrieval Plugin lets you easily find personal or work documents by asking questions in natural language.
Auto-Vicuna
askai - Command Line Interface for OpenAi ChatGPT
JARVIS - JARVIS, a system to connect LLMs with ML community. Paper: https://arxiv.org/pdf/2303.17580.pdf
gpt_index - LlamaIndex (GPT Index) is a project that provides a central interface to connect your LLM's with external data. [Moved to: https://github.com/jerryjliu/llama_index]
SuperAGI - <⚡️> SuperAGI - A dev-first open source autonomous AI agent framework. Enabling developers to build, manage & run useful autonomous agents quickly and reliably.
txtai - 💡 All-in-one open-source embeddings database for semantic search, LLM orchestration and language model workflows