llama_index
gpt-llama.cpp
llama_index | gpt-llama.cpp | |
---|---|---|
78 | 12 | |
41,098 | 599 | |
4.4% | 0.7% | |
9.9 | 8.2 | |
5 days ago | almost 2 years ago | |
Python | JavaScript | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
llama_index
-
Complete Large Language Model (LLM) Learning Roadmap
Resource: LlamaIndex Documentation
-
Quick tip: Replace MongoDB® Atlas with SingleStore Kai in LlamaIndex
The notebook is adapted from the LlamaIndex GitHub repo.
- Show HN: Route your prompts to the best LLM
- LlamaIndex: A data framework for your LLM applications
- FLaNK AI - 01 April 2024
-
Show HN: Ragdoll Studio (fka Arthas.AI) is the FOSS alternative to character.ai
For anyone curious llamaindex's "prompt mixins", they're actually dead simple: https://github.com/run-llama/llama_index/blob/8a8324008764a7... - and maybe no longer supported.
I basically reinvented this wheel in ragdoll but made it more dynamic: https://github.com/bennyschmidt/ragdoll/blob/master/src/util...
- LlamaIndex is a data framework for your LLM applications
- How to verify that a snippet of Python code doesn't access protected members
-
🆓 Local & Open Source AI: a kind ollama & LlamaIndex intro
Being able to plug third party frameworks (Langchain, LlamaIndex) so you can build complex projects
-
I made an app that runs Mistral 7B 0.2 LLM locally on iPhone Pros
Mistral Instruct does use a system prompt.
You can see the raw format here: https://www.promptingguide.ai/models/mistral-7b#chat-templat... and you can see how LllamaIndex uses it here (as an example): https://github.com/run-llama/llama_index/blob/1d861a9440cdc9...
gpt-llama.cpp
-
Attempt to run Llama on a remote server with chatbot-ui
hi! I really like the solution https://github.com/keldenl/gpt-llama.cpp which helps to deploy https://github.com/mckaywrigley/chatbot-ui on the local model. I am running this together with Wizard7b or 13b locally and it works fine, but when I tried to upload to a remote server I met an error.
-
Introducing Basaran: self-hosted open-source alternative to the OpenAI text completion API
sounds like you’re asking for exactly this? https://github.com/keldenl/gpt-llama.cpp
- LLaMA and AutoAPI?
-
New big update to GPTNicheFinder: better trends analysis and scoring system, cleaned up UI and verbose in the terminal for people who want to see what is going on and to verify the results
I salut you good sir. This is an amazing idea. I don't have time but it will be interesting idea to use this wrapper https://github.com/keldenl/gpt-llama.cpp which simulates GPT endpoint for local lama, so basically we can have amazing tool for completely free use. If somebody test it please let me know underneath my comment!
-
I build an AI powered writing tools, an AI co-author
I would gladly buy your product to run with a local model, like Vicuna ggml , also see https://github.com/keldenl/gpt-llama.cpp/
-
Serge... Just works
possible through fastllama in python or gpt-llama.cpp an API wrapper around llama.cpp
-
Embeddings?
https://github.com/keldenl/gpt-llama.cpp supports embeddings, and it even takes in openai type requests and returns openai compatible responses!
-
I built a completely Local AutoGPT with the help of GPT-llama running Vicuna-13B
https://github.com/keldenl/gpt-llama.cpp
- I build a completely Local and portable AutoGPT with the help of gpt-llama, running on Vicuna-13b
-
Adding Long-Term Memory to Custom LLMs: Let's Tame Vicuna Together!
There's a (kind of) working Auto-GPT solution that uses Vicuna https://github.com/keldenl/gpt-llama.cpp/blob/master/docs/Auto-GPT-setup-guide.md
What are some alternatives?
langchain - 🦜🔗 Build context-aware reasoning applications
semantic-kernel - Integrate cutting-edge LLM technology quickly and easily into your apps
text-generation-webui - A Gradio web UI for Large Language Models with support for multiple inference backends.
long_term_memory - A gradio web UI for running Large Language Models like GPT-J 6B, OPT, GALACTICA, LLaMA, and Pygmalion.
chatgpt-retrieval-plugin - The ChatGPT Retrieval Plugin lets you easily find personal or work documents by asking questions in natural language.
AGiXT - AGiXT is a dynamic AI Agent Automation Platform that seamlessly orchestrates instruction management and complex task execution across diverse AI providers. Combining adaptive memory, smart features, and a versatile plugin system, AGiXT delivers efficient and comprehensive AI solutions.