LongMem
obsidian-ollama
LongMem | obsidian-ollama | |
---|---|---|
3 | 3 | |
736 | 679 | |
- | - | |
5.0 | 2.7 | |
2 months ago | 30 days ago | |
Python | TypeScript | |
Apache License 2.0 | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
LongMem
- Is anyone using self hosted LLM day to day and training it like a new employee
-
Putting Together the Pieces of Transformative AI
Long Term Memory - Voyager, MemGPTand LongMem
-
Augmenting Language Models with Long-Term Memory
They just pushed the code 👨💻 https://github.com/Victorwz/LongMem
obsidian-ollama
-
Show HN: NotesOllama – I added local LLM support to Apple Notes (through Ollama)
This lets you talk to local LLMs in Apple Notes. I saw Obsidian Ollama (https://github.com/hinterdupfinger/obsidian-ollama) and thought it was handy, but I'm too lazy to migrate away from the Apple ecosystem, so I quickly hacked this together. I tend to use Notes as a scratchpad for prompts, so it's nice to do some quick inference without leaving the app.
Notes doesn't really support plugins so I'm using the macOS accessibility API for reading selections and then stream responses using the clipboard (not ideal but it works).
-
Is anyone using self hosted LLM day to day and training it like a new employee
Having come from Notion also, I LOVE Obsidian for its non-proprietary file structure. Incredible powerful plugins, too.
FWIW, it's not really what you're seeking... but there is a plugin that allows you to invoke an LLM from within Obsidian (via Ollama): https://github.com/hinterdupfinger/obsidian-ollama
In short, it allows you to set up prompts to transform selected text directly within a file, e.g. 'Summarize this selection as a markdown formatted list of key points', 'Write a PRD', 'Translate to [Language]', 'Run this as a prompt', etc.
-
Plugin to bring the power of local LLMs to logseq with (ollama-logseq)
Hello guys I was jealous that obsidian had a plugin integrating with ollama, So I decided to make one for logseq myself, ollama essentially allows you to play with local LLMs like: LLama 2, Orca Mini, vicuna and many more. Some of these LLMs are performing to levels close to chatGPT3.5 in some tasks and the fact that they are running locally is awesome, here is a quick demo
What are some alternatives?
MemGPT - Create LLM agents with long-term memory and custom tools 📚🦙
ollama - Get up and running with Llama 3, Mistral, Gemma, and other large language models.
autogen - A programming framework for agentic AI. Discord: https://aka.ms/autogen-dc. Roadmap: https://aka.ms/autogen-roadmap
ollama-logseq - Logseq plugin to integerate with ollama