obsidian-copilot
til
obsidian-copilot | til | |
---|---|---|
5 | 4 | |
445 | 74 | |
- | - | |
7.3 | 7.7 | |
3 months ago | 12 days ago | |
Python | Python | |
Apache License 2.0 | Creative Commons Zero v1.0 Universal |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
obsidian-copilot
-
Ask HN: Has Anyone Trained a personal LLM using their personal notes?
hadn't seen your repo yet [1] - adding it to my list right now.
Your blog post is really neat on top - thanks for sharing
https://github.com/eugeneyan/obsidian-copilot
-
Obsidian-Copilot: A Prototype Assistant for Writing and Thinking
Um... can someone explain what this actually does?
In the video the user chooses the 'Copilot: Draft' action, and wow, it generates code...
...but, the 'draft' action [1] calls `/get_chunks` and then runs 'queryLLM' [2] which then just invokes 'https://api.openai.com/v1/chat/completions' directly.
So, generating text this way is 100% not interesting or relevant.
What's interesting here is how it's building the prompt to send to the openai-api.
So... can anyone shed some light on what the actual code [3] in get_chunks() does, and why you would... hm... I guess, do a lookup and pass the results to the openai api, instead of just the raw text?
The repo says: "You write a section header and the copilot retrieves relevant notes & docs to draft that section for you.", and you can see in the linked post [4], this is basically what the OP is trying to implement here; you write 'I want X', and the plugin (a bit like copilot) does a lookup of related documents, crafts a meta-prompt and passes the prompt to the openai api.
...but, it doesn't seem to do that. It seems to ignore your actual prompt, lookup related documents by embedding similarity... and then... pass those documents in as the prompt?
I'm pretty confused as to why you would want that.
It basically requires that you write your prompt separately before hand, so you can invoke it magically with a one-line prompt later. Did I misunderstand how this works?
[1] - https://github.com/eugeneyan/obsidian-copilot/blob/bdabdc422...
[2] - https://github.com/eugeneyan/obsidian-copilot/blob/bdabdc422...
[3] - https://github.com/eugeneyan/obsidian-copilot/blob/main/src/...
[4] - https://eugeneyan.com/writing/llm-experiments/#shortcomings-...
til
-
Ask HN: Has Anyone Trained a personal LLM using their personal notes?
Folks at gitbook are kind enough to give me a LLM over my notes https://til.bhupesh.me
- How to undo anything in Git
-
Generate Feed of recent files inside a Git repository
I like to log my notes & TILs in a git repository and recently I had an idea to showcase (& automate) my most recent learnings on my github profile.
-
Keep your URLs healthy using Github Actions and Go
You can see the generated issue here. Also the Action Summary
What are some alternatives?
obsidian-smart-connections - Chat with your notes & see links to related content with AI embeddings. Use local models or 100+ via APIs like Claude, Gemini, ChatGPT & Llama 3
create-issue-from-file - A GitHub action to create an issue using content from a file
llmware - Providing enterprise-grade LLM-based development framework, tools, and fine-tuned models.
areyouok - A fast and easy to use URL health checker ⛑️ Keep your links healthy during tough times (Out of box support for GitHub Actions)
tonic_validate - Metrics to evaluate the quality of responses of your Retrieval Augmented Generation (RAG) applications.
yournal.py - Fast (y)ournal script to make daily notes from your terminal.
chroma-langchain
ResuLLMe - Enhance your résumé with Large Language Models
markdown-embeddings-search - Obisidan notes to pinecone embeddings plus other files in effor to learn llama_index
autollm - Ship RAG based LLM web apps in seconds.