Our great sponsors
-
continue
⏩ Open-source VS Code and JetBrains extensions that enable you to easily create your own modular AI software development system
-
WorkOS
The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
I'm not aware of any reason we would be connecting to windows.net. I would be surprised if VS Code did not already have access to Desktop / Documents / etc. but if this is the case, then Continue reads the currently open files as context for the LLM. It would be very useful to hear more about the details of these two cases so we can try to better reproduce and solve the problem. Would you be interested in opening a new issue? (https://github.com/continuedev/continue/issues/new)
Continue will cache Python packages in ~/.continue/server/env after the first download, but there might be something else causing slower startup. Will look into this as well!
Meilisearch is an open-source search library. We connect to meilisearch.com only in order to download the software, which then runs completely locally to power search in the dropdown as you type. The line of code where we do this is here: https://github.com/continuedev/continue/blob/ce76e391775034c...
The extension is not available on https://open-vsx.org/ ? (The market place for VSCodium)
Thanks for sharing! We've been recommending something similar with these llama-cpp-python bindings https://github.com/abetlen/llama-cpp-python#web-server, but I'll have to check that out
My open source tool aider does this. In the chat you can use /run to launch any shell command and return the output to GPT. It’s super helpful for running the code it just changed or related tests and letting GPT figure out how to fix problems.
https://github.com/paul-gauthier/aider#in-chat-commands
One more that is highly extensible: https://github.com/ppipada/vscode-flexigpt
With multiple ai providers, customizable prompts, select to ask, comment to ask, history saving, SO search, command line exec, etc.
Related posts
- Meta AI releases Code Llama 70B
- Show HN: Open-source, privacy oriented alternative to GitHub Copilot chat
- Continue will generate, refactor, and explain entire sections of code
- VSC Continue.dev with own Rest API
- What is your motive for running open-source models, instead of just using a ready-made solution like GPT-4?