debugpy-run
Finds and runs debugpy for VS Code "remote attach" command line debugging. (by bulletmark)
code-llama-for-vscode
Use Code Llama with Visual Studio Code and the Continue extension. A local LLM alternative to GitHub Copilot. (by xNul)
debugpy-run | code-llama-for-vscode | |
---|---|---|
1 | 5 | |
63 | 511 | |
- | - | |
4.5 | 4.6 | |
5 months ago | 8 months ago | |
Python | Python | |
- | MIT License |
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
debugpy-run
Posts with mentions or reviews of debugpy-run.
We have used some of these posts to build our list of alternatives
and similar projects.
code-llama-for-vscode
Posts with mentions or reviews of code-llama-for-vscode.
We have used some of these posts to build our list of alternatives
and similar projects. The last one was on 2024-01-16.
-
Stable Code 3B: Coding on the Edge
How are people using codellama and this in their workflows?
I found one option: https://github.com/xNul/code-llama-for-vscode
But I'm guessing there are others, and they might differ in how they provide context to the model.
-
LLMs up to 4x Faster With latest Nvidia drivers on Windows
Do you use https://github.com/xNul/code-llama-for-vscode or something else?
Haven’t found any good setup instructions for Linux or my Google skills are failing me.
-
Continue with LocalAI: An alternative to GitHub's Copilot that runs locally
Ollama only works on Mac. Here is a portable option:
https://github.com/xnul/code-llama-for-vscode
- Code Llama for VS Code
- Code Llama for VSCode - A simple API which mocks llama.cpp to enable support for Code Llama with the Continue Visual Studio Code extension. Cross-platform support. No login/key/etc, 100% local.
What are some alternatives?
When comparing debugpy-run and code-llama-for-vscode you can also consider the following projects:
pyxirr - Rust-powered collection of financial functions.
ollama-webui - ChatGPT-Style WebUI for LLMs (Formerly Ollama WebUI) [Moved to: https://github.com/open-webui/open-webui]