code-llama-for-vscode
tabby
code-llama-for-vscode | tabby | |
---|---|---|
5 | 27 | |
523 | 18,025 | |
- | 3.9% | |
4.6 | 9.9 | |
10 months ago | 5 days ago | |
Python | Rust | |
MIT License | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
code-llama-for-vscode
-
Stable Code 3B: Coding on the Edge
How are people using codellama and this in their workflows?
I found one option: https://github.com/xNul/code-llama-for-vscode
But I'm guessing there are others, and they might differ in how they provide context to the model.
-
LLMs up to 4x Faster With latest Nvidia drivers on Windows
Do you use https://github.com/xNul/code-llama-for-vscode or something else?
Haven’t found any good setup instructions for Linux or my Google skills are failing me.
-
Continue with LocalAI: An alternative to GitHub's Copilot that runs locally
Ollama only works on Mac. Here is a portable option:
https://github.com/xnul/code-llama-for-vscode
- Code Llama for VS Code
- Code Llama for VSCode - A simple API which mocks llama.cpp to enable support for Code Llama with the Continue Visual Studio Code extension. Cross-platform support. No login/key/etc, 100% local.
tabby
- LSP-AI: open-source language server serving as back end for AI code assistance
-
17 Best Developer Productivity Tools to Try
Two other tools emerging in this category are SuperMaven and TabbyML, both use fast and secure LLM for code completion and recommendations.
-
IBM Granite: A Family of Open Foundation Models for Code Intelligence
https://github.com/TabbyML/tabby can run self-hosted AI coding assistants. I tried it a while ago and it worked with Nvim pretty easily. There is a VS code extension too. The extension will just sort of "read" with you and provide suggestions from time to time. Anytime the suggestion is good you can press some key ( by default) to accept it. It's basically autocomplete on steroids.
- Google CodeGemma: Open Code Models Based on Gemma [pdf]
-
What AI assistants are already bundled for Linux?
NixOS just got tabbyml[1] which is built on llama-cpp. Working on systemsd services the weekend and updating latest tabbyml release which supports rocm in addition to cuda
[1] https://github.com/TabbyML/tabby
[2] https://github.com/NixOS/nixpkgs/pull/291744
- FLaNK Stack Weekly 19 Feb 2024
-
Show HN: Tabby back end in 20 Python lines (self-hosted AI coding assistant)
Nice implementation! It should serve as a great reference for a minimal Tabby's backend API. Thank you for sharing it!
Yeah - ultimately, it won't be as performant or feature-rich compared to https://github.com/TabbyML/tabby, but it's still perfect for educational purposes!
- Stable Code 3B: Coding on the Edge
-
Show HN: I built local copilot alternative using Codellama
Looks interesting! What are the main differences between this and https://github.com/TabbyML/tabby ?
-
Ask HN: Who is hiring? (October 2023)
TabbyML | Software Engineer (Rust) | REMOTE
Self-hosted AI coding assistant. An opensource / on-prem alternative to GitHub Copilot.
Project: https://github.com/TabbyML/tabby
Tabby is seeking a Software Engineer proficient in Rust to join our core engineering team. In this role, you will be responsible for developing the following features:
What are some alternatives?
ollama-webui - ChatGPT-Style WebUI for LLMs (Formerly Ollama WebUI) [Moved to: https://github.com/open-webui/open-webui]
fauxpilot - FauxPilot - an open-source alternative to GitHub Copilot server
text-generation-webui - A Gradio web UI for Large Language Models. Supports transformers, GPTQ, AWQ, EXL2, llama.cpp (GGUF), Llama models.
turbopilot - Turbopilot is an open source large-language-model based code completion engine that runs locally on CPU
twinny - The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
refact - WebUI for Fine-Tuning and Self-hosting of Open-Source Large Language Models for Coding
go-llama2 - Llama 2 inference in one file of pure Go
Finetune_LLMs - Repo for fine-tuning Casual LLMs
aider - aider is AI pair programming in your terminal
AnglE - Train and Infer Powerful Sentence Embeddings with AnglE | 🔥 SOTA on STS and MTEB Leaderboard
ollama-ui - Simple HTML UI for Ollama