Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality. Learn more →
Code-llama-for-vscode Alternatives
Similar projects and alternatives to code-llama-for-vscode
-
text-generation-webui
A Gradio web UI for Large Language Models. Supports transformers, GPTQ, AWQ, EXL2, llama.cpp (GGUF), Llama models.
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
ComfyUI
The most powerful and modular stable diffusion GUI, api and backend with a graph/nodes interface.
-
sd-webui-lobe-theme
🅰️ Lobe theme - The modern theme for stable diffusion webui, exquisite interface design, highly customizable UI, and efficiency boosting features.
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
-
twinny
The most no-nonsense, locally or API-hosted AI code completion plugin for Visual Studio Code - like GitHub Copilot but completely free and 100% private.
-
ollama-webui
Discontinued ChatGPT-Style WebUI for LLMs (Formerly Ollama WebUI) [Moved to: https://github.com/open-webui/open-webui]
-
llama-gpt
A self-hosted, offline, ChatGPT-like chatbot. Powered by Llama 2. 100% private, with no data leaving your device. New: Code Llama support!
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
code-llama-for-vscode reviews and mentions
-
Stable Code 3B: Coding on the Edge
How are people using codellama and this in their workflows?
I found one option: https://github.com/xNul/code-llama-for-vscode
But I'm guessing there are others, and they might differ in how they provide context to the model.
-
LLMs up to 4x Faster With latest Nvidia drivers on Windows
Do you use https://github.com/xNul/code-llama-for-vscode or something else?
Haven’t found any good setup instructions for Linux or my Google skills are failing me.
-
Continue with LocalAI: An alternative to GitHub's Copilot that runs locally
Ollama only works on Mac. Here is a portable option:
https://github.com/xnul/code-llama-for-vscode
- Code Llama for VS Code
- Code Llama for VSCode - A simple API which mocks llama.cpp to enable support for Code Llama with the Continue Visual Studio Code extension. Cross-platform support. No login/key/etc, 100% local.
-
A note from our sponsor - InfluxDB
www.influxdata.com | 2 May 2024
Stats
xNul/code-llama-for-vscode is an open source project licensed under MIT License which is an OSI approved license.
The primary programming language of code-llama-for-vscode is Python.
Popular Comparisons
- code-llama-for-vscode VS ollama-webui
- code-llama-for-vscode VS text-generation-webui
- code-llama-for-vscode VS go-llama2
- code-llama-for-vscode VS Finetune_LLMs
- code-llama-for-vscode VS twinny
- code-llama-for-vscode VS GoLLIE
- code-llama-for-vscode VS AnglE
- code-llama-for-vscode VS Fooocus
- code-llama-for-vscode VS debugpy-run
- code-llama-for-vscode VS realtime-bakllava
Sponsored