Show HN: Use Code Llama as Drop-In Replacement for Copilot Chat

This page summarizes the projects mentioned and recommended in the original post on news.ycombinator.com

InfluxDB – Built for High-Performance Time Series Workloads
InfluxDB 3 OSS is now GA. Transform, enrich, and act on time series data directly in the database. Automate critical tasks and eliminate the need to move data externally. Download now.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
  1. continue

    ⏩ Create, share, and use custom AI code assistants with our open-source IDE extensions and hub of models, rules, prompts, docs, and other building blocks

    Hi HN,

    Code Llama was released, but we noticed a ton of questions in the main thread about how/where to use it — not just from an API or the terminal, but in your own codebase as a drop-in replacement for Copilot Chat. Without this, developers don't get much utility from the model.

    This concern is also important because benchmarks like HumanEval don't perfectly reflect the quality of responses. There's likely to be a flurry of improvements to coding models in the coming months, and rather than relying on the benchmarks to evaluate them, the community will get better feedback from people actually using the models. This means real usage in real, everyday workflows.

    We've worked to make this possible with Continue (https://github.com/continuedev/continue) and want to hear what you find to be the real capabilities of Code Llama. Is it on-par with GPT-4, does it require fine-tuning, or does it excel at certain tasks?

    If you’d like to try Code Llama with Continue, it only takes a few steps to set up (https://continue.dev/docs/walkthroughs/codellama), either locally with Ollama, or through TogetherAI or Replicate's APIs.

  2. InfluxDB

    InfluxDB – Built for High-Performance Time Series Workloads. InfluxDB 3 OSS is now GA. Transform, enrich, and act on time series data directly in the database. Automate critical tasks and eliminate the need to move data externally. Download now.

    InfluxDB logo
  3. copilot.vim

    Neovim plugin for GitHub Copilot

    I use copilot in neovim[1]. It was remarkably simple to get installed. Highly recommend

    [1]: https://github.com/github/copilot.vim

  4. llama.cpp

    LLM inference in C/C++

    You can cobble together an OpenAI-esque server locally with llama.cpp: https://github.com/ggerganov/llama.cpp/issues/2766

  5. uniteai

    Your AI Stack in Your Editor

    [UniteAI](https://github.com/freckletonj/uniteai) I think fits the bill for you.

    This is my project, where the goal is to Unite your AI-stack inside your editor (so, Speech-to-text, Local LLMs, Chat GPT, Retrieval Augmented Gen, etc).

    It's built atop a Language Server, so, while no one has made an IntelliJ client yet, it's simple to. I'll help you do it if you make a GH Issue!

  6. ChatGPT.nvim

    ChatGPT Neovim Plugin: Effortless Natural Language Generation with OpenAI's ChatGPT API

    need to mod https://github.com/jackmort/chatgpt.nvim to use copilot or ollama

  7. awesome-neovim

    Collections of awesome neovim plugins.

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts

  • Why You Should Migrate to NeoVim

    3 projects | dev.to | 25 Apr 2025
  • My Favorite Vim Oneliners for Text Manipulation

    4 projects | news.ycombinator.com | 6 Aug 2023
  • Neovim Boilerplate

    2 projects | /r/neovim | 22 Jun 2023
  • Ask HN: Vim vs. Neovim

    1 project | news.ycombinator.com | 20 Jun 2023
  • Hey everyone I recently joined. Been using vim with basic plugin for past 4 years recently switched to neovim. How should I start ?

    4 projects | /r/neovim | 19 Jun 2023

Did you know that Python is
the 2nd most popular programming language
based on number of references?