Show HN: Use Code Llama as Drop-In Replacement for Copilot Chat

This page summarizes the projects mentioned and recommended in the original post on news.ycombinator.com

Scout Monitoring - Free Django app performance insights with Scout Monitoring
Get Scout setup in minutes, and let us sweat the small stuff. A couple lines in settings.py is all you need to start monitoring your apps. Sign up for our free tier today.
www.scoutapm.com
featured
InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
  • continue

    ⏩ Continue enables you to create your own AI code assistant inside your IDE. Keep your developers in flow with open-source VS Code and JetBrains extensions

  • Hi HN,

    Code Llama was released, but we noticed a ton of questions in the main thread about how/where to use it — not just from an API or the terminal, but in your own codebase as a drop-in replacement for Copilot Chat. Without this, developers don't get much utility from the model.

    This concern is also important because benchmarks like HumanEval don't perfectly reflect the quality of responses. There's likely to be a flurry of improvements to coding models in the coming months, and rather than relying on the benchmarks to evaluate them, the community will get better feedback from people actually using the models. This means real usage in real, everyday workflows.

    We've worked to make this possible with Continue (https://github.com/continuedev/continue) and want to hear what you find to be the real capabilities of Code Llama. Is it on-par with GPT-4, does it require fine-tuning, or does it excel at certain tasks?

    If you’d like to try Code Llama with Continue, it only takes a few steps to set up (https://continue.dev/docs/walkthroughs/codellama), either locally with Ollama, or through TogetherAI or Replicate's APIs.

  • Scout Monitoring

    Free Django app performance insights with Scout Monitoring. Get Scout setup in minutes, and let us sweat the small stuff. A couple lines in settings.py is all you need to start monitoring your apps. Sign up for our free tier today.

    Scout Monitoring logo
  • copilot.vim

    Neovim plugin for GitHub Copilot

  • I use copilot in neovim[1]. It was remarkably simple to get installed. Highly recommend

    [1]: https://github.com/github/copilot.vim

  • llama.cpp

    LLM inference in C/C++

  • You can cobble together an OpenAI-esque server locally with llama.cpp: https://github.com/ggerganov/llama.cpp/issues/2766

  • uniteai

    Your AI Stack in Your Editor

  • [UniteAI](https://github.com/freckletonj/uniteai) I think fits the bill for you.

    This is my project, where the goal is to Unite your AI-stack inside your editor (so, Speech-to-text, Local LLMs, Chat GPT, Retrieval Augmented Gen, etc).

    It's built atop a Language Server, so, while no one has made an IntelliJ client yet, it's simple to. I'll help you do it if you make a GH Issue!

  • ChatGPT.nvim

    ChatGPT Neovim Plugin: Effortless Natural Language Generation with OpenAI's ChatGPT API

  • need to mod https://github.com/jackmort/chatgpt.nvim to use copilot or ollama

  • awesome-neovim

    Collections of awesome neovim plugins.

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts

  • My Favorite Vim Oneliners for Text Manipulation

    4 projects | news.ycombinator.com | 6 Aug 2023
  • Neovim Boilerplate

    2 projects | /r/neovim | 22 Jun 2023
  • Ask HN: Vim vs. Neovim

    1 project | news.ycombinator.com | 20 Jun 2023
  • Hey everyone I recently joined. Been using vim with basic plugin for past 4 years recently switched to neovim. How should I start ?

    4 projects | /r/neovim | 19 Jun 2023
  • VS Code Dev Container alike setup for Neovim?

    2 projects | /r/neovim | 14 Apr 2023