VSC Continue.dev with own Rest API

This page summarizes the projects mentioned and recommended in the original post on /r/LocalLLaMA

Our great sponsors
  • WorkOS - The modern identity platform for B2B SaaS
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • SaaSHub - Software Alternatives and Reviews
  • llama.cpp

    LLM inference in C/C++

  • Check out the llama.cpp project, it's pretty straight forward to install and setup (dm me if you want help). After that, spin-up a llama.cpp API using these instructions. At this point, you should have an API running on your machine, and you can make a quick sanity check by visiting http://localhost:8080. Once that's all set-up you can use the VSC extension's guide on how to connect the API to your editor. Enjoy!

  • continue

    ⏩ Open-source VS Code and JetBrains extensions that enable you to easily create your own modular AI software development system

  • In this Continue.dev file https://github.com/continuedev/continue/blob/preview/server/continuedev/libs/llm/llamacpp.py the request to llama.cpp is implemented.

  • WorkOS

    The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.

    WorkOS logo
NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts