Can LLMs do static code analysis?

This page summarizes the projects mentioned and recommended in the original post on /r/LocalLLaMA

Our great sponsors
  • WorkOS - The modern identity platform for B2B SaaS
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • SaaSHub - Software Alternatives and Reviews
  • supercharger

    Supercharge Open-Source AI Models

  • Added support for 65B LLaMa model to https://github.com/catid/supercharger tonight. It runs faster than Baize 30B (maybe due to lack of adapter) and only slightly slower than Galpaca 30B. Benchmarks here: https://docs.google.com/spreadsheets/d/1TYBNr_UPJ7wCzJThuk5ysje7K1x-_62JhBeXDbmrjA8/edit?usp=sharing

  • codealpaca

  • Try, https://github.com/sahil280114/codealpaca, or we’re you trying to stick with more generalist models?

  • WorkOS

    The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.

    WorkOS logo
NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts