LLaMA-rs: Run inference of LLaMA on CPU with Rust 🦀🦙

This page summarizes the projects mentioned and recommended in the original post on /r/rust

Our great sponsors
  • WorkOS - The modern identity platform for B2B SaaS
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • SaaSHub - Software Alternatives and Reviews
  • llm

    An ecosystem of Rust libraries for working with large language models

  • deep-diamond

    A fast Clojure Tensor & Deep Learning library

  • I had some "classical ML" knowledge and knew a bit about the math behind DL and tensors in general thanks to the book Deep Learning for Programmers showcased in this repo: https://github.com/uncomplicate/deep-diamond (it's not in Rust, and I'm not sure what the current state of it is, though!).

  • WorkOS

    The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.

    WorkOS logo
NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts