Self train a super tiny model recommendations

This page summarizes the projects mentioned and recommended in the original post on /r/LocalLLaMA

Our great sponsors
  • WorkOS - The modern identity platform for B2B SaaS
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • SaaSHub - Software Alternatives and Reviews
  • llama.cpp

    LLM inference in C/C++

  • Llama.cpp has a train text from scratch example that you can run through CPU, RAM and offloading to GPU, although I'd imagine it'd be much less performant than a pure GPU solution.

  • transformers

    🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.

  • You can train it with the code provided in transformer repo: https://github.com/huggingface/transformers/blob/main/examples/pytorch/language-modeling/run_clm.py

  • WorkOS

    The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.

    WorkOS logo
  • ml-engineering

    Machine Learning Engineering Open Book

  • this might be interesting: https://github.com/stas00/ml-engineering/blob/master/transformers/make-tiny-models.md

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts