[P] OSLO: Open Source framework for Large-scale transformer Optimization

This page summarizes the projects mentioned and recommended in the original post on /r/MachineLearning

Our great sponsors
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • WorkOS - The modern identity platform for B2B SaaS
  • SaaSHub - Software Alternatives and Reviews
  • transformers

    🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.

  • OSLO is a framework that provides various GPU based optimization features for large-scale modeling. As of 2021, the [Hugging Face Transformers](https://github.com/huggingface/transformers) is being considered de facto standard. However, it does not best fit the purposes of large-scale modeling yet.

  • oslo

    Discontinued OSLO: Open Source framework for Large-scale model Optimization (by tunib-ai)

  • This is where OSLO comes in. OSLO is designed to make it easier to train large models with the Transformers. For example, you can fine-tune [GPTJ](https://huggingface.co/EleutherAI/gpt-j-6B) on the [Hugging Face Model Hub](https://huggingface.co/models) without many extra efforts using OSLO. Currently, GPT2, GPTNeo, and GPTJ are supported, but we plan to support more soon. For more information, see https://github.com/tunib-ai/oslo !

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

    InfluxDB logo
NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts