First model released by BigScience outperforms GPT-3 while being 16x smaller

This page summarizes the projects mentioned and recommended in the original post on news.ycombinator.com

Our great sponsors
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • WorkOS - The modern identity platform for B2B SaaS
  • SaaSHub - Software Alternatives and Reviews
  • promptsource

    Toolkit for creating, sharing and using natural language prompts.

  • We fine-tuned the model on a dozens of different NLP datasets and tasks in a prompted style. You can read all the prompts in the appendix or get them all here: https://github.com/bigscience-workshop/promptsource . Most NLP tasks are not particularly freeform, or they are naturally length limited like summary (XSum is very short). As a consequence, the model mostly defaults to short responses. Your "trick" is not that unreasonable though! Many of the training prompts that want long responses, ask for them explicitly.

  • transformers

    🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.

  • (author here)

    I don't have exact numbers for latency but the inference widget is currently on a TPU v3-8 (which if I am not mistaken could roughly be compared to a cluster of 8 V100). That gives you a rough idea of the latency for short inputs.

    Note that a colleague just reminded me that it is possible on a single (big) GPU with enough CPU to run inference for T5-11B (which is the size we use) with offloading -> https://github.com/huggingface/transformers/issues/9996#issu...

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

    InfluxDB logo
NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts