curated-transformers

🤖 A PyTorch library of curated Transformer models and their composable components (by explosion)

Curated-transformers Alternatives

Similar projects and alternatives to curated-transformers

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a better curated-transformers alternative or higher similarity.

curated-transformers reviews and mentions

Posts with mentions or reviews of curated-transformers. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-12-20.
  • Minimal implementation of Mamba, the new LLM architecture, in 1 file of PyTorch
    7 projects | news.ycombinator.com | 20 Dec 2023
    https://github.com/explosion/curated-transformers/blob/main/...

    Llama 1/2:

    https://github.com/explosion/curated-transformers/blob/main/...

    MPT:

    https://github.com/explosion/curated-transformers/blob/main/...

    With various stuff enabled, including support for TorchScript JIT, PyTorch flash attention, etc.

  • Curated Transformers: MosaicMPT LLM decoder in 90 lines
    1 project | news.ycombinator.com | 10 Aug 2023
  • Non-determinism in GPT-4 is caused by Sparse MoE
    3 projects | news.ycombinator.com | 4 Aug 2023
    Yeah. In curated transformers [1] we are seeing completely deterministic output across multiple popular transformer architectures on a single GPU (there can be variance between GPUs due to different kernels).

    One non-determinism we see with a temperature of 0 is that once you have quantized weights, many predicted pieces will have the same probability, including multiple pieces with the highest probability. And then the sampler (if you are not using a greedy decoder) will sample from those pieces.

    In other words, a temperature of 0 is a poor man’s greedy decoding. (It is totally possible that OpenAI’s implementation switches to a greedy decoder with a temperature of 0).

    [1] https://github.com/explosion/curated-transformers

  • Curated Transformers: LLMs from reusable building blocks
    1 project | news.ycombinator.com | 4 Aug 2023
  • Show HN: Curated Transformers – PyTorch LLMs with less code duplication
    1 project | news.ycombinator.com | 15 Jul 2023
  • Show HN: Curated Transformers – Lightweight, composable PyTorch transformers
    1 project | news.ycombinator.com | 13 Jul 2023
  • Falcon LLM – A 40B Model
    6 projects | news.ycombinator.com | 17 Jun 2023
    There are no big differences compared to other LLM architecturally. The largest differences compared to NeoX are: no biases in linear layers, shared heads for the key and value representations (but not query).

    Of course, it has 40B parameters, but there is also a 7B parameter version. The primary issue is that the current upstream version (on Huggingface) hasn't implemented key-value caching correctly. KV caching is needed to bring the complexity down from O(n^3) to O(n^2). The issues are: (1) their implementation uses Torch' scaled dot-product attention, which uses incorrect causal masks when the query/key sizes are not the same (which it the case when generating with a cache. (2) They don't index the rotary embeddings correctly when using key-value cache, so the rotary embedding of the first token is used for all generated tokens. Together, this causes the model to output garbage and it only works when using it without KV caching, making it very slow.

    However, this is not a property of the model and they will probably fix this soon. E.g. the transformer library that we are currently developing supports Falcon with key-value caching and it the speed is on-par with other models of the same size:

    https://github.com/explosion/curated-transformers/blob/main/...

    (This is a correct implementation of the decoder layer.)

  • A note from our sponsor - SaaSHub
    www.saashub.com | 8 May 2024
    SaaSHub helps you find the best software and product alternatives Learn more →

Stats

Basic curated-transformers repo stats
7
838
9.0
21 days ago

Sponsored
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com