transformer-lm

Transformer language model (GPT-2) with sentencepiece tokenizer (by lopuhin)

Transformer-lm Alternatives

Similar projects and alternatives to transformer-lm based on common topics and language

  • transformers

    🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.

  • gpt-neo

    Discontinued An implementation of model parallel GPT-2 and GPT-3-style models using the mesh-tensorflow library.

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

    InfluxDB logo
  • RWKV-LM

    RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.

  • LoRA

    Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"

  • xTuring

    Build, customize and control you own LLMs. From data pre-processing to fine-tuning, xTuring provides an easy way to personalize open-source LLMs. Join our discord community: https://discord.gg/TgHXuSJEk6

  • TextRL

    Implementation of ChatGPT RLHF (Reinforcement Learning with Human Feedback) on any generation model in huggingface's transformer (blommz-176B/bloom/gpt/bart/T5/MetaICL)

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a better transformer-lm alternative or higher similarity.

transformer-lm reviews and mentions

Posts with mentions or reviews of transformer-lm. We have used some of these posts to build our list of alternatives and similar projects.
  • GPT-2 Transformer Model Dataset Format
    1 project | /r/MLQuestions | 24 Mar 2021
    Today i tried to setup and use this code on my machine. Everything worked and i was able to train and sample on some custom text. But i actually didn't get how to use the folder structure in the right way because its not explained in detail. I have to put .txt files in 3 different folders: train (i guess thats the folder where my "main" dataset belongs?) valid test Does somebody know what kind of text inputs belong in the valid and test folder?

Stats

Basic transformer-lm repo stats
1
163
0.0
about 3 years ago

The primary programming language of transformer-lm is Python.


Sponsored
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com