pre-training

Top 14 pre-training Open-Source Projects

  • LLMSurvey

    The official GitHub page for the survey paper "A Survey of Large Language Models".

  • Project mention: Ask HN: Textbook Regarding LLMs | news.ycombinator.com | 2024-03-23

    Here’s another one - it’s older but has some interesting charts and graphs.

    https://arxiv.org/abs/2303.18223

  • Oscar

    Oscar and VinVL

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

    InfluxDB logo
  • Awesome-CLIP

    Awesome list for research on CLIP (Contrastive Language-Image Pre-Training).

  • TencentPretrain

    Tencent Pre-training framework in PyTorch & Pre-trained Model Zoo

  • Awesome-TimeSeries-SpatioTemporal-LM-LLM

    A professional list on Large (Language) Models and Foundation Models (LLM, LM, FM) for Time Series, Spatiotemporal, and Event Data.

  • Project mention: Awesome-TimeSeries-AIOps-LM-LLM: NEW Data - star count:169.0 | /r/algoprojects | 2023-10-21
  • XPretrain

    Multi-modality pre-training

  • Project mention: CVPR 2024 Datasets and Benchmarks - Part 1: Datasets | dev.to | 2024-04-23

    It was created by curating 3.8 million high-resolution videos from the publicly available HD-VILA-100M dataset. The dataset is created following these three steps:

  • uni2ts

    Unified Training of Universal Time Series Forecasting Transformers

  • Project mention: Moirai: A Time Series Foundation Model for Universal Forecasting | news.ycombinator.com | 2024-03-25

    Code is available! https://github.com/SalesforceAIResearch/uni2ts

  • SaaSHub

    SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives

    SaaSHub logo
  • conceptual-12m

    Conceptual 12M is a dataset containing (image-URL, caption) pairs collected for vision-and-language pre-training.

  • STEP

    Code for our SIGKDD'22 paper Pre-training-Enhanced Spatial-Temporal Graph Neural Network For Multivariate Time Series Forecasting. (by zezhishao)

  • GearNet

    GearNet and Geometric Pretraining Methods for Protein Structure Representation Learning, ICLR'2023 (https://arxiv.org/abs/2203.06125)

  • Bamboo

    Bamboo: 4 times larger than ImageNet; 2 time larger than Object365; Built by active learning. (by ZhangYuanhan-AI)

  • AMRBART

    Code for our paper "Graph Pre-training for AMR Parsing and Generation" in ACL2022

  • TensorFlowTTS-ts

    This project implements TensorflowTTS in Tensorflow.js using Typescript, enabling real-time text-to-speech in the browser. With pre-trained model for English language, you can generate high-quality speech from text input.

  • tsdae

    Tranformer-based Denoising AutoEncoder for Sentence Transformers Unsupervised pre-training.

  • Project mention: Tranformer-based Denoising AutoEncoder for ST Unsupervised pre-training | news.ycombinator.com | 2024-02-04

    A new PyPI package for training sentence embedding models in just 2 lines.

    The acquisition of sentence embeddings often necessitates a substantial volume of labeled data. However, in many cases and fields, labeled data is rarely accessible, and the procurement of such data is costly. In this project, we employ an unsupervised process grounded in pre-trained Transformers-based Sequential Denoising Auto-Encoder (TSDAE), introduced by the Ubiquitous Knowledge Processing Lab of Darmstadt, which can realize a performance level reaching 93.1% of in-domain supervised methodologies.

    The TSDAE schema comprises two components: an encoder and a decoder. Throughout the training process, TSDAE translates tainted sentences into uniform-sized vectors, necessitating the decoder to reconstruct the original sentences utilizing this sentence embedding. For good reconstruction quality, the semantics must be captured well in the sentence embeddings from the encoder. Subsequently, during inference, the encoder is solely utilized to form sentence embeddings.

    GitHub : https://github.com/louisbrulenaudet/tsdae

    Installation :

  • SaaSHub

    SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives

    SaaSHub logo
NOTE: The open source projects on this list are ordered by number of github stars. The number of mentions indicates repo mentiontions in the last 12 Months or since we started tracking (Dec 2020).

pre-training related posts

Index

What are some of the best open-source pre-training projects? This list will help you:

Project Stars
1 LLMSurvey 8,825
2 Oscar 1,030
3 Awesome-CLIP 1,019
4 TencentPretrain 981
5 Awesome-TimeSeries-SpatioTemporal-LM-LLM 742
6 XPretrain 438
7 uni2ts 416
8 conceptual-12m 305
9 STEP 305
10 GearNet 242
11 Bamboo 161
12 AMRBART 89
13 TensorFlowTTS-ts 6
14 tsdae 3

Sponsored
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com