Top 14 pre-training Open-Source Projects
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
Awesome-TimeSeries-SpatioTemporal-LM-LLM
A professional list on Large (Language) Models and Foundation Models (LLM, LM, FM) for Time Series, Spatiotemporal, and Event Data.
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
-
conceptual-12m
Conceptual 12M is a dataset containing (image-URL, caption) pairs collected for vision-and-language pre-training.
-
STEP
Code for our SIGKDD'22 paper Pre-training-Enhanced Spatial-Temporal Graph Neural Network For Multivariate Time Series Forecasting. (by zezhishao)
-
GearNet
GearNet and Geometric Pretraining Methods for Protein Structure Representation Learning, ICLR'2023 (https://arxiv.org/abs/2203.06125)
-
Bamboo
Bamboo: 4 times larger than ImageNet; 2 time larger than Object365; Built by active learning. (by ZhangYuanhan-AI)
-
TensorFlowTTS-ts
This project implements TensorflowTTS in Tensorflow.js using Typescript, enabling real-time text-to-speech in the browser. With pre-trained model for English language, you can generate high-quality speech from text input.
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
Here’s another one - it’s older but has some interesting charts and graphs.
https://arxiv.org/abs/2303.18223
Project mention: Awesome-TimeSeries-AIOps-LM-LLM: NEW Data - star count:169.0 | /r/algoprojects | 2023-10-21
It was created by curating 3.8 million high-resolution videos from the publicly available HD-VILA-100M dataset. The dataset is created following these three steps:
Project mention: Moirai: A Time Series Foundation Model for Universal Forecasting | news.ycombinator.com | 2024-03-25Code is available! https://github.com/SalesforceAIResearch/uni2ts
Project mention: Tranformer-based Denoising AutoEncoder for ST Unsupervised pre-training | news.ycombinator.com | 2024-02-04A new PyPI package for training sentence embedding models in just 2 lines.
The acquisition of sentence embeddings often necessitates a substantial volume of labeled data. However, in many cases and fields, labeled data is rarely accessible, and the procurement of such data is costly. In this project, we employ an unsupervised process grounded in pre-trained Transformers-based Sequential Denoising Auto-Encoder (TSDAE), introduced by the Ubiquitous Knowledge Processing Lab of Darmstadt, which can realize a performance level reaching 93.1% of in-domain supervised methodologies.
The TSDAE schema comprises two components: an encoder and a decoder. Throughout the training process, TSDAE translates tainted sentences into uniform-sized vectors, necessitating the decoder to reconstruct the original sentences utilizing this sentence embedding. For good reconstruction quality, the semantics must be captured well in the sentence embeddings from the encoder. Subsequently, during inference, the encoder is solely utilized to form sentence embeddings.
GitHub : https://github.com/louisbrulenaudet/tsdae
Installation :
pre-training related posts
Index
What are some of the best open-source pre-training projects? This list will help you:
Project | Stars | |
---|---|---|
1 | LLMSurvey | 8,825 |
2 | Oscar | 1,030 |
3 | Awesome-CLIP | 1,019 |
4 | TencentPretrain | 981 |
5 | Awesome-TimeSeries-SpatioTemporal-LM-LLM | 742 |
6 | XPretrain | 438 |
7 | uni2ts | 416 |
8 | conceptual-12m | 305 |
9 | STEP | 305 |
10 | GearNet | 242 |
11 | Bamboo | 161 |
12 | AMRBART | 89 |
13 | TensorFlowTTS-ts | 6 |
14 | tsdae | 3 |
Sponsored