tsdae

Tranformer-based Denoising AutoEncoder for Sentence Transformers Unsupervised pre-training. (by louisbrulenaudet)

Tsdae Alternatives

Similar projects and alternatives to tsdae based on common topics and language

  • HyperGAN

    2 tsdae VS HyperGAN

    Composable GAN framework with api and user interface

  • happy-transformer

    Happy Transformer makes it easy to fine-tune and perform inference with NLP Transformer models.

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

    InfluxDB logo
  • haystack

    55 tsdae VS haystack

    :mag: LLM orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your data. With advanced retrieval methods, it's best suited for building RAG, question answering, semantic search or conversational agent chatbots.

  • llmware

    9 tsdae VS llmware

    Providing enterprise-grade LLM-based development framework, tools, and fine-tuned models.

  • tm2tb

    1 tsdae VS tm2tb

    Bilingual term extractor

  • NLTK

    64 tsdae VS NLTK

    NLTK Source

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a better tsdae alternative or higher similarity.

tsdae reviews and mentions

Posts with mentions or reviews of tsdae. We have used some of these posts to build our list of alternatives and similar projects.
  • Tranformer-based Denoising AutoEncoder for ST Unsupervised pre-training
    1 project | news.ycombinator.com | 4 Feb 2024
    A new PyPI package for training sentence embedding models in just 2 lines.

    The acquisition of sentence embeddings often necessitates a substantial volume of labeled data. However, in many cases and fields, labeled data is rarely accessible, and the procurement of such data is costly. In this project, we employ an unsupervised process grounded in pre-trained Transformers-based Sequential Denoising Auto-Encoder (TSDAE), introduced by the Ubiquitous Knowledge Processing Lab of Darmstadt, which can realize a performance level reaching 93.1% of in-domain supervised methodologies.

    The TSDAE schema comprises two components: an encoder and a decoder. Throughout the training process, TSDAE translates tainted sentences into uniform-sized vectors, necessitating the decoder to reconstruct the original sentences utilizing this sentence embedding. For good reconstruction quality, the semantics must be captured well in the sentence embeddings from the encoder. Subsequently, during inference, the encoder is solely utilized to form sentence embeddings.

    GitHub : https://github.com/louisbrulenaudet/tsdae

    Installation :

Stats

Basic tsdae repo stats
1
3
5.1
about 2 months ago

louisbrulenaudet/tsdae is an open source project licensed under Apache License 2.0 which is an OSI approved license.

The primary programming language of tsdae is Python.


Sponsored
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com