autoembedder
PyTorch autoencoder with additional embeddings layer for categorical data 🚘 (by chrislemke)
attention-mixed-type-clustering
Attention in Mixed-Type Clustering (by jaanisfehling)
autoembedder | attention-mixed-type-clustering | |
---|---|---|
1 | 1 | |
8 | 0 | |
- | - | |
0.0 | 8.8 | |
15 days ago | 6 months ago | |
Python | Jupyter Notebook | |
MIT License | MIT License |
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
autoembedder
Posts with mentions or reviews of autoembedder.
We have used some of these posts to build our list of alternatives
and similar projects. The last one was on 2023-06-25.
-
How to learn Categorial Embeddings in Unsupervised Learning?
Solutions I found here and here propose to save the Input Batch as a in a variable after feeding it into the Embeddings Layer (but before the AE) and use that as the target for the loss function.
attention-mixed-type-clustering
Posts with mentions or reviews of attention-mixed-type-clustering.
We have used some of these posts to build our list of alternatives
and similar projects. The last one was on 2023-06-25.
-
How to learn Categorial Embeddings in Unsupervised Learning?
I am a ML/DL beginner, but this sounds fishy to me, because the Embeddings will not be trained by gradient descent. I tested this approach on a small tabular dataset vs. just feeding the categorial data into the AE (no Embeddings) and found that using the first approach (saving embedded cols as variable) to moderatly degrade Clustering Accuracy and NMI Score (This is not representative - just a small test on a small dataset). Here is my Notebook.
What are some alternatives?
When comparing autoembedder and attention-mixed-type-clustering you can also consider the following projects:
ds2 - Easiest way to use AI models without coding (Web UI & API support)
wysiwyh - A neural net to transform a video into audio in real time.
ALAE - [CVPR2020] Adversarial Latent Autoencoders
ludwig - Low-code framework for building custom LLMs, neural networks, and other AI models
poutyne - A simplified framework and utilities for PyTorch
pyod - A Comprehensive and Scalable Python Library for Outlier Detection (Anomaly Detection)
nni - An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.