contextualized-topic-models
OCTIS
Our great sponsors
contextualized-topic-models | OCTIS | |
---|---|---|
7 | 7 | |
1,157 | 681 | |
1.2% | 1.9% | |
5.0 | 6.0 | |
3 months ago | 4 months ago | |
Python | Python | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
contextualized-topic-models
-
[Project]Topic modelling of tweets from the same user
In our experiments, CTM works well with tweets: https://github.com/MilaNLProc/contextualized-topic-models (I'm one of the authors)
-
Extract words from large data set of reviews by sentiment
Use CTM https://github.com/MilaNLProc/contextualized-topic-models with sentiment labels to built distribution of words over labels
-
Using Transformer for Topic Modeling - what are the options?
This library from MILA seems quite neat! I haven’t had the change to play with it though : https://github.com/MilaNLProc/contextualized-topic-models
-
Catogorize the Data- Topic Modelling algorithm
a bit of shameless self-promotion, but we developed a topic model (https://github.com/MilaNLProc/contextualized-topic-models) that actually supports that use case!
-
(NLP) Best practices for topic modeling and generating interesting topics?
If you use CTM, you can provide the topic model two inputs: the preprocessed texts (that will be used by the topic model to generate the topical words) and the unpreprocessed texts (to generate the contextualized representations that will be later concatenated to the document bag-of-word representation). We saw that this slightly improves the performance instead of providing BERT the already-preprocessed text. This feature is supported in the original implementation of CTM, not in OCTIS. See here: https://github.com/MilaNLProc/contextualized-topic-models#combined-topic-model
-
Latest trends in topic modelling?
Cross-lingual Contextualized Topic Models with Zero-shot Learning from a team at MilaNLP which uses bag of words representations in combination with multi lingual embeddings from SBERT and works like a VAE (encode the input, use the encoded representation to decode back to a bag of words as close to the input as possible). Using SBERT embeddings makes their model generalise for other languages which may be useful. One major shortfall of this model as I understand is that it can't deal with long documents very elegantly - only up to BERT'S word limit (the workaround is to truncate and use the first words)
OCTIS
-
Interpretation of topic modeling results between LDA and BERTopic
OCTIS
-
(NLP) Best practices for topic modeling and generating interesting topics?
My team and I have recently released a python library called OCTIS (https://github.com/mind-Lab/octis) that allows you to automatically optimize the hyperparameters of a topic model according to a given evaluation metric (not log-likelihood). I guess, in your case, you might be interested in topic coherence. So you will get good quality topics with a low effort on the choice of the hyperparameters. Also, we included some state-of-the-art topic models, e.g. contextualized topic models (https://github.com/MilaNLProc/contextualized-topic-models).
-
I am working on a topic modelling paper and I need your help
I recently released a topic modeling library that also includes different evaluation measures. If you are interested, I leave here the link: https://github.com/mind-Lab/octis
-
Latest trends in topic modelling?
Silvia Terragni (a coauthor on the above) also brought a topic modelling library OCTIS which was exhibited as a demo paper and aims to be the huggingface transformers of topic modelling - it includes wrappers around the above model as well as and LDA and some baselines as well as some tools and frameworks for comparing them.
-
OCTIS a python framework to compare and optimize Topic Models
Link to the code Paper
- OCTIS, our new python framework to optimize and compare topic models has been accepted at EACL2021!
- [p] OCTIS: Optimizing and Comparing Topic models Is Simple. Our new python framework to compare and optimize topic models using Bayesian Optimization
What are some alternatives?
BERTopic - Leveraging BERT and c-TF-IDF to create easily interpretable topics.
PolyFuzz - Fuzzy string matching, grouping, and evaluation.
auto-sklearn - Automated Machine Learning with scikit-learn
tika-python - Tika-Python is a Python binding to the Apache Tika™ REST services allowing Tika to be called natively in the Python community.
image-similarity-measures - :chart_with_upwards_trend: Implementation of eight evaluation metrics to access the similarity between two images. The eight metrics are as follows: RMSE, PSNR, SSIM, ISSM, FSIM, SRE, SAM, and UIQ.
transformers - 🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
SMAC3 - SMAC3: A Versatile Bayesian Optimization Package for Hyperparameter Optimization
Top2Vec - Top2Vec learns jointly embedded topic, document and word vectors.
TopMost - A Topic Modeling System Toolkit
Sentimentanalysis - Language independent sentiment analysis
mlconjug3 - A Python library to conjugate verbs in French, English, Spanish, Italian, Portuguese and Romanian (more soon) using Machine Learning techniques.