Research2Vec
Representing research papers as vectors / latent representations. (by Santosh-Gupta)
what_are_embeddings
A deep dive into embeddings starting from fundamentals (by veekaybee)
Research2Vec | what_are_embeddings | |
---|---|---|
3 | 2 | |
194 | 864 | |
- | - | |
0.0 | 8.1 | |
about 3 years ago | about 1 month ago | |
Jupyter Notebook | Jupyter Notebook | |
- | - |
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Research2Vec
Posts with mentions or reviews of Research2Vec.
We have used some of these posts to build our list of alternatives
and similar projects.
- [P] 20K+ Arxiv ML Papers Vectorised, Cluster Application and Projector
-
20k+ ML Research Papers Vectorised + Clustered + Visualised! [OC]
In recent years, the number of research papers have grown tremendously. New areas are popping up everyday but it is not exactly clear which areas are emerging or which interesting new area has just surfaced up. I decided to cluster together 20k+ interesting machine learning papers that were recently surfaced up. Cluster Application: https://cloud.relevance.ai/dataset/research2vec/deploy/cluster/jacky-wong/M0FQOVdINEJZQTVzdWJmNHdQaXI6M1NIMVFncm9TNENZeU1vNUNHTUVWZw/60\_dWH4Bq8SHcPzXrEpF Embeddings Projector: https://cloud.relevance.ai/dataset/research2vec/deploy/projector/jacky-wong/NXNzdjUzNEIxczVzVVpOdUpabXE6TE92enhOZ1VTN2labDlocVZNNDlMUQ/4zQk534BY7n37LD0yk4A/old-australia-east/ I created the vectors using a fine-tuned version of Sentence Transformer's roberta-base model. What I scoped out from the problem: The training had to be unsupervised because no one would have any idea what was in the dataset An NLP embeddings-based approach with unsupervised clustering would be the simplest way to surface insights Interesting New Topics I Discovered Federated Learning,and Graph GANs were really interesting topics, along with the growth of Representation Learning Solution In order to get some form of off-the-shelf domain adaptation, I used off-the-shelf BART for unsupervised query generation and then fine-tuned my roberta embeddings using multiple negative rankings loss based on SentenceTransformers. This seemed to work quite well as the topics seemed to have separated out quite nicely in my embeddings projector. I then trained my model on the title and abstract of the research papers so that the model could better understand some of the data. Afterwards, I encoded the titles and clustered them using a simple K Means algorithm. Dataset The dataset curation process was fairly straightforward. I used the arxiv API and scraped 20k papers off the query "machine learning" sometime in late 2020 before I began experimenting with the work. I am looking to get feedback on what others would like to see in this application and would be curious to hear suggestions on where I could improve. From previous research, I did find this repository: https://github.com/Santosh-Gupta/Research2Vec However, as the dataset was different, I was unable to use the exact method provided. Disclaimer: I currently work for Relevance AI (the company behind the projector).
- 20k+ ML Research Papers Vectorised + Clustered + Visualised!
what_are_embeddings
Posts with mentions or reviews of what_are_embeddings.
We have used some of these posts to build our list of alternatives
and similar projects. The last one was on 2024-04-19.
-
The Illustrated Word2Vec
That is essentially correct. You take an object and "embed" it in a high-dimensional vector space to represent it.
For a deep dive, I highly recommend Vicki Boykis's free materials:
https://vickiboykis.com/what_are_embeddings/
-
GPT Weekly - 3rd July Edition - Adobe’s Safety Net, Open-Source AI: Expanded Context Lengths and more.
LLMs require embeddings to work. What are embeddings? But, did you know embeddings also power the recommendation engines? Another guide on embeddings.
What are some alternatives?
When comparing Research2Vec and what_are_embeddings you can also consider the following projects:
gpt-migrate - Easily migrate your codebase from one framework or language to another.