GEM VS cleora

Compare GEM vs cleora and see what are their differences.

Our great sponsors
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • WorkOS - The modern identity platform for B2B SaaS
  • SaaSHub - Software Alternatives and Reviews
GEM cleora
1 8
1,264 472
- 0.6%
0.0 2.4
5 months ago 6 months ago
Python Jupyter Notebook
BSD 3-clause "New" or "Revised" License GNU General Public License v3.0 or later
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

GEM

Posts with mentions or reviews of GEM. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2021-01-04.
  • [D] Why I'm Lukewarm on Graph Neural Networks
    4 projects | /r/MachineLearning | 4 Jan 2021
    Besides, they implemented a fast C++ version of the code that works for much larger graphs. If one searches for ProNE's implementation, they would (hypothetically) find the scikit-style wrapper instead of the fully-functional release. It reminds me of a situation with HOPE, when authors of one survey "implemented" it as naive SVD (https://github.com/palash1992/GEM/blob/master/gem/embedding/hope.py#L68) instead of Jacobi-Davidson generalized solver described in the paper (and literally with code released!!). In the end, I would assume that poor paper was less cited because of that repackaging effort.

cleora

Posts with mentions or reviews of cleora. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2021-02-11.
  • [R] Cleora: A Simple, Strong and Scalable Graph Embedding Scheme
    2 projects | /r/MachineLearning | 11 Feb 2021
    Abstract: The area of graph embeddings is currently dominated by contrastive learning methods, which demand formulation of an explicit objective function and sampling of positive and negative examples. This creates a conceptual and computational overhead. Simple, classic unsupervised approaches like Multidimensional Scaling (MSD) or the Laplacian eigenmap skip the necessity of tedious objective optimization, directly exploiting data geometry. Unfortunately, their reliance on very costly operations such as matrix eigendecomposition make them unable to scale to large graphs that are common in today's digital world. In this paper we present Cleora: an algorithm which gets the best of two worlds, being both unsupervised and highly scalable. We show that high quality embeddings can be produced without the popular step- wise learning framework with example sampling. An intuitive learning objective of our algorithm is that a node should be similar to its neighbors, without explicitly pushing disconnected nodes apart. The objective is achieved by iterative weighted averaging of node neigbors' embeddings, followed by normalization across dimensions. Thanks to the averaging operation the algorithm makes rapid strides across the embedding space and usually reaches optimal embeddings in just a few iterations. Cleora runs faster than other state-of-the-art CPU algorithms and produces embeddings of competitive quality as measured on downstream tasks: link prediction and node classification. We show that Cleora learns a data abstraction that is similar to contrastive methods, yet at much lower computational cost. We open-source Cleora under the MIT license allowing commercial use under this https URL.
    2 projects | /r/MachineLearning | 11 Feb 2021
    Our team at Synerise AI has open sourced Cleora - an ultra fast vertex embedding tool for graphs & hypergraphs. If you've ever used node2vec, DeepWalk, LINE or similar methods - it might be worth to check it out.
  • [D] Why I'm Lukewarm on Graph Neural Networks
    4 projects | /r/MachineLearning | 4 Jan 2021
    Thanks for raising so many interesting points about model performance and complexity. In this context, I think our newly released graph embedding library - Cleora - might be of interest: https://github.com/Synerise/cleora Cleora has some nice performance-wise properties:
  • Rusticles #20 - Wed Nov 18 2020
    10 projects | dev.to | 17 Nov 2020
    Synerise/cleora (Rust): Cleora AI is a general-purpose model for efficient, scalable learning of stable and inductive entity embeddings for heterogeneous relational data.

What are some alternatives?

When comparing GEM and cleora you can also consider the following projects:

i3status-rust - Very resourcefriendly and feature-rich replacement for i3status, written in pure Rust

Owlyshield - Owlyshield is an EDR framework designed to safeguard vulnerable applications from potential exploitation (C&C, exfiltration and impact).

node2vec-c - node2vec implementation in C++

textsynth - A (unofficial) Rust wrapper for the TextSynth API.

finalfusion-rust - finalfusion embeddings in Rust

sursis - A [personal]<-[notebook]->[network]. Complete with custom numerics for constrained Gaussian gravitation physics.

yourcontrols - Shared cockpit for Microsoft Flight Simulator.

PyO3 - Rust bindings for the Python interpreter

karateclub - Karate Club: An API Oriented Open-source Python Framework for Unsupervised Learning on Graphs (CIKM 2020)

dog - A command-line DNS client.

ggez - Rust library to create a Good Game Easily

LAGraph - This is a library plus a test harness for collecting algorithms that use the GraphBLAS. For test coverage reports, see https://graphblas.org/LAGraph/ . Documentation: https://lagraph.readthedocs.org