coral-cnn
horovod
coral-cnn | horovod | |
---|---|---|
4 | 1 | |
330 | 11,889 | |
2.1% | - | |
0.0 | 9.4 | |
about 3 years ago | over 2 years ago | |
Python | Python | |
MIT License | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
coral-cnn
-
[D] Why is Ordinal Regression so overlooked?
The most recent and usable DL attempt I have found is the CORAL/CORN frameworks (keras, pytorch) which have just a few stars, and that's it.
-
[D] can regression models be used for ranking?
To your question, there are specific types of models called ordinal regression / ordinal classification models that do not assume a metric distance between values. E.g., if you have "20/hr, $15/hr, $0/hr" these models don't assume that the distance between 0 and 15 is 3x the distance between 20 and 15. It just assumes 20 > 15 > 0. We worked on this a bit in the context of neural networks: https://www.sciencedirect.com/science/article/pii/S016786552030413X , https://raschka-research-group.github.io/coral_pytorch/
-
[D] Modeling class errors
If you are interested, I recently worked on a simple ordinal regression approach for neural networks here: https://www.sciencedirect.com/science/article/pii/S016786552030413X
-
[R] [D] What machine learning methods can be used for ordinal regression?
Just took a quick look at that paper, it sounds like a good approach. If you are interested, we recently developed an ordinal regression approach with implementation in PyTorch (https://github.com/Raschka-research-group/coral-cnn). Someone also recently ported it to Keras: https://github.com/ck37/coral-ordinal. I haven't read the paper you mentioned in detail, but it seems our method is similar except that we add the probabilities that are >0.5 and that we have theoretical guarantees. rank consistency.
horovod
-
[P] Cost of distributed deep learning on AWS
Code for https://arxiv.org/abs/1802.05799 found: https://github.com/uber/horovod
What are some alternatives?
coral-ordinal - Tensorflow Keras implementation of ordinal regression using consistent rank logits (CORAL) by Cao et al. (2019)
tf-encrypted - A Framework for Encrypted Machine Learning in TensorFlow
datatap-python - Focus on Algorithm Design, Not on Data Wrangling
rxray - Ray distributed computing integration for RxPY
contrastive-unpaired-translation - Contrastive unpaired image-to-image translation, faster and lighter training than cyclegan (ECCV 2020, in PyTorch)
seq2seq - A general-purpose encoder-decoder framework for Tensorflow
PyTorchZeroToAll - Simple PyTorch Tutorials Zero to ALL!
polyaxon - MLOps Tools For Managing & Orchestrating The Machine Learning LifeCycle
ludwig - Low-code framework for building custom LLMs, neural networks, and other AI models
ray_snowflake - Ray Data Connector for Snowflake
client - DagsHub client libraries
deep-significance - Enabling easy statistical significance testing for deep neural networks.