|over 2 years ago||8 months ago|
|MIT License||GNU General Public License v3.0 or later|
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
We haven't tracked posts mentioning shield yet.
Tracking mentions began in Dec 2020.
[D] Best methods for imbalanced multi-class classification with high dimensional, sparse predictors
2 projects | reddit.com/r/MachineLearning | 19 Jul 2021
The best method i've seen for dealing with this bias is to create "artificial contrasts" by including possibly many permutated copies of each feature and then doing a statistical test of the random forest importance values for each feature vs its shuffled contrasts. This method is described here: https://www.jmlr.org/papers/volume10/tuv09a/tuv09a.pdf and there is an implementation here: https://github.com/ryanbressler/CloudForest
What are some alternatives?
gago - :four_leaf_clover: Evolutionary optimization library for Go (genetic algorithm, partical swarm optimization, differential evolution)
Gorgonia - Gorgonia is a library that helps facilitate machine learning in Go.
go-deep - Artificial Neural Network
libsvm - libsvm go version
go-galib - Genetic Algorithms library written in Go / golang
gobrain - Neural Networks written in go
goRecommend - Collaborative Filtering (CF) Algorithms in Go!
tfgo - Tensorflow + Go, the gopher way
GoLearn - Machine Learning for Go