|about 6 years ago||about 1 year ago|
|-||GNU General Public License v3.0 or later|
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
We haven't tracked posts mentioning go-galib yet.
Tracking mentions began in Dec 2020.
[D] Best methods for imbalanced multi-class classification with high dimensional, sparse predictors
2 projects | reddit.com/r/MachineLearning | 19 Jul 2021
The best method i've seen for dealing with this bias is to create "artificial contrasts" by including possibly many permutated copies of each feature and then doing a statistical test of the random forest importance values for each feature vs its shuffled contrasts. This method is described here: https://www.jmlr.org/papers/volume10/tuv09a/tuv09a.pdf and there is an implementation here: https://github.com/ryanbressler/CloudForest
What are some alternatives?
gago - :four_leaf_clover: Evolutionary optimization library for Go (genetic algorithm, partical swarm optimization, differential evolution)
ddt - Golang Dynamic Decision Tree
gobrain - Neural Networks written in go
bayesian - Naive Bayesian Classification for Golang.
Gorgonia - Gorgonia is a library that helps facilitate machine learning in Go.
GoLearn - Machine Learning for Go
goga - Golang Genetic Algorithm
shield - Bayesian text classifier with flexible tokenizers and storage backends for Go
goRecommend - Collaborative Filtering (CF) Algorithms in Go!