Gorgonia
transformers
Our great sponsors
Gorgonia | transformers | |
---|---|---|
21 | 173 | |
5,326 | 124,115 | |
1.0% | 2.4% | |
2.8 | 10.0 | |
17 days ago | 7 days ago | |
Go | Python | |
Apache License 2.0 | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Gorgonia
-
Machine Learning en GO! 🤯
GitHub - gorgonia/gorgonia: Gorgonia is a library that helps facilitate machine learning in Go.
-
Machine Learning
I did end up writing and using a custom library for Random Forest (it's also in AwesomGo) in one real-world project (detecting Alzheimer's and Parkinson's from speech from a mobile app) - https://github.com/malaschitz/randomForest I had better results than the team who used TensorFlow and most importantly I didn't have to use any other technology than Go. For NN's it's probably best to use https://gorgonia.org/ - but it's not exactly a user friendly library. But there is a whole book on it - Hands-On Deep Learning with Go.
- Why isn’t Go used in AI/ML?
- GoLang AI/ML open source projects
-
A systematic framework for technical documentation authoring
Perhaps it's a product of French culture, but because Gorgonia[0] has a number of French contributors, this was actually the way we structured our documentation.
But this is the first time I've heard of the name of the framework.
[0]: https://gorgonia.org
-
[D] When was the last time you wrote a custom neural net?
Oh it's.Gorgonia
-
Most Popular GoLang Frameworks
Website: https://gorgonia.org
-
[D] What framework are you using?
I use Gorgonia.
-
Why can't Go be popular for machine learning?
What you think about this https://github.com/gorgonia/gorgonia ? I also recall there is something else out there but can't find it at the moment...
-
Neural networks in golang
Yep, all of them: https://github.com/gorgonia/gorgonia
transformers
-
AI enthusiasm #6 - Finetune any LLM you want💡
Most of this tutorial is based on Hugging Face course about Transformers and on Niels Rogge's Transformers tutorials: make sure to check their work and give them a star on GitHub, if you please ❤️
-
Schedule-Free Learning – A New Way to Train
* Superconvergence + LR range finder + Fast AI's Ranger21 optimizer was the goto optimizer for CNNs, and worked fabulously well, but on transformers, the learning rate range finder sadi 1e-3 was the best, whilst 1e-5 was better. However, the 1 cycle learning rate stuck. https://github.com/huggingface/transformers/issues/16013
-
Gemma doesn't suck anymore – 8 bug fixes
Thanks! :) I'm pushing them into transformers, pytorch-gemma and collabing with the Gemma team to resolve all the issues :)
The RoPE fix should already be in transformers 4.38.2: https://github.com/huggingface/transformers/pull/29285
My main PR for transformers which fixes most of the issues (some still left): https://github.com/huggingface/transformers/pull/29402
- HuggingFace Transformers: Qwen2
- HuggingFace Transformers Release v4.36: Mixtral, Llava/BakLlava, SeamlessM4T v2
- HuggingFace: Support for the Mixtral Moe
-
Paris-Based Startup and OpenAI Competitor Mistral AI Valued at $2B
If you want to tinker with the architecture Hugging Face has a FOSS implementation in transformers: https://github.com/huggingface/transformers/blob/main/src/tr...
If you want to reproduce the training pipeline, you couldn't do that even if you wanted to because you don't have access to thousands of A100s.
-
Fail to reproduce the same evaluation metrics score during inference.
I am aware that using mixed precision reduces the stability of weight and there will be little consistency but don't expect it to be this much. I have attached the graph of evaluation metrics. If someone can give me some insight into this issue, that would be great.
-
[D] What is a good way to maintain code readability and code quality while scaling up complexity in libraries like Hugging Face?
In transformers, they tried really hard to have a single function or method to deal with both self and cross attention mechanisms, masking, positional and relative encodings, interpolation etc. While it allows a user to use the same function/method for any model, it has led to severe parameter bloat. Just compare the original implementation of llama by FAIR with the implementation by HF to get an idea.
-
Mixtral-7b-8expert working in Oobabooga (unquantized multi-gpu)
pip install git+https://github.com/huggingface/transformers.git@main
What are some alternatives?
onnx-go - onnx-go gives the ability to import a pre-trained neural network within Go without being linked to a framework or library.
fairseq - Facebook AI Research Sequence-to-Sequence Toolkit written in Python.
GoLearn - Machine Learning for Go
sentence-transformers - Multilingual Sentence & Image Embeddings with BERT
tfgo - Tensorflow + Go, the gopher way
llama - Inference code for Llama models
goml - On-line Machine Learning in Go (and so much more)
transformer-pytorch - Transformer: PyTorch Implementation of "Attention Is All You Need"
gosseract - Go package for OCR (Optical Character Recognition), by using Tesseract C++ library
text-generation-webui - A Gradio web UI for Large Language Models. Supports transformers, GPTQ, AWQ, EXL2, llama.cpp (GGUF), Llama models.
bayesian - Naive Bayesian Classification for Golang.
huggingface_hub - The official Python client for the Huggingface Hub.