Pytorch
Flux.jl
Pytorch | Flux.jl | |
---|---|---|
381 | 27 | |
86,466 | 4,572 | |
1.2% | 0.3% | |
10.0 | 9.1 | |
3 days ago | 17 days ago | |
Python | Julia | |
BSD 3-clause "New" or "Revised" License | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Pytorch
-
Must-Know 2025 Developer’s Roadmap and Key Programming Trends
Python’s Growth in Data Work and AI: Python continues to lead because of its easy-to-read style and the huge number of libraries available for tasks from data work to artificial intelligence. Tools like TensorFlow and PyTorch make it a must-have. Whether you’re experienced or just starting, Python’s clear style makes it a good choice for diving into machine learning. Actionable Tip: If you’re new to Python, try projects that combine data with everyday problems. For example, build a simple recommendation system using Pandas and scikit-learn.
-
Decorator JITs: Python as a DSL
Basically this style of code - https://github.com/pytorch-labs/attention-gym/pull/84/files - has issues like this - https://github.com/pytorch/pytorch/pull/137452 https://github.com/pytorch/pytorch/issues/144511 https://github.com/pytorch/pytorch/issues/145869
For some higher level context, see https://pytorch.org/blog/flexattention/
-
Building an AI-powered Financial Data Analyzer with NodeJS, Python, SvelteKit, and TailwindCSS - Part 0
The AI Service will be built using aiohttp (asynchronous Python web server) and integrates PyTorch, Hugging Face Transformers, numpy, pandas, and scikit-learn for financial data analysis.
- PyTorch 2.6.0 Release
-
Responsible Innovation: Open Source Best Practices for Sustainable AI
Open source frameworks like PyTorch are already enabling Machine Learning breakthroughs because they’re living communities where great things happen through:
-
Golang Vs. Python Performance: Which Programming Language Is Better?
- Data Science and AI: TensorFlow, PyTorch and scikit-learn are only a few of the standard Python libraries. - Web Development: development of web-based applications is made simple by frameworks such as Flask as well as Django. - Prototyping: Python's ease of use lets you quickly iterate and testing concepts.
-
How to resolve the dlopen problem with Nvidia and PyTorch or Tensorflow inside a virtual env
By chance, Tensorflow or PyTorch can work with pip packages from Nvidia.
- Making VLLM work on WSL2
-
2025’s Must-Know Tech Stacks
PyTorch
-
Experiments with Byte Matrix Multiplication
> It's quite common in machine learning operations to multiply a matrix of unsigned byte by a matrix of signed byte. Don't ask me why, but that's the case.
Overflow is the reason. Intel's vpmaddubsw takes int8_t and uint8_t to give you results in int16_t. If both are unsigned 255 * 255 = 65025 will be out of range for int16_t so likely the instruction is designed to take int8_t and uint8_t. The overflow (or rather saturation with this instruction) can still occur because it sums to adjacent multiplication. See my comment in PyTorch. https://github.com/pytorch/pytorch/blob/a37db5ae3978010e1bb7...
Flux.jl
- Flux is fast and it's open source
-
Micrograd.jl
- and the killer one: none of my coauthors used Julia, so I decided to just go with PyTorch.
PyTorch has been just fine, and it's nice to not have to reinvent to wheel for every new model architecture.
[0] https://fluxml.ai/
-
The open weight Flux text to image model is next level
The name is a bit unfortunate given that Julia's main ML library is called Flux, see https://fluxml.ai.
This library is quite well known and has been around since, at least, 2016: https://github.com/FluxML/Flux.jl/graphs/code-frequency.
- Julia 1.10 Released
-
What Apple hardware do I need for CUDA-based deep learning tasks?
If you are really committed to running on Apple hardware then take a look at Tensorflow for macOS. Another option is the Julia programming language which has very basic Metal support at a CUDA-like level. FluxML would be the ML framework in Julia. I’m not sure either option will be painless or let you do everything you could do with a Nvidia GPU.
-
[D] ClosedAI license, open-source license which restricts only OpenAI, Microsoft, Google, and Meta from commercial use
Flux dominance!
-
What would be your programming language of choice to implement a JIT compiler ?
I’m no compiler expert but check out flux and zygote https://fluxml.ai/ https://fluxml.ai/
-
Any help or tips for Neural Networks on Computer Clusters
I would suggest you to look into Julia ecosystem instead of C++. Julia is almost identical to Python in terms of how you use it but it's still very fast. You should look into flux.jl package for Julia.
-
[D] Why are we stuck with Python for something that require so much speed and parallelism (neural networks)?
Give Julia a try: https://fluxml.ai
-
Deep Learning With Flux: Loss Doesn't Converge
2) Flux treats softmax a little different than most other activation functions (see here for more details) such as relu and sigmoid. When you pass an activation function into a layer like Dense(3, 32, relu), Flux expects that the function is broadcast over the layer's output. However, softmax cannot be broadcast as it operates over vectors rather than scalars. This means that if you want to use softmax as the final activation in your model, you need to pass it into Chain() like so:
What are some alternatives?
mediapipe - Cross-platform, customizable ML solutions for live and streaming media.
Lux.jl - Elegant and Performant Scientific Machine Learning in Julia
tinygrad - You like pytorch? You like micrograd? You love tinygrad! ❤️
Knet.jl - Koç University deep learning framework.
Pandas - Flexible and powerful data analysis / manipulation library for Python, providing labeled data structures similar to R data.frame objects, statistical functions, and much more
Transformers.jl - Julia Implementation of Transformer models
tensorflow - An Open Source Machine Learning Framework for Everyone
flax - Flax is a neural network library for JAX that is designed for flexibility.
Deep Java Library (DJL) - An Engine-Agnostic Deep Learning Framework in Java
Torch.jl - Sensible extensions for exposing torch in Julia.
CNTK - Microsoft Cognitive Toolkit (CNTK), an open source deep-learning toolkit
model-zoo - Please do not feed the models