ForwardDiff.jl
Pytorch
Our great sponsors
ForwardDiff.jl | Pytorch | |
---|---|---|
4 | 338 | |
854 | 78,016 | |
1.4% | 2.7% | |
5.7 | 10.0 | |
22 days ago | about 11 hours ago | |
Julia | Python | |
GNU General Public License v3.0 or later | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
ForwardDiff.jl
-
The Elements of Differentiable Programming
You seem somewhat obsessed with the idea that reverse-mode autodiff is not the same technique as forward-mode autodiff. It makes you,,, angry? Seems like such a trivial thing to act a complete fool over.
What's up with that?
Anyway, here's a forward differentiation package with a file that might interest you
https://github.com/JuliaDiff/ForwardDiff.jl/blob/master/src/...
-
Excited for Julia v1.9
Just so you know, v1.9 doesn't solve the load problems. What it does it gives package authors the tools to solve the problems, specifically precompilation as binaries and package extensions. It won't actually solve the load problems until the packages are updated to effectively make use of these features. This is already underway, https://sciml.ai/news/2022/09/21/compile_time/ with things like and https://github.com/JuliaDiff/ForwardDiff.jl/pull/625, but it is a fairly heavy lift to ensure things aren't invalidating and that everything that's necessary is precompiling.
-
Looking for numerical/iterative approach for determining a value
As a quick way to do it, you can use ForwardDiff.jl to determine the partial with respect to h. Then use a Newton-Raphson algorithm to solve for the value of h. I'm not familiar with the actual problem you're solving so there may be more appropriate ways to solve this based on the shape of your function, but this is my knee-jerk reaction to a problem like this. You could also calculate the partial derivative analytically if that is something that you want.
-
Question About Numerical Derivatives/Gradients: Why has no one yet implemented a gradient function in Julia that is similar to the gradient function in MATLAB and NumPy?
In these discussions, which are the only ones I could find that are the most pertinent and similar to what I'm talking about, https://github.com/JuliaDiff/ForwardDiff.jl/issues/390 and https://discourse.julialang.org/t/differentiation-without-explicit-function-np-gradient/57784 , nobody suggested or answered FiniteDiff.jl's finite differencing gradient for getting the numerical derivatives/gradients of an array of values. The answer is either the diff() function or Interpolations.jl, which I already explained in the post why I would want an alternative to those two options to exist, without having to call NumPy's gradient function.
Pytorch
-
Einsum in 40 Lines of Python
PyTorch also has some support for them, but it's quite incomplete and has many issues so that it is basically unusable. And its future development is also unclear. https://github.com/pytorch/pytorch/issues/60832
-
Library for Machine learning and quantum computing
TensorFlow
-
My Favorite DevTools to Build AI/ML Applications!
TensorFlow, developed by Google, and PyTorch, developed by Facebook, are two of the most popular frameworks for building and training complex machine learning models. TensorFlow is known for its flexibility and robust scalability, making it suitable for both research prototypes and production deployments. PyTorch is praised for its ease of use, simplicity, and dynamic computational graph that allows for more intuitive coding of complex AI models. Both frameworks support a wide range of AI models, from simple linear regression to complex deep neural networks.
-
penzai: JAX research toolkit for building, editing, and visualizing neural nets
> does PyTorch have a similar concept
of course https://github.com/pytorch/pytorch/blob/main/torch/utils/_py...
-
Tinygrad: Hacked 4090 driver to enable P2P
fyi should work on most 40xx[1]
[1] https://github.com/pytorch/pytorch/issues/119638#issuecommen...
-
The Elements of Differentiable Programming
Sure, right here: https://github.com/pytorch/pytorch/blob/main/torch/autograd/...
Here's the documentation: https://pytorch.org/tutorials/intermediate/forward_ad_usage....
> When an input, which we call “primal”, is associated with a “direction” tensor, which we call “tangent”, the resultant new tensor object is called a “dual tensor” for its connection to dual numbers[0].
-
Functions and operators for Dot and Matrix multiplication and Element-wise calculation in PyTorch
*My post explains Dot, Matrix and Element-wise multiplication in PyTorch.
-
Dot vs Matrix vs Element-wise multiplication in PyTorch
In PyTorch with @, dot() or matmul():
-
Building a GPT Model from the Ground Up!
import torch # we use PyTorch: https://pytorch.org data = torch.tensor(encode(text), dtype=torch.long) print(data.shape, data.dtype) print(data[:1000]) # the 1000 characters we looked at earlier will to the GPT look like this
-
Open Source Ascendant: The Transformation of Software Development in 2024
AI's Open Embrace Artificial intelligence (AI) and machine learning (ML) are increasingly leveraging open-source frameworks like TensorFlow [https://www.tensorflow.org/] and PyTorch [https://pytorch.org/]. This democratization of AI tools is driving innovation and lowering entry barriers across industries.
What are some alternatives?
Zygote.jl - 21st century AD
Flux.jl - Relax! Flux is the ML library that doesn't make you tensor
FiniteDiff.jl - Fast non-allocating calculations of gradients, Jacobians, and Hessians with sparsity support
mediapipe - Cross-platform, customizable ML solutions for live and streaming media.
Enzyme.jl - Julia bindings for the Enzyme automatic differentiator
Apache Spark - Apache Spark - A unified analytics engine for large-scale data processing
ChainRules.jl - forward and reverse mode automatic differentiation primitives for Julia Base + StdLibs
flax - Flax is a neural network library for JAX that is designed for flexibility.
NBodySimulator.jl - A differentiable simulator for scientific machine learning (SciML) with N-body problems, including astrophysical and molecular dynamics
tinygrad - You like pytorch? You like micrograd? You love tinygrad! ❤️ [Moved to: https://github.com/tinygrad/tinygrad]
Tullio.jl - ⅀
Pandas - Flexible and powerful data analysis / manipulation library for Python, providing labeled data structures similar to R data.frame objects, statistical functions, and much more