ForwardDiff.jl
julia
ForwardDiff.jl | julia | |
---|---|---|
4 | 351 | |
856 | 44,622 | |
0.8% | 0.7% | |
5.7 | 10.0 | |
22 days ago | 6 days ago | |
Julia | Julia | |
GNU General Public License v3.0 or later | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
ForwardDiff.jl
-
The Elements of Differentiable Programming
You seem somewhat obsessed with the idea that reverse-mode autodiff is not the same technique as forward-mode autodiff. It makes you,,, angry? Seems like such a trivial thing to act a complete fool over.
What's up with that?
Anyway, here's a forward differentiation package with a file that might interest you
https://github.com/JuliaDiff/ForwardDiff.jl/blob/master/src/...
-
Excited for Julia v1.9
Just so you know, v1.9 doesn't solve the load problems. What it does it gives package authors the tools to solve the problems, specifically precompilation as binaries and package extensions. It won't actually solve the load problems until the packages are updated to effectively make use of these features. This is already underway, https://sciml.ai/news/2022/09/21/compile_time/ with things like and https://github.com/JuliaDiff/ForwardDiff.jl/pull/625, but it is a fairly heavy lift to ensure things aren't invalidating and that everything that's necessary is precompiling.
-
Looking for numerical/iterative approach for determining a value
As a quick way to do it, you can use ForwardDiff.jl to determine the partial with respect to h. Then use a Newton-Raphson algorithm to solve for the value of h. I'm not familiar with the actual problem you're solving so there may be more appropriate ways to solve this based on the shape of your function, but this is my knee-jerk reaction to a problem like this. You could also calculate the partial derivative analytically if that is something that you want.
-
Question About Numerical Derivatives/Gradients: Why has no one yet implemented a gradient function in Julia that is similar to the gradient function in MATLAB and NumPy?
In these discussions, which are the only ones I could find that are the most pertinent and similar to what I'm talking about, https://github.com/JuliaDiff/ForwardDiff.jl/issues/390 and https://discourse.julialang.org/t/differentiation-without-explicit-function-np-gradient/57784 , nobody suggested or answered FiniteDiff.jl's finite differencing gradient for getting the numerical derivatives/gradients of an array of values. The answer is either the diff() function or Interpolations.jl, which I already explained in the post why I would want an alternative to those two options to exist, without having to call NumPy's gradient function.
julia
-
Top Paying Programming Technologies 2024
34. Julia - $74,963
-
Optimize sgemm on RISC-V platform
I don't believe there is any official documentation on this, but https://github.com/JuliaLang/julia/pull/49430 for example added prefetching to the marking phase of a GC which saw speedups on x86, but not on M1.
-
Dart 3.3
3. dispatch on all the arguments
the first solution is clean, but people really like dispatch.
the second makes calling functions in the function call syntax weird, because the first argument is privileged semantically but not syntactically.
the third makes calling functions in the method call syntax weird because the first argument is privileged syntactically but not semantically.
the closest things to this i can think of off the top of my head in remotely popular programming languages are: nim, lisp dialects, and julia.
nim navigates the dispatch conundrum by providing different ways to define free functions for different dispatch-ness. the tutorial gives a good overview: https://nim-lang.org/docs/tut2.html
lisps of course lack UFCS.
see here for a discussion on the lack of UFCS in julia: https://github.com/JuliaLang/julia/issues/31779
so to sum up the answer to the original question: because it's only obvious how to make it nice and tidy like you're wanting if you sacrifice function dispatch, which is ubiquitous for good reason!
-
Julia 1.10 Highlights
https://github.com/JuliaLang/julia/blob/release-1.10/NEWS.md
-
Best Programming languages for Data Analysis📊
Visit official site: https://julialang.org/
-
Potential of the Julia programming language for high energy physics computing
No. It runs natively on ARM.
julia> versioninfo() Julia Version 1.9.3 Commit bed2cd540a1 (2023-08-24 14:43 UTC) Build Info: Official https://julialang.org/ release
-
Rust std:fs slower than Python
https://github.com/JuliaLang/julia/issues/51086#issuecomment...
So while this "fixes" the issue, it'll introduce a confusing time delay between you freeing the memory and you observing that in `htop`.
But according to https://jemalloc.net/jemalloc.3.html you can set `opt.muzzy_decay_ms = 0` to remove the delay.
Still, the musl author has some reservations against making `jemalloc` the default:
https://www.openwall.com/lists/musl/2018/04/23/2
> It's got serious bloat problems, problems with undermining ASLR, and is optimized pretty much only for being as fast as possible without caring how much memory you use.
With the above-mentioned tunables, this should be mitigated to some extent, but the general "theme" (focusing on e.g. performance vs memory usage) will likely still mean "it's a tradeoff" or "it's no tradeoff, but only if you set tunables to what you need".
-
Eleven strategies for making reproducible research the norm
I have asked about Julia's reproducibility story on the Guix mailing list in the past, and at the time Simon Tournier didn't think it was promising. I seem to recall Julia itself didnt have a reproducible build. All I know now is that github issue is still not closed.
https://github.com/JuliaLang/julia/issues/34753
-
Julia as a unifying end-to-end workflow language on the Frontier exascale system
I don't really know what kind of rebuttal you're looking for, but I will link my HN comments from when this was first posted for some thoughts: https://news.ycombinator.com/item?id=31396861#31398796. As I said, in the linked post, I'm quite skeptical of the business of trying to assess relative buginess of programming in different systems, because that has strong dependencies on what you consider core vs packages and what exactly you're trying to do.
However, bugs in general suck and we've been thinking a fair bit about what additional tooling the language could provide to help people avoid the classes of bugs that Yuri encountered in the post.
The biggest class of problems in the blog post, is that it's pretty clear that `@inbounds` (and I will extend this to `@assume_effects`, even though that wasn't around when Yuri wrote his post) is problematic, because it's too hard to write. My proposal for what to do instead is at https://github.com/JuliaLang/julia/pull/50641.
Another common theme is that while Julia is great at composition, it's not clear what's expected to work and what isn't, because the interfaces are informal and not checked. This is a hard design problem, because it's quite close to the reasons why Julia works well. My current thoughts on that are here: https://github.com/Keno/InterfaceSpecs.jl but there's other proposals also.
-
Getaddrinfo() on glibc calls getenv(), oh boy
Doesn't musl have the same issue? https://github.com/JuliaLang/julia/issues/34726#issuecomment...
I also wonder about OSX's libc. Newer versions seem to have some sort of locking https://github.com/apple-open-source-mirror/Libc/blob/master...
but older versions (from 10.9) don't have any lockign: https://github.com/apple-oss-distributions/Libc/blob/Libc-99...
What are some alternatives?
Zygote.jl - 21st century AD
jax - Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
FiniteDiff.jl - Fast non-allocating calculations of gradients, Jacobians, and Hessians with sparsity support
NetworkX - Network Analysis in Python
Enzyme.jl - Julia bindings for the Enzyme automatic differentiator
Lua - Lua is a powerful, efficient, lightweight, embeddable scripting language. It supports procedural programming, object-oriented programming, functional programming, data-driven programming, and data description.
ChainRules.jl - forward and reverse mode automatic differentiation primitives for Julia Base + StdLibs
rust-numpy - PyO3-based Rust bindings of the NumPy C-API
NBodySimulator.jl - A differentiable simulator for scientific machine learning (SciML) with N-body problems, including astrophysical and molecular dynamics
Numba - NumPy aware dynamic Python compiler using LLVM
Tullio.jl - â…€
F# - Please file issues or pull requests here: https://github.com/dotnet/fsharp