DiffEqOperators.jl
julia
DiffEqOperators.jl | julia | |
---|---|---|
3 | 361 | |
281 | 46,112 | |
- | 0.5% | |
4.6 | 10.0 | |
over 1 year ago | 5 days ago | |
Julia | Julia | |
GNU General Public License v3.0 or later | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
DiffEqOperators.jl
-
Julia 1.7 has been released
>I hope those benchmarks are coming in hot
M1 is extremely good for PDEs because of its large cache lines.
https://github.com/SciML/DiffEqOperators.jl/issues/407#issue...
The JuliaSIMD tools which are internally used for BLAS instead of OpenBLAS and MKL (because they tend to outperform standard BLAS's for the operations we use https://github.com/YingboMa/RecursiveFactorization.jl/pull/2...) also generate good code for M1, so that was giving us some powerful use cases right off the bat even before the heroics allowed C/Fortran compilers to fully work on M1.
-
Why are NonlinearSolve.jl and DiffEqOperators.jl incompatible with the latest versions of ModelingToolkit and Symbolics!!!? Symbolics and ModelingToolkit are heavily downgraded when those packages are added.
(b) DiffEqOperators.jl is being worked on https://github.com/SciML/DiffEqOperators.jl/pull/467 .
-
What's Bad about Julia?
I like that they are colored now, but really what needs to be added is type parameter collapasing. In most cases, you want to see `::Dual{...}`, i.e. "it's a dual number", not `::Dual{typeof(ODESolution{sfjeoisjfsfsjslikj},sfsef,sefs}` (these can literally get to 3000 characters long). As an example of this, see the stacktraces in something like https://github.com/SciML/DiffEqOperators.jl/issues/419 . The thing is that it gives back more type information than the strictest dispatch: no function is dispatching off of that first 3000 character type parameter, so you know that printing that chunk of information is actually not informative to any method decisions. Automated type abbreviations could take that heuristic and chop out a lot of the cruft.
julia
- I Chose Common Lisp
-
Stressify.jl Performance Testing
_ __ _(_)_ | Documentation: https://docs.julialang.org (_) | (_) (_) | _ _ _| |_ __ _ | Type "?" for help, "]?" for Pkg help. | | | | | | |/ _` | | | | |_| | | | (_| | | Version 1.11.2 (2024-12-01) _/ |\__'_|_|_|\__'_| | Official https://julialang.org/ release |__/ | julia>
-
Julia Emerges as Powerful New Language for Scientific Machine Learning, Rivaling Python and MATLAB
The paper examines the current state of the Julia programming language for scientific machine learning (SML). Julia is a relatively new language that is designed to be fast, easy to use, and well-suited for scientific and numerical computing.
-
A Comprehensive Guide to Training a Simple Linear Regression Model in Julia
Download and Install Julia: Head over to https://julialang.org/ and download the appropriate installer for your operating system. Follow the installation instructions.
-
If you are starting in AI field ...
The above two steps is only for getting warm up, now you need to start coding on a programming language. Most of the AI community uses Python and there are other programming languages like Julia which is similar to python but it is faster than python, R used for statistical analysis and data visualization. Just try to learn one programming language with the Data Structure and Algorithm(DSA) and Object Oriented Programming System (OOPS) concepts.
-
What Every Developer Should Know About GPU Computing (2023)
If you are not writing the GPU kernel, just use a high level language which wraps up the CUDA, Metal, or whatever.
https://julialang.org
-
Julia 1.11 Highlights
It also turns out that it allows for a bunch more compiler optimizations to be implimented with a lot less pain. I got very nerd sniped on this this week leading to https://github.com/JuliaLang/julia/pull/56030 and https://github.com/JuliaLang/julia/pull/55913 which allow allocation removal in a number of cases and saves ~4ns (~10ns->6ns for Memory{Int8}(unef, 4)) for constructing Memory objects.
-
JuliaLang: Performance Prowess or Just Smoke and Mirrors? Unveiling the Real Story
Julia, renowned for its speed and efficiency in scientific computing, has caught the eye of many in the data science world. We were eager to find out if there's real power behind the hype. Curious about whether JuliaLang lives up to its reputation as the sprinter of the programming world?
-
From Julia to Rust
> are not an issue with Julia (eg memory safety)
Note that Julia does allow memory unsafety, for example you can mark array accesses with `@inbounds` to remove bound checks, kinda like how you can use `unsafe` in Rust except it looks much less scary.
It also doesn't help that the official example for how to use it safe was actually not safe [1]. Granted, this is just a single example and has been fixed since then, but it doesn't give a nice impression of their mindset when dealing with memory safety.
More in general there doesn't seem to be a strong mindset for correctness either. See [2] for a collection of such issues.
[1]: https://github.com/JuliaLang/julia/issues/39367
[2]: https://yuri.is/not-julia/
-
Let's Implement Overloading/Multiple-Dispatch
A couple years ago, I came across a language called Julia. It's multiple dispatch feature was very interesting; I wanted to know how it worked under the hood, but I didn't have the knowledge to do that yet. So here I am, finally giving it a try. Now that I have an implementation, I realized there is nothing tying this algorithm to runtime dispatch; I think it could be used in a language with static dispatch as well. If you're interested in learning about multiple dispatch, I left some links at the end of the post. So I guess this post is just about selecting the most specific function for a given set of arguments in a language with subtyping. Ok, let's get started.
What are some alternatives?
Gridap.jl - Grid-based approximation of partial differential equations in Julia
jax - Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
BoundaryValueDiffEq.jl - Boundary value problem (BVP) solvers for scientific machine learning (SciML)
NetworkX - Network Analysis in Python
FourierFlows.jl - Tools for building fast, hackable, pseudospectral partial differential equation solvers on periodic domains
Lua - Lua is a powerful, efficient, lightweight, embeddable scripting language. It supports procedural programming, object-oriented programming, functional programming, data-driven programming, and data description.
SciMLTutorials.jl - Tutorials for doing scientific machine learning (SciML) and high-performance differential equation solving with open source software.
rust-numpy - PyO3-based Rust bindings of the NumPy C-API
ApproxFun.jl - Julia package for function approximation
Numba - NumPy aware dynamic Python compiler using LLVM
ReservoirComputing.jl - Reservoir computing utilities for scientific machine learning (SciML)
Nim - Nim is a statically typed compiled systems programming language. It combines successful concepts from mature languages like Python, Ada and Modula. Its design focuses on efficiency, expressiveness, and elegance (in that order of priority).