jax
julia
jax | julia | |
---|---|---|
89 | 366 | |
31,945 | 46,834 | |
1.6% | 0.7% | |
10.0 | 10.0 | |
2 days ago | 2 days ago | |
Python | Julia | |
Apache License 2.0 | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
jax
- I want a good parallel computer
-
Show HN: Localscope–Limit scope of Python functions for reproducible execution
localscope is a small Python package that disassembles functions to check if they access global variables they shouldn't. I wrote this a few years ago to detect scope bugs which are common in Jupyter notebooks. It's recently come in handy writing jax code (https://github.com/jax-ml/jax) because it requires pure functions. Thought I'd share.
- Zest
-
KlongPy: High-Performance Array Programming in Python
If you like high-performance array programming a la "numpy with JIT" I suggest looking at JAX. It's very suitable for general numeric computing (not just ML) and a very mature ecosystem.
https://github.com/jax-ml/jax
-
PyTorch is dead. Long live Jax
Nope, changing graph shape requires recompilation: https://github.com/google/jax/discussions/17191
- cuDF – GPU DataFrame Library
-
Rebuilding TensorFlow 2.8.4 on Ubuntu 22.04 to patch vulnerabilities
I found a GitHub issue that seemed similar (missing ptxas) and saw a suggestion to install nvidia-cuda-toolkit. Alright: but that exploded the container size from 6.5 GB to 12.13 GB … unacceptable 😤 (Incidentally, this is too large for Cloud Shell to build on its limited persistent disk.)
-
The Elements of Differentiable Programming
The dual numbers exist just as surely as the real numbers and have been used well over 100 years
https://en.m.wikipedia.org/wiki/Dual_number
Pytorch has had them for many years.
https://pytorch.org/docs/stable/generated/torch.autograd.for...
JAX implements them and uses them exactly as stated in this thread.
https://github.com/google/jax/discussions/10157#discussionco...
As you so eloquently stated, "you shouldn't be proclaiming things you don't actually know on a public forum," and doubly so when your claimed "corrections" are so demonstrably and totally incorrect.
-
Julia GPU-based ODE solver 20x-100x faster than those in Jax and PyTorch
On your last point, as long as you jit the topmost level, it doesn't matter whether or not you have inner jitted functions. The end result should be the same.
Source: https://github.com/google/jax/discussions/5199#discussioncom...
-
Apple releases MLX for Apple Silicon
The design of MLX is inspired by frameworks like NumPy, PyTorch, Jax, and ArrayFire.
julia
- My programming Cruise
- Ask HN: What less-popular systems programming language are you using?
-
A data scientist's journey building a B2B data product with Julia and Pluto
In this post, I’m exploring dev tools for data scientists, specifically Julia and Pluto.jl. I interviewed Mandar, a data scientist and software engineer, about his experience adopting Pluto, a reactive notebook environment similar to Jupyter notebooks. What’s different about Pluto is that it’s designed specifically for Julia, a programming language built for scientific computing and machine learning.
-
New Horizons for Julia
https://github.com/JuliaLang/julia/issues/57483 yes, yes it should.
-
What is Open-Source? Beginners Guide How to Get Started.
Julia Seasons of Contributions (JSoC)
- I Chose Common Lisp
-
Stressify.jl Performance Testing
_ __ _(_)_ | Documentation: https://docs.julialang.org (_) | (_) (_) | _ _ _| |_ __ _ | Type "?" for help, "]?" for Pkg help. | | | | | | |/ _` | | | | |_| | | | (_| | | Version 1.11.2 (2024-12-01) _/ |\__'_|_|_|\__'_| | Official https://julialang.org/ release |__/ | julia>
-
Julia Emerges as Powerful New Language for Scientific Machine Learning, Rivaling Python and MATLAB
The paper examines the current state of the Julia programming language for scientific machine learning (SML). Julia is a relatively new language that is designed to be fast, easy to use, and well-suited for scientific and numerical computing.
-
A Comprehensive Guide to Training a Simple Linear Regression Model in Julia
Download and Install Julia: Head over to https://julialang.org/ and download the appropriate installer for your operating system. Follow the installation instructions.
-
If you are starting in AI field ...
The above two steps is only for getting warm up, now you need to start coding on a programming language. Most of the AI community uses Python and there are other programming languages like Julia which is similar to python but it is faster than python, R used for statistical analysis and data visualization. Just try to learn one programming language with the Data Structure and Algorithm(DSA) and Object Oriented Programming System (OOPS) concepts.
What are some alternatives?
Numba - NumPy aware dynamic Python compiler using LLVM
Nim - Nim is a statically typed compiled systems programming language. It combines successful concepts from mature languages like Python, Ada and Modula. Its design focuses on efficiency, expressiveness, and elegance (in that order of priority).
dex-lang - Research language for array processing in the Haskell/ML family
Lua - Lua is a powerful, efficient, lightweight, embeddable scripting language. It supports procedural programming, object-oriented programming, functional programming, data-driven programming, and data description.
Pytorch - Tensors and Dynamic neural networks in Python with strong GPU acceleration