stan
jax
stan | jax | |
---|---|---|
44 | 86 | |
2,609 | 30,660 | |
0.6% | 0.9% | |
9.5 | 10.0 | |
6 days ago | 4 days ago | |
C++ | Python | |
BSD 3-clause "New" or "Revised" License | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
stan
- Stan: Statistical modeling and high-performance statistical computation
-
Elevate Your Python Skills: Machine Learning Packages That Transformed My Journey as ML Engineer
Alternatives: stan and edward
- How often do you see Bayesian Statistics or Stan in the DS world? Essential skill or a nice to have?
-
Rstan Package in ATPA
remove.packages(c("StanHeaders", "rstan")) install.packages("rstan", repos = c("https://mc-stan.org/r-packages/", getOption("repos")))
-
[Q] Is there a method for adding random effects to an interval censored time to event model?
My approach to problems like this is to write down the proposed model mathematically first, in extreme detail. I find hierarchical form to be the easiest way to break it down piece by piece. Once I have the maths then I turn it into a Stan model. Last step is to use the Stan output to answer the research questions.
-
HELP Conjugate Priors in Bayesian Regression in SPSS
Here is a good breakdown of recommendations from Andrew Gelman.
-
Demand Planning
For instance my first choice in these cases is always a Bayesian inference tool like Stan. In my experience as someone who’s more of a programmer than mathematician/statistician, Bayesian tools like this make it much easier to not accidentally fool yourself with assumptions, and they can be pretty good at catching statistical mistakes.
-
What do actual ML engineers think of ChatGPT?
I tend to be most impressed by tools and libraries. The stuff that has most impressed me in my time in ML is stuff like pytorch and Stan, tools that allow expression of a wide variety of statistical (and ML, DL models, if you believe there's a distinction) models and inference from those models. These are the things that have had the largest effect in my own work, not in the sense of just using these tools, but learning from their design and emulating what makes them successful.
- ChatGPT4 writes Stan code so I don’t have to
-
How to get started learning modern AI?
oh its certainly used in practice. you should look into frameworks like Stan[1] and pyro[2]. i think bayesian models are seen as more explainable so they will be used in industries that value that sort of thing
[1] https://mc-stan.org/
jax
-
KlongPy: High-Performance Array Programming in Python
If you like high-performance array programming a la "numpy with JIT" I suggest looking at JAX. It's very suitable for general numeric computing (not just ML) and a very mature ecosystem.
https://github.com/jax-ml/jax
-
PyTorch is dead. Long live Jax
Nope, changing graph shape requires recompilation: https://github.com/google/jax/discussions/17191
- cuDF – GPU DataFrame Library
-
Rebuilding TensorFlow 2.8.4 on Ubuntu 22.04 to patch vulnerabilities
I found a GitHub issue that seemed similar (missing ptxas) and saw a suggestion to install nvidia-cuda-toolkit. Alright: but that exploded the container size from 6.5 GB to 12.13 GB … unacceptable 😤 (Incidentally, this is too large for Cloud Shell to build on its limited persistent disk.)
-
The Elements of Differentiable Programming
The dual numbers exist just as surely as the real numbers and have been used well over 100 years
https://en.m.wikipedia.org/wiki/Dual_number
Pytorch has had them for many years.
https://pytorch.org/docs/stable/generated/torch.autograd.for...
JAX implements them and uses them exactly as stated in this thread.
https://github.com/google/jax/discussions/10157#discussionco...
As you so eloquently stated, "you shouldn't be proclaiming things you don't actually know on a public forum," and doubly so when your claimed "corrections" are so demonstrably and totally incorrect.
-
Julia GPU-based ODE solver 20x-100x faster than those in Jax and PyTorch
On your last point, as long as you jit the topmost level, it doesn't matter whether or not you have inner jitted functions. The end result should be the same.
Source: https://github.com/google/jax/discussions/5199#discussioncom...
-
Apple releases MLX for Apple Silicon
The design of MLX is inspired by frameworks like NumPy, PyTorch, Jax, and ArrayFire.
-
MLPerf training tests put Nvidia ahead, Intel close, and Google well behind
I'm still not totally sure what the issue is. Jax uses program transformations to compile programs to run on a variety of hardware, for example, using XLA for TPUs. It can also run cuda ops for Nvidia gpus without issue: https://jax.readthedocs.io/en/latest/installation.html
There is also support for custom cpp and cuda ops if that's what is needed: https://jax.readthedocs.io/en/latest/Custom_Operation_for_GP...
I haven't worked with float4, but can imagine that new numerical types would require some special handling. But I assume that's the case for any ml environment.
But really you probably mean fixed point 4bit integer types? Looks like that has had at least some work done in Jax: https://github.com/google/jax/issues/8566
-
MatX: Efficient C++17 GPU numerical computing library with Python-like syntax
>
Are they even comparing apples to apples to claim that they see these improvements over NumPy?
> While the code complexity and length are roughly the same, the MatX version shows a 2100x over the Numpy version, and over 4x faster than the CuPy version on the same GPU.
NumPy doesn't use GPU by default unless you use something like Jax [1] to compile NumPy code to run on GPUs. I think more honest comparison will mainly compare MatX running on same CPU like NumPy as focus the GPU comparison against CuPy.
[1] https://github.com/google/jax
-
JAX – NumPy on the CPU, GPU, and TPU, with great automatic differentiation
Actually that never changed. The README has always had an example of differentiating through native Python control flow:
https://github.com/google/jax/commit/948a8db0adf233f333f3e5f...
The constraints on control flow expressions come from jax.jit (because Python control flow can't be staged out) and jax.vmap (because we can't take multiple branches of Python control flow, which we might need to do for different batch elements). But autodiff of Python-native control flow works fine!
What are some alternatives?
PyMC - Bayesian Modeling and Probabilistic Programming in Python
Numba - NumPy aware dynamic Python compiler using LLVM
rstan - RStan, the R interface to Stan
functorch - functorch is JAX-like composable function transforms for PyTorch.
brms - brms R package for Bayesian generalized multivariate non-linear multilevel models using Stan
julia - The Julia Programming Language
Elo-MMR - Skill estimation systems for multiplayer competitions
Pytorch - Tensors and Dynamic neural networks in Python with strong GPU acceleration
probability - Probabilistic reasoning and statistical analysis in TensorFlow
Cython - The most widely used Python to C compiler
rnim - A bridge between R and Nim
jax-windows-builder - A community supported Windows build for jax.