Flux.jl
kaleidoscope
Flux.jl | kaleidoscope | |
---|---|---|
22 | 9 | |
4,393 | 1,017 | |
0.4% | - | |
8.7 | 0.0 | |
3 days ago | about 4 years ago | |
Julia | Haskell | |
GNU General Public License v3.0 or later | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Flux.jl
- Julia 1.10 Released
-
What Apple hardware do I need for CUDA-based deep learning tasks?
If you are really committed to running on Apple hardware then take a look at Tensorflow for macOS. Another option is the Julia programming language which has very basic Metal support at a CUDA-like level. FluxML would be the ML framework in Julia. I’m not sure either option will be painless or let you do everything you could do with a Nvidia GPU.
-
[D] ClosedAI license, open-source license which restricts only OpenAI, Microsoft, Google, and Meta from commercial use
Flux dominance!
-
What would be your programming language of choice to implement a JIT compiler ?
I’m no compiler expert but check out flux and zygote https://fluxml.ai/ https://fluxml.ai/
-
Any help or tips for Neural Networks on Computer Clusters
I would suggest you to look into Julia ecosystem instead of C++. Julia is almost identical to Python in terms of how you use it but it's still very fast. You should look into flux.jl package for Julia.
-
[D] Why are we stuck with Python for something that require so much speed and parallelism (neural networks)?
Give Julia a try: https://fluxml.ai
-
Deep Learning With Flux: Loss Doesn't Converge
2) Flux treats softmax a little different than most other activation functions (see here for more details) such as relu and sigmoid. When you pass an activation function into a layer like Dense(3, 32, relu), Flux expects that the function is broadcast over the layer's output. However, softmax cannot be broadcast as it operates over vectors rather than scalars. This means that if you want to use softmax as the final activation in your model, you need to pass it into Chain() like so:
-
“Why I still recommend Julia”
Can you point to a concrete example of one that someone would run into when using the differential equation solvers with the default and recommended Enzyme AD for vector-Jacobian products? I'd be happy to look into it, but there do not currently seem to be any correctness issues in the Enzyme issue tracker that are current (3 issues are open but they all seem to be fixed, other than https://github.com/EnzymeAD/Enzyme.jl/issues/278 which is actually an activity analysis bug in LLVM). So please be more specific. The issue with Enzyme right now seems to moreso be about finding functional forms that compile, and it throws compile-time errors in the event that it cannot fully analyze the program and if it has too much dynamic behavior (example: https://github.com/EnzymeAD/Enzyme.jl/issues/368).
Additional note, we recently did a overhaul of SciMLSensitivity (https://sensitivity.sciml.ai/dev/) and setup a system which amounts to 15 hours of direct unit tests doing a combinatoric check of arguments with 4 hours of downstream testing (https://github.com/SciML/SciMLSensitivity.jl/actions/runs/25...). What that identified is that any remaining issues that can arise are due to the implicit parameters mechanism in Zygote (Zygote.params). To counteract this upstream issue, we (a) try to default to never default to Zygote VJPs whenever we can avoid it (hence defaulting to Enzyme and ReverseDiff first as previously mentioned), and (b) put in a mechanism for early error throwing if Zygote hits any not implemented derivative case with an explicit error message (https://github.com/SciML/SciMLSensitivity.jl/blob/v7.0.1/src...). We have alerted the devs of the machine learning libraries, and from this there has been a lot of movement. In particular, a globals-free machine learning library, Lux.jl, was created with fully explicit parameters https://lux.csail.mit.edu/dev/, and thus by design it cannot have this issue. In addition, the Flux.jl library itself is looking to do a redesign that eliminates implicit parameters (https://github.com/FluxML/Flux.jl/issues/1986). Which design will be the one in the end, that's uncertain right now, but it's clear that no matter what the future designs of the deep learning libraries will fully cut out that part of Zygote.jl. And additionally, the other AD libraries (Enzyme and Diffractor for example) do not have this "feature", so it's an issue that can only arise from a specific (not recommended) way of using Zygote (which now throws explicit error messages early and often if used anywhere near SciML because I don't tolerate it).
So from this, SciML should be rather safe and if not, please share some details and I'd be happy to dig in.
- Flux: The Elegant Machine Learning Stack
-
Jax vs. Julia (Vs PyTorch)
> In his item #1, he links to https://discourse.julialang.org/t/loaderror-when-using-inter... The issue is actually a Zygote bug, a Julia package for auto-differentiation, and is not directly related to Julia codebase (or Flux package) itself. Furthermore, the problematic code is working fine now, because DiffEqFlux has switched to Enzyme, which doesn't have that bug. He should first confirm whether the problem he is citing is actually a problem or not.
> Item #2, again another Zygote bug.
If flux chose a buggy package as a dependency, that's on them, and users are well justified in steering clear of Flux if it has buggy dependencies. As of today, the Project.toml for both Flux and DiffEqFlux still lists Zygote as a dependency. Neither list Enzyme.
https://github.com/FluxML/Flux.jl/blob/master/Project.toml
kaleidoscope
- Implementing a JIT Compiled Language with Haskell and LLVM (2017)
-
Should I abandon using haskell for my compiler?
Comparing the haskell and cpp implementations of the LLVM tutorial lead me to believe it might be faster to learn haskell and implement the compiler in haskell than to implement it in cpp.
-
What would be your programming language of choice to implement a JIT compiler ?
I think for writing compilers Haskell deserves to make the list. It is really excellent at creating DSLs. https://www.stephendiehl.com/llvm/
-
Proposal to Merge Pyston with Cpython
I'm no expert, but you might be interested in: https://llvm.org/docs/tutorial/
There's also a Haskell version if you'd prefer: https://www.stephendiehl.com/llvm/
Idk how to do this in python as I'm not really good with it, but in C, to make your compiler a JIT, you would `mmap` a region as writeable, write the machine code to it that you already know how to generate, `mprotect` it as PROT_EXEC instead of PROC_WRITE, cast the pointer to the region to a function pointer, and then call it. These functions may be available in the python sys package but I don't really know.
I've implemented a "JIT" that takes machine code as hex and does this. Warning: it's complete garbage with no error checking but is a good proof of concept. https://gist.github.com/martinjacobd/3ee56f3c7b7ce621034ec3e...
- Why does Rust have parameters on impl?
-
Implementing a LLVM Micro C compiler in Haskell
This is amazing. I tried following Stephen Diehl's JIT compiler in LLVM tutorial[0] a few years ago but it was already outdated (the llvm-hs library changed quite a bit), and subsequent web searches didn't turn up much.
For those interested in tutorials like this, I'd also recommend a very literate Haskell compiler for the PCF language to C[1], which is essentially lambda calculus with some primitives.
[0] https://www.stephendiehl.com/llvm/
[1] https://github.com/jozefg/pcf/
- Resources for Amateur Compiler Writers
-
Need some help with monad transformers
I'm currently working with llvm-hs-pure and am struggling to properly emit code for a module. I basically followed https://www.stephendiehl.com/llvm/#chapter-3-code-generation and have types like:
-
Advanced books / tutorials about Haskell?
http://www.stephendiehl.com/llvm/ Implementing a JIT Compiled Language with Haskell and LLVM Nice tutorial. Requires knowledge of monads, applicatives, transformers. Deep enough and more or less 'real world'.
What are some alternatives?
Pytorch - Tensors and Dynamic neural networks in Python with strong GPU acceleration
hyper-haskell-server - The strongly hyped Haskell interpreter.
Knet.jl - Koç University deep learning framework.
dhall - Maintainable configuration files
tensorflow - An Open Source Machine Learning Framework for Everyone
unbound - Replib: generic programming & Unbound: generic treatment of binders
Transformers.jl - Julia Implementation of Transformer models
ajhc - A fork of jhc. And also a Haskell compiler.
Torch.jl - Sensible extensions for exposing torch in Julia.
pcf - A small compiler for PCF
Lux.jl - Explicitly Parameterized Neural Networks in Julia
egison - The Egison Programming Language