Flux.jl
gleam
Flux.jl | gleam | |
---|---|---|
22 | 96 | |
4,393 | 15,033 | |
0.4% | 5.1% | |
8.7 | 9.9 | |
3 days ago | 6 days ago | |
Julia | Rust | |
GNU General Public License v3.0 or later | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Flux.jl
- Julia 1.10 Released
-
What Apple hardware do I need for CUDA-based deep learning tasks?
If you are really committed to running on Apple hardware then take a look at Tensorflow for macOS. Another option is the Julia programming language which has very basic Metal support at a CUDA-like level. FluxML would be the ML framework in Julia. I’m not sure either option will be painless or let you do everything you could do with a Nvidia GPU.
-
[D] ClosedAI license, open-source license which restricts only OpenAI, Microsoft, Google, and Meta from commercial use
Flux dominance!
-
What would be your programming language of choice to implement a JIT compiler ?
I’m no compiler expert but check out flux and zygote https://fluxml.ai/ https://fluxml.ai/
-
Any help or tips for Neural Networks on Computer Clusters
I would suggest you to look into Julia ecosystem instead of C++. Julia is almost identical to Python in terms of how you use it but it's still very fast. You should look into flux.jl package for Julia.
-
[D] Why are we stuck with Python for something that require so much speed and parallelism (neural networks)?
Give Julia a try: https://fluxml.ai
-
Deep Learning With Flux: Loss Doesn't Converge
2) Flux treats softmax a little different than most other activation functions (see here for more details) such as relu and sigmoid. When you pass an activation function into a layer like Dense(3, 32, relu), Flux expects that the function is broadcast over the layer's output. However, softmax cannot be broadcast as it operates over vectors rather than scalars. This means that if you want to use softmax as the final activation in your model, you need to pass it into Chain() like so:
-
“Why I still recommend Julia”
Can you point to a concrete example of one that someone would run into when using the differential equation solvers with the default and recommended Enzyme AD for vector-Jacobian products? I'd be happy to look into it, but there do not currently seem to be any correctness issues in the Enzyme issue tracker that are current (3 issues are open but they all seem to be fixed, other than https://github.com/EnzymeAD/Enzyme.jl/issues/278 which is actually an activity analysis bug in LLVM). So please be more specific. The issue with Enzyme right now seems to moreso be about finding functional forms that compile, and it throws compile-time errors in the event that it cannot fully analyze the program and if it has too much dynamic behavior (example: https://github.com/EnzymeAD/Enzyme.jl/issues/368).
Additional note, we recently did a overhaul of SciMLSensitivity (https://sensitivity.sciml.ai/dev/) and setup a system which amounts to 15 hours of direct unit tests doing a combinatoric check of arguments with 4 hours of downstream testing (https://github.com/SciML/SciMLSensitivity.jl/actions/runs/25...). What that identified is that any remaining issues that can arise are due to the implicit parameters mechanism in Zygote (Zygote.params). To counteract this upstream issue, we (a) try to default to never default to Zygote VJPs whenever we can avoid it (hence defaulting to Enzyme and ReverseDiff first as previously mentioned), and (b) put in a mechanism for early error throwing if Zygote hits any not implemented derivative case with an explicit error message (https://github.com/SciML/SciMLSensitivity.jl/blob/v7.0.1/src...). We have alerted the devs of the machine learning libraries, and from this there has been a lot of movement. In particular, a globals-free machine learning library, Lux.jl, was created with fully explicit parameters https://lux.csail.mit.edu/dev/, and thus by design it cannot have this issue. In addition, the Flux.jl library itself is looking to do a redesign that eliminates implicit parameters (https://github.com/FluxML/Flux.jl/issues/1986). Which design will be the one in the end, that's uncertain right now, but it's clear that no matter what the future designs of the deep learning libraries will fully cut out that part of Zygote.jl. And additionally, the other AD libraries (Enzyme and Diffractor for example) do not have this "feature", so it's an issue that can only arise from a specific (not recommended) way of using Zygote (which now throws explicit error messages early and often if used anywhere near SciML because I don't tolerate it).
So from this, SciML should be rather safe and if not, please share some details and I'd be happy to dig in.
- Flux: The Elegant Machine Learning Stack
-
Jax vs. Julia (Vs PyTorch)
> In his item #1, he links to https://discourse.julialang.org/t/loaderror-when-using-inter... The issue is actually a Zygote bug, a Julia package for auto-differentiation, and is not directly related to Julia codebase (or Flux package) itself. Furthermore, the problematic code is working fine now, because DiffEqFlux has switched to Enzyme, which doesn't have that bug. He should first confirm whether the problem he is citing is actually a problem or not.
> Item #2, again another Zygote bug.
If flux chose a buggy package as a dependency, that's on them, and users are well justified in steering clear of Flux if it has buggy dependencies. As of today, the Project.toml for both Flux and DiffEqFlux still lists Zygote as a dependency. Neither list Enzyme.
https://github.com/FluxML/Flux.jl/blob/master/Project.toml
gleam
-
Borgo is a statically typed language that compiles to Go
I haven't had time to really try to write anything in it, but https://gleam.run/ looks really good too. Like Elm for backend + frontend!
-
Release Radar • March 2024 Edition
Want a friendly language for building safe systems at scale? Gleam is here for you. It features modern and familiar syntax, that's reliable and scalable. Gleam runs on an Erlang virtual machine, and can run plenty of concurrent tasks. It comes with a compiler, build tool, formatter, editor integrations, and package manager all built in so you can get started right away. Congrats to the team on shipping your first major version 🙌.
-
The Current State of Clojure's Machine Learning Ecosystem
While I love Clojure, I have to agree about tooling. I recently started using Gleam* and was impressed at how easy it was to get up and running with the CLI tool. I think this is an important part of getting people to adopt a language.
* https://gleam.run/
-
Show HN: I open-sourced the in-memory PostgreSQL I built at work for E2E tests
If you use languages that compile to WASM (such as Gleam https://gleam.run), and can also run Postgres via WASM, then it opens very interesting offline scenarios with codebases which are similar on both the client and the server, for instance.
-
Why the number of Gleam programmers is growing so fast?
Recently, Gleam has gained more popularity, and a lot of developers (including me) are learning it. At the time of this writing, it has exceeded 14k stars on GitHub; it grew really fast for the last month.
- Cranelift code generation comes to Rust
- Gleam v1.0.0
- Gleam has a 1.0 release candidate
-
Welcome to the Gleam Language Tour
Oh, strange that github had a date of 2016 on this one: https://github.com/gleam-lang/gleam/issues/2
I was just going by that, though I do remember checking out gleam 5 years ago or so.
Re: macros, I really do think they’re a big deal and all the other newer languages I’ve used, such as Rust have some kind of macros or powerful meta programming features.
For older languages, a few, like Ruby have enough meta programmability to make nice DSLs, but many others don’t. Given the choice, I’d much rather have Elixir/Clojure style macros than other meta-programming facilities I’ve seen so far.
-
Inko Programming Language
I had been only following this language with some interest, I guess this was born in gitlab not sure if the creator(s) still work there. This is what I'd have wanted golang to be (albeit with GC when you do not have clear lifetimes).
But how would you differentiate yourself from https://gleam.run which can leverage the OTP, I'd be more interested if we can adapt Gleam to graalvm isolates so we can leverage the JVM ecosystem.
What are some alternatives?
Pytorch - Tensors and Dynamic neural networks in Python with strong GPU acceleration
are-we-fast-yet - Are We Fast Yet? Comparing Language Implementations with Objects, Closures, and Arrays
Knet.jl - Koç University deep learning framework.
web3.js - Collection of comprehensive TypeScript libraries for Interaction with the Ethereum JSON RPC API and utility functions.
tensorflow - An Open Source Machine Learning Framework for Everyone
Rustler - Safe Rust bridge for creating Erlang NIF functions
Transformers.jl - Julia Implementation of Transformer models
ponyc - Pony is an open-source, actor-model, capabilities-secure, high performance programming language
Torch.jl - Sensible extensions for exposing torch in Julia.
nx - Multi-dimensional arrays (tensors) and numerical definitions for Elixir
Lux.jl - Explicitly Parameterized Neural Networks in Julia
hamler - Haskell-style functional programming language running on Erlang VM.