Torch.jl
JuliaInterpreter.jl
Torch.jl | JuliaInterpreter.jl | |
---|---|---|
6 | 5 | |
205 | 157 | |
2.0% | 0.6% | |
4.2 | 7.6 | |
12 days ago | about 1 month ago | |
Julia | Julia | |
GNU General Public License v3.0 or later | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Torch.jl
- Julia 1.10 Released
- Julia 1.9: A New Era of Performance and Flexibility
- How usable is Julia for Natural Language Processing Machine learning?
-
Does Julia Have a Chance to Overthrown Python in the Machine Learning Industry?
For frontends Python has quite some head-start. In principle it would be possible to write Julia frond-ends to existing ML libraries (written e.g. in C), for example https://github.com/FluxML/Torch.jl , but the advantages over Python frontends would be very limited. Only a front-to-back Julia implementation leverages most of the language advantages like composibility and flexibility.
-
Julia: faster than Fortran, cleaner than Numpy
PyTorch for example is a C++ library with a Python user interface, see e.g. the language shares in GitHub (https://github.com/pytorch/pytorch ). There is also a Julia binding for Torch (https://github.com/FluxML/Torch.jl), but I do not know how up-to-date it is.
JuliaInterpreter.jl
-
Do you use Julia for general purpose tasks?
The projects page is a list of suggestions of projects that someone has already said they want to run. If you can find a mentor, you can submit a project for anything. For potential performance improvements, I'd look at https://github.com/JuliaDebug/JuliaInterpreter.jl/issues/206, https://github.com/JuliaDebug/JuliaInterpreter.jl/issues/312, and https://github.com/JuliaDebug/JuliaInterpreter.jl/issues/314. I'm not sure if Tim Holy or Kristoffer have time to mentor a project, but if you're interested in doing a gsoc, ask around in the Julia slack/zulip, and you might be able to find a mentor.
-
Julia 1.7 has been released
I would not go as far as calling it very naive, there has certainly been some work put into optimizing performance within the current design.
There are probably some gains to be had by using a different storage format for the IR though as proposed in [1], but it is difficult to say how much of a difference that will make in practice.
[1] https://github.com/JuliaDebug/JuliaInterpreter.jl/pull/309
-
What's Bad about Julia?
You're right, done some more research and there seems to be an interpreter in the compiler: https://github.com/JuliaDebug/JuliaInterpreter.jl. It's only enabled by putting an annotation, and is mainly used for the debugger, but it's still there.
Still, it still seems to try executing the internal SSA IR in its raw form (which is more geared towards compiling rather than dynamic execution in a VM). I was talking more towards a conventional bytecode interpreter (which you can optimize the hell out of it like LuaJIT did). A bytecode format that is carefully designed for fast execution (in either a stack-based or register-based VM) would be much better for interpreters, but I'm not sure if Julia's language semantics / object model can allow it. Maybe some intelligent people out there can make the whole thing work, is what I was trying to say.
-
Julia: faster than Fortran, cleaner than Numpy
It could, but that is a lot more work than it sounds. It might be easier to make it possible to swap out the compiler for one that is much faster (LLVM is slow but does good optimisations, other compilers like cranelift are faster but produce slower code). There is a Julia interpreter but it was written in Julia itself (it was written to support debuggers), so it doesn't really solve the latency issues.
-
Julia: Faster than Fortran, cleaner than Numpy
If you need to run small scripts and can't switch to a persistent-REPL-based workflow, you might consider starting Julia with the `--compile=min` option. You can also reduce startup times dramatically by building a sysimg with PackageCompiler.jl
There is also technically an interpreter if you want to go that way [1], so in principle it might be possible to do the same trick javascript does, but someone would have to implement that.
[1] https://github.com/JuliaDebug/JuliaInterpreter.jl
What are some alternatives?
Flux.jl - Relax! Flux is the ML library that doesn't make you tensor
Diffractor.jl - Next-generation AD
Pytorch - Tensors and Dynamic neural networks in Python with strong GPU acceleration
DaemonMode.jl - Client-Daemon workflow to run faster scripts in Julia
gluon-nlp - NLP made easy
Tullio.jl - ⅀
SciPyDiffEq.jl - Wrappers for the SciPy differential equation solvers for the SciML Scientific Machine Learning organization
julia-numpy-fortran-test - Comparing Julia vs Numpy vs Fortran for performance and code simplicity
JuliaTorch - Using PyTorch in Julia Language
Infiltrator.jl - No-overhead breakpoints in Julia
threads - Threads for Lua and LuaJIT. Transparent exchange of data between threads is allowed thanks to torch serialization.
rust - Empowering everyone to build reliable and efficient software.