Transformers.jl
RCall.jl
Transformers.jl | RCall.jl | |
---|---|---|
7 | 8 | |
504 | 311 | |
- | 0.6% | |
6.9 | 5.5 | |
3 months ago | about 1 month ago | |
Julia | Julia | |
MIT License | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Transformers.jl
-
Julia 1.10 Released
Flux is quite a nice lower level library:
https://github.com/FluxML/Flux.jl
On top of that there are many higher level libraries such as Transformers.jl
https://github.com/chengchingwen/Transformers.jl
- How is Julia Performance with GPUs (for LLMs)?
-
Load a transformer model with julia
Check out Transformers.jl. It’s a library that implements transformer based models in Julia using Flux.jl. They have support for some of the huggingface transformers.
-
Ask HN: Why hasn't the Deep Learning community embraced Julia yet?
https://github.com/chengchingwen/Transformers.jl but I have not had any personal experience with.
All of this is build by the community and your mileage may vary.
In my rather biased opinion the strengths of Julia are that the various ML libraries can share implementations, e.g. Pytorch and Tensorflow contain separate Numpy derivatives. One could say that you can write an ML framework in Julia, instead of writting a DSL in Python as part of your C++ ML library. As an example Julia has a GPU compiler so you can write your own layer directly in Julia and integrate it into your pipeline.
-
Help on Differentiable Programming
I think you might have some luck with looking at a transformers implementation in flux, e.g: https://github.com/chengchingwen/Transformers.jl/tree/master/src/basic
-
Fastai.jl: Fastai for Julia
Having tried fastai for a "serious" research project and helped (just a bit) towards FastAI.jl development, here's my take:
> motivation behind this is unclear.
Julia currently has two main DL libraries. Flux, which is somewhere between PyTorch and (tf.)Keras abstraction wise, and Knet, which is a little lower level (think just below PyTorch/around where MXNet Gluon sits). Frameworks like fastai, PyTorch Lightning and Keras demonstrate that there's a desire for higher-level, more batteries included libraries. FastAI.jl is looking to fill that gap in Julia.
> Since FastAI.jl uses Flux, and not PyTorch, functionality has to be reimplemented. FastAI.jl has vision support but no text support yet.
This is correct. That said, FastAI.jl is not and does not plan to be a copy of the Python API (hence "inspired by"). One consequence of this is that integration with other libraries is much easier, e.g. https://github.com/chengchingwen/Transformers.jl for NLP tasks.
> What is the timeline for FastAI.jl to achieve parity?
-
Julia Update: Adoption Keeps Climbing; Is It a Python Challenger?
If NLP primitives are all that's keeping you from testing the waters, have a look at https://github.com/chengchingwen/Transformers.jl.
RCall.jl
-
Makie, a modern and fast plotting library for Julia
I don't use it personally, but RCall.jl[1] is the main R interop package in Julia. You could call libraries that have no equivalent in Julia using that and write your own analyses in Julia instead.
[1] https://github.com/JuliaInterop/RCall.jl
-
Making Python 100x faster with less than 100 lines of Rust
You can have your cake and eat it with the likes of
* PythonCall.jl - https://github.com/cjdoris/PythonCall.jl
* NodeCall.jl - https://github.com/sunoru/NodeCall.j
* RCall.jl - https://github.com/JuliaInterop/RCall.jl
I tend to use Julia for most things and then just dip into another language’s ecosystem if I can’t find something to do the job and it’s too complex to build myself
-
Interoperability in Julia
To inter-operate Julia with the R language, the RCall package is used. Run the following commands on the Julia REPL
-
Convert Random Forest from Julia to R
https://github.com/JuliaInterop/RCall.jl may help
-
I'm considering Rust, Go, or Julia for my next language and I'd like to hear your thoughts on these
If you need to bindings to your existing R packages then Julia is the way. Check out RCall.jl
-
translate R code to Julia code
I have no experience with R, but maybe this will be of use: https://github.com/JuliaInterop/RCall.jl
-
Julia 1.6: what has changed since Julia 1.0?
You can use RCall to use R from Julia: https://github.com/JuliaInterop/RCall.jl
-
Julia Update: Adoption Keeps Climbing; Is It a Python Challenger?
I worked with R and Python during the last 3 years but learning and dabbling with Julia since 0.6. Since the availability of [PyCall.jl] and [RCall.jl], the transition to Julia can already be easier for Python/R users.
I agree that most of the time data wrangling is super confortable in R due to the syntax flexibility exploited by the big packages (tidyverse/data.table/etc). At the same time, Julia and R share a bigger heritage from Lisp influence that with Python, because R is also a Lisp-ish language (see [Advanced R, Metaprogramming]). My main grip from the R ecosystem is not that most of the perfomance sensitive packages are written in C/C++/Fortran but are written so deeply interconnect with the R environment that porting them to Julia that provide also an easy and good interface to C/C++/Fortran (and more see [Julia Interop] repo) seems impossible for some of them.
I also think that Julia reach to broader scientific programming public than R, where it overlaps with Python sometimes but provides the Matlab/Octave public with an better alternative. I don't expected to see all the habits from those communities merge into Julia ecosystem. On the other side, I think that Julia bigger reach will avoid to fall into the "base" vs "tidyverse" vs "something else in-between" that R is now.
[PyCall.jl]: https://github.com/JuliaPy/PyCall.jl
[RCall.jl]: https://github.com/JuliaInterop/RCall.jl
[Julia Interop]: https://github.com/JuliaInterop
[Advanced R, Metaprogramming] by Hadley Wickham: https://adv-r.hadley.nz/metaprogramming.html
What are some alternatives?
Flux.jl - Relax! Flux is the ML library that doesn't make you tensor
Makie.jl - Interactive data visualizations and plotting in Julia
PackageCompiler.jl - Compile your Julia Package
org-mode - This is a MIRROR only, do not send PR.
model-zoo - Please do not feed the models
Chain.jl - A Julia package for piping a value through a series of transformation expressions using a more convenient syntax than Julia's native piping functionality.
DataLoaders.jl - A parallel iterator for large machine learning datasets that don't fit into memory inspired by PyTorch's `DataLoader` class.
Revise.jl - Automatically update function definitions in a running Julia session
cmssw - CMS Offline Software
StatsPlots.jl - Statistical plotting recipes for Plots.jl
PyCall.jl - Package to call Python functions from the Julia language