Transformers.jl
AlgebraOfGraphics.jl
Transformers.jl | AlgebraOfGraphics.jl | |
---|---|---|
7 | 4 | |
504 | 393 | |
- | 1.3% | |
6.9 | 5.0 | |
3 months ago | 7 days ago | |
Julia | Julia | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Transformers.jl
-
Julia 1.10 Released
Flux is quite a nice lower level library:
https://github.com/FluxML/Flux.jl
On top of that there are many higher level libraries such as Transformers.jl
https://github.com/chengchingwen/Transformers.jl
- How is Julia Performance with GPUs (for LLMs)?
-
Load a transformer model with julia
Check out Transformers.jl. It’s a library that implements transformer based models in Julia using Flux.jl. They have support for some of the huggingface transformers.
-
Ask HN: Why hasn't the Deep Learning community embraced Julia yet?
https://github.com/chengchingwen/Transformers.jl but I have not had any personal experience with.
All of this is build by the community and your mileage may vary.
In my rather biased opinion the strengths of Julia are that the various ML libraries can share implementations, e.g. Pytorch and Tensorflow contain separate Numpy derivatives. One could say that you can write an ML framework in Julia, instead of writting a DSL in Python as part of your C++ ML library. As an example Julia has a GPU compiler so you can write your own layer directly in Julia and integrate it into your pipeline.
-
Help on Differentiable Programming
I think you might have some luck with looking at a transformers implementation in flux, e.g: https://github.com/chengchingwen/Transformers.jl/tree/master/src/basic
-
Fastai.jl: Fastai for Julia
Having tried fastai for a "serious" research project and helped (just a bit) towards FastAI.jl development, here's my take:
> motivation behind this is unclear.
Julia currently has two main DL libraries. Flux, which is somewhere between PyTorch and (tf.)Keras abstraction wise, and Knet, which is a little lower level (think just below PyTorch/around where MXNet Gluon sits). Frameworks like fastai, PyTorch Lightning and Keras demonstrate that there's a desire for higher-level, more batteries included libraries. FastAI.jl is looking to fill that gap in Julia.
> Since FastAI.jl uses Flux, and not PyTorch, functionality has to be reimplemented. FastAI.jl has vision support but no text support yet.
This is correct. That said, FastAI.jl is not and does not plan to be a copy of the Python API (hence "inspired by"). One consequence of this is that integration with other libraries is much easier, e.g. https://github.com/chengchingwen/Transformers.jl for NLP tasks.
> What is the timeline for FastAI.jl to achieve parity?
-
Julia Update: Adoption Keeps Climbing; Is It a Python Challenger?
If NLP primitives are all that's keeping you from testing the waters, have a look at https://github.com/chengchingwen/Transformers.jl.
AlgebraOfGraphics.jl
- Makie, a modern and fast plotting library for Julia
-
Tidyverse 2.0.0
This illustrates the point perfectly. Julia is attempting this and has a beachhead with Dataframes.jl. Confusingly though, Tidier.jl isn't really analogous to R's Tidyverse. It's more like one of a handful of meta-packages around Dataframes.jl.
Then there are Grammar of Graphics (ggplot was Tidyverse's first star) style plotting libraries that Julia has been building. I'm probably most excited about Algebra of Graphics (https://github.com/MakieOrg/AlgebraOfGraphics.jl/) as part of the Makie Plots ecosystem. It does still feel a bit like Julia community can't decide between following Matplotlib or R's Grid/Ggplot approach.
The seeds of a Tidyverse for Julia are there, but it'll take some time to achieve the consistency and maturity of the original Tidyverse.
-
What Julia plotting library do you use/think will be the standard going forward?
Did you maybe overlook something, in https://github.com/JuliaPlots/AlgebraOfGraphics.jl or other package? I looked up "grid" and it seems to have something. I realize R, and ggplot2, were considered best by many (and Gadfly.jl similar, AoG seems to be its replacement?), but I didn't realize it had extensions (that you clarify below). At least you can call R, and thus use its plotting (and I assume its extensions too, can you confirm or deny?). For some reasons you got downvoted, so might you be ignorant of new developments in Julia (also Makie, to me it seemed excellent and I thought Julia caught up with plotting, and also had more options than other languages), or the others, or people simply very opinionated about plotting? It's about features, also speed/latency/TTFP, which is getting better.
-
Julia Update: Adoption Keeps Climbing; Is It a Python Challenger?
Julia has plenty of plotting solutions that are better for stats than matplotlib:
https://github.com/JuliaPlots/AlgebraOfGraphics.jl
What are some alternatives?
Flux.jl - Relax! Flux is the ML library that doesn't make you tensor
Genie.jl - 🧞The highly productive Julia web framework
PackageCompiler.jl - Compile your Julia Package
StatsPlots.jl - Statistical plotting recipes for Plots.jl
model-zoo - Please do not feed the models
Chain.jl - A Julia package for piping a value through a series of transformation expressions using a more convenient syntax than Julia's native piping functionality.
DataLoaders.jl - A parallel iterator for large machine learning datasets that don't fit into memory inspired by PyTorch's `DataLoader` class.
VegaLite.jl - Julia bindings to Vega-Lite
RCall.jl - Call R from Julia
Revise.jl - Automatically update function definitions in a running Julia session