The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning. Learn more β
Transformers.jl Alternatives
Similar projects and alternatives to Transformers.jl
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
Chain.jl
A Julia package for piping a value through a series of transformation expressions using a more convenient syntax than Julia's native piping functionality.
-
WorkOS
The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.
-
DataLoaders.jl
A parallel iterator for large machine learning datasets that don't fit into memory inspired by PyTorch's `DataLoader` class.
-
Dash.jl
Dash for Julia - A Julia interface to the Dash ecosystem for creating analytic web applications in Julia. No JavaScript required.
-
Oceananigans.jl
π Julia software for fast, friendly, flexible, ocean-flavored fluid dynamics on CPUs and GPUs
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
Transformers.jl reviews and mentions
-
Julia 1.10 Released
Flux is quite a nice lower level library:
https://github.com/FluxML/Flux.jl
On top of that there are many higher level libraries such as Transformers.jl
https://github.com/chengchingwen/Transformers.jl
- How is Julia Performance with GPUs (for LLMs)?
-
Load a transformer model with julia
Check out Transformers.jl. Itβs a library that implements transformer based models in Julia using Flux.jl. They have support for some of the huggingface transformers.
-
Ask HN: Why hasn't the Deep Learning community embraced Julia yet?
https://github.com/chengchingwen/Transformers.jl but I have not had any personal experience with.
All of this is build by the community and your mileage may vary.
In my rather biased opinion the strengths of Julia are that the various ML libraries can share implementations, e.g. Pytorch and Tensorflow contain separate Numpy derivatives. One could say that you can write an ML framework in Julia, instead of writting a DSL in Python as part of your C++ ML library. As an example Julia has a GPU compiler so you can write your own layer directly in Julia and integrate it into your pipeline.
-
Help on Differentiable Programming
I think you might have some luck with looking at a transformers implementation in flux, e.g: https://github.com/chengchingwen/Transformers.jl/tree/master/src/basic
-
Fastai.jl: Fastai for Julia
Having tried fastai for a "serious" research project and helped (just a bit) towards FastAI.jl development, here's my take:
> motivation behind this is unclear.
Julia currently has two main DL libraries. Flux, which is somewhere between PyTorch and (tf.)Keras abstraction wise, and Knet, which is a little lower level (think just below PyTorch/around where MXNet Gluon sits). Frameworks like fastai, PyTorch Lightning and Keras demonstrate that there's a desire for higher-level, more batteries included libraries. FastAI.jl is looking to fill that gap in Julia.
> Since FastAI.jl uses Flux, and not PyTorch, functionality has to be reimplemented. FastAI.jl has vision support but no text support yet.
This is correct. That said, FastAI.jl is not and does not plan to be a copy of the Python API (hence "inspired by"). One consequence of this is that integration with other libraries is much easier, e.g. https://github.com/chengchingwen/Transformers.jl for NLP tasks.
> What is the timeline for FastAI.jl to achieve parity?
-
Julia Update: Adoption Keeps Climbing; Is It a Python Challenger?
If NLP primitives are all that's keeping you from testing the waters, have a look at https://github.com/chengchingwen/Transformers.jl.
-
A note from our sponsor - WorkOS
workos.com | 26 Apr 2024
Stats
chengchingwen/Transformers.jl is an open source project licensed under MIT License which is an OSI approved license.
The primary programming language of Transformers.jl is Julia.
Popular Comparisons
Sponsored