Our great sponsors
-
DataLoaders.jl
A parallel iterator for large machine learning datasets that don't fit into memory inspired by PyTorch's `DataLoader` class.
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
Having tried fastai for a "serious" research project and helped (just a bit) towards FastAI.jl development, here's my take:
> motivation behind this is unclear.
Julia currently has two main DL libraries. Flux, which is somewhere between PyTorch and (tf.)Keras abstraction wise, and Knet, which is a little lower level (think just below PyTorch/around where MXNet Gluon sits). Frameworks like fastai, PyTorch Lightning and Keras demonstrate that there's a desire for higher-level, more batteries included libraries. FastAI.jl is looking to fill that gap in Julia.
> Since FastAI.jl uses Flux, and not PyTorch, functionality has to be reimplemented. FastAI.jl has vision support but no text support yet.
This is correct. That said, FastAI.jl is not and does not plan to be a copy of the Python API (hence "inspired by"). One consequence of this is that integration with other libraries is much easier, e.g. https://github.com/chengchingwen/Transformers.jl for NLP tasks.
> What is the timeline for FastAI.jl to achieve parity?
> When should I choose FastAI.jl vs fastai?
This depends on your use cases and how comfortable you are with a) Julia b) having to roll some of your own code. For the first, I'd recommend poking around with the language before as well as using the linked dev channel in TFA to get an informed opinion.
FastAI.jl itself is composed of multiple constituent packages that can and are used independently, so there's also the option of mixing and matching. For example, https://github.com/lorenzoh/DataLoaders.jl is completely library agnostic.
Transformers.jl and TextAnalysis.jl already provide quite a bit of functionality for NLP, though to my knowledge neither makes use of RNNs. You may be interested in commenting on https://github.com/FluxML/Flux.jl/issues/1678.