Oceananigans.jl VS Transformers.jl

Compare Oceananigans.jl vs Transformers.jl and see what are their differences.

Oceananigans.jl

🌊 Julia software for fast, friendly, flexible, ocean-flavored fluid dynamics on CPUs and GPUs (by CliMA)
Our great sponsors
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • WorkOS - The modern identity platform for B2B SaaS
  • SaaSHub - Software Alternatives and Reviews
Oceananigans.jl Transformers.jl
4 7
875 503
1.6% -
9.5 6.9
5 days ago 2 months ago
Julia Julia
MIT License MIT License
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

Oceananigans.jl

Posts with mentions or reviews of Oceananigans.jl. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-12-27.
  • Julia 1.10 Released
    15 projects | news.ycombinator.com | 27 Dec 2023
    I think it’s also the design philosophy. JuMP and ForwardDiff are great success stories and are packages very light on dependencies. I like those.

    The DiffEq library seems to pull you towards the SciML ecosystem and that might not be agreeable to everyone.

    For instance a known Julia project that simulates diff equations seems to have implemented their own solver

    https://github.com/CliMA/Oceananigans.jl

  • GPU vendor-agnostic fluid dynamics solver in Julia
    11 projects | news.ycombinator.com | 8 May 2023
    I‘m currently playing around with Oceananigans.jl (https://github.com/CliMA/Oceananigans.jl). Do you know how both are similar or different?

    Oceananigans.jl has really intuitive step-by-step examples and a great discussion page on GitHub.

  • Supercharged high-resolution ocean simulation with Jax
    5 projects | news.ycombinator.com | 5 Dec 2021

Transformers.jl

Posts with mentions or reviews of Transformers.jl. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-12-27.
  • Julia 1.10 Released
    15 projects | news.ycombinator.com | 27 Dec 2023
    Flux is quite a nice lower level library:

    https://github.com/FluxML/Flux.jl

    On top of that there are many higher level libraries such as Transformers.jl

    https://github.com/chengchingwen/Transformers.jl

  • How is Julia Performance with GPUs (for LLMs)?
    2 projects | /r/Julia | 7 Apr 2023
  • Load a transformer model with julia
    2 projects | /r/Julia | 17 Oct 2022
    Check out Transformers.jl. It’s a library that implements transformer based models in Julia using Flux.jl. They have support for some of the huggingface transformers.
  • Ask HN: Why hasn't the Deep Learning community embraced Julia yet?
    3 projects | news.ycombinator.com | 11 Sep 2022
    https://github.com/chengchingwen/Transformers.jl but I have not had any personal experience with.

    All of this is build by the community and your mileage may vary.

    In my rather biased opinion the strengths of Julia are that the various ML libraries can share implementations, e.g. Pytorch and Tensorflow contain separate Numpy derivatives. One could say that you can write an ML framework in Julia, instead of writting a DSL in Python as part of your C++ ML library. As an example Julia has a GPU compiler so you can write your own layer directly in Julia and integrate it into your pipeline.

  • Help on Differentiable Programming
    1 project | /r/Julia | 5 Jan 2022
    I think you might have some luck with looking at a transformers implementation in flux, e.g: https://github.com/chengchingwen/Transformers.jl/tree/master/src/basic
  • Fastai.jl: Fastai for Julia
    6 projects | news.ycombinator.com | 27 Jul 2021
    Having tried fastai for a "serious" research project and helped (just a bit) towards FastAI.jl development, here's my take:

    > motivation behind this is unclear.

    Julia currently has two main DL libraries. Flux, which is somewhere between PyTorch and (tf.)Keras abstraction wise, and Knet, which is a little lower level (think just below PyTorch/around where MXNet Gluon sits). Frameworks like fastai, PyTorch Lightning and Keras demonstrate that there's a desire for higher-level, more batteries included libraries. FastAI.jl is looking to fill that gap in Julia.

    > Since FastAI.jl uses Flux, and not PyTorch, functionality has to be reimplemented. FastAI.jl has vision support but no text support yet.

    This is correct. That said, FastAI.jl is not and does not plan to be a copy of the Python API (hence "inspired by"). One consequence of this is that integration with other libraries is much easier, e.g. https://github.com/chengchingwen/Transformers.jl for NLP tasks.

    > What is the timeline for FastAI.jl to achieve parity?

  • Julia Update: Adoption Keeps Climbing; Is It a Python Challenger?
    17 projects | news.ycombinator.com | 18 Jan 2021
    If NLP primitives are all that's keeping you from testing the waters, have a look at https://github.com/chengchingwen/Transformers.jl.

What are some alternatives?

When comparing Oceananigans.jl and Transformers.jl you can also consider the following projects:

MATDaemon.jl

Flux.jl - Relax! Flux is the ML library that doesn't make you tensor

FiniteDiff.jl - Fast non-allocating calculations of gradients, Jacobians, and Hessians with sparsity support

PackageCompiler.jl - Compile your Julia Package

MITgcm - M.I.T General Circulation Model master code and documentation repository

model-zoo - Please do not feed the models

Metal.jl - Metal programming in Julia

DataLoaders.jl - A parallel iterator for large machine learning datasets that don't fit into memory inspired by PyTorch's `DataLoader` class.

opendylan - Open Dylan compiler and IDE

Chain.jl - A Julia package for piping a value through a series of transformation expressions using a more convenient syntax than Julia's native piping functionality.

julia-ml-from-scratch - Machine learning from scratch in Julia

StatsPlots.jl - Statistical plotting recipes for Plots.jl