Oceananigans.jl VS MITgcm

Compare Oceananigans.jl vs MITgcm and see what are their differences.

Oceananigans.jl

🌊 Julia software for fast, friendly, flexible, ocean-flavored fluid dynamics on CPUs and GPUs (by CliMA)
Our great sponsors
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • WorkOS - The modern identity platform for B2B SaaS
  • SaaSHub - Software Alternatives and Reviews
Oceananigans.jl MITgcm
4 1
875 313
1.6% 3.5%
9.5 8.7
5 days ago 6 days ago
Julia Fortran
MIT License MIT License
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

Oceananigans.jl

Posts with mentions or reviews of Oceananigans.jl. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-12-27.
  • Julia 1.10 Released
    15 projects | news.ycombinator.com | 27 Dec 2023
    I think it’s also the design philosophy. JuMP and ForwardDiff are great success stories and are packages very light on dependencies. I like those.

    The DiffEq library seems to pull you towards the SciML ecosystem and that might not be agreeable to everyone.

    For instance a known Julia project that simulates diff equations seems to have implemented their own solver

    https://github.com/CliMA/Oceananigans.jl

  • GPU vendor-agnostic fluid dynamics solver in Julia
    11 projects | news.ycombinator.com | 8 May 2023
    I‘m currently playing around with Oceananigans.jl (https://github.com/CliMA/Oceananigans.jl). Do you know how both are similar or different?

    Oceananigans.jl has really intuitive step-by-step examples and a great discussion page on GitHub.

  • Supercharged high-resolution ocean simulation with Jax
    5 projects | news.ycombinator.com | 5 Dec 2021

MITgcm

Posts with mentions or reviews of MITgcm. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2022-12-21.
  • Greenland's glaciers are melting 100 times faster than estimated
    3 projects | /r/Futurology | 21 Dec 2022
    Observational data used in this study is available at Sutherland et al. (2019b) for the LeConte Bay, Straneo (2022) for Helheim glacier/Sermilik Fjord, and Rignot and Schulz (2022) for Store fjord. The plume model used in this study builds on code developed by Cowton et al. (2015) and is distributed from Tom Cowton through his publicly available github open-source site https://github.com/tcowton/iceplume and open-source MITgcm checkpoint 65m https://github.com/MITgcm/MITgcm/archive/checkpoint65m.zip. Our modifications made to the iceplume package are available at https://github.com/KikiSchulz/iceplume_mod.

What are some alternatives?

When comparing Oceananigans.jl and MITgcm you can also consider the following projects:

MATDaemon.jl

E3SM - Energy Exascale Earth System Model source code. NOTE: use "maint" branches for your work. Head of master is not validated.

FiniteDiff.jl - Fast non-allocating calculations of gradients, Jacobians, and Hessians with sparsity support

lightcurve-of-the-day - Animated transit lightcurve posted once a day to twitter

Metal.jl - Metal programming in Julia

JuliaComputation - Repository for Common Ground C25

opendylan - Open Dylan compiler and IDE

iceplume

julia-ml-from-scratch - Machine learning from scratch in Julia

iceplume_mod - Modification to the iceplume package by Tom Cowton, see GRL paper "An improved and observationally-constrained melt rate parameterization for vertical ice fronts of marine terminating glaciers" by Schulz, Nguyen, Pillar

XLA.jl - "Maybe we have our own magic."

ClimateMachine.jl - Climate Machine: an Earth System Model that automatically learns from data