keras-cv VS returnn

Compare keras-cv vs returnn and see what are their differences.

keras-cv

Industry-strength Computer Vision workflows with Keras (by keras-team)

returnn

The RWTH extensible training framework for universal recurrent neural networks (by rwth-i6)
Our great sponsors
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • WorkOS - The modern identity platform for B2B SaaS
  • SaaSHub - Software Alternatives and Reviews
keras-cv returnn
4 4
946 349
3.5% 0.6%
9.3 9.8
3 days ago 23 days ago
Python Python
GNU General Public License v3.0 or later GNU General Public License v3.0 or later
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

keras-cv

Posts with mentions or reviews of keras-cv. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-07-11.

returnn

Posts with mentions or reviews of returnn. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-07-11.
  • Keras Core: Keras for TensorFlow, Jax, and PyTorch
    5 projects | news.ycombinator.com | 11 Jul 2023
    That looks very interesting.

    I actually have developed (and am developing) sth very similar, what we call the RETURNN frontend, a new frontend + new backends for our RETURNN framework. The new frontend is supporting very similar Python code to define models as you see in PyTorch or Keras, i.e. a core Tensor class, a base Module class you can derive, a Parameter class, and then a core functional API to perform all the computations. That supports multiple backends, currently mostly TensorFlow (graph-based) and PyTorch, but JAX was something I also planned. Some details here: https://github.com/rwth-i6/returnn/issues/1120

    (Note that we went a bit further ahead and made named dimensions a core principle of the framework.)

    (Example beam search implementation: https://github.com/rwth-i6/i6_experiments/blob/14b66c4dc74c0...)

    One difficulty I found was how design the API in a way that works well both for eager-mode frameworks (PyTorch, TF eager-mode) and graph-based frameworks (TF graph-mode, JAX). That mostly involves everything where there is some state, or sth code which should not just execute in the inner training loop but e.g. for initialization only, or after each epoch, or whatever. So for example:

    - Parameter initialization.

    - Anything involving buffers, e.g. batch normalization.

    - Other custom training loops? Or e.g. an outer loop and an inner loop (e.g. like GAN training)?

    - How to implement sth like weight normalization? In PyTorch, the module.param is renamed, and then there is a pre-forward hook, which on-the-fly calculates module.param for each call for forward. So, just following the same logic for both eager-mode and graph-mode?

    - How to deal with control flow context, accessing values outside the loop which came from inside, etc. Those things are naturally possible eager-mode, where you would get the most recent value, and where there is no real control flow context.

    - Device logic: Have device defined explicitly for each tensor (like PyTorch), or automatically eagerly move tensors to the GPU (like TensorFlow)? Moving from one device to another (or CPU) is automatic or must be explicit?

    I see that you have keras_core.callbacks.LambdaCallback which is maybe similar, but can you effectively update the logic of the module in there?

  • Python’s “Type Hints” are a bit of a disappointment to me
    15 projects | news.ycombinator.com | 21 Apr 2022
    > warnings of IDEs are simple to ignore

    This is unusual. In my experience, of codebases I have worked with or have seen, when there are type hints, there are almost all perfectly correct.

    Also, you can setup the CI to check also for IDE warnings. For example, we use this script for PyCharm: https://github.com/rwth-i6/returnn/blob/master/tests/pycharm...

    The test for PyCharm inspections only passes when there are no warnings.

    Although, I have to admit, we explicitly exclude type warnings because here we have a couple of false positives. So in this respect, it actually agrees with the article.

    But then we also do code review and there we are strict about having it all correct.

    Yes, I see the argument of the article that the typing in Python is not perfect and you can easily fool it if you want, so you cannot 100% trust the types. But given good standard practice, it will only rarely happen that the type is not as expected and typing helps a lot. And IDE type warnings, or mypy checks still are useful tools and catch bugs for you, just not maybe 100% of all typing bugs but still maybe 80% of them or so.

    > Isn’t it better to detect at least some errors than to detect none at all?

  • How to cleanup a branch (PR) with huge number of commits
    1 project | dev.to | 1 Sep 2021
    I was trying to implement some new feature in some larger somewhat messy project (RETURNN but not so relevant).
    1 project | /r/learnprogramming | 1 Sep 2021
    So I created a new branch, also made a GitHub draft PR (here), and started working on it.

What are some alternatives?

When comparing keras-cv and returnn you can also consider the following projects:

stable-diffusion-tensorflow - Stable Diffusion in TensorFlow / Keras

punctuator2 - A bidirectional recurrent neural network model with attention mechanism for restoring missing punctuation in unsegmented text

keras-core - A multi-backend implementation of the Keras API, with support for TensorFlow, JAX, and PyTorch.

enforce - Python 3.5+ runtime type checking for integration testing and data validation

i6_experiments

keras-nlp - Modular Natural Language Processing workflows with Keras

recurrent-fwp - Official repository for the paper "Going Beyond Linear Transformers with Recurrent Fast Weight Programmers" (NeurIPS 2021)

typeguard - Run-time type checker for Python