mation-spec VS trax

Compare mation-spec vs trax and see what are their differences.

SurveyJS - Open-Source JSON Form Builder to Create Dynamic Forms Right in Your App
With SurveyJS form UI libraries, you can build and style forms in a fully-integrated drag & drop form builder, render them in your JS app, and store form submission data in any backend, inc. PHP, ASP.NET Core, and Node.js.
surveyjs.io
featured
InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
mation-spec trax
4 7
6 7,962
- 0.4%
4.4 4.7
4 months ago 3 months ago
JavaScript Python
Apache License 2.0 Apache License 2.0
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

mation-spec

Posts with mentions or reviews of mation-spec. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-10-31.
  • Show HN: FoldMation – An Interactive Origami Learning and Creation Application
    1 project | news.ycombinator.com | 14 Feb 2024
    Hi, I've created an application where you can follow step by step origami fold instructions, and a Creator where you can make these interactive folds.

    On comparing to video instructions, you have the ability to quickly skip/rewind steps and replay a complicated step many times.

    On the creation side, there have been one or two attempts at this before, but those solutions rely on mouse drags as the user interface. This greatly limited the kinds of folds possible. The foldMation Creator uses commands, keywords and values to compose a domain specific language/step and provides a (relatively speaking) easy to use user interface to compose the steps.

    For those interested in using the Creator, please go through the tutorial at the top of the create page.

    Btw, the DSL for foldMation uses https://github.com/mationai/mation-spec. I created it since I couldn't find anything out there that is similar, allowing me to specify a well structured data with English-like readable syntax.

    Let me know what you think?

    1 project | news.ycombinator.com | 12 Jan 2024
    The DSL for foldMation uses https://github.com/mationai/mation-spec . I created it since I couldn't find anything out there that is similar, allowing me to specify a well structured data with English-like readable syntax.
  • Ohm: A library and language for building parsers, interpreters, compilers, etc.
    6 projects | news.ycombinator.com | 31 Oct 2023
    Ohm is a wonderful tool. I used it to create mation-spec [0], a readable structured configuration and specification format to automate and run code. I look hard trying to find something like it before giving up and creating one myself with the help of Ohm. The mation-spec is the basis of an origami fold simulation language to describe and simulate origami folds. PM me if you like to see it before I post the simulator on HN.

    [0] https://github.com/mationai/mation-spec

  • Replit's new Code LLM was trained in 1 week
    12 projects | news.ycombinator.com | 3 May 2023
    Have you thought of finding or creating something like this [0]?

    I created this as the basis for my origami folding descriptive language. I tried to find something similar, requirements being both well structured and English-like but couldn't find any, so I created it.

    The origami folding app will hopefully be out in 2 weeks, so you can see how it's used.

    [0] https://github.com/fuzzthink/mation-spec

trax

Posts with mentions or reviews of trax. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2024-04-23.
  • Maxtext: A simple, performant and scalable Jax LLM
    10 projects | news.ycombinator.com | 23 Apr 2024
    Is t5x an encoder/decoder architecture?

    Some more general options.

    The Flax ecosystem

    https://github.com/google/flax?tab=readme-ov-file

    or dm-haiku

    https://github.com/google-deepmind/dm-haiku

    were some of the best developed communities in the Jax AI field

    Perhaps the “trax” repo? https://github.com/google/trax

    Some HF examples https://github.com/huggingface/transformers/tree/main/exampl...

    Sadly it seems much of the work is proprietary these days, but one example could be Grok-1, if you customize the details. https://github.com/xai-org/grok-1/blob/main/run.py

  • Replit's new Code LLM was trained in 1 week
    12 projects | news.ycombinator.com | 3 May 2023
    and the implementation https://github.com/google/trax/blob/master/trax/models/resea... if you are interested.

    Hope you get to look into this!

  • RedPajama: Reproduction of Llama with Friendly License
    4 projects | news.ycombinator.com | 17 Apr 2023
    Thank you for developing the pipeline and amassing considerable compute for gathering and preprocessing this dataset!

    I'm not sure if this is the right place to ask about this, but could you consider training an LLM using a more advanced, sparse transformer architecture (specifically, "Terraformer" from this paper https://arxiv.org/abs/2111.12763 and this codebase https://github.com/google/trax/blob/master/trax/models/resea... by Google Brain and OpenAI)? I understand the pressure to focus on training a straightforward LLaMA replication, but of course you see that it's a legacy dense architecture which limits its inference performance. This new architecture is not just an academic curiosity but is already validated at scale by Google, providing 10x+ inference performance boost on the same hardware.

    Frankly, the community's compute budget - for training and for inference - isn't infinite, and neither is the public's interest in models that do not have advantage (at least in convenience) over closed-source ones; and so we should utilize both those resources as efficiently as possible. It could be a big step forward if you trained at least LLaMA-Terraformer-7B and 13B foundation models on the whole dataset.

  • The founder of Gmail claims that ChatGPT can “kill” Google in two years.
    1 project | /r/Futurology | 31 Jan 2023
    But a couple years later they came out with open source implementations yeah: https://github.com/google/trax/tree/master/trax/models/reformer
  • [D] Paper Explained - Sparse is Enough in Scaling Transformers (aka Terraformer) | Video Walkthrough
    1 project | /r/MachineLearning | 1 Dec 2021
    Code: https://github.com/google/trax/blob/master/trax/examples/Terraformer_from_scratch.ipynb
  • Why would I want to develop yet another deep learning framework?
    4 projects | /r/learnmachinelearning | 16 Sep 2021
  • How to train large models on a normal laptop?
    1 project | /r/LanguageTechnology | 14 Feb 2021
    Training language models is expensive. Train the biggest model you can afford. I assume you've tried the colab from the reformer GitHub: https://github.com/google/trax/tree/master/trax/models/reformer

What are some alternatives?

When comparing mation-spec and trax you can also consider the following projects:

IF

flax - Flax is a neural network library for JAX that is designed for flexibility.

ReplitLM - Inference code and configs for the ReplitLM model family

dm-haiku - JAX-based neural network library

stat4701 - Final Project

muzero-general - MuZero

code-align-evals-data

ML-Optimizers-JAX - Toy implementations of some popular ML optimizers using Python/JAX

fauxpilot - FauxPilot - an open-source alternative to GitHub Copilot server

extending-jax - Extending JAX with custom C++ and CUDA code

text-generation-webui - A Gradio web UI for Large Language Models. Supports transformers, GPTQ, AWQ, EXL2, llama.cpp (GGUF), Llama models.

objax