onnx

Open standard for machine learning interoperability (by onnx)

Onnx Alternatives

Similar projects and alternatives to onnx

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a better onnx alternative or higher similarity.

onnx discussion

Log in or Post with

onnx reviews and mentions

Posts with mentions or reviews of onnx. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2024-06-13.
  • Using Google Magika to build an AI-powered file type detector
    4 projects | dev.to | 13 Jun 2024
    To perform fast inference at runtime, Magika uses the cross-platform Open Neural Network Exchange (ONNX) runtime. ONNX provides a method to optimize, accelerate, and deploy models built using any of the popular frameworks consistently, even across different hardware platforms or instruction set architectures.
  • Nvidia and Salesforce double down on AI startup Cohere in $450M round
    1 project | news.ycombinator.com | 4 Jun 2024
    Right; but you can't cross-compile everything. This is really common in AI libraries, especially multi-target projects like ONNX: https://onnx.ai/

    The math probably adds up in Google's favor with the TPUs, even if they end up being less efficient and slower per-unit than Nvidia hardware. They don't need to pay for the margins, and they can run them 24/7 for their intended purpose. The previous-generation TPUs can't be reused or resold for other purposes though, and if/when AI blows over as a trend you probably can't easily start mining crypto or doing HPC calculations like an Nvidia cluster would.

  • HuggingFace hacked – Space secrets leak disclosure
    1 project | news.ycombinator.com | 1 Jun 2024
    > I had assumed model files were big matrices of numbers and some metadata perhaps

    ONNX [1] is more or less this, but the challenge you immediately run into is models with custom layers/operators with their own inference logic - you either have to implement those operators in terms of the supported ops (not necessarily practical or viable) or provide the implementation of the operator to the runtime, putting you back at square one.

    [1] https://onnx.ai/

  • Onyx, a new programming language powered by WebAssembly
    4 projects | news.ycombinator.com | 1 Dec 2023
  • From Lab to Live: Implementing Open-Source AI Models for Real-Time Unsupervised Anomaly Detection in Images
    4 projects | dev.to | 15 Oct 2023
    Once your model has been trained and validated using Anomalib, the next step is to prepare it for real-time implementation. This is where ONNX (Open Neural Network Exchange) or OpenVINO (Open Visual Inference and Neural network Optimization) comes into play.
  • Object detection with ONNX, Pipeless and a YOLO model
    2 projects | dev.to | 20 Sep 2023
    ONNX is an open format from the Linux Foundation to represent machine learning models. It is becoming extensively adopted by the Machine Learning community and is compatible with most of the machine learning frameworks like PyTorch, TensorFlow, etc. Converting a model between any of those formats and ONNX is really simple and can be done in most cases with a single command.
  • 38TB of data accidentally exposed by Microsoft AI researchers
    3 projects | news.ycombinator.com | 18 Sep 2023
    ONNX[0], model-as-protosbufs, continuing to gain adoption will hopefully solve this issue.

    [0] https://github.com/onnx/onnx

  • Reddit’s LLM text model for Ads Safety
    1 project | /r/RedditEng | 13 Sep 2023
    Running inference for large models on CPU is not a new problem and fortunately there has been great development in many different optimization frameworks for speeding up matrix and tensor computations on CPU. We explored multiple optimization frameworks and methods to improve latency, namely TorchScript, BetterTransformer and ONNX.
  • Operationalize TensorFlow Models With ML.NET
    5 projects | dev.to | 17 Aug 2023
    ONNX is a format for representing machine learning models in a portable way. Additionally, ONNX models can be easily optimized and thus become smaller and faster.
  • Onnx Runtime: “Cross-Platform Accelerated Machine Learning”
    5 projects | news.ycombinator.com | 25 Jul 2023
    I would say onnx.ai [0] provides more information about ONNX for those who aren’t working with ML/DL.

    [0] https://onnx.ai

  • A note from our sponsor - InfluxDB
    www.influxdata.com | 16 Jun 2024
    Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality. Learn more →

Stats

Basic onnx repo stats
41
17,117
9.4
2 days ago

Sponsored
Free Django app performance insights with Scout Monitoring
Get Scout setup in minutes, and let us sweat the small stuff. A couple lines in settings.py is all you need to start monitoring your apps. Sign up for our free tier today.
www.scoutapm.com