Our great sponsors
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
wonnx
A WebGPU-accelerated ONNX inference run-time written 100% in Rust, ready for native and the web
-
WorkOS
The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.
I would say onnx.ai [0] provides more information about ONNX for those who aren’t working with ML/DL.
[0] https://onnx.ai
There's also a third-party WebGPU implementation: https://github.com/webonnx/wonnx
Be better to use https://github.com/tinygrad/tinygrad as an intermediary framework?
We wanted to use ONNX runtime for a "model driver" for MD simulations, where any ML model can be used for molecular dynamics simulations. Problem was it was way too immature. Like ceiling function will only work with single precision in ONNX. But the biggest issue was that we could not take derivatives in ONNX runtime, so any complicated model that uses derivatives inside was a nogo, is that limitation still exist? Do you know if it can take derivatives in training mode now?
Eventually we went with pytorch only support for the time being, with still exploring OpenXLA in place of ONNX, as a universal adapter: https://github.com/ipcamit/colabfit-model-driver