run-wasm
onnxjs
run-wasm | onnxjs | |
---|---|---|
1 | 3 | |
470 | 1,565 | |
0.0% | - | |
3.6 | 4.9 | |
almost 3 years ago | over 3 years ago | |
TypeScript | TypeScript | |
Apache License 2.0 | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
run-wasm
onnxjs
- Alternatives to tensorflowjs
-
Run WASM, a client side Python runtime
Tensorflow (and by extension Keras) offload most of the actual work to C++ or C, so having those compile to WebAssembly would (I imagine) be a herculean effort.
Instead, The TF team maintains TFJS, which can run on WebAssembly[0].
There are also tractjs[1], and onnyxjs[2], both of which allow you to run (most) ONNX models (which is an open standard for specifying ML models) using WebAssembly and WebGL(only onnyxjs supports WebGL). A bunch of frameworks (caffe, pytorch, TF) support exporting to/importing from ONNX.
[0] https://blog.tensorflow.org/2020/03/introducing-webassembly-...
[1] https://github.com/bminixhofer/tractjs
[2] https://github.com/microsoft/onnxjs
- AppleNeuralHash2ONNX: Convert Apple NeuralHash Model for CSAM Detection to ONNX
What are some alternatives?
video-transcoder - Android app for video and audio transcoder, based on FFmpeg
AppleNeuralHash2ONNX - Convert Apple NeuralHash model for CSAM Detection to ONNX.
wasup - A zero-dependency, isomorphic library for emitting WebAssembly
tractjs - Run ONNX and TensorFlow inference in the browser.
onnxruntime - ONNX Runtime: cross-platform, high performance ML inferencing and training accelerator