SaaSHub helps you find the best software and product alternatives Learn more →
XNNPACK Alternatives
Similar projects and alternatives to XNNPACK
-
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
-
-
-
doesitarm
🦾 A list of reported app support for Apple Silicon as well as Apple M4 and M3 Ultra Macs
-
AITemplate
AITemplate is a Python framework which renders neural network into high performance CUDA/HIP C++ code. Specialized for FP16 TensorCore (NVIDIA GPU) and MatrixCore (AMD GPU) inference.
-
conda
A system-level, binary package and environment manager running on all major operating systems and platforms.
-
-
-
-
-
ncnn
ncnn is a high-performance neural network inference framework optimized for the mobile platform
-
-
-
-
Awesome-Rust-MachineLearning
This repository is a list of machine learning libraries written in Rust. It's a compilation of GitHub repositories, blogs, books, movies, discussions, papers, etc. 🦀
-
-
-
XNNPACK discussion
XNNPACK reviews and mentions
-
Ask HN: If you are a Machine Learning engineer, what do you do at work?
A lot do, personally, every single time I try to go back to conda/mamba whatever, I get some extremely weird C/C++ related linking bug - just recently, I ran into an issue where the environment was _almost_ completely isolated from the OS distro's C/C++ build infra, except for LD, which was apparently so old it was missing the vpdpbusd instruction (https://github.com/google/XNNPACK/issues/6389). Except the thing was, that wouldn't happen when building outside of of the Conda environment. Very confusing. Standard virtualenvs are boring but nearly always work as expected in comparison.
- Xnnpack: High-efficiency floating-point neural network inference operators
- Can a NPU be used for vectors?
-
Performance critical ML: How viable is Rust as an alternative to C++
Why are you writing your own inference code in C++ or Rust instead of using some kind of established framework like XNNPACK?
- [P] Pure C/C++ port of OpenAI's Whisper
-
[Discussion] Is XNNPACK a part of mediapipe? or should be additionally configured with mediapipe?
XNNPACK - https://github.com/google/XNNPACK
- WebAssembly Techniques to Speed Up Matrix Multiplication by 120x
-
Prediction: Macs won't see many new games, no matter how powerful their hardware is
Ok, concrete example time! At work, we're going to be using some software which includes XNNPACK, which is a library of highly-optimised operations for doing neural-network inference. This is the sort of thing where people have gone in and specifically tuned for performance, and nope, there's no attempt at all made to have code which is different for Intel/AMD or Apple/Other ARM. What they target is elements of the ISA, like NEON (i.e. ARM SIMD) and SSE, AVX etc. on x86(-64). And Wasm SIMD for Wasm.
-
Where are Nvidia's DLSS models stored and how big are they?
It's quite simple. https://github.com/google/XNNPACK for example.
-
A note from our sponsor - SaaSHub
www.saashub.com | 23 Jan 2025
Stats
google/XNNPACK is an open source project licensed under GNU General Public License v3.0 or later which is an OSI approved license.
The primary programming language of XNNPACK is C.