Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality. Learn more →
Matrixmultiply Alternatives
Similar projects and alternatives to matrixmultiply
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
Graal
GraalVM compiles Java applications into native executables that start instantly, scale fast, and use fewer compute resources 🚀
-
rust-ndarray
ndarray: an N-dimensional array with array views, multidimensional slicing, and efficient operations
-
WorkOS
The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.
-
weave
A state-of-the-art multithreading runtime: message-passing based, fast, scalable, ultra-low overhead (by mratsim)
matrixmultiply reviews and mentions
-
Help understanding the state of ndarrays and linalg in Rust.
The matrixmultiply crate from the ndarray author (https://github.com/bluss/matrixmultiply) is one such implementation. It uses the same algorithm as the BLIS project (https://www.cs.utexas.edu/users/flame/pubs/TOMS-BLIS-Analytical.pdf) to partition the problem and exploit the cache hierarchy. It isn't as well tuned as eg. Intel MKL or BLIS, but the results are very respectable.
-
faer 0.8.0 release
Do you plan to support integers as native types? I know there is an issue for the crate matrixmultiply for that, it seems it can be problematic because of overflow.
-
Faster `matrixmultiply` ?
There's a famous crate [matrixmultiply](https://github.com/bluss/matrixmultiply) for matrix-matrix multiplication in Rust. But it's a bit slow for me.
-
Nim vs Rust Benchmarks
In my benchmarks, Nim is faster than Rust:
- multithreading runtime (i.e Rayon vs Weave https://github.com/mratsim/weave)
- Cryptography: https://hackmd.io/@gnark/eccbench#Pairing
- Scientific computing / matrix multiplication: https://github.com/bluss/matrixmultiply/issues/34#issuecomme...
There is no inherent reason why a Nim program would be slower than Rust.
-
A note from our sponsor - InfluxDB
www.influxdata.com | 19 Apr 2024
Stats
bluss/matrixmultiply is an open source project licensed under Apache License 2.0 which is an OSI approved license.
The primary programming language of matrixmultiply is Rust.
Popular Comparisons
- matrixmultiply VS weave
- matrixmultiply VS rust-ndarray
- matrixmultiply VS Programming-Language-Benchmarks
- matrixmultiply VS Programming-Language-Benchmark
- matrixmultiply VS Graal
- matrixmultiply VS faer-rs
- matrixmultiply VS matrixmultiply_mt
- matrixmultiply VS cblas-sys
- matrixmultiply VS rust
- matrixmultiply VS nalgebra