Our great sponsors
-
WorkOS
The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.
i just released my own auto-diff library called niura, (it's unstable and unsafe at the moment) and i've been looking for a simple, rust-compatible way to do gpu acceleration for matrix-multiplication, could you recommend something in that regard?
Take a look at arrayfire-rust! :)
how does it compare with https://github.com/spearow/juice, https://github.com/neuronika/neuronika and https://github.com/spearow/juice?
how does it compare with https://github.com/spearow/juice, https://github.com/neuronika/neuronika and https://github.com/spearow/juice?
What is the benefit of this compared to using bindings/a wrapper to Tensorflow, or other ML libraries written in C/C++, such as this community hosted project on tensorflow's github. If it's just for fun that is a valid enough reason imo, just curious since you describe it as a better Tensorflow because of the typing vs using the python wrapper, when there already exist ways to interact with tensorflow with both Rust and other statically typed languages, also including C++ (officially supported), C#, Haskell and Scala, as well as probably having bindings not mentioned on the documentation for more niche languages.
Related posts
- Intel CEO: 'The entire industry is motivated to eliminate the CUDA market'
- Do you consider making a physics engine (for RL) worth it?
- Where to Learn Vulkan for parallel computation (with references to porting from CUDA)
- Any role that Rust could have in the Data world (Big Data, Data Science, Machine learning, etc.)?
- Announcing neuronika 0.1.0, a deep learning framework in Rust