burn
AECforWebAssembly
burn | AECforWebAssembly | |
---|---|---|
34 | 51 | |
4,845 | 32 | |
- | - | |
8.9 | 8.0 | |
6 months ago | 6 days ago | |
Rust | C++ | |
Apache License 2.0 | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
burn
-
Burn 0.10.0 Released ð¥ (Deep Learning Framework)
Release Note: https://github.com/burn-rs/burn/releases/tag/v0.10.0
- Deep Learning Framework in Rust: Burn 0.10.0 Released
-
Why Rust Is the Optimal Choice for Deep Learning, and How to Start Your Journey with the Burn Deep Learning Framework
The comprehensive, open-source deep learning framework in Rust, Burn, has recently undergone significant advancements in its latest release, highlighted by the addition of The Burn Book ð¥. There has never been a better moment to embark on your deep learning journey with Rust, as this book will guide you through your initial project, providing extensive explanations and links to relevant resources.
-
Candle: Torch Replacement in Rust
Burn (deep learning framework in rust) has WGPU backend (WebGPU) already. Check it out https://github.com/burn-rs/burn. It was released recently.
- Burn â A Flexible and Comprehensive Deep Learning Framework in Rust
-
Announcing Burn-Wgpu: New Deep Learning Cross-Platform GPU Backend
For more details about the latest release see the release notes: https://github.com/burn-rs/burn/releases/tag/v0.8.0.
-
Are there any ML crates that would compile to WASM?
Tract is the most well known ML crate in Rust, which I believe can compile to WASM - https://github.com/sonos/tract/. Burn may also be useful - https://github.com/burn-rs/burn.
-
Any working wgpu compute example that would run in a browser?
We, the burn team, are working on the wgpu backend (WebGPU) for Burn deep learning framework. You can check out the current state: https://github.com/burn-rs/burn/tree/main/burn-wgpu
-
Iâve fallen in love with rust so now what?
Here is the project: https://github.com/burn-rs/burn
-
Is anyone doing Machine Learning in Rust?
Disclaimer, I'm the main author of Burn https://burn-rs.github.io.
AECforWebAssembly
-
Gren 0.3: Source maps
Great! I have not yet made source maps for my programming language that compiles to WebAssembly, and I probably never will.
- Mislite li da okolina ima potpuno pogrešno mišljenje o ljudima koji rade u IT-u?
- Koja je najapsurdnija poruka o pogrešci koju je neki vaš program ispisivao?
-
What is the most absurd error message your compiler/interpreter was once outputting?
Up until today, my AEC-to-WebAssembly was, if somebody tried to use two structures of different types as the second and the third operand to the ?: (ternary conditional) operator, as in this example: ``` Structure First Consists Of Nothing; EndStructure
- PoteÅ¡koÄe s pronalaskom posla
-
Good languages for writing compilers in?
Well, I have written the first compiler for my programming language, targetting x86, in IE6-compatible JavaScript, and the second compiler, targetting WebAssembly, has been written in C++11. I think that, to choose a language to write a compiler in, you need to look at at least two things:
-
Why does GCC run in Docker produce around 30% smaller statically linked C++ executables than GCC run on Linux? AECforWebAssembly is 1.08 MB large if compiled using GCC 13.1 in Docker, but it is 1.59 MB if compiled using GCC 13.1 on Debian.
You can see the releases v2.5.3 and v2.5.2 of AECforWebAssembly on GitHub. They are produced with the same version of GCC, the only difference (as far as I know) is that v2.5.2 was produced directly on Debian, whereas v2.5.3 was cross-compiled from Windows to Linux using Docker.
-
Let's Make Sure Github Doesn't Become the only Option
That could be true. I host my AEC-to-WebAssembly compiler on GitHub, GitLab and SourceForge, and it's only on GitHub that it has 21 stars and 2 forks. On GitLab and SourceForge, it has zero of both.
- koliko vam je bilo tesko nac posao u programiranju?
-
Does the JVM / CLR even make sense nowadays?
Well, the main compiler for my programming language is targetting the JavaScript Virtual Machine by outputting WebAssembly. I think it's even better than targetting Java Virtual Machine, because, for one thing, your executables can run in any modern browser if you output WebAssembly. If you target Java Virtual Machine, the users need to actually download your app. Furthermore, there is an official assembler for WebAssembly called WebAssembly Binary Toolkit (WABT), so your compiler can output assembly and not have to deal with binary files. There is nothing equivalent to that for Java Virtual Machine.
What are some alternatives?
candle - Minimalist ML framework for Rust
Lark - Lark is a parsing toolkit for Python, built with a focus on ergonomics, performance and modularity.
dfdx - Deep learning in Rust, with shape checked tensors and neural networks
wasm-fizzbuzz - WebAssembly from Scratch: From FizzBuzz to DooM.
tch-rs - Rust bindings for the C++ api of PyTorch.
mal - mal - Make a Lisp
Graphite - 2D raster & vector editor that melds traditional layers & tools with a modern node-based, non-destructive, procedural workflow.
Drogon-torch-serve - Serve pytorch / torch models using Drogon
tract - Tiny, no-nonsense, self-contained, Tensorflow and ONNX inference [Moved to: https://github.com/sonos/tract]
libCat - ðâ⬠A runtime for C++26 w/out libC or POSIX. Smaller binaries, only arena allocators, SIMD, stronger type safety than STL, and value-based errors!
L2 - l2 is a fast, Pytorch-style Tensor+Autograd library written in Rust
gdal-js - This is an Emscripten port of GDAL, an open source X/MIT licensed translator library for raster and vector geospatial data formats.