mastermind
neural-network-from-scratch
mastermind | neural-network-from-scratch | |
---|---|---|
1 | 3 | |
5 | 114 | |
- | - | |
4.1 | 0.0 | |
2 months ago | about 2 years ago | |
Rust | Rust | |
- | - |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
mastermind
-
Mastermind Solver
Mastermind intrigued me in the same way as the author some time ago, and I've used it as a standard problem when trying out new computational frameworks/methods ever since.
Here is my Rust version with multi-threading, SIMD, WASM running on your device inside a WebApp: https://0xbe7a.github.io/mastermind/
Repo: https://github.com/0xbe7a/mastermind
It is quite fast (1839202304 position pairs evaluated in 1652ms on my device) and can also exploit some symmetries inside the solution space.
neural-network-from-scratch
- Examine individual neurons of a small neural network in the browser
-
Language models can explain neurons in language models
I built a toy neural network that runs in the browser[1] to model 2D functions with the goal of doing something similar to this research (in a much more limited manner, ofc). Since the input space is so much more limited than language models or similar, it's possible to examine the outputs for each neuron for all possible inputs, and in a continuous manner.
In some cases, you can clearly see neurons that specialize to different areas of the function being modeled, like this one: https://i.ameo.link/b0p.png
This OpenAI research seems to be feeding lots of varied input text into the models they're examining and keeping track of the activations of different neurons along the way. Another method I remember seeing used in the past involves using an optimizer to generate inputs that maximally activate particular neurons in vision models[2].
I'm sure that's much more difficult or even impossible for transformers which operate on sequences of tokens/embeddings rather than single static input vectors, but maybe there's a way to generate input embeddings and then use some method to convert them back into tokens.
[1] https://nn.ameo.dev/
[2] https://www.tensorflow.org/tutorials/generative/deepdream
-
Browser-based neural network sandbox built with Rust + WebAssembly
Full source code is on Github: https://github.com/ameobea/neural-network-from-scratch
What are some alternatives?
glam-rs - A simple and fast linear algebra library for games and graphics
shorelark - Simulation of life & evolution
stdarch - Rust's standard library vendor-specific APIs and run-time feature detection
wasm-pdf - Generate PDF files with JavaScript and WASM (WebAssembly)
simd-json - Rust port of simdjson
bhtsne - Parallel Barnes-Hut t-SNE implementation written in Rust.
hora - 🚀 efficient approximate nearest neighbor search algorithm collections library written in Rust 🦀 .
wasm-learning - Building Rust functions for Node.js to take advantage of Rust's performance, WebAssembly's security and portability, and JavaScript's ease-of-use. Demo code and recipes.
cgmath-rs - A linear algebra and mathematics library for computer graphics.
Seed - A Rust framework for creating web apps
ludi317
minesweeper - Minesweeper game developed with Rust, WebAssembly (Wasm), and Canvas