neural-network-from-scratch
bhtsne
Our great sponsors
neural-network-from-scratch | bhtsne | |
---|---|---|
3 | 2 | |
114 | 57 | |
- | - | |
0.0 | 0.0 | |
about 2 years ago | over 1 year ago | |
Rust | Rust | |
- | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
neural-network-from-scratch
- Examine individual neurons of a small neural network in the browser
-
Language models can explain neurons in language models
I built a toy neural network that runs in the browser[1] to model 2D functions with the goal of doing something similar to this research (in a much more limited manner, ofc). Since the input space is so much more limited than language models or similar, it's possible to examine the outputs for each neuron for all possible inputs, and in a continuous manner.
In some cases, you can clearly see neurons that specialize to different areas of the function being modeled, like this one: https://i.ameo.link/b0p.png
This OpenAI research seems to be feeding lots of varied input text into the models they're examining and keeping track of the activations of different neurons along the way. Another method I remember seeing used in the past involves using an optimizer to generate inputs that maximally activate particular neurons in vision models[2].
I'm sure that's much more difficult or even impossible for transformers which operate on sequences of tokens/embeddings rather than single static input vectors, but maybe there's a way to generate input embeddings and then use some method to convert them back into tokens.
[1] https://nn.ameo.dev/
[2] https://www.tensorflow.org/tutorials/generative/deepdream
-
Browser-based neural network sandbox built with Rust + WebAssembly
Full source code is on Github: https://github.com/ameobea/neural-network-from-scratch
bhtsne
-
bhtsne 0.5.0, now 5.6x faster on a 4 core machine, plus a summary of my Rust journey (so far)
bhtsne now supports parallelism, custom data types and custom user defined metrics.
- Barnes-Hut t-SNE, a tree-accelerated algorithm for data visualization reaches 0.3.0
What are some alternatives?
shorelark - Simulation of life & evolution
neuronika - Tensors and dynamic neural networks in pure Rust.
wasm-pdf - Generate PDF files with JavaScript and WASM (WebAssembly)
tangram - Tangram makes it easy for programmers to train, deploy, and monitor machine learning models.
wasm-learning - Building Rust functions for Node.js to take advantage of Rust's performance, WebAssembly's security and portability, and JavaScript's ease-of-use. Demo code and recipes.
tch-rs - Rust bindings for the C++ api of PyTorch.
minesweeper - Minesweeper game developed with Rust, WebAssembly (Wasm), and Canvas
tidy-viewer - 📺(tv) Tidy Viewer is a cross-platform CLI csv pretty printer that uses column styling to maximize viewer enjoyment.
Seed - A Rust framework for creating web apps
linfa - A Rust machine learning framework.
bitque - A simplified Jira clone built with seed.rs and actix
rust - Rust language bindings for TensorFlow