-
neural-network-from-scratch
A neural network library written from scratch in Rust along with a web-based application for building + training neural networks + visualizing their outputs
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
I built a toy neural network that runs in the browser[1] to model 2D functions with the goal of doing something similar to this research (in a much more limited manner, ofc). Since the input space is so much more limited than language models or similar, it's possible to examine the outputs for each neuron for all possible inputs, and in a continuous manner.
In some cases, you can clearly see neurons that specialize to different areas of the function being modeled, like this one: https://i.ameo.link/b0p.png
This OpenAI research seems to be feeding lots of varied input text into the models they're examining and keeping track of the activations of different neurons along the way. Another method I remember seeing used in the past involves using an optimizer to generate inputs that maximally activate particular neurons in vision models[2].
I'm sure that's much more difficult or even impossible for transformers which operate on sequences of tokens/embeddings rather than single static input vectors, but maybe there's a way to generate input embeddings and then use some method to convert them back into tokens.
[1] https://nn.ameo.dev/
[2] https://www.tensorflow.org/tutorials/generative/deepdream