nn_vis
EyeTrackVR
nn_vis | EyeTrackVR | |
---|---|---|
3 | 7 | |
1,045 | 633 | |
- | 4.4% | |
5.9 | 4.0 | |
4 months ago | 16 days ago | |
Python | Python | |
MIT License | GNU General Public License v3.0 only |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
nn_vis
-
[D] Is there a tool to visualise my neural network in real time?
In my master Thesis I did some work on visualizing a neural network. It is not trivial to show the weights in a meaningful/understandable way. https://github.com/julrog/nn_vis You need a portion of training data and it's just for fully connected layers currently. The visualization is 3D and in realtime, but it needs some preprocessing, so i'm not sure if it fits your needs at all.
-
Finding important connections
I had some some success on pruning weights with adding batch normalization layer between existing layer, freezing the existing layer and then and retrain the model with the batch normalization layer (training can be much shorter because of way less weights to train). Then using magnitude of the original weights with the weights from the batch normalization, you can prune the original model. You can see an example for fully connected layer in my code: https://github.com/julrog/nn_vis
-
[D] Convolution Neural Network Visualization - Made with Unity 3D and lots of Code / source - stefsietz (IG)
I just made my project public on GitHub, which seems similar to yours https://github.com/julrog/nn_vis
EyeTrackVR
- accessory for eye tracking quest 2?
-
Vive pro eye edition - > vive pro 2
My only issue is the lack of eye tracking. So, I am building my own eye tracking module for it. (https://github.com/RedHawk989/EyeTrackVR)
-
Index X PSVR2 Eye Tracking Possibility?
yes. there's a diy eye tracker for any headset on github. https://github.com/RedHawk989/EyeTrackVR/releases/tag/EyeTrackApp-0.1.8 is the current release. the hardware is fairly trivial to build.
- Should i swap my index for pro eye, or mod it?
-
I bought a used htc vive for 40 dollars. God im loving it! Does pcvr much much MUCH better than my quest and with the right mods its a perfect headset. Thank y’all for pushing me towards the purchase i wont use my oculus for anything but flight sims or standalone console games(or wireless stuff)
Its an htc product. The main game i play with it is vrchat i also am building some custom eye trackers https://github.com/RedHawk989/EyeTrackVR
-
Eye tracking with OSC
Here is the one I am following https://twitter.com/Prohurtz_ https://github.com/RedHawk989/EyeTrackVR. It is very early, but ESP32 cam cameras are very small and easily fit inside at least the Quest 2.
What are some alternatives?
DeepFaceLab - DeepFaceLab is the leading software for creating deepfakes.
VRCEyeTracking - OSC App to allow VRChat avatars to interact with eye and facial tracking hardware
pyomyo - PyoMyo - Python Opensource Myo armband library
dogfight-sandbox-hg2 - Air to air combat sandbox, created in Python 3 using the HARFANG 3D 2 framework.
nn-visualizer
NeosWCFaceTrack - OpenSeeFace fork allowing for Face Tracking in Neos VR through a single RGB webcam
OpenXR-SDK-Source - Sources for OpenXR loader, basic API layers, and example code.
rf2_video_settings - Create presets of your rFactor 2 settings and quickly change between performance focused VR setup or an eye-candy favoured Replay setup.
Longhand - Text corpora in virtual reality
WeightVis - Visualize neural network weights from different kind of libraries