OpenXR-SDK-Source
nn_vis
OpenXR-SDK-Source | nn_vis | |
---|---|---|
5 | 3 | |
624 | 1,045 | |
2.1% | - | |
6.9 | 5.9 | |
10 days ago | 4 months ago | |
Python | Python | |
Apache License 2.0 | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
OpenXR-SDK-Source
-
Stereoscopic Rendering for VR from Scratch?
I‘d recommend to take a look at the Hello_Xr sample in the OpenXr repository here: https://github.com/KhronosGroup/OpenXR-SDK-Source/tree/master/src/tests/hello_xr
- Beginner friendly tutorial of openxr
-
Most efficient way to figure out what is creating uninitialised values?
I've dug into it some more and the file is from Monado, which I had compiled with debug symbols due to similar issues having happened before. xr_generated_loader.cpp:152 is from OpenXR-SDK-Source, also compiled with debug symbols for the same reason.
-
Interfacing OpenXR Extensions
There is only one mainstream OpenXR loader implementation, the one from Khronos. Older OpenXR loader versions did export extension functions "by accident", but it was fixed quite a while ago https://github.com/KhronosGroup/OpenXR-SDK-Source/commit/d14bd22ae3a64d6bcb3a79a6119f35ad0a2d3110
-
openxr initialization/application detecting VR
OpenXR API layers are like Vulkan API layers. There are two API layers that are provided by default, one to dump api calls and one to validate api calls. Those layers are now included in the openxr_loader_windows-*.zip on the OpenXR-SDK-Source releases page but you probably have to set the XR_API_LAYER_PATH environment variable or some registry key on windows for the loader to find them. Documentation: https://github.com/KhronosGroup/OpenXR-SDK-Source/blob/master/specification/loader/api_layer.adoc
nn_vis
-
[D] Is there a tool to visualise my neural network in real time?
In my master Thesis I did some work on visualizing a neural network. It is not trivial to show the weights in a meaningful/understandable way. https://github.com/julrog/nn_vis You need a portion of training data and it's just for fully connected layers currently. The visualization is 3D and in realtime, but it needs some preprocessing, so i'm not sure if it fits your needs at all.
-
Finding important connections
I had some some success on pruning weights with adding batch normalization layer between existing layer, freezing the existing layer and then and retrain the model with the batch normalization layer (training can be much shorter because of way less weights to train). Then using magnitude of the original weights with the weights from the batch normalization, you can prune the original model. You can see an example for fully connected layer in my code: https://github.com/julrog/nn_vis
-
[D] Convolution Neural Network Visualization - Made with Unity 3D and lots of Code / source - stefsietz (IG)
I just made my project public on GitHub, which seems similar to yours https://github.com/julrog/nn_vis
What are some alternatives?
openvr_fsr_app - Management Gui for OpenVR FSR PlugIn
DeepFaceLab - DeepFaceLab is the leading software for creating deepfakes.
unity-webxr-export - Develop and export WebXR experiences using Unity WebGL
pyomyo - PyoMyo - Python Opensource Myo armband library
StereoKit - An easy-to-use XR engine for building AR and VR applications with C# and OpenXR!
nn-visualizer
lovr - Lua Virtual Reality Framework
dogfight-sandbox-hg2 - Air to air combat sandbox, created in Python 3 using the HARFANG 3D 2 framework.
LinusTrinus - TrinusVR streaming server for Linux
EyeTrackVR - Open Source and Affordable, Virtual Reality Eye Tracking Platform.
OpenXR-MixedReality - OpenXR samples and preview headers for HoloLens and Windows Mixed Reality developers familiar with Visual Studio
rf2_video_settings - Create presets of your rFactor 2 settings and quickly change between performance focused VR setup or an eye-candy favoured Replay setup.