How to transmit custom signal to vrchat world (AI guided 3d keypoints)

This page summarizes the projects mentioned and recommended in the original post on /r/VRchat

Our great sponsors
  • InfluxDB - Collect and Analyze Billions of Data Points in Real Time
  • Onboard AI - Learn any GitHub repo in 59 seconds
  • SaaSHub - Software Alternatives and Reviews
  • UNet

    Network system for VRChat UDON (by Xytabich)

    Summary: I've implemented real-time mono-webcam full body tracking, but I'm having trouble sending this data to vrchat. This function was actually implemented on Unity, and socket communication was used in Unity. Is there a socket communication method that works with udonsharp or vrchat? Here are the solutions I reviewed. https://github.com/Xytabich/UNet I think Unet is very similar to the solution I was looking for, but the connection ID uses the person object that entered the world. I want to receive data from outside.

  • midi-websocket

    Send midi through websockets

    https://github.com/fa-m/midi-websocket MIDI websockets pose a fundamental question to me. This solution is difficult to analyze and emphasizes the importance of security.

  • InfluxDB

    Collect and Analyze Billions of Data Points in Real Time. Manage all types of time series data in a single, purpose-built database. Run at any scale in any environment in the cloud, on-premises, or at the edge.

  • This only works for your own avatar though. If you want to make something that is displayed in a single world you can look into ShaderMotion, which is a system to bring humanoid poses into VRC worlds. The data is sent via a video feed (as blinking pixels) and a shader then reads out those pixels and "animates" the model accordingly.

  • MotionBERT

    [ICCV 2023] PyTorch Implementation of "MotionBERT: A Unified Perspective on Learning Human Motion Representations"

    Thank you for answer. The method I am using is one of the monocular 3d pose estimation studies, and is as follows. https://github.com/Walter0807/MotionBERT

  • OSCMotion

    Lox also made OSCMotion, which allows you to send the joint positions via OSC instead. This is pretty scuffed though, since the update rate is far too low for this to look good. It is possible in theory though.

  • Onboard AI

    Learn any GitHub repo in 59 seconds. Onboard AI learns any GitHub repo in minutes and lets you chat with it to locate functionality, understand different parts, and generate new code. Use it for free at www.getonboard.dev.

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts