iOS 15.4 AVFoundation LiDAR sensor + MIDI notes = depth-based augmented reality music visualization as the music is being played
Why do you think that https://github.com/chriswebb09/ARKitNavigationDemo is a good alternative to Reality-Synthesizer
iOS 15.4 AVFoundation LiDAR sensor + MIDI notes = depth-based augmented reality music visualization as the music is being played
Why do you think that https://github.com/chriswebb09/ARKitNavigationDemo is a good alternative to Reality-Synthesizer