tdk-demo
PeopleSansPeople
tdk-demo | PeopleSansPeople | |
---|---|---|
3 | 5 | |
16 | 294 | |
- | 2.4% | |
8.4 | 3.0 | |
14 days ago | 2 months ago | |
YAML | C# | |
BSD 3-clause "New" or "Revised" License | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
tdk-demo
PeopleSansPeople
-
PI wants me to make a synthetic dataset.
Also, check this Unity repo out
-
Generating human motion synthetic data ?
I was trying to train a model which goes on top of one of the pose estimation models(posenet, movenet, mediapipe) which detects the action performed(waving, swipe right, etc), and I was planning on generating synthetic data for it. I saw that there's a project for unity PeopleSansPeople, but it's not right to train a model for action recognition. I would like something that either simulates a human doing a simple action, to which I would be able to add randomness to it. I was thinking to either use Unity or maybe write something that would model the human keypoints(the output of pose estimation) and simulate them.. I am wondering if there already exists something that you guys might know about??
- [P] Can't finish my master's thesis. What to do?
-
[R] PeopleSansPeople: Unity's Human-Centric Synthetic Data Generator. GitHub link in comments.
Source code: https://github.com/Unity-Technologies/PeopleSansPeople
-
[R] PeopleSansPeople: Unity's Human-Centric Synthetic Data Generator
Webpage: https://unity-technologies.github.io/PeopleSansPeople/ Paper: https://arxiv.org/abs/2112.09290 Source code: https://github.com/Unity-Technologies/PeopleSansPeople Papers with code: https://paperswithcode.com/paper/peoplesanspeople-a-synthetic-data-generator https://paperswithcode.com/dataset/peoplesanspeople Demo video: https://youtu.be/mQ_DUdB70dc Summary: PeopleSansPeople is a human-centric data generator provided by Unity Technologies that contains highly-parametric and simulation-ready 3D human assets, parameterized lighting and camera system, parameterized environment generators, and fully-manipulable and extensible domain randomizers. PeopleSansPeople can generate RGB images with sub-pixel-perfect 2D/3D bounding box, COCO-compliant human keypoints, and semantic/instance segmentation masks in JSON annotation files. All packaged in macOS and Linux executable binaries capable of generating 1M+ datasets. In addition we release a template Unity environment for lowering the barrier of entry and getting you started with creating your own highly-parameterized human-centric synth data generator. We affectionately named our synthetic data generator PeopleSansPeople, as it is a data generator aimed at human-centric computer vision without using human data which bears serious privacy, safety, ethical, bias, and legal concerns. Benchmarks: The domain randomization we used for our benchmarks are naïve, brute-forced sweeps through the pre-chosen range of parameters; as such we end up generating psychedelic-looking scenes, which turned out to train more performant models for human-centric computer vision.Using PeopleSansPeople we benchmarked a Detectron2 Keypoint R-CNN variant. Results indicate synthetic pre-training with our data outperforms results of training on real data alone or pre-training with ImageNet, both in limited and abundant data regimes.We envisage that this freely-available data generator should enable a wide range of research into the emerging field of simulation to real transfer learning in the critical area of human-centric computer vision.
What are some alternatives?
DoppelGANger - [IMC 2020 (Best Paper Finalist)] Using GANs for Sharing Networked Time Series Data: Challenges, Initial Promise, and Open Questions
Robotics-Object-Pose-Estimation - A complete end-to-end demonstration in which we collect training data in Unity and use that data to train a deep neural network to predict the pose of a cube. This model is then deployed in a simulated robotic pick-and-place task.
discus - A data-centric AI package for ML/AI. Get the best high-quality data for the best results. Discord: https://discord.gg/t6ADqBKrdZ
com.unity.perception - Perception toolkit for sim2real training and validation in Unity
nist-crc-2023 - NIST Collaborative Research Cycle on Synthetic Data. Learn about Synthetic Data week by week!
VirtualHumanBatchProcessing
generatedata - A powerful, feature-rich, random test data generator.
ml-agents - The Unity Machine Learning Agents Toolkit (ML-Agents) is an open-source project that enables games and simulations to serve as environments for training intelligent agents using deep reinforcement learning and imitation learning.
SynthDet - SynthDet - An end-to-end object detection pipeline using synthetic data
REaLTabFormer - A suite of auto-regressive and Seq2Seq (sequence-to-sequence) transformer models for tabular and relational synthetic data generation.