IsaacGymEnvs
Unity-Robotics-Hub
Our great sponsors
IsaacGymEnvs | Unity-Robotics-Hub | |
---|---|---|
8 | 12 | |
1,599 | 1,864 | |
8.2% | 2.9% | |
4.6 | 0.0 | |
3 days ago | 16 days ago | |
Python | C# | |
GNU General Public License v3.0 or later | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
IsaacGymEnvs
-
What is the limit on parallel environments?
Although Gym/Gymnasium allows you to generate vectorized parallel environments, if you want to train in hundreds or thousands of environments you will need to use the NVIDIA simulator repertoire (Isaac Gym, Isaac Orbit or Omniverse Isaac Gym).
-
How to optimize custom gym environment for GPU
Otherwise, I'd suggest checking out the Isaac Gym paper and the Isaac Gym Envs repo.
-
Showing the "good" values does not help the PPO algorithm?
in the given environment (https://github.com/NVIDIA-Omniverse/IsaacGymEnvs/blob/main/isaacgymenvs/tasks/franka_cabinet.py), the task for the robot is to open a cabinet. The action values, which are the output of the agent, are the target velocity values for the robot's joints.
-
Has anyone experience using/implementing "masking action" in Isaac Gym?
can it be implemented in the task-level scripts (i.e. ant.py, FrankaCabinet.py etc.) like this?
-
[Material advice] Learn reinforcement leanring
IsaacGymEnvs
-
Simulating robotic arm for object manipulation
And here are some reinforcment learning examples.
-
What Happened to OpenAI + RL?
Gym has been great at standardizing API and providing a baseline set of environments. However, parallelizing environments with original Gym interface is cumbersome, and new simulators are being introduced with their own ways of doing things. It's not clear to me that Gym is still useful today, from a research perspective.
-
[D] MuJoCo vs PyBullet? (esp. for custom environment)
If you already have experience in PyBullet then its probably not worth switching to Mujoco for creating custom environments. However, if you have the GPU compute for it, I'd recommend checking out Isaac Gym. GPU acceleration is great for spawning a bunch of envs for domain randomization, and it's already been used by recent research to get some great results that have previously taken a ridiculous amount of CPU compute.
Unity-Robotics-Hub
-
Create VR-Controlled Roboter in Unity
So far I have cloned the Universal_Robots_ROS2_Description and Robotiq_2f_140 (Supported for ROS2) in my workspace. Since I want to implement my Robot in Unity I am following the Unity Integration tutorial and the moveit tutorial.
- Unity VS Unreal which one you recommand for ROS (am a biginner for all three)
-
ROS 2 and simulation in Unity3D
Hi. We have used the ROS-TCP connector as well to establish a connection between ROS2 and Unity, and have done the ros_unity_integration tutorial.
-
HoloLens 2 AR-Interface for Medical Assistance Robots in Unity
It's for my thesis, so it's not a library and it's quite unstructured since there will be no further use. I used the MRTK Interface Elements in Unity and the Unity Robotics Hub (https://github.com/Unity-Technologies/Unity-Robotics-Hub). Then I customized everything for the KUKA iiwa Robotic Arm.
- Simulating robotic arm for object manipulation
-
Unity Pick And Place with Hololens 2 #hololens2 #hololens #robotics #robot #pickandplace #kitchen #microsoft #augmentedreality #ar #xr #unity3d #unity #niryo
Actually, there is no code on GitHub yet. It's a Unity project sitting on my dropbox at the moment. I took the project in the Unity Robotics Hub repo (https://github.com/Unity-Technologies/Unity-Robotics-Hub.git) and added multiple things iteratively to make it work with the Hololens.
- Best suite for robotics tasks
- I'm Addicted to Assets
-
Cam this happen
Are you using ROS for the robot by any chance? if so, have you seen this? it might do just what you want, incl way to connect between the robot and Unity: * https://github.com/Unity-Technologies/Unity-Robotics-Hub
-
We've created a open source Gazebo alternative that is much easier and more efficient to develop simulations.
How is this different from the the official unity for robotics? https://github.com/Unity-Technologies/Unity-Robotics-Hub
What are some alternatives?
MuJoCo_RL_UR5 - A MuJoCo/Gym environment for robot control using Reinforcement Learning. The task of agents in this environment is pixel-wise prediction of grasp success chances.
gazebo-classic - Gazebo classic. For the latest version, see https://github.com/gazebosim/gz-sim
dm_control - Google DeepMind's software stack for physics-based simulation and Reinforcement Learning environments, using MuJoCo.
ros-sharp - ROS# is a set of open source software libraries and tools in C# for communicating with ROS from .NET applications, in particular Unity3D
robo-gym - An open source toolkit for Distributed Deep Reinforcement Learning on real and simulated robots.
mediapipe - Cross-platform, customizable ML solutions for live and streaming media.
gym3 - Vectorized interface for reinforcement learning environments
ROS-TCP-Connector
OmniIsaacGymEnvs - Reinforcement Learning Environments for Omniverse Isaac Gym
URDF-Importer - URDF importer
skrl - Modular reinforcement learning library (on PyTorch and JAX) with support for NVIDIA Isaac Gym, Isaac Orbit and Omniverse Isaac Gym
ZeroSimROSUnity - Robotic simulation in Unity with ROS integration.