Robotics-Object-Pose-Estimation VS ROS-TCP-Endpoint

Compare Robotics-Object-Pose-Estimation vs ROS-TCP-Endpoint and see what are their differences.

Robotics-Object-Pose-Estimation

A complete end-to-end demonstration in which we collect training data in Unity and use that data to train a deep neural network to predict the pose of a cube. This model is then deployed in a simulated robotic pick-and-place task. (by Unity-Technologies)

ROS-TCP-Endpoint

ROS package used to create an endpoint to accept ROS messages sent from a Unity scene using the ROS TCP Connector scripts (by Unity-Technologies)
Our great sponsors
  • WorkOS - The modern identity platform for B2B SaaS
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • SaaSHub - Software Alternatives and Reviews
Robotics-Object-Pose-Estimation ROS-TCP-Endpoint
2 2
263 161
4.2% 6.2%
0.0 0.0
about 2 years ago about 2 months ago
Python Python
Apache License 2.0 Apache License 2.0
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

Robotics-Object-Pose-Estimation

Posts with mentions or reviews of Robotics-Object-Pose-Estimation. We have used some of these posts to build our list of alternatives and similar projects.

ROS-TCP-Endpoint

Posts with mentions or reviews of ROS-TCP-Endpoint. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2022-07-30.
  • Mixed reality game using ROS and Unity [Update 1]
    3 projects | /r/robotics | 30 Jul 2022
    Hey everyone! I wanted to share a project I'm working on to make an FPV drone racing game using Unity and ROS. I'm still pretty early in the process, but the goal I'm working towards is a PC game (maybe VR) that let's you build custom virtual racetracks in an indoor environment and then race actual physical drones (and eventually autonomous) on them. Why I'm building this I like working on robotics projects in my spare time, and one project I've wanted to do for a while has been building my own autonomous drone. I've worked on some systems like that in the past and they've go been really cool to see in person. Along the way, I also started getting into flying FPV drones as well and realized that flying them manually is just as fun as seeing them fly themselves, so I wanted to see if I could combine the two in some way by possibly making a game out of it. Btw, definitely check out the work done at University of Zurich if you're interested in [high-speed autonomous drones](https://rpg.ifi.uzh.ch/aggressive_flight.html). How does it work I put together a quick demo video just to document the current state of my prototype. [Update Video](https://youtu.be/zxoPFM5ol7o. I'm very early in the process, and honestly, I've kind of cheated a bunch just to get something up and running and feel out the concept. Most of what I've done has just been connecting pieces together using off-the-shelf hardware/software. Right now, the prototype basically just proves out the concept of rendering the realtime position of a drone inside of a Unity game and getting all the "piping" set up to get data into the right place. Currently, the information flow is all one-directional from the drone to the PC. On the hardware-side, I'm using Bitcraze's crazyflie drone with it's lighthouse positioning deck and steamVR's base stations for estimating the drone's 3D position. State estimation is pretty hard, but thanks to all the hardwork done by the crazyflie open source community, this is just kind of works out of the box and in realtime (i.e. one of the big reasons why it kind of feels like cheating lol). Communication between the crazyflie and the PC is done using the crazyflie radio dongle. On the software-side, I'm using ROS to handle all the intermediate messaging and obviously Unity for the user interface, game logic and visualization. Challenges I've run into so far Getting the state estimate data from the crazyflie into Unity was somewhat interesting to figure out. Basically, the crazyflie computes its 6DoF pose (position and orientation) onboard, then transmits this telemetry over radio to the PC. On the PC, I wrote a simple ROS publisher node that listens for these messages and then publishes them onto a ROS network. To get the data into Unity, I'm using Unity's ROS-TCP-Connector package (and ROS-TCP-Endpoint) which essentially just forwards the messages from the ROS network into Unity. Inside Unity, I wrote a simple script tied to a gameobject representing the drone that takes the data, transforms it into Unity's coordinate frame and uses it to set the gameobject's position. Overall, it's just a lot of forwarding of information (with some annoying coordinate frame transforms along the way). Another important piece of the puzzle (as far as rendering the drone inside a 3D virtual replica of my room) was building the room model and calibrating it to my actual room. I can go into it more detail for sure, but at a high-level I basically just picked a point in my room to be the origin in both the physical and virtual room, put the crazyflie there (aligned with the axes I picked for the origin) used the crazyflie cfclient tool to center the base station position estimates there. My process was pretty rough as a first pass, and it will very likely have to improve, especially as I move in the mixed reality direction and start rendering virtual objects on a live camera feed. What's next? Tactically, the next few steps would be to add the FPV view into the game (streaming video data from the drone and rendering it into Unity), which involves more data forwarding (and calibration). In addition, I need to add input controls so you can actually fly the drone. The bigger goals in store would be around building out proper gameplay, integrating in autonomy (and figuring out where it makes sense), and maybe exploring what VR functionality might look like as opposed to just using a flat display on a PC monitor. Thanks for reading through this whole update! If you made it this far, I would really love to hear any feedback or questions on this or anything else. Most likely, it would help me figure out what some additional next steps would be, and I'd be super interested learn if there are other cool directions I could take this project!](https://youtu.be/zxoPFM5ol7o)
  • Help! Meet problems about connecting the ros with unity based on ros sharp
    5 projects | /r/ROS | 9 Apr 2022
    From my experience, ROS# is not the recommended way to connect Unity and ROS anymore. You probably want to use the ROS-TCP-Connector which is officially supported by Unity and its corresponding ROS package. That page has quite a few tutorials to help you get the connection setup. I currently maintain a plugin that implements a set of commonly used sensors here which might be of some use to you. Another useful resource maintained by Unity is their URDF importer, but it is still a bit buggy. Let me know if you need any other resources or help, I'm trying to make Unity a bit more livable for ROS developers.

What are some alternatives?

When comparing Robotics-Object-Pose-Estimation and ROS-TCP-Endpoint you can also consider the following projects:

Unity-Robotics-Hub - Central repository for tools, tutorials, resources, and documentation for robotics simulation in Unity.

ros-sharp - ROS# is a set of open source software libraries and tools in C# for communicating with ROS from .NET applications, in particular Unity3D

openpifpaf - Official implementation of "OpenPifPaf: Composite Fields for Semantic Keypoint Detection and Spatio-Temporal Association" in PyTorch.

AIKIDO - Artificial Intelligence for Kinematics, Dynamics, and Optimization

ROS-TCP-Connector

champ_setup_assistant - CHAMP Package Config Generator

URDF-Importer - URDF importer

PeopleSansPeople - Unity's privacy-preserving human-centric synthetic data generator

ydata-synthetic - Synthetic data generators for tabular and time-series data

rbdl-orb - RBDL - Rigid Body Dynamics Library - ORB Version - The two main differences to the original rbdl is that this version has error handling and uses polymorphism for constraints

awesome-robotics-libraries - :sunglasses: A curated list of robotics libraries and software