pykitti
habitat-lab
Our great sponsors
pykitti | habitat-lab | |
---|---|---|
2 | 3 | |
1,114 | 1,710 | |
1.3% | 6.2% | |
5.3 | 9.2 | |
6 months ago | about 17 hours ago | |
Python | Python | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
pykitti
- Projecting 3D LiDAR points to camera
-
converting a disparity map to a depth map given calibration file
That is the calibration file from the KITTI dataset. You can find in the README how to interpret those values. Maybe take a look at https://github.com/utiasSTARS/pykitti as well, it allows you to parse these files in Python very quickly.
habitat-lab
-
[D] Looking for open source projects to contribute
There are plenty of them out there. I spend a lot of time contributing to open source projects like Habitat-Sim https://github.com/facebookresearch/habitat-sim and Habitat-Lab https://github.com/facebookresearch/habitat-lab which have a ton of open issues and code maintaince stuff that we would welcome contributions of.
- Accelerate PPO training
-
Facebook AI Introduces Habitat 2.0: Next-Generation Simulation Platform Provides Faster Training For AI Agents With Tactile Perception
Github: https://github.com/facebookresearch/habitat-lab
What are some alternatives?
ai-robot-hand-with-raspberry-pi - A robotics hand that mimics human hands using Computer Vision.
ai2thor-rearrangement - 🔀 Visual Room Rearrangement
stereoDepth - single and stereo calibration, disparity calculation.
Autonomous-Ai-drone-scripts - State of the art autonomous navigation scripts using Ai, Computer Vision, Lidar and GPS to control an arducopter based quad copter.
robo-vln - Pytorch code for ICRA'21 paper: "Hierarchical Cross-Modal Agent for Robotics Vision-and-Language Navigation"
mini-cheetah-tmotor-python-can - Python Motor Driver for Mini-Cheetah based Actuators: T-Motor AK80-6/AK80-9 using SocketCAN Interface
drl_grasping - Deep Reinforcement Learning for Robotic Grasping from Octrees
procthor - 🏘️ Scaling Embodied AI by Procedurally Generating Interactive 3D Houses
carla - Open-source simulator for autonomous driving research.
habitat-api - A modular high-level library to train embodied AI agents across a variety of tasks, environments, and simulators. [Moved to: https://github.com/facebookresearch/habitat-lab]
habitat-sim - A flexible, high-performance 3D simulator for Embodied AI research.