Monocular-MiniSLAM
pykitti
Our great sponsors
Monocular-MiniSLAM | pykitti | |
---|---|---|
1 | 2 | |
25 | 1,107 | |
- | 1.9% | |
4.3 | 5.3 | |
about 3 years ago | 5 months ago | |
Python | Python | |
- | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Monocular-MiniSLAM
-
SLAM backend correction with G2O
Also, I am trying to build a mini SLAM code base here : https://github.com/sakshamjindal/Monocular-MiniSLAM
pykitti
-
converting a disparity map to a depth map given calibration file
That is the calibration file from the KITTI dataset. You can find in the README how to interpret those values. Maybe take a look at https://github.com/utiasSTARS/pykitti as well, it allows you to parse these files in Python very quickly.
What are some alternatives?
robo-vln - Pytorch code for ICRA'21 paper: "Hierarchical Cross-Modal Agent for Robotics Vision-and-Language Navigation"
stereoDepth - single and stereo calibration, disparity calculation.
ai-robot-hand-with-raspberry-pi - A robotics hand that mimics human hands using Computer Vision.
simple-slam
habitat-api - A modular high-level library to train embodied AI agents across a variety of tasks, environments, and simulators. [Moved to: https://github.com/facebookresearch/habitat-lab]
procthor - 🏘️ Scaling Embodied AI by Procedurally Generating Interactive 3D Houses
ai2thor-rearrangement - 🔀 Visual Room Rearrangement
habitat-lab - A modular high-level library to train embodied AI agents across a variety of tasks and environments.
calibrated-backprojection-network - PyTorch Implementation of Unsupervised Depth Completion with Calibrated Backprojection Layers (ORAL, ICCV 2021)