SaaSHub helps you find the best software and product alternatives Learn more →
Ok-robot Alternatives
Similar projects and alternatives to ok-robot
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
obsninja
VDO.Ninja is a powerful tool that lets you bring remote video feeds into OBS or other studio software via WebRTC.
-
prql
PRQL is a modern language for transforming data — a simple, powerful, pipelined SQL replacement
-
FLiPStackWeekly
FLaNK AI Weekly covering Apache NiFi, Apache Flink, Apache Kafka, Apache Spark, Apache Iceberg, Apache Ozone, Apache Pulsar, and more...
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
-
gpt-neox
An implementation of model parallel autoregressive transformers on GPUs, based on the DeepSpeed library.
-
readyset
Readyset is a MySQL and Postgres wire-compatible caching layer that sits in front of existing databases to speed up queries and horizontally scale read throughput. Under the hood, ReadySet caches the results of cached select statements and incrementally updates these results over time as the underlying data changes.
-
FinGPT
FinGPT: Open-Source Financial Large Language Models! Revolutionize 🔥 We release the trained model on HuggingFace.
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
ok-robot reviews and mentions
- Apple Explores Home Robotics as Potential 'Next Big Thing'
-
Low Cost Robot Arm
That's it, isn't it. The question is not, how far away from that are we, but when can you and I actually afford it? Because, as the other commenter snarkily replies, human maid's already exist. The lifestyle of the singularity is already here for the rich. It's trickling down that kind of lifestyle to the rest of us that AI robots will enable. (with some amount of social upheaval.)
Lets say the robot that can do that comes out next year for $15 million. Could you afford one? I certainly can't. So pretend that it does, what changes for you and I? Nothing. So the robots that can do that won't be used as robot maids until the price comes down. Which; it will. Open source robotics and model-available AI will force things to be affordable sooner, rather than later, because we'd all like a robot to do that for us.
The industrial versions will be used to do hideously dangerous things. underwater welding, chainsaw helicoptering, manual nuclear reactor rod removal. We already use machines for a lot of those difficult/impossible tasks, it's just a matter of programming the robots.
Which takes us back to today. How far away from that are we? The pieces are already here. Between https://ok-robot.github.io/ and https://mobile-aloha.github.io/ the building blocks are here. It's just a matter of time before someone puts the existing pieces together to make said robot, the only question is who will be first to make it, who will be first to open source it. Who will make it not just possible, but affordable?
-
GPT-4, without specialized training, beat a GPT-3.5 class model that cost $10B
Thanks! Appreciate the kind words. I should have in the next month or so (interviewing and finishing my Master's, so there's been delays) a follow up that follows more advancements in the router style VLA, sensoiromotor VLM, and advances in embedding enriched vision models in general.
If you want a great overview of what a modern robotics stack would look like with all this, https://ok-robot.github.io/ was really good and will likely make it into the article. It's a VLA combined with existing RL methods to demonstrate multi-tasking robots, and serves as a great glimpes into what a lot of researchers are working on. You won't see these techniques in robots in industrial or commercial settings - we're still too new at this to be reliable or capable enough to deploy these on real tasks.
-
Figure robotics demos its OpenAI integration
The Ok-robot demo shows that the technology for it to be fairly general is there, though no idea if figure one is using their technology or not. Simply being able to command a robot instead of moving a turtle with gcode is nothing short of astounding to those who aren’t deeply involved and tracking the sota progress in this area.
https://ok-robot.github.io/
- FLaNK Stack 26 February 2024
-
Show HN: OK-Robot: open, modular home robot framework for pick-and-drop anywhere
Disclaimer: I'm not one of the authors, but I work in this area.
You basically hit the nail on the head with these questions. This work is super cool, but you named a lot of the limitations with contemporary robot learning systems.
1. It's using an object classifier. It's described here (https://github.com/ok-robot/ok-robot/tree/main/ok-robot-navi...), but if I understanding it correctly basically they are using a ViT model (basically an image classification model) to do some labeling of images and projecting them onto a voxel grid. Then they are using language embeddings from CLIP to pair the language with the voxel grid. The limitations of this are that if they want this to run on the robot, they can't use the super huge versions of these models. While they could use a huge model on the cloud, that would introduce a lot of latency.
-
A note from our sponsor - SaaSHub
www.saashub.com | 4 May 2024
Stats
ok-robot/ok-robot is an open source project licensed under MIT License which is an OSI approved license.
The primary programming language of ok-robot is Python.
Sponsored