Our great sponsors
-
inferencedb
🚀 Stream inferences of real-time ML models in production to any data lake (Experimental)
-
WorkOS
The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.
NOTE:
The number of mentions on this list indicates mentions on common posts plus user suggested alternatives.
Hence, a higher number means a more popular project.
Related posts
- [P] InferenceDB - Makes it easy to store predictions of real-time ML models in S3
- InferenceDB: Stream inferences of real-time ML models to S3 using Kafka
- InferenceDB: Stream inferences of real-time ML models in production to any data lake 🚀
- InferenceDB – Stream ML inferences to S3 or any data lake with CRDs
- InferenceDB – Stream predictions from KServe to S3 or any data lakes