Our great sponsors
-
BentoML
The most flexible way to serve AI/ML models in production - Build Model Inference Service, LLM APIs, Inference Graph/Pipelines, Compound AI systems, Multi-Modal, RAG as a Service, and more!
-
WorkOS
The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.
Also mlflow is not that optimized because it doesnt microbatch like torchserve/tfserving/bentoml. https://github.com/bentoml/BentoML/tree/master/benchmark
NOTE:
The number of mentions on this list indicates mentions on common posts plus user suggested alternatives.
Hence, a higher number means a more popular project.
Related posts
- project ideas/advice for entry-level grad jobs?
- Two high schoolers trying to use Azure/GCP/AWS- need help!
- Congratulations on v1.0, BentoML 🍱 ! You are r/mlops OSS of the month!
- Show HN: BentoML goes 1.0 – A faster way to ship your models to production
- Show HN: Truss – serve any ML model, anywhere, without boilerplate code