Sonar helps you commit clean code every time. With over 225 unique rules to find Python bugs, code smells & vulnerabilities, Sonar finds the issues while you focus on the work. Learn more →
BentoML Alternatives
Similar projects and alternatives to BentoML
-
fastapi
FastAPI framework, high performance, easy to learn, fast to code, ready for production
-
seldon-core
An MLOps framework to package, deploy, monitor and manage thousands of production machine learning models
-
ONLYOFFICE
ONLYOFFICE Docs — document collaboration in your environment. Powerful document editing and collaboration in your app or environment. Ultimate security, API and 30+ ready connectors, SaaS or on-premises
-
haystack
:mag: Haystack is an open source NLP framework to interact with your data using Transformer models and LLMs (GPT-4, ChatGPT and alike). Haystack offers production-ready tools to quickly build complex question answering, semantic search, text generation applications, and more.
-
Kedro
A Python framework for creating reproducible, maintainable and modular data science code.
-
clearml
ClearML - Auto-Magical CI/CD to streamline your ML workflow. Experiment Manager, MLOps and Data-Management
-
-
-
Sonar
Write Clean Python Code. Always.. Sonar helps you commit clean code every time. With over 225 unique rules to find Python bugs, code smells & vulnerabilities, Sonar finds the issues while you focus on the work.
-
-
metaflow
Build and manage real-life data science projects with ease. (by zillow)
-
-
postgresml
PostgresML is an AI application database. Download open source models from Huggingface, or train your own, to create and index LLM embeddings, generate text, or make online predictions using only SQL.
-
-
-
-
-
-
-
Zulip
Zulip server and web application. Open-source team chat that helps teams stay productive and focused.
-
Gravitational Teleport
The easiest, most secure way to access infrastructure.
-
metamask-extension
:globe_with_meridians: :electric_plug: The MetaMask browser extension enables browsing Ethereum blockchain enabled websites
-
InfluxDB
Access the most powerful time series database as a service. Ingest, store, & analyze all types of time series data in a fully-managed, purpose-built database. Keep data forever with low-cost storage and superior data compression.
BentoML reviews and mentions
- Ask HN: Who is hiring? (November 2022)
-
[D] How to get the fastest PyTorch inference and what is the "best" model serving framework?
For 2), I am aware of a few options. Triton inference server is an obvious one as is the ‘transformer-deploy’ version from LDS. My only reservation here is that they require the model compilation or are architecture specific. I am aware of others like Bento, Ray serving and TorchServe. Ideally I would have something that allows any (PyTorch model) to be used without the extra compilation effort (or at least optionally) and has some convenience things like ease of use, easy to deploy, easy to host multiple models and can perform some dynamic batching. Anyway, I am really interested to hear people's experience here as I know there are now quite a few options! Any help is appreciated! Disclaimer - I have no affiliation or are connected in any way with the libraries or companies listed here. These are just the ones I know of. Thanks in advance.
- PostgresML is 8-40x faster than Python HTTP microservices
-
Show HN: Truss – serve any ML model, anywhere, without boilerplate code
In this category I’m a big fan of https://github.com/bentoml/BentoML
What I like about it is their idiomatic developer experience. It reminds me of other Pythonic frameworks like Flask and Django in a good way.
I have no affiliation with them whatsoever, just an admirer.
-
[P] Introducing BentoML 1.0 - A faster way to ship your models to production
Github Page: https://github.com/bentoml/BentoML
-
Show HN: Bentoctl – An open-source Terraform deployment tool for ML
Elastic License 2: https://github.com/bentoml/bentoctl/blob/v0.3.1/LICENSE.md which also applies to their Yatai kubernetes thing, but strangely not (yet?) to the similarly named repo which is Apache-2: https://github.com/bentoml/BentoML/blob/main/LICENSE
-
How to Build a Machine Learning Demo in 2022
Using a general-purpose framework such as FastAPI involves writing a lot of boilerplate code just to get your API endpoint up and running. If deploying a model for a demo is the only thing you are interested in and you do not mind losing some flexibility, you might want to use a specialized serving framework instead. One example is BentoML, which will allow you to get an optimized serving endpoint for your model up and running much faster and with less overhead than a generic web framework. Framework-specific serving solutions such as Tensorflow Serving and TorchServe typically offer optimized performance but can only be used to serve models trained using Tensorflow or PyTorch, respectively.
-
MLH, Open Source, Mapillary & Me
BentoML - BentoML is a flexible, high-performance framework for serving, managing, and deploying machine learning models.
-
Why do so many people think Python is easier to productionize than R?
Also mlflow is not that optimized because it doesnt microbatch like torchserve/tfserving/bentoml. https://github.com/bentoml/BentoML/tree/master/benchmark
-
Ask HN: Who is hiring? (April 2021)
BentoML.ai | ML Engineer, Backend Engineer | Full-time | Bay Area or Remote | Python, Kubernetes, MLOps platform, Data Infra, Tensorflow, PyTorch, etc
BentoML is an open-source framework for machine learning model serving & deployment https://github.com/bentoml/BentoML
We are a venture backed startup behind the BentoML open source project, and we are looking for engineers who are passionate about building Open Source, MLOps, ML Platform or developer tools. Email [ chaoyu at bentoml.ai] if you are interested.
Job descriptions: https://angel.co/company/bentoml
-
A note from our sponsor - Sonar
www.sonarsource.com | 1 Jun 2023
Stats
bentoml/BentoML is an open source project licensed under Apache License 2.0 which is an OSI approved license.
The primary programming language of BentoML is Python.