BentoML
Poetry
BentoML | Poetry | |
---|---|---|
16 | 377 | |
6,586 | 29,631 | |
2.2% | 1.6% | |
9.8 | 9.7 | |
1 day ago | 1 day ago | |
Python | Python | |
Apache License 2.0 | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
BentoML
-
Who's hiring developer advocates? (December 2023)
Link to GitHub -->
-
project ideas/advice for entry-level grad jobs?
there are a few tools you can use as "cheat mode" shortcuts to give you a leg up as you're getting started. here's one: https://github.com/bentoml/BentoML
-
Two high schoolers trying to use Azure/GCP/AWS- need help!
Then you can look into bentoml https://github.com/bentoml/BentoML which is used to deploy ml stuff with many more benifits.
- Ask HN: Who is hiring? (November 2022)
-
[D] How to get the fastest PyTorch inference and what is the "best" model serving framework?
For 2), I am aware of a few options. Triton inference server is an obvious one as is the ‘transformer-deploy’ version from LDS. My only reservation here is that they require the model compilation or are architecture specific. I am aware of others like Bento, Ray serving and TorchServe. Ideally I would have something that allows any (PyTorch model) to be used without the extra compilation effort (or at least optionally) and has some convenience things like ease of use, easy to deploy, easy to host multiple models and can perform some dynamic batching. Anyway, I am really interested to hear people's experience here as I know there are now quite a few options! Any help is appreciated! Disclaimer - I have no affiliation or are connected in any way with the libraries or companies listed here. These are just the ones I know of. Thanks in advance.
- PostgresML is 8-40x faster than Python HTTP microservices
- Congratulations on v1.0, BentoML 🍱 ! You are r/mlops OSS of the month!
-
Show HN: Truss – serve any ML model, anywhere, without boilerplate code
In this category I’m a big fan of https://github.com/bentoml/BentoML
What I like about it is their idiomatic developer experience. It reminds me of other Pythonic frameworks like Flask and Django in a good way.
I have no affiliation with them whatsoever, just an admirer.
-
[P] Introducing BentoML 1.0 - A faster way to ship your models to production
Github Page: https://github.com/bentoml/BentoML
- Show HN: BentoML goes 1.0 – A faster way to ship your models to production
Poetry
-
Understanding Dependencies in Programming
You can manage dependencies in Python with the package manager pip, which comes pre-installed with Python. Pip allows you to install and uninstall Python packages, and it uses a requirements.txt file to keep track of which packages your project depends on. However, pip does not have robust dependency resolution features or isolate dependencies for different projects; this is where tools like pipenv and poetry come in. These tools create a virtual environment for each project, separating the project's dependencies from the system-wide Python environment and other projects.
-
Implementing semantic image search with Amazon Titan and Supabase Vector
Poetry provides packaging and dependency management for Python. If you haven't already, install poetry via pip:
-
From Kotlin Scripting to Python
Poetry
-
How to Enhance Content with Semantify
The Semantify repository provides an example Astro.js project. Ensure you have poetry installed, then build the project from the root of the repository:
-
Uv: Python Packaging in Rust
Has anyone else been paying attention to how hilariously hard it is to package PyTorch in poetry?
https://github.com/python-poetry/poetry/issues/6409
-
Boring Python: dependency management (2022)
Based on this comment 5 days ago[0], it's working? I'm not sure didn't dig in too far but based on that comment it seems fair to say that it's not fully Poetry's fault because torch removed hashes (which poetry needs to be effective) for a while only recently adding it back in.
Not sure where I would stand if I fully investigated it tho.
[0] https://github.com/python-poetry/poetry/issues/6409#issuecom...
-
Fun with Avatars: Crafting the core engine | Part. 1
We will be running this project in Python 3.10 on Mac/Linux, and we will use Poetry to manage our dependencies. Later, we will bundle our app into a container using docker for deployment.
-
Python Packaging, One Year Later: A Look Back at 2023 in Python Packaging
Here are the two main packaging issues I run into, specifically when using Poetry:
1) Lack of support for building extension modules (as mentioned by the article). There is a workaround using an undocumented feature [0], which I've tried, but ultimately decided it was not the right approach. I still use Poetry, but build the extension as a separate step in CI, rather than kludging it into Poetry.
2) Lack of support for offline installs [1], e.g. being able to download the dependencies, copy them to another machine, and perform the install from the downloaded dependencies (similar to using "pip --no-index --find-links=."). Again, you can work around this (by using "poetry export --with-credentials" and "pip download" for fetching the dependencies, then firing up pypiserver [2] to run a local PyPI server on the offline machine), but ideally this would all be a first class feature of Poetry, similar to how it is in pip.
I don't have the capacity to create Pull Requests for addressing these issues with Poetry, and I'm very grateful for the maintainers and those who do contribute. Instead, on the linked issues I share my notes on the matter, in the hope that it may at least help others and potentially get us closer to a solution.
Regardless, I'm sticking with Poetry for now. Though to be fair, the only other Python packaging tools I've used extensively are Pipenv and pip/setuptools. It's time consuming to thoroughly try out these other packaging tools, and is generally lower priority than developing features/fixing bugs, so it's helpful to read about the author's experience with these other tools, such as PDM and Hatch.
[0] https://github.com/python-poetry/poetry/issues/2740
[1] https://github.com/python-poetry/poetry/issues/2184
[2] https://pypi.org/project/pypiserver/
-
Introducing Flama for Robust Machine Learning APIs
We believe that poetry is currently the best tool for this purpose, besides of being the most popular one at the moment. This is why we will use poetry to manage the dependencies of our project throughout this series of posts. Poetry allows you to declare the libraries your project depends on, and it will manage (install/update) them for you. Poetry also allows you to package your project into a distributable format and publish it to a repository, such as PyPI. We strongly recommend you to learn more about this tool by reading the official documentation.
-
How do you resolve dependency conflicts?
I started using poetry. The problem is poetry will not install if there is dependency conflict and there is no way to ignore: github
What are some alternatives?
fastapi - FastAPI framework, high performance, easy to learn, fast to code, ready for production
Pipenv - Python Development Workflow for Humans.
seldon-core - An MLOps framework to package, deploy, monitor and manage thousands of production machine learning models
PDM - A modern Python package and dependency manager supporting the latest PEP standards
haystack - :mag: LLM orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your data. With advanced retrieval methods, it's best suited for building RAG, question answering, semantic search or conversational agent chatbots.
hatch - Modern, extensible Python project management
clearml - ClearML - Auto-Magical CI/CD to streamline your AI workload. Experiment Management, Data Management, Pipeline, Orchestration, Scheduling & Serving in one MLOps/LLMOps solution
pyenv - Simple Python version management
Kedro - Kedro is a toolbox for production-ready data science. It uses software engineering best practices to help you create data engineering and data science pipelines that are reproducible, maintainable, and modular.
pip-tools - A set of tools to keep your pinned Python dependencies fresh.
kubeflow - Machine Learning Toolkit for Kubernetes
virtualenv - Virtual Python Environment builder