Media-Recommendation-Engine
mlrun
Our great sponsors
Media-Recommendation-Engine | mlrun | |
---|---|---|
1 | 3 | |
12 | 1,287 | |
- | 5.5% | |
0.0 | 9.9 | |
about 1 year ago | 2 days ago | |
Python | Python | |
MIT License | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Media-Recommendation-Engine
mlrun
- Discussion on Need of Feature Stores
-
I reviewed 50+ open-source MLOps tools. Hereโs the result
You should also add MLRun: https://github.com/mlrun/mlrun
- Has anyone here been able to deploy Mlrun successfully on Kubernetes cluster?
What are some alternatives?
mnist-mlops-learning - In this project I played with mlflow, streamlit and fastapi to create a training and prediction app on digits
feast - Feature Store for Machine Learning
reco-model-monitoring - fastapi + prometheus + grafana ๐ฃ
dagster-example-pipeline - Template Dagster repo using poetry and a single Docker container; works well with CICD
api-gateway - ๐ช Kong API Gateway
flyte - Scalable and flexible workflow orchestration platform that seamlessly unifies data, ML and analytics stacks.
Jokes_api - JokesAPI is a REST API that serves two part jokes.
SmartSim - SmartSim Infrastructure Library.
energy-forecasting - ๐ ๐ง๐ต๐ฒ ๐๐๐น๐น ๐ฆ๐๐ฎ๐ฐ๐ธ ๐ณ-๐ฆ๐๐ฒ๐ฝ๐ ๐ ๐๐ข๐ฝ๐ ๐๐ฟ๐ฎ๐บ๐ฒ๐๐ผ๐ฟ๐ธ | ๐๐ฒ๐ฎ๐ฟ๐ป ๐ ๐๐ & ๐ ๐๐ข๐ฝ๐ for free by designing, building and deploying an end-to-end ML batch system ~ ๐ด๐ฐ๐ถ๐ณ๐ค๐ฆ ๐ค๐ฐ๐ฅ๐ฆ + 2.5 ๐ฉ๐ฐ๐ถ๐ณ๐ด ๐ฐ๐ง ๐ณ๐ฆ๐ข๐ฅ๐ช๐ฏ๐จ & ๐ท๐ช๐ฅ๐ฆ๐ฐ ๐ฎ๐ข๐ต๐ฆ๐ณ๐ช๐ข๐ญ๐ด
phidata - Build AI Assistants with function calling and connect LLMs to external tools.
loopquest - A Production Tool for Embodied AI
mosec - A high-performance ML model serving framework, offers dynamic batching and CPU/GPU pipelines to fully exploit your compute machine