blackjack-basic-strategy
orchest
blackjack-basic-strategy | orchest | |
---|---|---|
23 | 44 | |
26 | 4,022 | |
- | 0.1% | |
2.0 | 4.5 | |
about 1 year ago | 11 months ago | |
JavaScript | TypeScript | |
MIT License | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
blackjack-basic-strategy
-
Show HN: Pip install inference, open source computer vision deployment
Itâs an easy to use inference server for computer vision models.
The end result is a Docker container that serves a standardized API as a microservice that your application uses to get predictions from computer vision models (though there is also a native Python interface).
Itâs backed by a bunch of component pieces:
* a server (so you donât have to reimplement things like image processing & prediction visualization on every project)
* standardized APIs for computer vision tasks (so switching out the model weights and architecture can be done independently of your application code)
* model architecture implementations (which implement the tensor parsing glue between images & predictions) for supervised models that you've fine-tuned to perform custom tasks
* foundation model implementations (like CLIP & SAM) that tend to chain well with fine-tuned models
* reusable utils to make adding support for new models easier
* a model registry (so your code can be independent from your model weights & you don't have to re-build and re-deploy every time you want to iterate on your model weights)
* data management integrations (so you can collect more images of edge cases to improve your dataset & model the more it sees in the wild)
* ecosystem (there are tens of thousands of fine-tuned models shared by users that you can use off the shelf via Roboflow Universe[1])
Additionally, since it's focused specifically on computer vision, it has specific CV-focused features (like direct camera stream input) and makes some different tradeoffs than other more general ML solutions (namely, optimized for small-fast models that run at the edge & need support for running on many different devices like NVIDIA Jetsons and Raspberry Pis in addition to beefy cloud servers).
[1] https://universe.roboflow.com
-
Open discussion and useful links people trying to do Object Detection
* Most of the time I find Roboflow extremely handy, I used it to merge datasets, augmentate, read tutorials and that kind of thing. Basically you just create your dataset with roboflow and focus on other aspects.
-
TensorFlow Datasets (TFDS): a collection of ready-to-use datasets
For computer vision, there are 100k+ open source classification, object detection, and segmentation datasets available on Roboflow Universe: https://universe.roboflow.com
-
Please suggest resources to learn how to work with pre-trained CV models
Solid website and app overall for learning more about computer vision, discovering datasets, and keeping up with advancements in the field: * https://roboflow.com/learn * https://universe.roboflow.com (datasets) | https://blog.roboflow.com/computer-vision-datasets-and-apis/ * https://blog.roboflow.com
-
Suggestion for identification problem with shipping labels?
If you're lacking training images, you can also use [Roboflow Universe](https://universe.roboflow.com) to obtain them (over 100 million labeled images available)
-
Ask HN: Who is hiring? (November 2022)
Roboflow | Multiple Roles | Full-time (Remote) | https://roboflow.com/careers
Roboflow is the fastest way to use computer vision in production. We help developers give their software the sense of sight. Our end-to-end platform[1] provides tooling for image collection, annotation, dataset exploration and curation, training, and deployment.
Over 100k engineers (including engineers from 2/3 Fortune 100 companies) build with Roboflow. And we now host the largest collection[2] of open source computer vision datasets and pre-trained models[3].
We have several openings available, but are primarily looking for strong technical generalists who want to help us democratize computer vision and like to wear many hats and have an outsized impact. (We especially love hiring past and future founders.)
We're hiring 3 full-stack engineers this quarter and we're also looking for an infrastructure engineer with Elasticsearch experience.
[1]: https://docs.roboflow.com
[2]: https://blog.roboflow.com/computer-vision-datasets-and-apis/
[3]: https://universe.roboflow.com
-
When annotating an image, if a collection of an entity changes the nature of the entity, do you label them collectively or separately?
Based on what I do/use when I prepare models: A good framework for creating and improving this dataset faster is to use Roboflow Universe and search âflowersâ and âbouquets of flowersâ in the search bar (itâs like Google Images for CV Datasets). You can search images by subject, or metadata, and clone them directly into a free public workspace (they house up to 10k images without charge). * https://universe.roboflow.com/ * https://universe.roboflow.com/search?q=flowers * https://universe.roboflow.com/search?q=bouqets
-
Need help on finding an area where machine learning is applicable on day-to-day life but not implemented already
Lots of ideas will come to mind if you look and search through open source datasets: https://universe.roboflow.com/
- Ask HN: Any good self-hosted image recognition software?
-
SAAS for object detection?
Open source datasets: https://universe.roboflow.com/ Model training: https://docs.roboflow.com/train Model deployment: https://docs.roboflow.com/inference/hosted-api
orchest
-
Decent low code options for orchestration and building data flows?
You can check out our OSS https://github.com/orchest/orchest
- Build ML workflows with Jupyter notebooks
-
Building container images in Kubernetes, how would you approach it?
The code example is part of our ELT/data pipeline tool called Orchest: https://github.com/orchest/orchest/
-
Launch HN: Patterns (YC S21) â A much faster way to build and deploy data apps
First want to say congrats to the Patterns team for creating a gorgeous looking tool. Very minimal and approachable. Massive kudos!
Disclaimer: we're building something very similar and I'm curious about a couple of things.
One of the questions our users have asked us often is how to minimize the dependence on "product specific" components/nodes/steps. For example, if you write CI for GitHub Actions you may use a bunch of GitHub Action references.
Looking at the `graph.yml` in some of the examples you shared you use a similar approach (e.g. patterns/openai-completion@v4). That means that whenever you depend on such components your automation/data pipeline becomes more tied to the specific tool (GitHub Actions/Patterns), effectively locking in users.
How are you helping users feel comfortable with that problem (I don't want to invest in something that's not portable)? It's something we've struggled with ourselves as we're expanding the "out of the box" capabilities you get.
Furthermore, would have loved to see this as an open source project. But I guess the second best thing to open source is some open source contributions and `dcp` and `common-model` look quite interesting!
For those who are curious, I'm one of the authors of https://github.com/orchest/orchest
-
Argo became a graduated CNCF project
Haven't tried it. In its favor, Argo is vendor neutral and is really easy to set up in a local k8s environment like docker for desktop or minikube. If you already use k8s for configuration, service discovery, secret management, etc, it's dead simple to set up and use (avoiding configuration having to learn a whole new workflow configuration language in addition to k8s). The big downside is that it doesn't have a visual DAG editor (although that might be a positive for engineers having to fix workflows written by non-programmers), but the relatively bare-metal nature of Argo means that it's fairly easy to use it as an underlying engine for a more opinionated or lower-code framework (orchest is a notable one out now).
- Ideas for infrastructure and tooling to use for frequent model retraining?
-
Looking for a mentor in MLOps. I am a lead developer.
If youâd like to try something for you data workflows thatâs vendor agnostic (k8s based) and open source you can check out our project: https://github.com/orchest/orchest
-
Is there a good way to trigger data pipelines by event instead of cron?
You can find it here: https://github.com/orchest/orchest Convenience install script: https://github.com/orchest/orchest#installation
-
How do you deal with parallelising parts of an ML pipeline especially on Python?
We automatically provide container level parallelism in Orchest: https://github.com/orchest/orchest
-
Launch HN: Sematic (YC S22) â Open-source framework to build ML pipelines faster
For people in this thread interested in what this tool is an alternative to: Airflow, Luigi, Kubeflow, Kedro, Flyte, Metaflow, Sagemaker Pipelines, GCP Vertex Workbench, Azure Data Factory, Azure ML, Dagster, DVC, ClearML, Prefect, Pachyderm, and Orchest.
Disclaimer: author of Orchest https://github.com/orchest/orchest
What are some alternatives?
uxp-photoshop-plugin-samples - UXP Plugin samples for Photoshop 22 and higher.
docker-airflow - Docker Apache Airflow
wallet - The official repository for the Valora mobile cryptocurrency wallet.
hookdeck-cli - Receive events (e.g. webhooks) in your development environment
process-google-dataset - Process Google Dataset is a tool to download and process images for neural networks from a Google Image Search using a Chrome extension and a simple Python code.
ploomber - The fastest âĄď¸ way to build data pipelines. Develop iteratively, deploy anywhere. âď¸
rollup-react-example - An example React application using Rollup with ES modules, dynamic imports, Service Workers, and Flow.
n8n - Free and source-available fair-code licensed workflow automation tool. Easily automate tasks across different services.
edenai-javascript - The best AI engines in one API: vision, text, speech, translation, OCR, machine learning, etc. SDK and examples for JavaScript developers.
label-studio - Label Studio is a multi-type data labeling and annotation tool with standardized output format
Speed-Coding-Games-in-JavaScript - Games Repository from Speed Coding channel
Node RED - Low-code programming for event-driven applications