scipipe
pytkml
scipipe | pytkml | |
---|---|---|
1 | 3 | |
1,054 | 5 | |
0.2% | - | |
3.0 | 0.0 | |
10 months ago | almost 3 years ago | |
Go | Python | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
scipipe
-
Ask HN: What have you created that deserves a second chance on HN?
https://scipipe.org - A pipeline tool for shell commands by a declarative flow-based API in Go
Github link: https://github.com/scipipe/scipipe
There are many pipeline tools for shell commands, but a majority has one or more limitations in their API which makes certain complex pipelines impossible or really hard to write.
We were pushing the limits of all the tools we tried, so developed our own, and implemented it in Go, with a declarative API for defining the data flow dependencies, instead of inventing yet another DSL. This has allowed us great flexibility in developing also complex pipelines, e.g. combining parameter sweeps nested with cross-validation implemented as workflow constructs.
SciPipe is also unique in providing an audit report for every single output of the workflow, in a structured JSON format. A helper tool allows converting these reports to either an HTML report, a PDF, or a Bash script that will generate the one accompanying output file from scratch.
An extra cool things is that, because the audit reports live alongside output files, if you run a scipipe workflow that uses files generated by another scipipe workflow, it will pick up also all the history for the input files generated by this earlier workflow, meaning that you get a 100% complete audit report, even if your analysis spans multiple workflows!
pytkml
-
Ask HN: What have you created that deserves a second chance on HN?
Code: https://github.com/rbitr/pytkml
I didn't explain it well; this is an area that's becoming increasingly important
-
Ask HN: What is something you built but never marketed?
I worked on (wouldn't say I completed) a testing framework for ML models, where you can specify a series of tests for a model to pass. The unique part is emphasizing support from the training data for the inferences you expect the model to make, i.e. checking that similar training data exists for some test cases and is influential in predicting them. It's pretty niche, and I don't explain it very well, but I remain convinced that with the right framing it represents a more rigorous way of making sure machine learning models are built and used "responsibly".
https://github.com/rbitr/pytkml
- Show HN: Writing tests for machine learning models
What are some alternatives?
codebase-visualizer-action - Visualize your codebase during CI.
pdfcomments
tripods-web - A puzzle game.
drummachine
UrlChecker - Android app by TrianguloY: URLCheck
india-pincode-regex - A simple regex based exhaustive validator for PIN codes in India
dotfile - Simple version control made for tracking single files
reddit-playlists
osxphotos - Python app to work with pictures and associated metadata from Apple Photos on macOS. Also includes a package to provide programmatic access to the Photos library, pictures, and metadata.
youtube2Anki - Convert Youtube Transcripts to Anki cards
quart - An async Python micro framework for building web applications.
KaithemAutomation - Pure Python, GUI-focused home automation/consumer grade SCADA