torchtyping
star-history
torchtyping | star-history | |
---|---|---|
7 | 37 | |
1,337 | 5,884 | |
- | 2.0% | |
3.2 | 8.8 | |
11 months ago | 2 days ago | |
Python | TypeScript | |
Apache License 2.0 | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
torchtyping
-
[D] Have their been any attempts to create a programming language specifically for machine learning?
Not really an answer to your question, but there are Python packages that try to solve the problem of tensor shapes that you mentioned, e.g. https://github.com/patrick-kidger/torchtyping or https://github.com/deepmind/tensor_annotations
-
What's New in Python 3.11?
I disagree. I've had a serious attempt at array typing using variadic generics and I'm not impressed. Python's type system has numerous issues... and now they just apply to any "ArrayWithNDimensions" type as well as any "ArrayWith2Dimenensions" type.
Variadic protocols don't exist; many operations like stacking are inexpressible; the synatx is awful and verbose; etc. etc.
I've written more about this here as part of my TorchTyping project: [0]
[0] https://github.com/patrick-kidger/torchtyping/issues/37#issu...
-
Can anyone point out the mistakes in my input layer or dimension?
also https://github.com/patrick-kidger/torchtyping
-
[D] Anyone using named tensors or a tensor annotation lib productively?
FWIW I'm the author of torchtyping so happy to answer any questions about that. :) I think people are using it!
-
[D] Ideal deep learning library
The one thing I really *really* wish got more attention was named tensors and the tensor type system. Tensor misalignment errors are a constant source of silently-failing bugs. While 3rd party libraries have attempted to fill this gap, it really needs better native support. In particular it seems like bad form to me for programmers to have to remember the specific alignment and broadcasting rules, and then have to apply them to an often poorly documented order of tensor indices. I'd really like to see something like tsalib's warp operator made part of the main library and generalized to arbitrary function application, like a named-tensor version of fold. But preferably using notation closer to that of torchtyping.
-
[P] torchtyping -- documentation + runtime type checking of tensor shapes (and dtypes, ...)
Yes it does work with numerical literals! It support using integers to specify an absolute size, strings to specify names for dimensions that should all be consistently sized (and optionally also checks named tensors), "..." to indicate batch dimensions, and so on. See the full list here.
star-history
-
Stirling PDF: Self-hosted, web-based PDF manipulation tool
I have some questions about the Github Star history, it's very unusual to see a ~1 year old with 20k+ stars.
It went from 6k to 15k+ stars in a few days around 2023 Christmas when global internet traffic is usually lowest, and I couldn't find any major social media posts or announcements around that time. If you're gonna buy stars don't buy 10k+ stars on one day, spread it out a bit!
https://star-history.com/#Stirling-Tools/Stirling-PDF&Date
https://www.google.com/search?q=%22stirling%22+%22PDF%22&sca...
-
Show HN: I've built a locally running perplexity clone
That’s a great project you pulled off. From the time I starred it (10-12h ago I think), and upon re-checking this post, you gained 500+ stars lol.
Visualized in a chart with star-history: https://star-history.com/#nilsherzig/LLocalSearch
-
What I learned from looking at 900 most popular open source AI tools
You can actively see a fresh "hype curve" in the transformer-debugger repo that was posted a couple days ago (https://github.com/openai/transformer-debugger) (star history https://star-history.com/#openai/transformer-debugger&Date).
Regardless of the repo's stars or how valuable it really is, at the time I saw it posted to HN, it had 1.6k stars/16 hours. What channel are people listening to to star it so quickly. I'm not implying any nefariousness, mind you, I'm only wondering where all the stargazers were referred from so fast and in such volume.
-
What I learned in 6 months of working on a CodeGen dev tool GPT Pilot
I’ve been releasing open-source projects for years now, and I’ve always wanted to see how fast my Github repo is growing compared to other successful repositories on https://star-history.com/. The problem is that on Star History, I’m unable to zoom into the graph, so a new repo that has 1,000 stars cannot be compared with a big repo that has 50,000 because you can’t see how the bigger repo does in its beginning. So, I asked GPT Pilot to build this functionality. It scrapes Github repos for stargazers, saves them into the database, plots them on a graph, and enables the graph to be zoomed in and out.
-
Htmx is a great front-end library, but its x account is full of memes
i'm a one man shop in montana, competing w/ Google, Vercel & Facebook for dev mindshare
if i did what everyone else does you never would have heard of htmx
https://star-history.com/#bigskysoftware/htmx&Date
-
Htmx and Web Components: A Perfect Match
also: https://star-history.com/#bigskysoftware/htmx&facebook/react...
-
Show HN: Like-History.ai
Similar to http://star-history.com for GitHub repos, http://like-history.ai is a tool to help generate the like history of projects on HuggingFace.co
More details: https://twitter.com/Tim_Qian/status/1730245069259575485
- Star History: the missing GitHub star history graph of GitHub repos
-
Htmx is part of the GitHub Accelerator
yeah, he was the one that really started the madness:
https://star-history.com/#bigskysoftware/htmx&bigskysoftware...
his video posted on july 7th
-
Startups are in first batch of GitHub OS Accelerator
Github star history graph of the batch:
https://star-history.com/#trpc/trpc&termux/termux-app&respon...
What are some alternatives?
jaxtyping - Type annotations and runtime checking for shape and dtype of JAX/NumPy/PyTorch/etc. arrays. https://docs.kidger.site/jaxtyping/
receiptline - Markdown for receipts. Printable digital receipts. Generate receipt printer commands and images.
equinox - Elegant easy-to-use neural networks + scientific computing in JAX. https://docs.kidger.site/equinox/
starred - creating your own Awesome List by GitHub stars!
tsalib - Tensor Shape Annotation Library (numpy, tensorflow, pytorch, ...)
redux-undo - :recycle: higher order reducer to add undo/redo functionality to redux state containers
mypy - Optional static typing for Python
timeonsite - Timeonsitetracker.js - Modern & accurate "Time on site" tracking for web and mobile browsers
functorch - functorch is JAX-like composable function transforms for PyTorch.
robusta - Kubernetes observability and automation, with an awesome Prometheus integration
tensor_annotations - Annotating tensor shapes using Python types
nix-prisma-example - An example Prisma project using nix