Pytorch
Theano
Our great sponsors
Pytorch | Theano | |
---|---|---|
335 | - | |
77,783 | 9,852 | |
2.4% | 0.1% | |
10.0 | 5.0 | |
1 day ago | 3 months ago | |
Python | Python | |
GNU General Public License v3.0 or later | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Pytorch
-
penzai: JAX research toolkit for building, editing, and visualizing neural nets
> does PyTorch have a similar concept
of course https://github.com/pytorch/pytorch/blob/main/torch/utils/_py...
-
Tinygrad: Hacked 4090 driver to enable P2P
fyi should work on most 40xx[1]
[1] https://github.com/pytorch/pytorch/issues/119638#issuecommen...
-
The Elements of Differentiable Programming
Sure, right here: https://github.com/pytorch/pytorch/blob/main/torch/autograd/...
Here's the documentation: https://pytorch.org/tutorials/intermediate/forward_ad_usage....
> When an input, which we call “primal”, is associated with a “direction” tensor, which we call “tangent”, the resultant new tensor object is called a “dual tensor” for its connection to dual numbers[0].
-
Functions and operators for Dot and Matrix multiplication and Element-wise calculation in PyTorch
*My post explains Dot, Matrix and Element-wise multiplication in PyTorch.
-
Dot vs Matrix vs Element-wise multiplication in PyTorch
In PyTorch with @, dot() or matmul():
-
Building a GPT Model from the Ground Up!
import torch # we use PyTorch: https://pytorch.org data = torch.tensor(encode(text), dtype=torch.long) print(data.shape, data.dtype) print(data[:1000]) # the 1000 characters we looked at earlier will to the GPT look like this
-
Open Source Ascendant: The Transformation of Software Development in 2024
AI's Open Embrace Artificial intelligence (AI) and machine learning (ML) are increasingly leveraging open-source frameworks like TensorFlow [https://www.tensorflow.org/] and PyTorch [https://pytorch.org/]. This democratization of AI tools is driving innovation and lowering entry barriers across industries.
-
Best AI Tools for Students Learning Development and Engineering
Which label applies to a tool sometimes depends on what you do with it. For example, PyTorch or TensorFlow can be called a library, a toolkit, or a machine-learning framework.
-
Element-wise vs Matrix vs Dot multiplication
In PyTorch with * or mul(). ` or mul()` can multiply 0D or more D tensors by element-wise multiplication:
-
Bash Debugging
When I was at Facebook, I wrote a Python script to extract shell scripts from GitHub Actions workflows, so we could run them all through ShellCheck: https://github.com/pytorch/pytorch/blob/69e0bda9996865e319db...
Theano
We haven't tracked posts mentioning Theano yet.
Tracking mentions began in Dec 2020.
What are some alternatives?
Flux.jl - Relax! Flux is the ML library that doesn't make you tensor
CNTK - Microsoft Cognitive Toolkit (CNTK), an open source deep-learning toolkit
mediapipe - Cross-platform, customizable ML solutions for live and streaming media.
mxnet - Lightweight, Portable, Flexible Distributed/Mobile Deep Learning with Dynamic, Mutation-aware Dataflow Dep Scheduler; for Python, R, Julia, Scala, Go, Javascript and more
Apache Spark - Apache Spark - A unified analytics engine for large-scale data processing
Caffe - Caffe: a fast open framework for deep learning.
flax - Flax is a neural network library for JAX that is designed for flexibility.
Keras - Deep Learning for humans
tinygrad - You like pytorch? You like micrograd? You love tinygrad! ❤️ [Moved to: https://github.com/tinygrad/tinygrad]
Caffe2
Pandas - Flexible and powerful data analysis / manipulation library for Python, providing labeled data structures similar to R data.frame objects, statistical functions, and much more
silero-models - Silero Models: pre-trained speech-to-text, text-to-speech and text-enhancement models made embarrassingly simple