torchsde
tsalib
torchsde | tsalib | |
---|---|---|
5 | 1 | |
1,473 | 253 | |
2.0% | 0.8% | |
4.8 | 10.0 | |
7 months ago | almost 4 years ago | |
Python | Python | |
Apache License 2.0 | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
torchsde
-
Google Research • Differentiable SDE solvers with GPU support and efficient sensitivity analysis in PyTorch. For stochastic differential equations in your deep learning models
Github: https://github.com/google-research/torchsde
-
[D] Ideal deep learning library
So not just that paper, but also our follow-up papers on the same topic: Neural SDEs as Infinite-Dimensional GANs Efficient and Accurate Gradients for Neural SDEs are in fact implemented in PyTorch, specifically the torchsde library. (Disclaimer: of which I am a developer.)
-
[D] Is there any way for GAN to generate arbitrary length of time series signal?
Code: SDE-GAN example in torchsde.
-
[P] Final Year Computer Science Project Suggestions
If you're interested in finance then I'd recommend Neural SDEs: https://arxiv.org/abs/2102.03657 https://arxiv.org/abs/2105.13493 https://github.com/google-research/torchsde/blob/master/examples/sde_gan.py
-
Simple & Fast GAN Training [D]
This may or may not fit what you're after.
tsalib
-
[D] Ideal deep learning library
The one thing I really *really* wish got more attention was named tensors and the tensor type system. Tensor misalignment errors are a constant source of silently-failing bugs. While 3rd party libraries have attempted to fill this gap, it really needs better native support. In particular it seems like bad form to me for programmers to have to remember the specific alignment and broadcasting rules, and then have to apply them to an often poorly documented order of tensor indices. I'd really like to see something like tsalib's warp operator made part of the main library and generalized to arbitrary function application, like a named-tensor version of fold. But preferably using notation closer to that of torchtyping.
What are some alternatives?
torchdyn - A PyTorch library entirely dedicated to neural differential equations, implicit models and related numerical methods
torchtyping - Type annotations and dynamic checking for a tensor's shape, dtype, names, etc.
pysindy - A package for the sparse identification of nonlinear dynamical systems from data
hasktorch - Tensors and neural networks in Haskell
tabnet - PyTorch implementation of TabNet paper : https://arxiv.org/pdf/1908.07442.pdf
equinox - Elegant easy-to-use neural networks + scientific computing in JAX. https://docs.kidger.site/equinox/
SSD-pytorch - SSD: Single Shot MultiBox Detector pytorch implementation focusing on simplicity
dex-lang - Research language for array processing in the Haskell/ML family
NeuralCDE - Code for "Neural Controlled Differential Equations for Irregular Time Series" (Neurips 2020 Spotlight)
functorch - functorch is JAX-like composable function transforms for PyTorch.
pix2pixHD - Synthesizing and manipulating 2048x1024 images with conditional GANs