delve
pytea
delve | pytea | |
---|---|---|
1 | 3 | |
77 | 310 | |
- | 0.3% | |
4.0 | 1.8 | |
about 1 year ago | about 2 years ago | |
Python | TypeScript | |
MIT License | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
delve
pytea
What are some alternatives?
cleverhans - An adversarial example library for constructing attacks, building defenses, and benchmarking both
examples - A set of examples around pytorch in Vision, Text, Reinforcement Learning, etc.
only_train_once - OTOv1-v3, NeurIPS, ICLR, TMLR, DNN Training, Compression, Structured Pruning, Erasing Operators, CNN, Diffusion, LLM
explainerdashboard - Quickly build Explainable AI dashboards that show the inner workings of so-called "blackbox" machine learning models.
uncertainty-toolbox - Uncertainty Toolbox: a Python toolbox for predictive uncertainty quantification, calibration, metrics, and visualization
TorchDrift - Drift Detection for your PyTorch Models
WeightWatcher - The WeightWatcher tool for predicting the accuracy of Deep Neural Networks
backpack - BackPACK - a backpropagation package built on top of PyTorch which efficiently computes quantities other than the gradient.
vivit - [TMLR 2022] Curvature access through the generalized Gauss-Newton's low-rank structure: Eigenvalues, eigenvectors, directional derivatives & Newton steps
Transformer-MM-Explainability - [ICCV 2021- Oral] Official PyTorch implementation for Generic Attention-model Explainability for Interpreting Bi-Modal and Encoder-Decoder Transformers, a novel method to visualize any Transformer-based network. Including examples for DETR, VQA.