open_lth

A repository in preparation for open-sourcing lottery ticket hypothesis code. (by facebookresearch)

Open_lth Alternatives

Similar projects and alternatives to open_lth

  • Pytorch

    340 open_lth VS Pytorch

    Tensors and Dynamic neural networks in Python with strong GPU acceleration

  • composer

    Supercharge Your Model Training (by mosaicml)

  • InfluxDB

    Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.

    InfluxDB logo
  • pytorch-lightning

    Discontinued Build high-performance AI models with PyTorch Lightning (organized PyTorch). Deploy models with Lightning Apps (organized Python to build end-to-end ML systems). [Moved to: https://github.com/Lightning-AI/lightning] (by PyTorchLightning)

  • ffcv

    8 open_lth VS ffcv

    FFCV: Fast Forward Computer Vision (and other ML workloads!)

  • apex

    5 open_lth VS apex

    A PyTorch Extension: Tools for easy mixed precision and distributed training in Pytorch (by NVIDIA)

  • SaaSHub

    SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives

    SaaSHub logo
NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a better open_lth alternative or higher similarity.

open_lth reviews and mentions

Posts with mentions or reviews of open_lth. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2022-06-04.
  • [D] Where do we currently stand at in lottery ticket hypothesis research?
    2 projects | /r/MachineLearning | 4 Jun 2022
    Here https://github.com/facebookresearch/open_lth
  • [P] Composer: a new PyTorch library to train models ~2-4x faster with better algorithms
    7 projects | /r/MachineLearning | 16 Mar 2022
    The way I see it, what we're working on is really a completely new layer in the stack: speeding up the algorithm itself by changing the math. We've still taken great pains to make sure everything else in Composer runs as efficiently as it can, but - as long as you're running the same set of mathematical operations in the same order - there isn't much room to distinguish one trainer from another, and I'd guess that there isn't much of a raw speed difference between Composer and PTL in that sense. For that reason, we aren't very focused on inter-trainer speed comparisons - 10% or 20% here or there a rounding error on the 4x or more that you can expect in the long-run by changing the math. (I will say, though, that the engineers at MosaicML are really good at what they do, and Composer is performance tuned - it absolutely wipes the floor with the OpenLTH trainer I tried to write for my PhD, even without the algorithmic speedups.)

Stats

Basic open_lth repo stats
2
618
0.0
over 1 year ago

facebookresearch/open_lth is an open source project licensed under MIT License which is an OSI approved license.

The primary programming language of open_lth is Python.


Sponsored
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com