nvidia-docker VS git-lfs

Compare nvidia-docker vs git-lfs and see what are their differences.

nvidia-docker

Build and run Docker containers leveraging NVIDIA GPUs (by NVIDIA)

git-lfs

Git extension for versioning large files (by git-lfs)
InfluxDB - Power Real-Time Data Analytics at Scale
Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
www.influxdata.com
featured
SaaSHub - Software Alternatives and Reviews
SaaSHub helps you find the best software and product alternatives
www.saashub.com
featured
nvidia-docker git-lfs
53 159
16,998 12,492
- 0.9%
0.0 9.0
5 months ago 4 days ago
Makefile Go
Apache License 2.0 GNU General Public License v3.0 or later
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

nvidia-docker

Posts with mentions or reviews of nvidia-docker. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-06-08.
  • What are the best AI tools you've ACTUALLY used?
    30 projects | /r/artificial | 8 Jun 2023
    Nvidia Docker on GitHub
  • Plex setup through Docker + Nvidia card, but hardware acceleration stops working after some time
    2 projects | /r/PleX | 3 Jun 2023
    Here's where I found discussion regarding this https://github.com/NVIDIA/nvidia-docker/issues/1671
  • Seeking Guidance on Leveraging Local Models and Optimizing GPU Utilization in containerized packages
    1 project | /r/LocalLLaMA | 21 May 2023
    I found the Faq, looks like Windows isn't supported which might indicate why I had this problem earlier. I might need to dual boot my machine if it won't work with WSL which I don't see mentioned in either page. WSL Cuda instructions found this I'll give it a try.
  • Which GPU for HW transcoding in PMS: Intel Arc or Nvidia?
    1 project | /r/PleX | 20 Apr 2023
    Arc has linux kernel support from 6.0, been using an A770 with tdarr for a few months. Super solid and no issues like the nvidia docker toolkit just losing the GPU. The workaround doesn't hold for long - https://github.com/NVIDIA/nvidia-docker/issues/1730 It is one of the reasons I went with Intel over waiting for a low end 40 series. The other was I that basically stole it for $199. So far Plex is the only thing that doesn't work with Arc and all the HW transcoding falls on the iGPU. Knowing how they prioritize things nobody wants, Arc support and AV1 transcoding will be added when 16th gen Intel CPUs are released.
  • [D] Would a Tesla M40 provide cheap inference acceleration for self-hosted LLMs?
    8 projects | /r/MachineLearning | 11 Apr 2023
  • Help! Accelerated-GPU with Cuda and CuPy
    1 project | /r/wsl2 | 8 Apr 2023
    I use ubuntu with wsl2 under windows 11. I want to learn how to get GPU-accelerated computing working. My ASUS laptop has a NVIDIA graphics card as well as a built in Intel. Please ask me any needed questions and then tell me the steps I need to get this working? SOFTWARE DEVELOPER Thanks for sharing that information. Can you please confirm if you have already installed NVIDIA drivers on your Windows 11 system? If not, you'll need to install the appropriate NVIDIA drivers for your GPU in order to enable GPU acceleration. Additionally, what is your end goal is with GPU accelerated computing? yes I have the NVIDIA control panel installed. SOFTWARE DEVELOPER Great. To enable GPU acceleration in WSL2, you'll need to follow these steps: First, confirm you are running Windows 11 build 20145 or later, and have WSL2 installed and configured to use Ubuntu. Download the NVIDIA Linux driver for your GPU from the official NVIDIA Linux download page https://www.nvidia.com/Download/index.aspx. Install the latest NVIDIA driver, including the NVIDIA Docker plugin. You can follow instructions on the NVIDIA Docker plugin GitHub page (https://github.com/NVIDIA/nvidia-docker). Install the CUDA toolkit and cuDNN library. Once you've installed the NVIDIA driver successfully, you can download the CUDA toolkit and cuDNN library (if applicable) from the official NVIDIA CUDA download page https://developer.nvidia.com/cuda-downloads. Install the NVIDIA Container Toolkit. NVIDIA Container Toolkit is a nifty package that provides a set of tools for GPU-accelerated container deployments. You can find installation instructions and a getting started guide on the NVIDIA Container Toolkit GitHub page https://github.com/NVIDIA/nvidia-docker. Test the installation by running a TensorFlow or PyTorch script that uses the GPU for accelerated computing. Check if the GPU is recognized and being used in the output of the script. Once you've successfully followed these steps, you should have GPU-accelerated computing environments in WSL2. Hope that helps! Let me know if you have any other questions. yes to question #1. On #2, do I download the Game Ready driver or the Studio Driver. I don't play games as I'm a software devevloper but want accelerated graphics for development or data science purposes. SOFTWARE DEVELOPER Great to hear that you have WSL2 set up on your system. Regarding your question, you should download the Studio Driver version, as it's optimized for professional workloads, including data science and software development. The Game Ready version is optimized for gaming and might not always provide the best performance for non-gaming workloads. Once you have downloaded the Studio Driver for your GPU, follow the instructions provided by the driver installer to complete the installation process. Let me know if you have any further questions! Would you clarify #3? Is the NVIDIA Docker plugin included with the NVIDIA driver install? Or is it a separate install. I have Docker for Windows installed and working in WSL2. SOFTWARE DEVELOPER Sure, I'd be happy to clarify. The NVIDIA Docker plugin is an additional package that needs to be installed separately from the NVIDIA drivers. The plugin provides a convenient way for creating and running GPU-accelerated Docker containers in your system. To install the NVIDIA Docker plugin, follow the instructions on the NVIDIA Docker plugin GitHub page (https://github.com/NVIDIA/nvidia-docker). Note that the plugin is only compatible with Docker CE (Community Edition) or Docker EE (Enterprise Edition) on Linux-based systems, and does not work on Docker for Windows. Once you've installed the plugin, you can use it to run GPU-accelerated containers with Docker. It's worth noting that the NVIDIA Docker plugin requires access to your host system's NVIDIA GPU, so you'll need to ensure that your GPU has been correctly installed and recognized by your system. Hope that answers your question. Let me know if you have any additional questions!
  • Plex Transcode (VC1 (HW) 1080p H264 (HW) 1080p) on Pixel 7 Pro
    1 project | /r/PleX | 3 Apr 2023
    Im trying to determine how to troubleshoot & resolve the HW transcoding, but based on my testing Im assuming its some change to the NVIDIA toolkit https://github.com/NVIDIA/nvidia-docker
  • jelyfin with nvidia acceleration stopped working
    1 project | /r/jellyfin | 8 Mar 2023
  • Dockerize CUDA-Accelerated Applications
    1 project | dev.to | 3 Mar 2023
    NVIDIA Container Toolkit
  • Setting up a new unraid server with vgpu and plex docker transcodes
    1 project | /r/unRAID | 26 Jan 2023
    So I am in the initial planning stages of setting up a new unraid server. Looking at picking up an SC846 24bay 4u chassis. I've got a Gigabyte Aorus mb with an AMD 5950x, 32gb of ddr4 (adding more as needed) and an nvidia 3070ti. I plan on getting an LSI 8i for the drives and leaves room for expansion server plans. My goal is to have plex setup via docker and utilize the gpu transcoding to offload the cpu work. I also want to setup vms or a vm server to essentially also have a "gaming server" mainly for me and the kids. This means down the road I would be adding another GPU to split up with other users. Im trying to allow for a max of 4 people while also still allowing plex to transcode as needed. Now I know there's other ways to do this but I dont feel like splitting this up into multiple systems unless I have to. So really just trying to see if this might be possible. My worry is that in order to make the gpu available to the plex docker I have to setup an nvidia container. https://github.com/NVIDIA/nvidia-docker

git-lfs

Posts with mentions or reviews of git-lfs. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2024-03-04.
  • Git-annex: manage large files in Git without storing the contents in Git
    1 project | news.ycombinator.com | 16 Apr 2024
    What's the difference between this and Git-LFS?

    https://git-lfs.com/

  • Twenty Years Is Nothing
    4 projects | news.ycombinator.com | 4 Mar 2024
  • Aho – a Git implementation in Awk
    8 projects | news.ycombinator.com | 10 Feb 2024
    It doesn't, since Git's data model has to be changed to content-defined chunks to solve the issue.

    You should look at git-lfs[1] instead.

    [1] https://git-lfs.com

  • Launch HN: Diversion (YC S22) – Cloud-Native Git Alternative
    5 projects | news.ycombinator.com | 22 Jan 2024
    Congrats on the HN launch. How does this improve or expand or blow git-lfs[1] out of the water because if I needed large blob file support it's what I would use instead. It offers pointers to the big files to the hosted git instead of pushing around the binaries itself -- though I am speculating since I've not used it myself just read about it online.

    [1] https://git-lfs.com/

  • Ask HN: How do you keep your documentation, how-to, examples and blogs updated?
    1 project | news.ycombinator.com | 15 Dec 2023
    Specifics depend on project types, but literate programming[0] and using/enforcing coding/git/versioning standards helps. re: outdated responses -- email list for 'new/updated version available' with errata/change log location.

    [0] : https://blog.bitsrc.io/literate-programming-a-radical-approa...

    [1] : https://blog.codacy.com/coding-standards

    [2] : https://github.com/git-lfs/git-lfs/blob/main/.github/workflo...

  • Ask HN: Can we do better than Git for version control?
    17 projects | news.ycombinator.com | 10 Dec 2023
    fine with layers: e.g., large binary files via git-lfs (https://git-lfs.com) and merge conflicts in non-textual files by custom merge resolvers like Unity’s (https://flashg.github.io/GitMerge-for-Unity/).

    Perhaps in the future, almost everyone will keep using Git at the core, but have so many layers to make it more intuitive and provide better merges, that what they’re using barely resembles Git at all. This flexibility and the fact that nearly everything is designed for Git and integrates with Git, are why I doubt it’s ever going away.

    Some alternatives for thought:

    - pijul (https://pijul.org), a completely different VCS which allegedly has better merges/rebases. In beta, but I rarely hear about it nowadays and have heard more bad than good. I don’t think we can implement this alternate rebases in Git, but maybe we don’t need to; even after reading the website, I don’t understand why pijul’s merges are better, and in particular I can’t think of a concrete example nor does pijul provide one.

    - Unison (https://www.unison-lang.org). This isn’t a VCS, but a language with a radical approach to code representation: instead of code being text stored in files, code is ASTs referenced by hash and stored in essentially a database. Among other advantages, the main one is that you can rename symbols and they will automatically propagate to dependencies, because the symbols are referenced by their hash instead of their name. I believe this automatic renaming will be common in the future, whether it’s implemented by a layer on top of Git or alternate code representation like Unison (to be clear, Unison’s codebases are designed to work with Git, and the Unison project itself is stored in Git repos).

    - SVN, the other widespread VCS. Google or ask ChatGPT “Git vs SVN” and you’ll get answers like this (https://www.linode.com/docs/guides/svn-vs-git/, https://stackoverflow.com/a/875). Basically, SVN is easier to understand and handles large files better, Git is decentralized and more popular. But what about the differences which can’t be resolved by layers, like lazygit for intuition and git-lfs for large files? It seems to me like even companies with centralized private repositories use Git, meaning Git will probably win in the long term, but I don’t work at those companies so I don’t really know.

    - Mercurial and Fossil, the other widespread VCSs. It seems these are more similar to Git and the main differences are in the low-level implementation (https://stackoverflow.com/a/892688, https://fossil-scm.org/home/doc/trunk/www/fossil-v-git.wiki#....). It actually seems like most people prefer Mercurial and Fossil over Git and would use them if they had the same popularity, or at least if they had Git’s popularity and Git had Mercury or Fossil’s. But again, these VCSs are so similar that with layers, you can probably create a Git experience which has their advantages and almost copies their UI.

  • We Put Half a Million Files in One Git Repository, Here's What We Learned (2022)
    4 projects | news.ycombinator.com | 28 Aug 2023
  • Show HN: Gogit – Just enough Git (in Go) to push itself to GitHub
    8 projects | news.ycombinator.com | 29 Jul 2023
    > I don’t know what that is

    its a standard output from `go doc`, rendered as HTML. if you dont recognize that, then you aren't really in a position to be commenting on the topic. nothing is stopping anyone from pinning to a tag:

    https://github.com/git-lfs/git-lfs/tags

    or even a commit and relying of a specific version of the software. yes upgrades might be painful but a module IS available.

  • Unable to push because of large file deleted in the past
    2 projects | /r/git | 3 Jul 2023
    # git push origin feature-branch /usr/bin/gh auth git-credential get: 1: /usr/bin/gh auth git-credential get: /usr/bin/gh: not found /usr/bin/gh auth git-credential store: 1: /usr/bin/gh auth git-credential store: /usr/bin/gh: not found Enumerating objects: 9228, done. Counting objects: 100% (7495/7495), done. Delta compression using up to 8 threads Compressing objects: 100% (2090/2090), done. Writing objects: 100% (6033/6033), 72.77 MiB | 7.39 MiB/s, done. Total 6033 (delta 4402), reused 5194 (delta 3616) remote: Resolving deltas: 100% (4402/4402), completed with 477 local objects. remote: error: Trace: c1c90b47a5483929dcdd8c974a6c7d0695e86f67f680d8b88b80ef1c1bce74a remote: error: See https://gh.io/lfs for more information. remote: error: File deployment_20200220.sql is 872.78 MB; this exceeds GitHub's file size limit of 100.00 MB remote: error: GH001: Large files detected. You may want to try Git Large File Storage - https://git-lfs.github.com. To https://github.com/my-org/my-project.git ! [remote rejected] rest-logging -> rest-logging (pre-receive hook declined) error: failed to push some refs to 'https://github.com/my-org/my-project.git'
  • What and Why, Git LFS?
    3 projects | dev.to | 12 Jun 2023

What are some alternatives?

When comparing nvidia-docker and git-lfs you can also consider the following projects:

Pytorch - Tensors and Dynamic neural networks in Python with strong GPU acceleration

onedrive - OneDrive Client for Linux

nvidia-container-runtime - NVIDIA container runtime

git-fat - Simple way to handle fat files without committing them to git, supports synchronization using rsync

Entware - Ultimate repo for embedded devices

Gitea - Git with a cup of tea! Painless self-hosted all-in-one software development service, including Git hosting, code review, team collaboration, package registry and CI/CD

container-images

git - A fork of Git containing Windows-specific patches.

Whisparr

nixpkgs - Nix Packages collection & NixOS

docker-to-linux - Make bootable Linux disk image abusing Docker

scalar - Scalar: A set of tools and extensions for Git to allow very large monorepos to run on Git without a virtualization layer