ROCm-docker
ROCm
ROCm-docker | ROCm | |
---|---|---|
3 | 11 | |
392 | 21 | |
1.0% | - | |
5.1 | 10.0 | |
24 days ago | over 3 years ago | |
Shell | HTML | |
MIT License | - |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
ROCm-docker
-
AMD Funded a Drop-In CUDA Implementation Built on ROCm: It's Open-Source
https://rocm.docs.amd.com/projects/install-on-linux/en/lates... links to ROCm/ROCm-docker: https://github.com/ROCm/ROCm-docker which is the source of docker.io/rocm/rocm-terminal: https://hub.docker.com/r/rocm/rocm-terminal :
docker run -it --device=/dev/kfd --device=/dev/dri --group-add video rocm/rocm-terminal
-
Stable Diffusion PR optimizes VRAM, generate 576x1280 images with 6 GB VRAM
Not sure about the 6600, but there is a guide for Linux at least:
https://m.youtube.com/watch?v=d_CgaHyA_n4&feature=emb_logo
And this is somehow relevant (possibly), as I kept the link open.
https://github.com/RadeonOpenCompute/ROCm-docker/issues/38
-
It's working perfectly under Linux
As for the Docker image, I suppose you could compile the image (https://hub.docker.com/r/rocm/pytorch) by yourself using the sources (https://github.com/RadeonOpenCompute/ROCm-docker#building-images), which seems to be quite a bit of work. Better, you could just use an older tag of the upstream image, eg. rocm4.1.1_ubuntu18.04_py3.6_pytorch instead of rocm4.2_ubuntu18.04_py3.6_caffe2 or latest . Just make sure your container version matches your host ROCm version.
ROCm
- ROCm 6.1.0
-
AMD Funded a Drop-In CUDA Implementation Built on ROCm: It's Open-Source
ROCm is not spelled out anywhere in their documentation and the best answers in search come from Github and not AMD official documents
"Radeon Open Compute Platform"
https://github.com/ROCm/ROCm/issues/1628
And they wonder why they are losing. Branding absolutely matters.
-
AMD Instinct MI300X Accelerators
https://github.com/ROCm/ROCm/issues/1353
Bought in 2020. Stopped working in 2020. Not the latest, but in-production, advertised ROCm-capable, and what I could find during the Great GPU Shortage of 2020.
-
AMD leaps after launching AI chip that could challenge Nvidia dominance
Maybe so. But it isn't confidence inspiring when I go to see which cards are supported and I see this issue:
https://github.com/ROCm/ROCm/issues/1714
With Nvidia cards, I know that if I buy any Nvidia card made in the last 10 years, CUDA code will run on it. Period. (Yes, different language levels require newer hardware, but Nvidia docs are quite clear about which CUDA versions require which silicon.)
The will-they-won't-they and the rapidly dropped support is hurting the otherwise excellent ROCm and HIP projects. There is a huge API surface to implement and it looks like they're making rapid gains.
-
GCN2, GCN3: What is the Technical, Non-Business Reason for Limited Supported in Linux (OpenSYCL/HIP/ROCM)? [Exasperated client]
Like, there is: https://github.com/ROCm/ROCm.github.io/blob/master/hardware.md but I'm pretty sure that's very very outdated, maybe from 4.x?
-
AMD’s Best GPU has some problems — Radeon RX 7900XTX VR Performance Review
Fair enough I'll give you that. Although it is listed as officially supported here, other documentation says it works but is not officially supported.
-
Finally, ROCm packages in [community]!
Do you have a source? The 580 and several older cards are listed as officially supported here, and even some 2xx/3xx cards are listed as unofficially supported.
-
[D] What’s the word on AMD gpus these days?
Some of the GPUs listed in your link are for consumers. For a more extensive list, see https://github.com/ROCm/ROCm.github.io/blob/master/hardware.md
-
Told an AI to generate Linux. Looks about right
Very conveniently, your linked page (the therein linked pages) do not talk about which GPUs actually do support ROCm. This is probably because AMDs newest cards do not support ROCm in any way, and would guess they don't want the sales pact this lack of feature could cause. Please do evaluate yourself, here: https://github.com/ROCm/ROCm.github.io/blob/master/hardware.md
What are some alternatives?
awesome-kubernetes - A curated list for awesome kubernetes sources :ship::tada:
rocm-arch - A collection of Arch Linux PKGBUILDS for the ROCm platform
AiDungeon2-Docker-ROCm - Runs an AIDungeon2 fork in Docker on AMD ROCm hardware.
ROCR-Runtime - ROCm Platform Runtime: ROCr a HPC market enhanced HSA based runtime
ZLUDA - CUDA on AMD GPUs
deep-daze - Simple command line tool for text to image generation using OpenAI's CLIP and Siren (Implicit neural representation network). Technique was originally created by https://twitter.com/advadnoun
stable-diffusion - Go to lstein/stable-diffusion for all the best stuff and a stable release. This repository is my testing ground and it's very likely that I've done something that will break it.
ROCm - AMD ROCm™ Software - GitHub Home [Moved to: https://github.com/ROCm/ROCm]
docker-elk - The Elastic stack (ELK) powered by Docker and Compose.
stable-diffusion-webui - Stable Diffusion web UI
Dokku - A docker-powered PaaS that helps you build and manage the lifecycle of applications