m1_huggingface_diffusers_demo
conda
m1_huggingface_diffusers_demo | conda | |
---|---|---|
5 | 30 | |
15 | 6,092 | |
- | 0.6% | |
10.0 | 9.8 | |
over 1 year ago | about 24 hours ago | |
Jupyter Notebook | Python | |
MIT License | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
m1_huggingface_diffusers_demo
-
JupyterLab 4.0
The trick is that you have to deactivate the virtual environment and then resource it after adding Jupyter to that virtual environment.
Most shells cache executable paths, so the path for jupyter will be the global path, not the one for your virtual environment. This is unfortunately not at all obvious and leads to very hard to track down bugs that seem to disappear and reappear if you aren't familiar with the issue.
I have a recipe here which always works: https://github.com/nlothian/m1_huggingface_diffusers_demo#se...
If you don't have requirements.txt then do this: `pip3 install jupyter` for that line, then `deactivate` and `source ./venv/bin/activate`.
-
Bunny AI
This is how I did it on an M1 in September: https://github.com/nlothian/m1_huggingface_diffusers_demo
I think it probably needs updating now, but it should give you something to start with.
-
One-Click Install Stable Diffusion GUI App for M1 Mac. No Dependencies Needed
On my M1 MAx with 32 GB I'm getting 1.5 iterations/second (ie, ~30 seconds for the standard 50 iterations) using this example: https://github.com/nlothian/m1_huggingface_diffusers_demo
-
Nvidia Hopper Sweeps AI Inference Benchmarks in MLPerf Debut
Out of interest I've been running a bunch of the huggingface version of StableDiffusion using the M1 accelerated branch on my M1 Max[1]. I'm getting 1.54 it/s compared to 2.0 it/s for a Nvidia T4 Tesla on Google Collab.
T4 Tesla gets 21,691 queries/second for for ResNet, compared to 81,292 q/s for the new H100, 41,893 q/s for the A100 and 6164 q/s for the new Jetson.
So you can expect maybe 15,000 q/s on a M1 Max. But some tests seem to indicate a lot less[2] - not sure what is happening there.
[1] Setup like this: https://github.com/nlothian/m1_huggingface_diffusers_demo
[2] https://tlkh.dev/benchmarking-the-apple-m1-max#heading-resne...
conda
-
How to Create Virtual Environments in Python
Python's venv module is officially recommended for creating virtual environments since Python 3.5 comes packaged with your Python installation. While there still are additional older tools available, such as conda and virtualenv, if you are new to virtual environments, it is best to use venv now.
- Why does creating my conda environment use so much memory?
- Installing Anaconda on ChromeOS using Linux
-
PSA: conda-libmamba-solver can cut two hours off of your Anaconda install, but has only 47 GitHub stars. It deserves more praise.
conda's dependency solver solves a harder problem than pip's. This quote alludes to it "Conda will never be as fast as pip, so long as we're doing real environment solves and pip satisfies itself only for the current operation." (from https://github.com/conda/conda/issues/7239). Thus mamba was created to improve performance and now conda is bringing in that performance boost.
- Is Anaconda still open source?
-
How to get the best Conda environment experience in Codespaces
The other challenge I ran into sometimes was that if I was running a lower memory/storage Codespace instance, when I tried to use Conda from the command line to modify environments, the process would be killed after a few seconds. This turns out to be related to some performance issues Conda has that make it consume a lot of memory when trying to work with the conda-forge installation channel. You can always then just increase the size of the Codespace your are working with (just go to your Codespaces list and use the triple dots to change the settings for a Codespace).
-
What is the status of Python 3.11?
It's worth noting that [ana]conda isn't even fully compatible yet with 3.11 (you can use it to create 3.11 environments--and you really should rather than waiting on relying on the system python--but conda itself can only run on 3.10.
-
Miniconda finally released for Python 3.10
It took some time but as great Christmas present Miniconda was finally released with Python 3.10!
-
TW: ZSH (and BASH?) does not show current working dir etc anymore
The September update broke it.
-
Python 3.11.0 is now available
According to this this issue is high on their priority list (whatever that means).
What are some alternatives?
ai-notes - notes for software engineers getting up to speed on new AI developments. Serves as datastore for https://latent.space writing, and product brainstorming, but has cleaned up canonical references under the /Resources folder.
mamba - The Fast Cross-Platform Package Manager
diffusionbee-stable-diffusion-ui - Diffusion Bee is the easiest way to run Stable Diffusion locally on your M1 Mac. Comes with a one-click installer. No dependencies or technical knowledge needed.
Poetry - Python packaging and dependency management made easy
sd-buddy - Companion desktop app for the self-hosted M1 Mac version of Stable Diffusion
miniforge - A conda-forge distribution.
diffusers - 🤗 Diffusers: State-of-the-art diffusion models for image and audio generation in PyTorch and FLAX.
PDM - A modern Python package and dependency manager supporting the latest PEP standards
stable-diffusion - A latent text-to-image diffusion model
pip-tools - A set of tools to keep your pinned Python dependencies fresh.
jupyter-collaboration - A Jupyter Server Extension Providing Support for Y Documents
pip - The Python package installer