Stable-Diffusion-NCNN
ncnn
Stable-Diffusion-NCNN | ncnn | |
---|---|---|
8 | 12 | |
940 | 19,310 | |
- | 1.4% | |
4.9 | 9.4 | |
11 months ago | 4 days ago | |
C++ | C++ | |
BSD 3-clause "New" or "Revised" License | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Stable-Diffusion-NCNN
- Stable Diffusion implemented by ncnn framework based on C++, supported txt2img and img2img!
-
Is there an open source app for generating ai images?
It is possible, though a bit unpractical: Stable-Diffusion-NCNN
-
Stable Diffusion running on a Snapdragon mobile device. Processing on device, not cloud (phone's in Flight Mode).
Remember this project needs 8gb to do 512x512 on a phone https://github.com/EdVince/Stable-Diffusion-NCNN
-
Qualcomm demos fastest local AI image generation with Stable Diffusion on mobile
"World's first on-device demonstration of Stable Diffusion on an Android phone" Not really
-
Qualcomm Demos AI Art Generator Stable Diffusion Running on an Android Phone Processor First In The World
some already did it https://github.com/EdVince/Stable-Diffusion-NCNN , not first in the world
- world's first on-device demonstration of STABLE DIFFUSION on an android phone
-
Stable diffusion port to .NET c#
https://github.com/andreae293/Stable-Diffusion.NET-NCNN This project is a simple c# port of the already existing c++ stable diffusion using Ncnn Dotnet libraries. Visual studio 2022 is required to open it
ncnn
-
AMD Funded a Drop-In CUDA Implementation Built on ROCm: It's Open-Source
ncnn uses Vulkan for GPU acceleration, I've seen it used in a few projects to get AMD hardware support.
https://github.com/Tencent/ncnn
-
[D] Best way to package Pytorch models as a standalone application
They're using NCNN to package the model. Have a look. https://github.com/Tencent/NCNN
-
Realtime object detection android app
Hi. Here is my prefered android app for realtime objet detection: https://github.com/nihui/ncnn-android-nanodet ; https://github.com/Tencent/ncnn contains a lot of android demo app for a lot of models.
- ncnn: High-performance neural network inference framework optimized for mobile
-
Esp32 tensorflow lite
ncnn home page: https://github.com/Tencent/ncnn
-
MMDeploy: Deploy All the Algorithms of OpenMMLab
ncnn
-
Draw Things, Stable Diffusion in your pocket, 100% offline and free
Yes, Android devices tend to have bigger RAMs, making running 1024x1024 possible (this is not possible at all on iPhones, which could peak around 5GiB memory with my current implementation, some serious engineering required to bring that down on iPhone devices). The problem is I am not sure about speed. I would likely switch to NCNN (https://github.com/Tencent/ncnn) as the backend which have a decent Vulkan computing kernel support. It is definitely a possibility and there is a path to do that.
- What’s New in TensorFlow 2.10?
-
[Technical Article] OCR Upgrade
As the leading open-source inference framework in China and in the world, what we like are its almost zero cost cross-platform capability, high inference speed, and minimal deployment volume. (Project address: https://github.com/Tencent/ncnn)
-
Is there a functioning neural netowork or backbone written in pure C language only?
If you’re not planning on training the neural net on an embedded device and just do inference, this might interest you: https://github.com/Tencent/ncnn
What are some alternatives?
TNN - TNN: developed by Tencent Youtu Lab and Guangying Lab, a uniform deep learning inference framework for mobile、desktop and server. TNN is distinguished by several outstanding features, including its cross-platform capability, high performance, model compression and code pruning. Based on ncnn and Rapidnet, TNN further strengthens the support and performance optimization for mobile devices, and also draws on the advantages of good extensibility and high performance from existed open source efforts. TNN has been deployed in multiple Apps from Tencent, such as Mobile QQ, Weishi, Pitu, etc. Contributions are welcome to work in collaborative with us and make TNN a better framework.
XNNPACK - High-efficiency floating-point neural network inference operators for mobile, server, and Web
NcnnDotNet - ncnn wrapper written in C++ and C# for Windows, MacOS, Linux, iOS and Android
rife-ncnn-vulkan - RIFE, Real-Time Intermediate Flow Estimation for Video Frame Interpolation implemented with ncnn library
Stable-Diffusion.NET-NCNN
deepdetect - Deep Learning API and Server in C++14 support for Caffe, PyTorch,TensorRT, Dlib, NCNN, Tensorflow, XGBoost and TSNE
intel-extension-for-transformers - ⚡ Build your chatbot within minutes on your favorite device; offer SOTA compression techniques for LLMs; run LLMs efficiently on Intel Platforms⚡
netron - Visualizer for neural network, deep learning and machine learning models
civitai - A repository of models, textual inversions, and more
darknet - Convolutional Neural Networks
RPi_64-bit_Zero-2-image - Raspberry Pi Zero 2 W 64-bit OS image with OpenCV, TensorFlow Lite and ncnn Framework.
torch-mlir - The Torch-MLIR project aims to provide first class support from the PyTorch ecosystem to the MLIR ecosystem.