InfluxDB Platform is powered by columnar analytics, optimized for cost-efficient storage, and built with open data standards. Learn more →
Neural-compressor Alternatives
Similar projects and alternatives to neural-compressor
-
-
Scout Monitoring
Free Django app performance insights with Scout Monitoring. Get Scout setup in minutes, and let us sweat the small stuff. A couple lines in settings.py is all you need to start monitoring your apps. Sign up for our free tier today.
-
-
-
-
TensorRT
NVIDIA® TensorRT™ is an SDK for high-performance deep learning inference on NVIDIA GPUs. This repository contains the open source components of TensorRT.
-
-
-
InfluxDB
Purpose built for real-time analytics at any scale. InfluxDB Platform is powered by columnar analytics, optimized for cost-efficient storage, and built with open data standards.
-
sparseml
Libraries for applying sparsification recipes to neural networks with a few lines of code, enabling faster and smaller models
-
tflite-micro
Infrastructure to enable deployment of ML models to low-power resource-constrained embedded targets (including microcontrollers and digital signal processors).
-
-
Lion
Code for "Lion: Adversarial Distillation of Proprietary Large Language Models (EMNLP 2023)" (by YJiangcm)
-
UPop
[ICML 2023] UPop: Unified and Progressive Pruning for Compressing Vision-Language Transformers.
neural-compressor discussion
neural-compressor reviews and mentions
- Intel Textual Inversion Training on Hugging Face
-
An open-source library for optimizing deep learning inference. (1) You select the target optimization, (2) nebullvm searches for the best optimization techniques for your model-hardware configuration, and then (3) serves an optimized model that runs much faster in inference
Open-source projects leveraged by nebullvm include OpenVINO, TensorRT, Intel Neural Compressor, SparseML and DeepSparse, Apache TVM, ONNX Runtime, TFlite and XLA. A huge thank you to the open-source community for developing and maintaining these amazing projects.
-
Meet Intel® Neural Compressor: An Open-Source Python Library for Model Compression that Reduces the Model Size and Increases the Speed of Deep Learning Inference for Deployment on CPUs or GPUs
Continue reading | The Github repo for the library can be accessed here.
-
A note from our sponsor - InfluxDB
www.influxdata.com | 12 Sep 2024
Stats
intel/neural-compressor is an open source project licensed under Apache License 2.0 which is an OSI approved license.
The primary programming language of neural-compressor is Python.