InfluxDB 3 OSS is now GA. Transform, enrich, and act on time series data directly in the database. Automate critical tasks and eliminate the need to move data externally. Download now. Learn more →
BigDL Alternatives
Similar projects and alternatives to BigDL
-
ExpansionCards
Reference designs and documentation to create Expansion Cards for the Framework Laptop
-
InfluxDB
InfluxDB – Built for High-Performance Time Series Workloads. InfluxDB 3 OSS is now GA. Transform, enrich, and act on time series data directly in the database. Automate critical tasks and eliminate the need to move data externally. Download now.
-
-
ollama
Get up and running with Llama 3.3, DeepSeek-R1, Phi-4, Gemma 3, Mistral Small 3.1 and other large language models.
-
-
-
-
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
-
-
-
intel-extension-for-pytorch
A Python package for extending the official PyTorch that can easily obtain performance on Intel platform
-
-
-
-
-
-
-
-
-
Zeppelin
Web-based notebook that enables data-driven, interactive data analytics and collaborative documents with SQL, Scala and more.
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
BigDL discussion
BigDL reviews and mentions
- FlashMoE: DeepSeek-R1 671B and Qwen3MoE 235B with 1~2 Intel B580 GPU in IPEX-LLM
-
DeepSeek R1 671B Q4_K_M with 1~2 Arc A770 on Xeon
That's because the OP is linking to the quickstart guide. There are benchmark numbers on the github's root page, but it does not appear to include the new deepseek yet:
https://github.com/intel/ipex-llm/tree/main?tab=readme-ov-fi...
-
MacBook Air M4
you have the choice to go with the AMD Ryzen AI Max processor which rivals the M4. And gives similar battery life and performance.
Or the Intel Lunar Lake processor.
Both have extremely good laptop options - the Lenovo Yoga Aura edition is pretty much macbook quality.
And runs LLMs (https://github.com/intel/ipex-llm/blob/main/docs/mddocs/Over...)
- IPEX-LLM Portable Zip for Ollama on Intel GPU
-
Llama.cpp supports Vulkan. why doesn't Ollama?
There is ipex-llm support for Ollama on Intel GPU (https://github.com/intel/ipex-llm/blob/main/docs/mddocs/Quic...)
- Intel Announces Arc B-Series "Battlemage" Discrete Graphics with Linux Support
-
Run Ollama on Intel Arc GPU (IPEX)
You can find helpful tips here.
- PyTorch Library for Running LLM on Intel CPU and GPU
-
LLaMA Now Goes Faster on CPUs
Any performance benchmark against intel's 'IPEX-LLM'[0] or others?
[0] - https://github.com/intel-analytics/ipex-llm
- BigDL-LLM: running LLM on your laptop using INT4
-
A note from our sponsor - InfluxDB
www.influxdata.com | 23 May 2025
Stats
intel/ipex-llm is an open source project licensed under Apache License 2.0 which is an OSI approved license.
The primary programming language of BigDL is Python.