The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning. Learn more →
Candle Alternatives
Similar projects and alternatives to candle
-
-
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
learn-you-a-haskell
“Learn You a Haskell for Great Good!” by Miran Lipovača
-
-
burn
Discontinued Burn is a new comprehensive dynamic Deep Learning Framework built using Rust with extreme flexibility, compute efficiency and portability as its primary goals. [Moved to: https://github.com/Tracel-AI/burn] (by burn-rs)
-
-
-
WorkOS
The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.
-
-
Universal-G-Code-Sender
A cross-platform G-Code sender for GRBL, Smoothieware, TinyG and G2core.
-
nitro
An inference server on top of llama.cpp. OpenAI-compatible API, queue, & scaling. Embed a prod-ready, local inference engine in your apps. Powers Jan (by janhq)
-
-
-
-
cncjs
A web-based interface for CNC milling controller running Grbl, Marlin, Smoothieware, or TinyG.
-
-
-
hidet
An open-source efficient deep learning framework/compiler, written in python.
-
blind_chat
A fully in-browser privacy solution to make Conversational AI privacy-friendly
-
-
-
SaaSHub
SaaSHub - Software Alternatives and Reviews. SaaSHub helps you find the best software and product alternatives
candle reviews and mentions
-
karpathy/llm.c
Candle already exists[1], and it runs pretty well. Can use both CUDA and Metal backends (or just plain-old CPU).
- Best alternative for python
-
Is there any LLM that can be installed with out python
Check out Candle! It's a Deep Learning framework for Rust. You can run LLMs in binaries.
-
Announcing Kalosm - an local first AI meta-framework for Rust
Kalosm is a meta-framework for AI written in Rust using candle. Kalosm supports local quantized large language models like Llama, Mistral, Phi-1.5, and Zephyr. It also supports other quantized models like Wuerstchen, Segment Anything, and Whisper. In addition to local models, Kalosm supports remote models like GPT-4 and ada embeddings.
-
RFC: candle-lora
I have been working on a machine learning library called candle-lora for Candle. It implementes a technique called LoRA (low rank adaptation), which allows you to reduce a model's trainable parameter count by wrapping and freezing old layers.
- ExecuTorch: Enabling On-Device interference for embedded devices
-
[P] Open-source project to run locally LLMs in browser, such as Phi-1.5 for fully private inference
We provide full local inference in browser, by using libraries from Hugging Face like transformers.js or candle for WASM inference.
-
Update on the Candle ML framework.
We've first announced Candle, a minimalist ML framework in Rust 6 weeks ago. Since then we've focused on adding various recent models and improved the framework so as to support the necessary features in an efficient way. You can checkout a gallery of the examples, supported models include:
-
Should I Haskell or OCaml?
How did you select those two as your options?
I'm just a hobbyist that enjoys programming, and I eventually wanted to expand beyond python. I looked at Haskell and read Learn You a Haskell and did some Exercism exercises but never got anywhere close to being able to use it for real projects. Have been trying to learn about Lisp lately and feel like I've come to a similar dead end.
On the other hand, both Go and Rust have felt fulfilling and practical, with static typing and solid tooling, cross compilations, static binaries, and dependency management that is just a huge breath of fresh air coming from python.
The ML / data science scene is nowhere near as developed as in Python, and I still lean on jupyter/polars/PyTorch here, but I think the candle project[0] seems very interesting. Compiling whisper down to a single CUDA-leveraging binary for fast local transcription is pretty cool!
- Minimalist ML framework for Rust
-
A note from our sponsor - WorkOS
workos.com | 18 Apr 2024
Stats
huggingface/candle is an open source project licensed under Apache License 2.0 which is an OSI approved license.
The primary programming language of candle is Rust.