LLMCompiler
[ICML 2024] LLMCompiler: An LLM Compiler for Parallel Function Calling (by SqueezeAILab)
mlx-examples
Examples in the MLX framework (by ml-explore)
LLMCompiler | mlx-examples | |
---|---|---|
2 | 31 | |
1,118 | 5,287 | |
4.4% | 7.0% | |
7.6 | 9.7 | |
about 2 months ago | 3 days ago | |
Python | Python | |
MIT License | MIT License |
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
LLMCompiler
Posts with mentions or reviews of LLMCompiler.
We have used some of these posts to build our list of alternatives
and similar projects. The last one was on 2023-12-18.
mlx-examples
Posts with mentions or reviews of mlx-examples.
We have used some of these posts to build our list of alternatives
and similar projects. The last one was on 2024-04-29.
- MLX-Whisper
- FLaNK AI Weekly for 29 April 2024
- DBRX on Apple MLX
- Why the M2 is more advanced that it seemed
- MLX: Speculative Decoding
- Mixtral on MLX
- Qwen on MLX
- FLaNK Weekly 18 Dec 2023
- MLX: Fine-tune Llama 7B or Mistral 7B with 32GB
-
Whisper: Nvidia RTX 4090 vs. M1 Pro with MLX
I was able to get it running on MLX on my M2 Max machine within a couple minutes using their example: https://github.com/ml-explore/mlx-examples/tree/main/whisper
What are some alternatives?
When comparing LLMCompiler and mlx-examples you can also consider the following projects:
GoLLIE - Guideline following Large Language Model for Information Extraction
llama-cpp-python - Python bindings for llama.cpp
SqueezeLLM - [ICML 2024] SqueezeLLM: Dense-and-Sparse Quantization
cog-whisper-diarization - Cog implementation of transcribing + diarization pipeline with Whisper & Pyannote
FLaNK-Ice - Apache Iceberg - Cloud Data Lakehouse
FLaNK-OpenAi - Chat
uHTTP - Pythonic Web Development
MemGPT - Create LLM agents with long-term memory and custom tools 📚🦙
kani - kani (カニ) is a highly hackable microframework for chat-based language models with tool use/function calling. (NLP-OSS @ EMNLP 2023)
furnace - a multi-system chiptune tracker compatible with DefleMask modules
FLaNK-ContinuousSQL
LLMCompiler vs GoLLIE
mlx-examples vs llama-cpp-python
LLMCompiler vs SqueezeLLM
mlx-examples vs cog-whisper-diarization
LLMCompiler vs FLaNK-Ice
mlx-examples vs FLaNK-OpenAi
LLMCompiler vs uHTTP
mlx-examples vs MemGPT
LLMCompiler vs kani
mlx-examples vs furnace
LLMCompiler vs MemGPT
mlx-examples vs FLaNK-ContinuousSQL