automq
FLiPStackWeekly
automq | FLiPStackWeekly | |
---|---|---|
8 | 82 | |
1,421 | 14 | |
50.4% | - | |
9.9 | 9.9 | |
3 days ago | 3 days ago | |
Java | ||
GNU General Public License v3.0 or later | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
automq
-
Tiered storage won't fix Kafka
I agree with your viewpoint. The crux of the matter is not whether to use tiered storage or not, but what trade-offs have been made in the specific storage architecture and what benefits have been gained. Here(https://github.com/AutoMQ/automq?tab=readme-ov-file#-automq-...) is a qualitative comparison chart of streaming systems including kafka/confluent/redpanda/warpstream/automq. This comparison chart does not have specific numerical comparisons, but purely based on their trade-offs at the storage level, I think this will be of some use to you.
- Streaming Platform Comparision:Kafka/Confluent/Pulsar/AutoMQ/Redpanda/Warpstream
-
Show HN: AutoMQ – A Cost-Effective Kafka distro that can autoscale in seconds
Yes, thank you for the clarification. AutoMQ has replaced the topic-partition storage with cloud-native S3Stream (https://github.com/AutoMQ/automq/tree/main/s3stream) library, thereby harnessing the benefits of cloud EBS and S3.
- FLaNK Stack Weekly for 20 Nov 2023
FLiPStackWeekly
What are some alternatives?
TinyLlama - The TinyLlama project is an open endeavor to pretrain a 1.1B Llama model on 3 trillion tokens.
gorilla-cli - LLMs for your CLI
memq - MemQ is an efficient, scalable cloud native PubSub system
awk-raycaster - Pseudo-3D shooter written completely in gawk using raycasting technique
depthai-python - DepthAI Python Library
litellm - Call all LLM APIs using the OpenAI format. Use Bedrock, Azure, OpenAI, Cohere, Anthropic, Ollama, Sagemaker, HuggingFace, Replicate (100+ LLMs)
FLaNK-SaoPauloBrazil - FLaNK-SaoPauloBrazil
modelscope - ModelScope: bring the notion of Model-as-a-Service to life.
trip - Elegant middleware functions for your HTTP clients.
create-nifi-pulsar-flink-apps - How to create a real-time scalable streaming app using Apache NiFi, Apache Pulsar and Apache Flink SQL
ML-For-Beginners - 12 weeks, 26 lessons, 52 quizzes, classic Machine Learning for all
FLiP-PulsarSummit2022Asia - FLiP-PulsarSummit2022Asia: Pulsar Summit Asia 2022