stretto VS concache

Compare stretto vs concache and see what are their differences.

stretto

Stretto is a Rust implementation for Dgraph's ristretto (https://github.com/dgraph-io/ristretto). A high performance memory-bound Rust cache. (by al8n)

concache

A linked-list based, lock-free concurrent hashmap in Rust. (by saligrama)
Our great sponsors
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • WorkOS - The modern API for authentication & user identity.
  • LearnThisRepo.com - Learn 300+ open source libraries for free using AI.
stretto concache
6 1
388 155
- -
5.7 10.0
9 days ago about 4 years ago
Rust Rust
Apache License 2.0 Apache License 2.0
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

stretto

Posts with mentions or reviews of stretto. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2022-07-07.
  • Stretto 0.5.0 release: Support runtime agnostic AsyncCache
    2 projects | /r/rust | 7 Jul 2022
    Hi, I think this link is a good explanation https://github.com/al8n/stretto/pull/7
    2 projects | /r/rust | 7 Jul 2022
  • Writing a concurrent LRU cache
    11 projects | /r/rust | 10 Dec 2021
    Ya, I saw concache but I looked into it and it doesn't implement what is needed. Each bucket has its own linked-list backing (hence "lock-free linked list buckets"). An LRU needs each value in each bucket to be part of one linked list I believe. After posting this I realized my line of research was failing because it was state of the art five years ago. Caffeine replaced `concurrentlinkedhashmap` in the java world (by the same author). A rust version of that is Moka. These are much more complicated than a concurrent LRU but faster (aka more state of the art). Another rust crate is Stretto which is a port of dgraph's Ristretto (in go). The question becomes is it worth it to essentially port `concurrentlinkedhashmap` to have a great concurrent LRU when there are more state of the art caches out there.
  • Stretto - a thread-safe, high-performance, high hit-ratio cache.
    5 projects | /r/rust | 17 Oct 2021
    For the case in the benches folder(a very roughly bench case), stretto is around 20 - 30 ms(sync version is around 30 - 40 ms) faster than moka, for 120, 000+ operations. I set stretto to collect metrics when benching, collecting metrics will make around 10% overhead. Moka seems not to provide a configuration to collect the metrics, so the hit-ratio is not compared.
    5 projects | /r/rust | 17 Oct 2021

concache

Posts with mentions or reviews of concache. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2021-12-10.
  • Writing a concurrent LRU cache
    11 projects | /r/rust | 10 Dec 2021
    Concache ( https://github.com/saligrama/concache ) is the only crate that came up easily in a search, but I don't know whether it would be easily adaptable if it needed tweaks. You might could try forking it anyway.

What are some alternatives?

When comparing stretto and concache you can also consider the following projects:

ristretto - A high performance memory-bound Go cache

moka - A high performance concurrent caching library for Rust

rust-memcache - memcache client for rust

dashmap - Blazing fast concurrent HashMap for Rust.

bitsock - Safe Rust crate for creating socket servers and clients with ease.

ttl_cache

bmemcached-rs - Rust binary memcached implementation

concurrentlinkedhashmap - A ConcurrentLinkedHashMap for Java

hitbox - A high-performance caching framework suitable for single-machine and for distributed applications in Rust

evlru - An eventually consistent LRU designed for lock-free concurrent reads