evlru VS moka

Compare evlru vs moka and see what are their differences.

evlru

An eventually consistent LRU designed for lock-free concurrent reads (by Bajix)

moka

A high performance concurrent caching library for Rust (by moka-rs)
Our great sponsors
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • WorkOS - The modern identity platform for B2B SaaS
  • SaaSHub - Software Alternatives and Reviews
evlru moka
2 4
20 1,323
- 6.1%
0.7 9.3
about 1 year ago 8 days ago
Rust Rust
MIT License Apache License 2.0
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

evlru

Posts with mentions or reviews of evlru. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2021-12-10.

moka

Posts with mentions or reviews of moka. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-02-08.
  • Python deserves a good in-memory cache library!
    7 projects | /r/Python | 8 Feb 2023
    If you know Caffeine(Java)/Ristretto(Go)/Moka(Rust), you know what Theine is. Python deserves a good in-memory cache library.
  • caching, asynchronous, request deduplication - deduplicate 0.3.1
    2 projects | /r/rust | 12 Nov 2022
    Thanks for the feedback. I am aware of Moka. We used it on a work project but stopped because we had a couple of issues with it. The main one was: https://github.com/moka-rs/moka/issues/154 which I think is fixed now and a smaller issue which was caused by the Quanta crate crashing on AMD chips (also fixed).
  • Writing a concurrent LRU cache
    11 projects | /r/rust | 10 Dec 2021
    Ya, I saw concache but I looked into it and it doesn't implement what is needed. Each bucket has its own linked-list backing (hence "lock-free linked list buckets"). An LRU needs each value in each bucket to be part of one linked list I believe. After posting this I realized my line of research was failing because it was state of the art five years ago. Caffeine replaced `concurrentlinkedhashmap` in the java world (by the same author). A rust version of that is Moka. These are much more complicated than a concurrent LRU but faster (aka more state of the art). Another rust crate is Stretto which is a port of dgraph's Ristretto (in go). The question becomes is it worth it to essentially port `concurrentlinkedhashmap` to have a great concurrent LRU when there are more state of the art caches out there.
  • Stretto - a thread-safe, high-performance, high hit-ratio cache.
    5 projects | /r/rust | 17 Oct 2021
    How does it compare to https://github.com/moka-rs/moka ?

What are some alternatives?

When comparing evlru and moka you can also consider the following projects:

stretto - Stretto is a Rust implementation for Dgraph's ristretto (https://github.com/dgraph-io/ristretto). A high performance memory-bound Rust cache.

left-right - A lock-free, read-optimized, concurrency primitive.

ristretto - A high performance memory-bound Go cache

concurrentlinkedhashmap - A ConcurrentLinkedHashMap for Java

dashmap - Blazing fast concurrent HashMap for Rust.

rust-memcache - memcache client for rust

Stretto - Beautiful web-based music player

cached - Rust cache structures and easy function memoization

cacache-rs - A high-performance, concurrent, content-addressable disk cache, with support for both sync and async APIs. 💩💵 but for your 🦀