cacheme
moka
cacheme | moka | |
---|---|---|
2 | 4 | |
41 | 1,345 | |
- | 3.0% | |
3.6 | 9.3 | |
11 months ago | 17 days ago | |
Python | Rust | |
BSD 3-clause "New" or "Revised" License | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
cacheme
-
Python deserves a good in-memory cache library!
Also if you are looking for a good cache framework, maybe Cacheme can help you.
-
Cacheme: Asyncio cache framework with multiple storages and thundering herd protection
See https://github.com/Yiling-J/cacheme
moka
-
Python deserves a good in-memory cache library!
If you know Caffeine(Java)/Ristretto(Go)/Moka(Rust), you know what Theine is. Python deserves a good in-memory cache library.
-
caching, asynchronous, request deduplication - deduplicate 0.3.1
Thanks for the feedback. I am aware of Moka. We used it on a work project but stopped because we had a couple of issues with it. The main one was: https://github.com/moka-rs/moka/issues/154 which I think is fixed now and a smaller issue which was caused by the Quanta crate crashing on AMD chips (also fixed).
-
Writing a concurrent LRU cache
Ya, I saw concache but I looked into it and it doesn't implement what is needed. Each bucket has its own linked-list backing (hence "lock-free linked list buckets"). An LRU needs each value in each bucket to be part of one linked list I believe. After posting this I realized my line of research was failing because it was state of the art five years ago. Caffeine replaced `concurrentlinkedhashmap` in the java world (by the same author). A rust version of that is Moka. These are much more complicated than a concurrent LRU but faster (aka more state of the art). Another rust crate is Stretto which is a port of dgraph's Ristretto (in go). The question becomes is it worth it to essentially port `concurrentlinkedhashmap` to have a great concurrent LRU when there are more state of the art caches out there.
-
Stretto - a thread-safe, high-performance, high hit-ratio cache.
How does it compare to https://github.com/moka-rs/moka ?
What are some alternatives?
cachetools - Extensible memoizing collections and decorators
stretto - Stretto is a Rust implementation for Dgraph's ristretto (https://github.com/dgraph-io/ristretto). A high performance memory-bound Rust cache.
memoize - Caching library for asynchronous Python applications.
ristretto - A high performance memory-bound Go cache
theine - high performance in-memory cache
dashmap - Blazing fast concurrent HashMap for Rust.
chatgpt-memory - Allows to scale the ChatGPT API to multiple simultaneous sessions with infinite contextual and adaptive memory powered by GPT and Redis datastore.
rust-memcache - memcache client for rust
gocache - ☔️ A complete Go cache library that brings you multiple ways of managing your caches
Stretto - Beautiful web-based music player
Caffeine - A high performance caching library for Java
concurrentlinkedhashmap - A ConcurrentLinkedHashMap for Java