memoize
cacheme
memoize | cacheme | |
---|---|---|
1 | 2 | |
64 | 41 | |
- | - | |
6.2 | 3.6 | |
about 10 hours ago | 11 months ago | |
Python | Python | |
Apache License 2.0 | BSD 3-clause "New" or "Revised" License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
memoize
-
Good and Bad Elixir
I totally agree, though I think those articles are a lot harder (eg requiring more skill) to write well because you need to quickly ramp your readers on all of whatever the context is that's necessary to actually appreciate the nuance of the design decisions under discussion. You're basically by definitely going to be out of the realm of "just follow best practice X" or "apply pattern Y or you're doing it wrong."
As a small example, I've been working on a small asyncio-based web service (Python) which is oriented around an expensive process that generates a result, where the result is stashed in sqlite and returned. I knew upfront that I needed a way to track when a particular result was already being prepared so that if I got a second request for it, it would collapse it into the first one and only do the work once. I wrote this as a twenty line memoizing decorator, but it turns out this issue as a nameβ cache stampeding. Once I realized that, I discovered that there are existing (and much more complicated/tunable) solutions to this problem, such as https://github.com/DreamLab/memoize/, but the article pitching that solution spends quite a bit of time getting to itβ enough so that if I'd discovered it before building my own, I'm not sure I would even have appreciated its applicability:
https://tech.ringieraxelspringer.com/blog/open-source/cachin...
cacheme
-
Python deserves a good in-memory cache library!
Also if you are looking for a good cache framework, maybe Cacheme can help you.
-
Cacheme: Asyncio cache framework with multiple storages and thundering herd protection
See https://github.com/Yiling-J/cacheme
What are some alternatives?
httpx-cache - Simple caching transport for httpx
cachetools - Extensible memoizing collections and decorators
Tornado-SQLAlchemy - SQLAlchemy support for Tornado
theine - high performance in-memory cache
webssh - :seedling: Web based ssh client
chatgpt-memory - Allows to scale the ChatGPT API to multiple simultaneous sessions with infinite contextual and adaptive memory powered by GPT and Redis datastore.
pottery - Redis for humans. πππ
gocache - βοΈ A complete Go cache library that brings you multiple ways of managing your caches
turbo - A framework based on tornado for easier development, scaling up and maintenance
moka - A high performance concurrent caching library for Rust