think-async
arq
think-async | arq | |
---|---|---|
4 | 4 | |
222 | 1,934 | |
- | - | |
7.8 | 6.9 | |
3 months ago | 4 days ago | |
Python | Python | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
think-async
-
Think Async: Resources for Exploring Different Concurrency Paradigms in Python
Lately, at my workplace, I've been doing a lot of asynchronous I/O programming in Python. In my case, ironically, I picked up Golang faster than Python's async paradigm despite Python being my primary language. Coroutine chauffeured asynchronous programming demanded a substantial shift in the way I used to compose solutions in synchronous Python.
However, after successfully writing two services using asyncio, SQS, DynamoDB, and aiobotocore—I'm convinced that this is actually worth it. Here're a few resources that I've found helpful along the way. Pull requests are very much welcome.
https://github.com/rednafi/think-async
- Think Async in Python
arq
- Future Plan for Arq
- The Many Problems with Celery
-
I made a simple async queueing framework called SAQ! It includes a built in web UI to manage jobs.
I need to process a lot of long running IO heavy jobs with background workers. I've been using ARQ for a while but decided to take a crack at writing my own distributed queue.
-
Boilerplates for integration services when you need to sync API resources or databases
Lately I've been writing asynchronous python code and yes, the resource integration problem has come again. Because now from version 1.4 SQLAlchemy has become asynchronous a new boilerplate was created. Now, as a scheduler, I took a completely asynchronous Arq. Considering the specifics of the service, long I/O operations, it seems that the service turned out to be more optimal in asynchronous execution. I haven't measured the performance yet, but I think I'll write another post about it.
What are some alternatives?
Joblib - Computing with Python functions.
celery - Distributed Task Queue (development branch)
aiomultiprocess - Take a modern Python codebase to the next level of performance.
saq - Simple Async Queues
vermin - Concurrently detect the minimum Python versions needed to run code
celery-sqlalchemy-boilerplate - Boilerplate for services with Celery, SQLAlchemy, Docker, Alembic and Pytest
gevent - Coroutine-based concurrency library for Python
faust - Python Stream Processing. A Faust fork
mnqueues - Monitored Multiprocessing Queues
Flask-RQ2 - A Flask extension for RQ.
regta - 📅 Production-ready scheduler with async, multithreading and multiprocessing support for Python
SQLAlchemy - The Database Toolkit for Python