Flask-RQ2
arq
Our great sponsors
Flask-RQ2 | arq | |
---|---|---|
3 | 4 | |
224 | 1,912 | |
0.9% | - | |
0.0 | 6.9 | |
2 days ago | 2 days ago | |
Python | Python | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Flask-RQ2
-
Wondering if I should use Celery vs threads for what I want to do
From experience i would not use threads for this or any background jobs. I would use Celery or Flask-RQ2 to be your workers, you will also probably end up using them to run other tasks as you encounter the need for other jobs. They both use Redis as a broker and job store and you can use Redis for other things like caching and so many other useful features. I kind of like RQ2 more then Celery because its a little simpler but Celery has a lot more to offer, more features. RQ2 has rq-dashboard for monitoring jobs and Celery has Flower.
-
Tutorials on how to build a flask extension?
However, you might need to access the app’s context like how you’d do so in the Flask-RQ2 extension by using ScriptInfo from flask.cli:
-
Application structure for CLI and API in same
For the CLI part, i recommend going with Click. Its great and already part of flask and easy to use. Other CLI libraries work too but why have more libraries. For scheduling jobs you might want to look at Flask-APScheduler. i used it in one of my projects at work for a while but ended up needing something that could scale to many more workers so had to rip it out but i didnt have a problem with it otherwise. You might want to fork Flask-APScheduler so you can update some libraries because it hasnt been touched in 2 years now or you can just use APScheduler alone. Flask-RQ2 and Celery are pretty good too but the workers need to run in separate processes and need Redis. You could use Redislite i think and with celery you could use a database as your broker, i have done it but dont recommend it.
arq
- Future Plan for Arq
- The Many Problems with Celery
-
I made a simple async queueing framework called SAQ! It includes a built in web UI to manage jobs.
I need to process a lot of long running IO heavy jobs with background workers. I've been using ARQ for a while but decided to take a crack at writing my own distributed queue.
-
Boilerplates for integration services when you need to sync API resources or databases
Lately I've been writing asynchronous python code and yes, the resource integration problem has come again. Because now from version 1.4 SQLAlchemy has become asynchronous a new boilerplate was created. Now, as a scheduler, I took a completely asynchronous Arq. Considering the specifics of the service, long I/O operations, it seems that the service turned out to be more optimal in asynchronous execution. I haven't measured the performance yet, but I think I'll write another post about it.
What are some alternatives?
rq - Simple job queues for Python
celery - Distributed Task Queue (development branch)
rq-scheduler - A lightweight library that adds job scheduling capabilities to RQ (Redis Queue)
saq - Simple Async Queues
rq-dashboard - Flask-based web front-end for monitoring RQ queues
celery-sqlalchemy-boilerplate - Boilerplate for services with Celery, SQLAlchemy, Docker, Alembic and Pytest
flask-apscheduler - Adds APScheduler support to Flask
faust - Python Stream Processing. A Faust fork
django-todo - A multi-user, multi-group todo/ticketing system for Django projects. Includes CSV import and integrated mail tracking.
SQLAlchemy - The Database Toolkit for Python
flower - Real-time monitor and web admin for Celery distributed task queue
think-async - 🌿 Exploring cooperative concurrency primitives in Python