scrapy-redis
Redis-based components for Scrapy. (by rmax)
scrapyd
A service daemon to run Scrapy spiders (by scrapy)
scrapy-redis | scrapyd | |
---|---|---|
4 | 6 | |
5,590 | 3,022 | |
0.3% | 0.6% | |
6.0 | 9.1 | |
10 months ago | 6 days ago | |
Python | Python | |
MIT License | BSD 3-clause "New" or "Revised" License |
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
scrapy-redis
Posts with mentions or reviews of scrapy-redis.
We have used some of these posts to build our list of alternatives
and similar projects. The last one was on 2023-06-26.
- How to make scrapy run multiple times on the same URLs?
-
Ask HN: What are the best tools for web scraping in 2022?
11. With some work, you can use Scrapy for distributed projects that are scraping thousands (millions) of domains. We are using https://github.com/rmax/scrapy-redis.
-
How can I clone a github project to offline machine ?
git clone https://github.com/darkrho/scrapy-redis.git cd scrapy-redis python setup.py install
scrapyd
Posts with mentions or reviews of scrapyd.
We have used some of these posts to build our list of alternatives
and similar projects. The last one was on 2022-08-10.
-
Multiple scrapy spiders automation? Executing batch scraping manually now
Scrapyd is a good option to run your scrapers remotely in the cloud. Adding a Scrapyd dashboard makes the experience better.
-
Ask HN: What are the best tools for web scraping in 2022?
8. If you decide to have your own infrastructure, you can use https://github.com/scrapy/scrapyd.
-
The Complete Scrapyd Guide - Deploy, Schedule & Run Your Scrapy Spiders
Scrapyd is one of the most popular options. Created by the same developers that developed Scrapy itself, Scrapyd is a tool for running Scrapy spiders in production on remote servers so you don't need to run them on a local machine.
-
The Complete Guide To ScrapydWeb, Get Setup In 3 Minutes!
ScrapydWeb is the most popular open source Scrapyd admin dashboards. Boasting 2,400 Github stars, ScrapydWeb has been fully embraced by the Scrapy community.
-
Any paid services for hosting scrapy spiders?
or scrapyd -> https://github.com/scrapy/scrapyd
-
Daily Share Price Notifications using Python, SQL and Africas Talking - Part Two
While I am aware that we could use Scrapyd to host your spiders and actually send requests, alongside with ScrapydWeb, I personally prefer to keep my scraper deployment simple, quick, and free. If you are interested in this alternative instead, check out this post written by Harry Wang.
What are some alternatives?
When comparing scrapy-redis and scrapyd you can also consider the following projects:
powerpage-web-crawler - a portable, lightweight web crawler using Powerpage.
Gerapy - Distributed Crawler Management Framework Based on Scrapy, Scrapyd, Django and Vue.js
polite - Be nice on the web
Webscraping Open Project - The web scraping open project repository aims to share knowledge and experiences about web scraping with Python [Moved to: https://github.com/TheWebScrapingClub/webscraping-from-0-to-hero]
wi-page - Rank Wikipedia Article's Contributors by Byte Counts.
SpiderKeeper - admin ui for scrapy/open source scrapinghub