scrapy-crawl-once
Scrapy middleware which allows to crawl only new content (by TeamHG-Memex)
scrapy-redis
Redis-based components for Scrapy. (by rmax)
scrapy-crawl-once | scrapy-redis | |
---|---|---|
1 | 4 | |
80 | 5,585 | |
- | 0.5% | |
0.0 | 6.0 | |
over 2 years ago | 9 months ago | |
Python | Python | |
MIT License | MIT License |
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
scrapy-crawl-once
Posts with mentions or reviews of scrapy-crawl-once.
We have used some of these posts to build our list of alternatives
and similar projects.
-
Skip Seen URLS
You should use https://github.com/TeamHG-Memex/scrapy-crawl-once or even adapt it to your DB.
scrapy-redis
Posts with mentions or reviews of scrapy-redis.
We have used some of these posts to build our list of alternatives
and similar projects. The last one was on 2023-06-26.
- How to make scrapy run multiple times on the same URLs?
-
Ask HN: What are the best tools for web scraping in 2022?
11. With some work, you can use Scrapy for distributed projects that are scraping thousands (millions) of domains. We are using https://github.com/rmax/scrapy-redis.
-
How can I clone a github project to offline machine ?
git clone https://github.com/darkrho/scrapy-redis.git cd scrapy-redis python setup.py install
What are some alternatives?
When comparing scrapy-crawl-once and scrapy-redis you can also consider the following projects:
Gerapy - Distributed Crawler Management Framework Based on Scrapy, Scrapyd, Django and Vue.js
powerpage-web-crawler - a portable, lightweight web crawler using Powerpage.
scrapy-rotating-proxies - use multiple proxies with Scrapy
polite - Be nice on the web
scrapydweb - Web app for Scrapyd cluster management, Scrapy log analysis & visualization, Auto packaging, Timer tasks, Monitor & Alert, and Mobile UI. Docs 文档 :point_right:
scrapyd - A service daemon to run Scrapy spiders