stock-price-scraper
scrapyd
stock-price-scraper | scrapyd | |
---|---|---|
3 | 6 | |
4 | 2,852 | |
- | 0.8% | |
0.0 | 5.9 | |
over 1 year ago | 3 months ago | |
Python | Python | |
MIT License | BSD 3-clause "New" or "Revised" License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
stock-price-scraper
- Nairobi Stock Exchange Web Scraper (MongoDB Atlas Hackathon 2022 on DEV)
-
Daily Share Price Notifications using Python, SQL and Africas Talking - Part Two
The full code for this tutorial is on github.
-
Daily Share Price Notifications using Python, SQL and Africas Talking - Part One
Alternatively, check the finished code on Github
scrapyd
-
Multiple scrapy spiders automation? Executing batch scraping manually now
Scrapyd is a good option to run your scrapers remotely in the cloud. Adding a Scrapyd dashboard makes the experience better.
-
Ask HN: What are the best tools for web scraping in 2022?
8. If you decide to have your own infrastructure, you can use https://github.com/scrapy/scrapyd.
-
The Complete Scrapyd Guide - Deploy, Schedule & Run Your Scrapy Spiders
Scrapyd is one of the most popular options. Created by the same developers that developed Scrapy itself, Scrapyd is a tool for running Scrapy spiders in production on remote servers so you don't need to run them on a local machine.
-
The Complete Guide To ScrapydWeb, Get Setup In 3 Minutes!
ScrapydWeb is the most popular open source Scrapyd admin dashboards. Boasting 2,400 Github stars, ScrapydWeb has been fully embraced by the Scrapy community.
-
Any paid services for hosting scrapy spiders?
or scrapyd -> https://github.com/scrapy/scrapyd
-
Daily Share Price Notifications using Python, SQL and Africas Talking - Part Two
While I am aware that we could use Scrapyd to host your spiders and actually send requests, alongside with ScrapydWeb, I personally prefer to keep my scraper deployment simple, quick, and free. If you are interested in this alternative instead, check out this post written by Harry Wang.
What are some alternatives?
epg-grabber - Built in Python, this is another Internet-based scraper and it provides a very rich electronic program guide (EPG) data set from some supported sites.
Gerapy - Distributed Crawler Management Framework Based on Scrapy, Scrapyd, Django and Vue.js
scrapydweb - Web app for Scrapyd cluster management, Scrapy log analysis & visualization, Auto packaging, Timer tasks, Monitor & Alert, and Mobile UI. DEMO :point_right:
scrapyrt - HTTP API for Scrapy spiders
SpiderKeeper - admin ui for scrapy/open source scrapinghub
open-source-badges - :octocat: Open Source & Licence Badges
polite - Be nice on the web
puppeteer - Node.js API for Chrome
estela - estela, an elastic web scraping cluster 🕸
Webscraping Open Project - The web scraping open project repository aims to share knowledge and experiences about web scraping with Python [Moved to: https://github.com/TheWebScrapingClub/webscraping-from-0-to-hero]
chrome-aws-lambda - Chromium Binary for AWS Lambda and Google Cloud Functions