gh-action-data-scraping
Geo-IP-Database
gh-action-data-scraping | Geo-IP-Database | |
---|---|---|
1 | 1 | |
228 | 8 | |
0.0% | - | |
0.0 | 8.2 | |
6 days ago | 5 days ago | |
JavaScript | ||
MIT License | - |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
gh-action-data-scraping
-
Git scraping: track changes over time by scraping to a Git repository
i do this as a demo: https://github.com/swyxio/gh-action-data-scraping
but conveniently it also serves as a way to track the downtime of github actions, which used to be bad but seems to be fine the last couple months: https://github.com/swyxio/gh-action-data-scraping/assets/676...
Geo-IP-Database
-
Git scraping: track changes over time by scraping to a Git repository
I have a couple of similar scrapers as well. One is a private repo that I collect visa information off Wikipedia (for Visalogy.com), and GeoIP information from MaxMind database (used with their permission).
https://github.com/Ayesh/Geo-IP-Database/
It downloads the repo, and dumps the data split by the first 8 bytes of the IP address, and saves to individual JSON files. For every new scraper run, it creates a new tag and pushes it as a package, so the dependents can simply update them with their dependency manager.
What are some alternatives?
bchydro-outages - Track BCHydro Outages via Git history
scrape-san-mateo-fire-dispatch
bbcrss - Scrapes the headlines from BBC News indexes every five minutes
carbon-intensity-forecast-tracking - The reliability of the National Grid's Carbon Intensity forecast
github-actions - Infromation and tips regarding GitHub Actions
metrobus-timetrack-history - Tracking Metrobus location data