scrape-hacker-news-by-domain
ugit
scrape-hacker-news-by-domain | ugit | |
---|---|---|
4 | 16 | |
35 | 1,349 | |
- | - | |
9.9 | 7.1 | |
1 day ago | 3 months ago | |
JavaScript | Shell | |
- | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
scrape-hacker-news-by-domain
-
London Street Trees
Yeah I have a bunch of these using pretty-printed JSON - here's one that scrapes Hacker News for mentions of my site, for example: https://github.com/simonw/scrape-hacker-news-by-domain/blob/...
-
Git scraping: track changes over time by scraping to a Git repository
Git is a key technology in this approach, because the value you get out of this form of scraping is the commit history - it's a way of turning a static source of information into a record of how that information changed over time.
I think it's fine to use the term "scraping" to refer to downloading a JSON file.
These days an increasing number of websites work by serving up JSON which is then turned into HTML by a client-side JavaScript app. The JSON often isn't a formally documented API, but you can grab it directly to avoid the extra step of processing the HTML.
I do run Git scrapers that process HTML as well. A couple of examples:
scrape-san-mateo-fire-dispatch https://github.com/simonw/scrape-san-mateo-fire-dispatch scrapes the HTML from http://www.firedispatch.com/iPhoneActiveIncident.asp?Agency=... and records both the original HTML and converted JSON in the repository.
scrape-hacker-news-by-domain https://github.com/simonw/scrape-hacker-news-by-domain uses my https://shot-scraper.datasette.io/ browser automation tool to convert an HTML page on Hacker News into JSON and save that to the repo. I wrote more about how that works here: https://simonwillison.net/2022/Dec/2/datasette-write-api/
-
Ask HN: Small scripts, hacks and automations you're proud of?
I have a neat Hacker News scraping setup that I'm really pleased with.
The problem: I want to know when content from one of my sites is submitted to Hacker News, and keep track of the points and comments over time. I also want to be alerted when it happens.
Solution: https://github.com/simonw/scrape-hacker-news-by-domain/
This repo does a LOT of things.
It's an implementation of my Git scraping pattern - https://simonwillison.net/2020/Oct/9/git-scraping/ - in that it runs a script once an hour to check for more content.
It scrapes https://news.ycombinator.com/from?site=simonwillison.net (scraping the HTML because this particular feature isn't supported by the Hacker News API) using shot-scraper - a tool I built for command-line browser automation: https://shot-scraper.datasette.io/
The scraper works by running this JavaScript against the page and recording the resulting JSON to the Git repository: https://github.com/simonw/scrape-hacker-news-by-domain/blob/...
That solves the "monitor and record any changes" bit.
But... I want alerts when my content shows up.
I solve that using three more tools I built: https://datasette.io/ and https://datasette.io/plugins/datasette-atom and https://datasette.cloud/
This script here runs to push the latest scraped JSON to my SQLite database hosted using my in-development SaaS platform, Datasette Cloud: https://github.com/simonw/scrape-hacker-news-by-domain/blob/...
I defined this SQL view https://simon.datasette.cloud/data/hacker_news_posts_atom which shows the latest data in the format required by the datasette-atom plugin.
Which means I can subscribe to the resulting Atom feed (add .atom to that URL) in NetNewsWire and get alerted when my content shows up on Hacker News!
I wrote a bit more about how this all works here: https://simonwillison.net/2022/Dec/2/datasette-write-api/
-
Datasette’s new JSON write API: The first alpha of Datasette 1.0
I'm really pleased with the Hacker News scraping demo in this - it's an extension of the scraper I wrote back in March, using shot-scraper to execute JavaScript in headless Chrome and write the resulting JSON back to a Git repo: https://simonwillison.net/2022/Mar/14/scraping-web-pages-sho...
My new demo also then pipes that data up to Datasette using curl -X POST - this script here: https://github.com/simonw/scrape-hacker-news-by-domain/blob/...
ugit
-
You screwed your git history? Don't panic!
View on GitHub
-
Ask HN: Small scripts, hacks and automations you're proud of?
Ugit (undo git): https://github.com/Bhupesh-V/ugit/blob/master/ugit
I got this idea from a Twitter discussion which eventually blew up a year later after its initial release.
-
Fig AI: Translate natural language to bash
Looks nice. A bit of a self plug, if anyone needs a git undo. I built one recently, can undo almost 20 operations so far Link: https://github.com/bhupesh-v/ugit
-
What cool apps are you working on and what problems have you solved?
Nothing big but I like building devtools, so I built a git undo https://github.com/Bhupesh-V/ugit
- Undo your last git mistake with ugit
- Ugit
-
Hey devs and fellow students. Flex Time.
My recent one is ugit - undo git commands
-
Any beginner friendly repos for hacktoberfest?
self plug: https://github.com/bhupesh-v/ugit
What are some alternatives?
scrape-san-mateo-fire-dispatch
fifa-api - React + Node.js + socket.io -- Multiplayer Game server
shot-scraper - A command-line utility for taking automated screenshots of websites
ofbiz-framework - Apache OFBiz is an open source product for the automation of enterprise processes. It includes framework components and business applications for ERP, CRM, E-Business/E-Commerce, Supply Chain Management and Manufacturing Resource Planning. OFBiz provides a foundation and starting point for reliable, secure and scalable enterprise solutions.
zettelkasten - Creating notes with the zettelkasten note taking method and storing all notes on github
SauceKudasai - Get Anime info by image or URL (uses trace.moe and Anilist for animeinfo)
hun_law_rs - Tool for parsing hungarian laws (Rust version)
dragon - Drag and drop source/target for X
sf-tree-history - Tracking the history of trees in San Francisco
SmartCommit - Automatically generate concise and meaningful Git commit messages from your staged changes using AI.
queensland-traffic-conditions - A scraper that tracks changes to the published queensland traffic incidents data
Periculum-API - A working API for accessing/scraping kickasstorrent Torrents data.