scrape-hacker-news-by-domain
oxipng
scrape-hacker-news-by-domain | oxipng | |
---|---|---|
4 | 14 | |
35 | 2,648 | |
- | - | |
9.9 | 8.6 | |
2 days ago | 20 days ago | |
JavaScript | Rust | |
- | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
scrape-hacker-news-by-domain
-
London Street Trees
Yeah I have a bunch of these using pretty-printed JSON - here's one that scrapes Hacker News for mentions of my site, for example: https://github.com/simonw/scrape-hacker-news-by-domain/blob/...
-
Git scraping: track changes over time by scraping to a Git repository
Git is a key technology in this approach, because the value you get out of this form of scraping is the commit history - it's a way of turning a static source of information into a record of how that information changed over time.
I think it's fine to use the term "scraping" to refer to downloading a JSON file.
These days an increasing number of websites work by serving up JSON which is then turned into HTML by a client-side JavaScript app. The JSON often isn't a formally documented API, but you can grab it directly to avoid the extra step of processing the HTML.
I do run Git scrapers that process HTML as well. A couple of examples:
scrape-san-mateo-fire-dispatch https://github.com/simonw/scrape-san-mateo-fire-dispatch scrapes the HTML from http://www.firedispatch.com/iPhoneActiveIncident.asp?Agency=... and records both the original HTML and converted JSON in the repository.
scrape-hacker-news-by-domain https://github.com/simonw/scrape-hacker-news-by-domain uses my https://shot-scraper.datasette.io/ browser automation tool to convert an HTML page on Hacker News into JSON and save that to the repo. I wrote more about how that works here: https://simonwillison.net/2022/Dec/2/datasette-write-api/
-
Ask HN: Small scripts, hacks and automations you're proud of?
I have a neat Hacker News scraping setup that I'm really pleased with.
The problem: I want to know when content from one of my sites is submitted to Hacker News, and keep track of the points and comments over time. I also want to be alerted when it happens.
Solution: https://github.com/simonw/scrape-hacker-news-by-domain/
This repo does a LOT of things.
It's an implementation of my Git scraping pattern - https://simonwillison.net/2020/Oct/9/git-scraping/ - in that it runs a script once an hour to check for more content.
It scrapes https://news.ycombinator.com/from?site=simonwillison.net (scraping the HTML because this particular feature isn't supported by the Hacker News API) using shot-scraper - a tool I built for command-line browser automation: https://shot-scraper.datasette.io/
The scraper works by running this JavaScript against the page and recording the resulting JSON to the Git repository: https://github.com/simonw/scrape-hacker-news-by-domain/blob/...
That solves the "monitor and record any changes" bit.
But... I want alerts when my content shows up.
I solve that using three more tools I built: https://datasette.io/ and https://datasette.io/plugins/datasette-atom and https://datasette.cloud/
This script here runs to push the latest scraped JSON to my SQLite database hosted using my in-development SaaS platform, Datasette Cloud: https://github.com/simonw/scrape-hacker-news-by-domain/blob/...
I defined this SQL view https://simon.datasette.cloud/data/hacker_news_posts_atom which shows the latest data in the format required by the datasette-atom plugin.
Which means I can subscribe to the resulting Atom feed (add .atom to that URL) in NetNewsWire and get alerted when my content shows up on Hacker News!
I wrote a bit more about how this all works here: https://simonwillison.net/2022/Dec/2/datasette-write-api/
-
Datasette’s new JSON write API: The first alpha of Datasette 1.0
I'm really pleased with the Hacker News scraping demo in this - it's an extension of the scraper I wrote back in March, using shot-scraper to execute JavaScript in headless Chrome and write the resulting JSON back to a Git repo: https://simonwillison.net/2022/Mar/14/scraping-web-pages-sho...
My new demo also then pipes that data up to Datasette using curl -X POST - this script here: https://github.com/simonw/scrape-hacker-news-by-domain/blob/...
oxipng
- OxiPNG: Multithreaded PNG optimizer written in Rust
-
screen capture/snapshot utility with image optimization support/configurability
I have had good experiences with https://github.com/shssoichiro/oxipng . Although, I suspect this wouldn't give you nearly enough space savings as jpg.
- Ask HN: Small scripts, hacks and automations you're proud of?
-
Exported png image with color-to-alpha edit is huge
If you do want the file as a PNG (for transparency and a common format that's well supported), but don't want it so huge, consider something like oxipng. https://github.com/shssoichiro/oxipng
-
Name a program that doesn't get enough love!
oxipng, pngquant and svgcleaner — optimizing images
-
Losslessly Optimising Images
I wonder how `pngcrush` compares to `oxipng` (https://github.com/shssoichiro/oxipng).
Personally, I use `oxipng` if I want lossless compression. However, most of the time, I use `pngquant` instead, since it gives significant size reduction even at `99%` (I can't even distinguish between the original and reduced image).
pngquant --quality=99 --ext=.png --force file.png
-
Adobe plans to make Photoshop on the web free to everyone
Depending on your workflow it might make sense to export PNGs directly from Affinity and then reduce their size with a utility like Oxipng, which uses all your cores to find the best algorithm for each particular image.
- OptiPNG vs. PNGcrush vs. Gimp to Reduce PNG Size
-
Help processing massive videos (16k resolution)
Assuming your frames are PNG files, you could use a lossless optimizer like optipng to try if their size can be reduced. I prefer oxipng, which is faster and multithreaded, and seems to have more active development.
-
(Urgent) Best Image Compressor Sites That Barely Compress?
Not sure what extensions of images you use, but if they’re PNG you could use oxipng: https://github.com/shssoichiro/oxipng
What are some alternatives?
scrape-san-mateo-fire-dispatch
squoosh - Make images smaller using best-in-class codecs, right in the browser.
shot-scraper - A command-line utility for taking automated screenshots of websites
ImageOptim - GUI image optimizer for Mac
zettelkasten - Creating notes with the zettelkasten note taking method and storing all notes on github
opencv-rust - Rust bindings for OpenCV 3 & 4
hun_law_rs - Tool for parsing hungarian laws (Rust version)
sharp - High performance Node.js image processing, the fastest module to resize JPEG, PNG, WebP, AVIF and TIFF images. Uses the libvips library.
sf-tree-history - Tracking the history of trees in San Francisco
image - Encoding and decoding images in Rust
queensland-traffic-conditions - A scraper that tracks changes to the published queensland traffic incidents data
imageproc (PistonDevelopers) - Image processing operations