HN-Paywall-Archiver
capture-website
Our great sponsors
HN-Paywall-Archiver | capture-website | |
---|---|---|
1 | 2 | |
5 | 1,868 | |
- | - | |
10.0 | 6.0 | |
almost 2 years ago | 3 months ago | |
JavaScript | JavaScript | |
- | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
HN-Paywall-Archiver
-
Show HN: A userscript that adds archive URLs below the paywalled HN submissions
Source code: https://github.com/MostlyEmre/hn-anti-paywall
Now let me overexplain.
-Why?-
I never liked paywalled articles. I understand where they come from, but I don't like where we cross our paths.
This is why I don't use major news aggregators anymore. Instead, I spend my "catching-up-with-the-world-time" on Hacker News. However, Hacker News (HN) also has its fair-share of paywalled articles. (Around 11.6% according to my short-lived, half-assed attempt at measuring it. See my super old data https://hpa.emre.ca/ I tell the story below.)
-First try-
Around a year ago, when I ran the above experiment, my goal wasn't to run that experiment. It was during my self-teaching & career-changing process, I decided to build a React HN clone. To make it stand-out from the bunch, I added a paywall feature. It would detect paywalled articles and would add an archive URL into the metadata.
The issue with archiving is unless someone archived the link before on the {archiving-project} then the link is most likely not archived. So me sending people to those projects meant nothing. It kinda meant something for me from an ideological standpoint but I assume you are not me.
This rubbed me the wrong way. I decided to build a backend (See https://github.com/MostlyEmre/HN-Paywall-Archiver) that would scan the links and automatically to detect paywalls close to real-time and submit paywalled ones to archive.is for archival. I used Nodejs, Firebase, and React. I was -still am- really proud because I believed it was doing public good in terms of digital preservation. Only 1 person needed to run this script to benefit everyone. As an extra, I was curious on how many paywalled articles were being shared, by whom, at what time. So I also created some analytics functionality to gather the data. And later created a UI to present it.
HN-Paywall-Archiver was great but I stopped running the backend at some point. Because at that point couldn't find a way to continuously run my backend code on some platform for cheap or didn't try hard enough.
P.S. Recently I've been thinking of remaking this version with Cloudflare Workers.
-Hacker News Paywall Archiver Userscript-
After almost a year, I got into userscripts. Super great super awesome concept. People seem to hate javascript unless it is presented as a userscript. So I decided to get my hands dirty to create a simple solution that solves the paywall issue on HN without breaking any hearts.
My solution is not perfect as it had to be simple. But here's the rundown.
Pros:
- Does not beg for attention.
capture-website
- Sindresorhus/capture-website: Capture screenshots of websites
-
Automating inclusion of calendar screenshot in GitHub repo after pushing daily solutions
You can pass the cookie to screenshot action (reference)