newsit VS hackernews-button

Compare newsit vs hackernews-button and see what are their differences.

newsit

Chrome Extension for Hacker News and Reddit Links (by benwinding)

hackernews-button

Privacy-preserving Firefox extension linking to Hacker News discussion; built with Bloom filters and WebAssembly (by jstrieb)
Our great sponsors
  • SurveyJS - Open-Source JSON Form Builder to Create Dynamic Forms Right in Your App
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • WorkOS - The modern identity platform for B2B SaaS
newsit hackernews-button
6 8
23 83
- -
1.6 2.8
about 1 year ago 5 months ago
TypeScript C
MIT License GNU General Public License v3.0 only
The number of mentions indicates the total number of mentions that we've tracked plus the number of user suggested alternatives.
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.

newsit

Posts with mentions or reviews of newsit. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2023-06-09.
  • Update 4: RedReader granted non-commercial accessibility exemption
    5 projects | /r/RedReader | 9 Jun 2023
    On Firefox there is Newsit & Reddit Checker.
  • Finally found some Epiverse alternative extensions that find reddit threads for any URL!
    1 project | /r/Epiverse | 1 Jul 2022
    Newsit
  • Nyxt browser annotations beat pen and paper, believe me
    3 projects | news.ycombinator.com | 2 Feb 2022
    Something I've always wanted to work on was a browser extension that allowed things like this to happen collaboratively. There's stuff like PeerLibrary[0] that lets you annotate things as a group, but it's limited to publications and things you upload. Nyxt seems to meet what I want but lacks the collaborative aspect I'm looking for.

    There's a couple of Browser extensions that have sorta tried to accomplish this before. Epiverse[1] seems to be the most polished one so far. It originally intended to allow any user to comment on any webpage. But it found that without existing content, few found it useful. So eventually the creator just parsed to see if the webpage was posted on Reddit or Hackernews. The original purpose of it ended up being too expensive to host so it ended up just becoming a HN/Reddit parser. Which, tbh, is basically what I want to build at this point. I'd love to contribute to the project, but I don't have much time and it's closed source

    The other similar extensions also just parse HN/Reddit like Newsit[2] (which is open-source) and Thredd[3] (which only parses Reddit). My only real addition to this is that I'd like to include the ability to parse more than just Reddit and HN. I wanna create a discussion aggregator. There's similar sites like Lobste.rs and Lemmy.ml that could also be parsed, but obviously that's not the full extent of where discussions happen around a webpage.

    I don't think I have it figured out, and I don't know if anyone ever will, but I think there's a lot to gain if someone is able to someday harness that feeling that you get when you read something really good or find something really cool and wanna see how others responded

    [0] https://peerlibrary.org/

    [1] https://epiverse.co/

    [2] https://newsit.benwinding.com/

    [3] https://thredd.io/

  • Here's an article that might be of interest
    2 projects | /r/freebsd | 22 Jul 2021
    I often use Newsit to tell whether something has been posted.
  • Show HN: Privacy-preserving browser extension linking to HN discussion
    4 projects | news.ycombinator.com | 1 Mar 2021
    Nice, so it periodically retrieves a list of HN posts and queries that list locally, so you're not telling algolia any specific site details.

    There are millions of posts on HN, how many submissions does it retrieve? Surely if you find an obscure site, it might not be in the local list.

    Maybe you could just always query many urls together, with only 1 of them being the real url you want. That would make it hard to track too.

    Also I made a similar extension, but it queries sources on every page load.

    (extension) https://newsit.benwinding.com/

    (source) https://github.com/benwinding/newsit

  • Ask HN: Is anyone still using Gulp?
    2 projects | news.ycombinator.com | 8 Feb 2021
    Yeah that's true, I mean I use gulp for browser extension builds, as there's a few random tasks (img compression, copying files, babel) that need to be done in order to build cross platform. Would be a pain in webpack.

    My example: https://github.com/benwinding/newsit/blob/master/gulpfile.js

hackernews-button

Posts with mentions or reviews of hackernews-button. We have used some of these posts to build our list of alternatives and similar projects. The last one was on 2021-06-13.
  • GitHub - jstrieb/hackernews-button: Privacy-preserving Firefox extension linking to Hacker News discussions; built with Bloom filters and WebAssembly
    1 project | /r/firefox | 15 Jan 2022
  • Ask HN: I curate HN stories which didn't reach the front page. Feedback please
    3 projects | news.ycombinator.com | 13 Jun 2021
    It's worth noting that my extension is far from perfect – it turns out that determining whether a specific page has been submitted to Hacker News is far from a trivial problem to solve. In general, this is because multiple URLs can map to the same page.

    Direct string comparison of the current URL to previously submitted ones doesn't work because there are many ways for two identical web pages to have different URLs. For example, the URL fragments can differ (the part after the "#" that may or may not be present). Also there can be tracking parameters (often—but not necessarily—prefixed with "utm_"), which don't change anything about the page. But the URL parameters can't be entirely disregarded because sometimes sites, forums in particular, rely on them – consider pages that use an "?id=..." parameter for different pages. Thus some parameters should be removed, but some shouldn't. The same website having different domains (or domains that change over time) further complicates the situation.

    My solution was to "canonicalize" URLs by transforming them into a simplified form using some pretty rough heuristics for common sources of noise. The Python code to do that is here: https://github.com/jstrieb/hackernews-button/blob/master/can...

    All of this to say that even though I've used my extension for months and have been quite happy, there will inevitably be false negatives.

  • Privacy-preserving Firefox extension linking to Hacker News discussion; built with Bloom filters and WebAssembly
    1 project | /r/coolgithubprojects | 1 Mar 2021
    1 project | /r/github | 1 Mar 2021
    1 project | /r/programming | 1 Mar 2021
  • Show HN: Privacy-preserving browser extension linking to HN discussion
    4 projects | news.ycombinator.com | 1 Mar 2021
    Thanks for clearing that up, yes I'm not that familiar with Bloom filters, seems like an interesting and useful concept. It could probably (pun intended) be applied to many applications to increase privacy.

    I like the [1] Workflow file you've made, the comments really help with reading shell code. I'm also amazed you can query 4M entries everyday with BigQuery, I thought that might be fairly expensive to do right? Or is this below a free tier?

    [1] https://github.com/jstrieb/hackernews-button/actions/runs/61...

What are some alternatives?

When comparing newsit and hackernews-button you can also consider the following projects:

RedditRepostSleuth - A high performance repost detection and administration bot for Reddit.

minwiz - Minimal starter kit for under 2 KB sites

xorfilter - Go library implementing binary fuse and xor filters

thredd - Collaborative Browsing

promnesia - Another piece of your extended mind