My First Impressions of Web3

This page summarizes the projects mentioned and recommended in the original post on news.ycombinator.com

Our great sponsors
  • SurveyJS - Open-Source JSON Form Builder to Create Dynamic Forms Right in Your App
  • WorkOS - The modern identity platform for B2B SaaS
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • arweave

    The Arweave server and App Developer Toolkit.

    There is arweave which is trying to bring permanent storage. You could storage the nft on arweave chain and mint the NFT on the same.

    https://www.arweave.org/

    Though, I'm not sure how it will "scale".

  • Joplin

    Joplin - the secure note taking and to-do app with synchronisation capabilities for Windows, macOS, Linux, Android and iOS.

    I love the idea(ls) of cryptocurrencies and yet I hate "web3" because it's a misnomer that led to a series of misconstructions:

    Web3 is futile because it attempts to rebuild the Web (1) on an abysmally resource-constrained global computer which (2) uses a bunch of protocols that makes it impossible interact with using web browsers thus requiring a series of intermediary parties whom participants have to rely on. It is not even the fact that I need to trust those intermediaries, I trust a bunch of Web 2 corporations for some of the most critical services anyway, but the fact that we end up where we have started except it is now more expensive and much slower.

    It is easy to dismiss Web3 as such, but that would not be fruitful. Besides all financial incentives, I (would like to) believe that there is a group of people who are sincerely interested in a more decentralized web, or rather, a web that is decentralized in a fundamentally different way than Web 2 and Web 1 are and were. To make it more concrete, there is an interest in decoupling authoring and hosting of web services; Linux distributions have had mirrors all over the world for the efficient distribution of data years before BitTorrent, so the magic of BitTorrent was not just about its efficiency promises, but in bringing content-addressed data to masses and thus decoupling the authoring (torrent creating) and the hosting (seeding) of content. Instead of having to ask Debian's permission to set up a mirror, I could now simply seed its torrent. It thus mattered that this decoupling has been implemented not at a social level (mirrors) but at a protocol level (peers).

    You may be familiar with the concept of cardinality in databases: one-to-one, one-to-many, many-to-many. Indeed, it can be just as useful to describe the access patterns to databases:

    (A) A one-for-one database is where a single writer is storing data for themselves. In the world of decentralized apps (not necessarily crypto-ridden web3), a good example is draw.io (and Zero Data Apps[0] in general) which allows you to "bring your own storage". On desktop, you have Joplin[1] for note-keeping that can synchronize to various cloud services.

    (B) A one-for-many database is where a single writer is distributing content to many. BitTorrent and IPFS are prime examples of this.

    (C) On the other hand, a many-for-many database is one that multiple writers store data for multiple readers. A centralized example of this is Hacker News, Twitter, reddit, and so on... This is what web3 attempts to be. There are a couple application-level attempts[2] at this, but not as much at a lower level that can enable arbitrary many-for-many use cases except blockchains.

    Sadly the critics of web3 do not acknowledge that there are legitimate use cases for decentralized many-to-many databases that would, for instance, allow members of Hacker News to be able to host it in the same way that they are able to seed an existing torrent, and there are currently no other application-agnostic solutions than blockchains. Sadly, again, the proponents of web3 do not realize that the consistency guarantees of a financial ledger are too unnecessarily strict for many use cases.

    I am working on a many-for-many database with much lesser consistency guarantees using SQLite and based on CRDTs designed to be used in browsers from day one (hence, as an example, using P-256[3] for public key cryptography rather than Bitcoin's and Ethereum's secp256k1 as the former is readily available in WebCrypto). This is something I do in my spare time and 100% for experimentation and fun without any financial motives or elements; let me know if you are interested in collaborating or following, email in the bio.

    ----

    [0] https://0data.app/

    [1] https://joplinapp.org/

    [2] https://getaether.net/

    [3] https://developer.mozilla.org/en-US/docs/Web/API/EcKeyGenPar...

  • SurveyJS

    Open-Source JSON Form Builder to Create Dynamic Forms Right in Your App. With SurveyJS form UI libraries, you can build and style forms in a fully-integrated drag & drop form builder, render them in your JS app, and store form submission data in any backend, inc. PHP, ASP.NET Core, and Node.js.

  • aether

    Aether client app with bundled front-end and P2P back-end

    I love the idea(ls) of cryptocurrencies and yet I hate "web3" because it's a misnomer that led to a series of misconstructions:

    Web3 is futile because it attempts to rebuild the Web (1) on an abysmally resource-constrained global computer which (2) uses a bunch of protocols that makes it impossible interact with using web browsers thus requiring a series of intermediary parties whom participants have to rely on. It is not even the fact that I need to trust those intermediaries, I trust a bunch of Web 2 corporations for some of the most critical services anyway, but the fact that we end up where we have started except it is now more expensive and much slower.

    It is easy to dismiss Web3 as such, but that would not be fruitful. Besides all financial incentives, I (would like to) believe that there is a group of people who are sincerely interested in a more decentralized web, or rather, a web that is decentralized in a fundamentally different way than Web 2 and Web 1 are and were. To make it more concrete, there is an interest in decoupling authoring and hosting of web services; Linux distributions have had mirrors all over the world for the efficient distribution of data years before BitTorrent, so the magic of BitTorrent was not just about its efficiency promises, but in bringing content-addressed data to masses and thus decoupling the authoring (torrent creating) and the hosting (seeding) of content. Instead of having to ask Debian's permission to set up a mirror, I could now simply seed its torrent. It thus mattered that this decoupling has been implemented not at a social level (mirrors) but at a protocol level (peers).

    You may be familiar with the concept of cardinality in databases: one-to-one, one-to-many, many-to-many. Indeed, it can be just as useful to describe the access patterns to databases:

    (A) A one-for-one database is where a single writer is storing data for themselves. In the world of decentralized apps (not necessarily crypto-ridden web3), a good example is draw.io (and Zero Data Apps[0] in general) which allows you to "bring your own storage". On desktop, you have Joplin[1] for note-keeping that can synchronize to various cloud services.

    (B) A one-for-many database is where a single writer is distributing content to many. BitTorrent and IPFS are prime examples of this.

    (C) On the other hand, a many-for-many database is one that multiple writers store data for multiple readers. A centralized example of this is Hacker News, Twitter, reddit, and so on... This is what web3 attempts to be. There are a couple application-level attempts[2] at this, but not as much at a lower level that can enable arbitrary many-for-many use cases except blockchains.

    Sadly the critics of web3 do not acknowledge that there are legitimate use cases for decentralized many-to-many databases that would, for instance, allow members of Hacker News to be able to host it in the same way that they are able to seed an existing torrent, and there are currently no other application-agnostic solutions than blockchains. Sadly, again, the proponents of web3 do not realize that the consistency guarantees of a financial ledger are too unnecessarily strict for many use cases.

    I am working on a many-for-many database with much lesser consistency guarantees using SQLite and based on CRDTs designed to be used in browsers from day one (hence, as an example, using P-256[3] for public key cryptography rather than Bitcoin's and Ethereum's secp256k1 as the former is readily available in WebCrypto). This is something I do in my spare time and 100% for experimentation and fun without any financial motives or elements; let me know if you are interested in collaborating or following, email in the bio.

    ----

    [0] https://0data.app/

    [1] https://joplinapp.org/

    [2] https://getaether.net/

    [3] https://developer.mozilla.org/en-US/docs/Web/API/EcKeyGenPar...

  • nimbus-eth2

    Nim implementation of the Ethereum Beacon Chain

    > If you care about the environment even a little bit (like turning off lights in rooms you're not occupying) then you will reject Web3. Even the most efficient blockchains use more energy than the status quo unnecessarily.

    On an Intel NUC (Core i3, low power mode) I'm running a non-mining Ethereum 1 full node[1] plus a staking Ethereum 2 node[2] (comprising two active validators) on mainnet. Measured with a Kill A Watt[3] since genesis of the beacon chain, it's using approximately USD 140 kWh of electricity per year (about USD $15/year where I live), and makes use of the Internet connection that I use for everything else personal and work related. The Ethereum 1 node also acts as my personal gateway to Ethereum vs. say my needing to connect through Infura.

    There are today 279235 active validators[4] on Ethereum mainnet. Now, I know that Ethereum hasn't made the switch over to Proof of Stake yet (that's what Eth 2 is all about) but it's' coming this year. Let's ignore the kWh usage of my non-mining full Eth 1 node and assume the 140 kWh is split evenly by the validators (it's not even close, the Eth 1 node is a pig in comparison, but for sake of argument), then round each one up to 100 kWH per year and assume that's the average per validator going forward, and let's grow the beacon chain to 1 million active validators. So that's 100k MWh per year. Amazon reported[5] that they consumed 24 million MWh in 2020. I'm not sure how many combined MWh are consumed by the data centers for VISA, traditional banks, etc., but I'm guessing it's nothing to sneeze at.

    According to Statista[6], it costs about 150 kWh for VISA to process 100k transactions. According to VISA[7] they processed about 206 billion transactions over 12 months. So that's about 309k MWh.

    A couple of things to consider also. Ethereum devs are concerned about energy consumption, and there are active efforts to drive down the energy cost per validator by the various projects (nimbus, teku, etc.). Also, my Core i3 Intel NUC is pretty heavy-duty compared to lower-end hardware capable of running a validator node. So I expect the energy cost/year of Eth 2 to improve in coming years.

    [1] https://geth.ethereum.org/

    [2] https://github.com/status-im/nimbus-eth2#readme

    [3] https://en.wikipedia.org/wiki/Kill_A_Watt

    [4] https://beaconscan.com/

    [5] https://sustainability.aboutamazon.com/environment/sustainab...

    [6] https://www.statista.com/statistics/1265891/ethereum-energy-...

    [7] https://usa.visa.com/dam/VCOM/global/about-visa/documents/ab...

  • go-ethereum

    Official Go implementation of the Ethereum protocol

    > If you care about the environment even a little bit (like turning off lights in rooms you're not occupying) then you will reject Web3. Even the most efficient blockchains use more energy than the status quo unnecessarily.

    On an Intel NUC (Core i3, low power mode) I'm running a non-mining Ethereum 1 full node[1] plus a staking Ethereum 2 node[2] (comprising two active validators) on mainnet. Measured with a Kill A Watt[3] since genesis of the beacon chain, it's using approximately USD 140 kWh of electricity per year (about USD $15/year where I live), and makes use of the Internet connection that I use for everything else personal and work related. The Ethereum 1 node also acts as my personal gateway to Ethereum vs. say my needing to connect through Infura.

    There are today 279235 active validators[4] on Ethereum mainnet. Now, I know that Ethereum hasn't made the switch over to Proof of Stake yet (that's what Eth 2 is all about) but it's' coming this year. Let's ignore the kWh usage of my non-mining full Eth 1 node and assume the 140 kWh is split evenly by the validators (it's not even close, the Eth 1 node is a pig in comparison, but for sake of argument), then round each one up to 100 kWH per year and assume that's the average per validator going forward, and let's grow the beacon chain to 1 million active validators. So that's 100k MWh per year. Amazon reported[5] that they consumed 24 million MWh in 2020. I'm not sure how many combined MWh are consumed by the data centers for VISA, traditional banks, etc., but I'm guessing it's nothing to sneeze at.

    According to Statista[6], it costs about 150 kWh for VISA to process 100k transactions. According to VISA[7] they processed about 206 billion transactions over 12 months. So that's about 309k MWh.

    A couple of things to consider also. Ethereum devs are concerned about energy consumption, and there are active efforts to drive down the energy cost per validator by the various projects (nimbus, teku, etc.). Also, my Core i3 Intel NUC is pretty heavy-duty compared to lower-end hardware capable of running a validator node. So I expect the energy cost/year of Eth 2 to improve in coming years.

    [1] https://geth.ethereum.org/

    [2] https://github.com/status-im/nimbus-eth2#readme

    [3] https://en.wikipedia.org/wiki/Kill_A_Watt

    [4] https://beaconscan.com/

    [5] https://sustainability.aboutamazon.com/environment/sustainab...

    [6] https://www.statista.com/statistics/1265891/ethereum-energy-...

    [7] https://usa.visa.com/dam/VCOM/global/about-visa/documents/ab...

  • iroha

    Iroha - A simple, enterprise-grade decentralized ledger

    > Blockchains are designed to be a network of peers, but not designed such that it’s really possible for your mobile device...

    If I am not mistaken Hyperledger Iroha[0] has(had?) that as one of its goals.

    [0] https://github.com/hyperledger/iroha

  • portal-network-specs

    Official repository for specifications for the Portal Network

    From the post:

    > People don’t want to run their own servers, and never will...

    Fair enough, but there are active efforts to develop ultra-light clients for Ethereum together with the concept of "portal network":

    https://github.com/ethereum/portal-network-specs/

    https://our.status.im/nimbus-fluffly/

    > there’s not even a word for an actual untrusted client/server interface that will have to exist somewhere, and no acknowledgement that if successful there will ultimately be billions (!) more clients than servers.

    I would not say there's "no acknowledgement" of this; depending on how deep you are in the space, it's pretty obvious that the goal is to have layered networks and mission specific networks (storage vs. messaging vs. consensus), all economically incentivized, that are p2p through and through, from the resource constrained devices of end consumers to the staking nodes that secure the networks. That's the hope, the goal, and the focus of ongoing efforts.

    The opposite of the missing word is "a node in a p2p network".

    The points made about the difficulty in evolving protocols quickly are not lost on me, but I guess I'm more optimistic than the author that it will happen relatively quickly in coming years, including this one. In the process, there will be opportunities seized where the protocols fall short and half-measures or worse (with respect to decentralization) will generate excitement for a time. That seems like "growing pains" to me.

  • WorkOS

    The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.

  • solana

    Web-Scale Blockchain for fast, secure, scalable, decentralized apps and marketplaces.

    > Instead of storing the data on-chain, NFTs instead contain a URL that points to the data. What surprised me about the standards was that there’s no hash commitment for the data located at the URL.

    I've been recently exploring the Solana[0] NFT ecosystem. The situation is similar there and I admit it took me by surprise at first. However upon further inspection, there's more to the story.

    As others here have mentioned, most serious ETH collections address this problem using IPFS. But on Solana Arweave[1] is a popular solution. I had never heard of Arweave before and it's a seriously cool concept. In a nutshell, it's a system that allows you to pay for 200+ (potentially much more) years of storage _up front_. I won't pretend to understand it all, but it effectively pays the network of miners to host your assets indefinitely. The up front payment - which is steep when compared to traditional hosting - provides a "sustainable endowment" for these mining rewards. This allows you to guarantee that the asset will be available without counting on some random hosted storage system.

    It seems that NFTs are the main use case for such a system at the moment. However I can imagine other use cases could emerge for an answer to this question I never really thought to ask: "How can I ensure that an asset is hosted "forever?" Interesting problem and an interesting solution that a network like this - with its marriage of decentralized technology and economic incentives - is uniquely poised to address.

    [0] https://solana.com/

  • nft.storage

    😋 Free decentralized storage and bandwidth for NFTs on IPFS and Filecoin.

    It's better to think of IPFS as a protocol similar to HTTP...instead of being name-addressed and requiring the name owner to provide the infrastructure to serve the data, data are content-addressed so that anyone can serve the data. That's the "decentralized" nature of IPFS.

    Many NFTs are hosted by NFT platforms, and also by services such as https://nft.storage/ (backed by IPFS & Filecoin). It's quite trivial though to take the IPFS CID and pin it somewhere else (local computer, a pinning service like Pinata, etc.), and anyone can do it at any time. If all you want to do is be able to prove ownership at some point in the future, you don't really need to host the content indefinitely on IPFS...just host it when you need to.

  • moonworm

    codegen for crypto degens and other ethereum smart contract toolkit for python

    We have built an open source tool that you can connect to any node (on an Ethereum-based blockchain) and instantly start building datasets about contracts that you care about. All you need is their ABI.

    https://github.com/bugout-dev/moonworm

    We are committed to keeping this code free. Our policy is only to charge for our operational expertise, but all the code that we use is open source. We are in the process of opening our platform up for decentralization (so anyone can contribute node time, storage, etc.).

    Intellectual property is theft.

  • annotated-spec

    Vitalik's annotated eth2 spec. Not intended to be "the" annotated spec; other documents like Ben Edgington's https://benjaminion.xyz/eth2-annotated-spec/ also exist. This one is intended to focus more on design rationale.

    The crux of the article is that the front-ends are all routing calls through centralized APIs to get their message included on the blockchain. Infura and Alchemy don't do much. They just pass a JSON-RPC message to an Ethereum node running on their servers. There is some additional indexing services they provide, but there are many open, decentralized alternatives for that such as TheGraph Protocol. And it's not unfeasible for an application to run its own Postgres instance to index data from the ETH blockchain.

    As for full-fat clients on normal mobile devices, the main issue is the data requirements. Running a full node can take hundreds of gigabytes. It is possible on light hardware. People are running Beacon chain nodes on Raspberry Pis. But you do need the storage and that tends to be scarce on mobile.

    Meanwhile, the Ethereum core devs are aware of this issue and are actively working towards it. They shipped the Altair hard fork this year that has adds sync committees which make it possible to do without needing the whole chain history (using merkle trees): https://github.com/ethereum/annotated-spec/blob/master/altai...

    The light client to follow from those improvements is forthcoming:

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts