Show HN: Postgres.js – Fastest Full-Featured PostgreSQL Client for Node and Deno

This page summarizes the projects mentioned and recommended in the original post on news.ycombinator.com

Our great sponsors
  • SurveyJS - Open-Source JSON Form Builder to Create Dynamic Forms Right in Your App
  • WorkOS - The modern identity platform for B2B SaaS
  • InfluxDB - Power Real-Time Data Analytics at Scale
  • postgres

    Postgres.js - The Fastest full featured PostgreSQL client for Node.js, Deno, Bun and CloudFlare (by porsager)

    It greatly outperformed the alternatives[1] using pipelining and prepared statements, while providing a much better development experience safe from SQL injections. Since then I've been busy building things using it, now running in production, and although quite delayed I'm so happy to release a new major today with some really exciting new features:

    1. Realtime subscribe to changes through Logical Replication [2]

    It's now possible to use logical replication to subscribe in realtime to any changes in your database with a simple api like `sql.subscribe('insert:events', row => ...)`. Inspired from Supabase Realtime you can now have it yourself in Node.

    2. A Safe Dynamic Query Builder

    Nesting the sql`` tagged template literal function allows building highly dynamic queries while staying safe and using parameterized queries at the same time.

    3. Multi-host connection URLs for High Availability support

    It's really nice to be able to quickly spin up a High Availability Postgres setup using pg_auto_failover[3] and connect using Postgres.js with automatic failover and almost 0 downtime.

    4. Deno support

    It also works with Deno now, completing all tests except a few SSL specific ones which requires fixes in Deno.

    5. And much more

    Large object support, efficient connection handling for large scale use, cancellation of requests, Typescript suport, async cursors.

    [1] https://github.com/porsager/postgres-benchmarks#results

  • postgres-benchmarks

    A set of benchmarks focusing on the performance of Postgres client libraries for Node.js

    It greatly outperformed the alternatives[1] using pipelining and prepared statements, while providing a much better development experience safe from SQL injections. Since then I've been busy building things using it, now running in production, and although quite delayed I'm so happy to release a new major today with some really exciting new features:

    1. Realtime subscribe to changes through Logical Replication [2]

    It's now possible to use logical replication to subscribe in realtime to any changes in your database with a simple api like `sql.subscribe('insert:events', row => ...)`. Inspired from Supabase Realtime you can now have it yourself in Node.

    2. A Safe Dynamic Query Builder

    Nesting the sql`` tagged template literal function allows building highly dynamic queries while staying safe and using parameterized queries at the same time.

    3. Multi-host connection URLs for High Availability support

    It's really nice to be able to quickly spin up a High Availability Postgres setup using pg_auto_failover[3] and connect using Postgres.js with automatic failover and almost 0 downtime.

    4. Deno support

    It also works with Deno now, completing all tests except a few SSL specific ones which requires fixes in Deno.

    5. And much more

    Large object support, efficient connection handling for large scale use, cancellation of requests, Typescript suport, async cursors.

    [1] https://github.com/porsager/postgres-benchmarks#results

  • SurveyJS

    Open-Source JSON Form Builder to Create Dynamic Forms Right in Your App. With SurveyJS form UI libraries, you can build and style forms in a fully-integrated drag & drop form builder, render them in your JS app, and store form submission data in any backend, inc. PHP, ASP.NET Core, and Node.js.

  • npmgraph

    A tool for exploring NPM modules and dependencies

    There is quite a big difference, but I'll highlight some of the main points here. Note I'm the author of Postgres.js so I might be biased.

    Slonik is a wrapper around another node driver (pg), so it's performance is the same as that, where Postgres.js is significantly faster (2-5x)[1].

    Postgres.js is also a zero dependency module, whereas Slonik has quite the dependency graph meaning - compare https://npmgraph.js.org/?q=slonik with https://npmgraph.js.org/?q=postgres. That makes it more difficult to audit the code and preventing chain attacks etc.

    Slonik also doesn't have the same lean straight forward developer experience as Postgres.js - again I'm biased saying that ;)

    Postgres.js also does things that will make your queries perform better out of the box by implicitly creating prepared statements and using pipelining.

    I suppose you could build Slonik on top of Postgres.js instead of pg as well, but that would probably only make sense if migrating to Postgres.js.

    [1] http://github.com/porsager/postgres-benchmarks#results

  • imba

    🐤 The friendly full-stack language

    What a coincidence seeing this on HN. Great lib! I actually experimented with replacing our use of node-postgres with this today. Your lib was the reason I finally got around to shipping support for tagged template literals in Imba today (https://github.com/imba/imba/commit/ae8c329d1bb72eec6720108d...) :) Are you open for a PR exposing the option to return rows as arrays? It's pretty crucial for queries joining multiple tables with duplicate column names.

  • React

    The library for web and native user interfaces.

    Any reason you can't use MIT.

    For hobbyist projects Unlicense is great, but it may create issues with bigger companies. I think the main sticking point, is public domain that doesn't really exist in countries like Germany, so the unlicensed is unclear. Companies don't like unclear things.

    Facebook uses MIT, and their tools like React and Jest are cornerstones of the NodeJS ecosystem.

    https://github.com/facebook/react/blob/main/LICENSE

  • node-redis

    Redis Node.js client

    > Sure, the c++ is going to require you to do some sanitizing as you force your data into v8

    it's not just sanitizing, there's a lot more to the object creation inside v8 itself. but, even if it were just sanitizing, that mechanism has become a lot more complicated than it ever was in v8 3.1 (timeframe around node 0.4) or 3.6 (timeframe around node 0.6). when interacting with c++, v8 makes no assumptions, whereas when interacting with javascript, a large number of assumptions can be made (e.g. which context and isolate is it being executed in, etc).

    > but as we noted that's inevitable no matter how you slice it.

    yes, from c++ to javascript and back, but when you need to make that trip multiple times, instead of once, that interchange adds up to quite a bit of extra code executed, values transformed, values checked, etc. sure, banging your head against a wall might not hurt once, but do it 40 times in a row and you're bound to be bloodied.

    > Now maybe in some cases the v8 internals offer some advantages the generic c++ api can't access

    by a fairly large margin, as it turns out, especially as v8 has evolved from the early 3.1 days to the current 9.8: 11 years. there has been significant speedup to javascript dealing with javascript objects compared to c++ dealing with javascript objects. see below.

    > My memories of the redis client is different than yours so I'd be quite interested to see those conversations / benchmarks.

    super easy to find, all of that was done in public: https://github.com/redis/node-redis/pull/242 - there are multiple benchmarks done by multiple people, and the initial findings were 15-20% speedup, but were improved upon. the speedup was from the decoding of the binary packet, which was passed as a single buffer, as opposed to parsing it externally and passing in each object through the membrane.

    > As a simple thought experiment, in the scenario you're describing we should see a javascript implementation of a JSON parser to beat the pants off the v8 engine implementation, but this doesn't seem to the case.

    that's a bit of a straw man argument. especially given that JSON.parse() is a single call and does not require any additional tooling/isolates/contexts to execute, it's just straight c++ code with very fast access into the v8 core:

        Local result = Local::New(isolate, JSON.Parse(jsonString));

  • plv8

    V8 Engine Javascript Procedural Language add-on for PostgreSQL

    but, let's take your straw man a little further. let's suppose that all of the actual parsing is done for you already, and all you're doing is iterating through the data structure, creating objects through the c++ api, and calling it good. that should be faster than calling the c++ JSON.parse(), shouldn't it? since we don't have to actually parse anything, right? no, it's actually much slower. you can see this in action at https://github.com/plv8/plv8/blob/r3.1/plv8_type.cc#L173-L60...

    again, we're not talking about whether javascript in an interpreter is faster than c++, we're talking about whether v8's api causes enough slowdown that some workloads that require a lot of data between c++ and javascript are slower than the same workload that requires very little data between c++ and javascript ... because passing through v8's c++/javascript membrane is slow.

  • WorkOS

    The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.

NOTE: The number of mentions on this list indicates mentions on common posts plus user suggested alternatives. Hence, a higher number means a more popular project.

Suggest a related project

Related posts