Our great sponsors
-
fast-sqlite3-inserts
Some bunch of test scripts to generate a SQLite DB with 1B rows in fastest possible way
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
remixdb
RemixDB: A read- and write-optimized concurrent KV store. Fast point and range queries. Extremely low write-amplification.
-
WorkOS
The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.
You mention wanting to do a Go version. Not sure if it's useful, but this is a SQLite "bulk data generation" util I threw together ages ago in Go:
https://github.com/sqlitebrowser/sqlitedatagen
There's some initial work to parallelise it with goroutines here:
https://github.com/sqlitebrowser/sqlitedatagen/blob/multi_ro...
Didn't go very far down that track though, as the mostly single threaded nature of writes in SQLite seemed to prevent that from really achieving much. Well, I _think_ that's why it didn't really help. ;)
Very nice! I shall give that a try. I always get excited by SQLite and go projects.
I maintain a similar go tool for work, which I use to stuff around 1TB into MariaDB a time:
Potentially interesting:
https://github.com/siara-cc/sqlite_micro_logger_arduino
https://github.com/siara-cc/sqlite_micro_logger_arduino/blob...
This is a heavily subsetted implementation of SQLite3 that can read/write databases (presumably on SD cards) from very small microcontrollers.
It presumably doesn't have the same ACID compliance properties, but with a single <1.5k source file, may represent a particularly efficient way to rapidly learn the intrinsics.
Now I'm thinking it could actually be interesting to see what drh thinks of this implementation (and any gotchas in it) because of its small size and accessibility.
Related posts
- Are entity framework tools typically avoided with MySQL & Go and are there alternatives for migration script tooling that version control the entire schema like SSDT?
- How to create a link between two spans in OpenTelemetry
- Setting up PostgreSQL for running integration tests
- Building a Playful File Locker with GoFr
- Using migrations with Golang