Our great sponsors
-
InfluxDB
Power Real-Time Data Analytics at Scale. Get real-time insights from all types of time series data with InfluxDB. Ingest, query, and analyze billions of data points in real-time with unbounded cardinality.
-
databases-intuition
Building an intuition for latency and throughput of basic operations across SQL databases.
-
WorkOS
The modern identity platform for B2B SaaS. The APIs are flexible and easy-to-use, supporting authentication, user identity, and complex enterprise features like SSO and SCIM provisioning.
I'm a huge fan of SQLite! My org's apps use it heavily, often via this simple key-value interface built on sqlite: https://github.com/aaviator42/StorX
Handles tens of thousands of requests a day very smoothly! :)
I’ve been using Pocketbase[0] on my projects and I can’t recommend it enough.
It is built on a SQLite db and has real-time pub/sub capabilities. Its JS SDK is incredibly easy to use and setup for CRUD as well. For side projects and some medium tasks, I’d say SQLite/Pocketbase has been super easy to work with.
[0] https://pocketbase.io
One possible strategy is to have one directory/file per customer which is one SQLite file. But then as the user logs in, you have to look up first what database they should be connected to.
OR somehow derive it from the user ID/username. Keeping all the customer databases in a single directory/disk and then constantly "lite streaming" to S3.
Because each user is isolated, they'll be writing to their own database. But migrations would be a pain. They will have to be rolled out to each database separately.
One upside is, you can give users the ability to take their data with them, any time. It is just a single file.
[0]. https://litestream.io/
Sounds like your problem is with SQLAlchemy, not with SQLite.
My https://sqlite-utils.datasette.io library might be a better fit for you. It's a much thinner abstraction than SQLAlchemy.
Without too much difficulty you can get into the 100,000s inserts per second range, and even near 1M inserts per second.
https://github.com/eatonphil/databases-intuition#go-mattngo-...
- Use clone file to duplicate the cached data directory to give to individual tests.
One thing I'd like to pursue is to store the Postgres data dir in SQLite [1]. Then, I can reset the "file system" using SQL after each test instead of copying the entire datadir.
[1]: https://github.com/guardianproject/libsqlfs
Little use if you’re not on the JVM but I’ve had great success with Embedded Postgres:
https://github.com/zonkyio/embedded-postgres
Each test just copies a template database so it’s ultra fast and avoids the need for complicated reset logic.