py-caskdb
fast-sqlite3-inserts
py-caskdb | fast-sqlite3-inserts | |
---|---|---|
14 | 11 | |
1,122 | 363 | |
- | - | |
3.3 | 0.0 | |
2 months ago | about 1 year ago | |
Python | Rust | |
MIT License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
py-caskdb
- Ask HN: What are some good resources for learning about low level disk/file IO?
-
Ask HN: Similar Books like “Raytracing in one Weekend”
self plug: I wanted to learn how databases work internally, like how they store and retrieve data, build indexes, etc.
I built an educational KV store to teach someone to write a database from scratch. I have set up this project in TDD fashion with the tests. So, you start with simple functions, pass the tests, and the difficulty level goes up. There are hints if you get stuck. When all the tests pass, you would have written a persistent key-value store in the end.
link: https://github.com/avinassh/py-caskdb
-
Resource for making database from scratch
link: https://github.com/avinassh/py-caskdb
- What are some recent papers to read on KV stores?
-
Ask HN: As a senior engineer content with programming chops, what to learn next?
If you like databases, you start exploring the internals and start writing one! There is no going back once you dig deep into storage internals, KV stores, and distributed systems.
plug: I made an educational project which can help you write a database in python, from scratch - https://github.com/avinassh/py-caskdb
- Ask HN: Which personal projects got you hired?
-
Getting started with database development
Build your own disk based KV store
- GitHub - avinassh/py-caskdb: (educational) build your own disk based KV store
-
Weekly Coders, Hackers & All Tech related thread - 07/05/2022
link - https://github.com/avinassh/py-caskdb
- Show HN: CaskDB – project to teach you building a key value store
fast-sqlite3-inserts
-
SQLite performance tuning: concurrent reads, multiple GBs and 100k SELECTs/s
I am experimenting with SQLite, where I try inserting 1B rows in under a minute. The current best is inserting 100M rows at 23s. I cut many corners to get performance, but the tweaks might suit your workload.
I have explained my rationale and approach here - https://avi.im/blag/2021/fast-sqlite-inserts/
the repo link - https://github.com/avinassh/fast-sqlite3-inserts
-
I/O is no longer the bottleneck
I am working on a project [0] to generate 1 billion rows in SQLite under a minute and inserted 100M rows inserts in 33 seconds. First, I generate the rows and insert them in an in-memory database, then flush them to the disk at the end. To flush it to disk it takes only 2 seconds, so 99% of the time is being spent generating and adding rows to the in-memory B Tree.
For Python optimisation, have you tried PyPy? I ran my same code (zero changes) using PyPy, and I got 3.5x better speed.
I published my findings here [1].
[0] - https://github.com/avinassh/fast-sqlite3-inserts
[1] - https://avi.im/blag/2021/fast-sqlite-inserts/
- Ask HN: Which personal projects got you hired?
-
Is there any language that is as similar as possible to Python in syntax, readability, and features, but is statically typed?
I have a side project where I tried to insert one billion rows in SQLite. I was able to insert 100 million rows using Python under 210 seconds. The same thing with PyPy took 120 seconds. I am wondering what kind of speed boost I would get with Cython
- Ask for benchmark. The owner can’t verify a 18% perf gain, could you?
-
Inserting One Billion Rows in SQLite Under A Minute
Measure, measure, measure! There is a PR which made really minor changes, but it got 2x speed boost with CPython version
- Inserting One Billion Rows in SQLite Under a Minute
- Weekly Coders, Hackers & All Tech related thread - 17/07/2021
-
How we achieved write speeds of 1.4 million rows per second
[somewhat related] Recently, I was benchmarking SQLite inserts and I managed to insert 3.3M records per second (100M in 33 ish seconds) on my local machine - https://github.com/avinassh/fast-sqlite3-inserts Ofcourse the comparison is not apples to apples, but sharing here if anyone finds it interesting
What are some alternatives?
helindb
tsbs - Time Series Benchmark Suite, a tool for comparing and evaluating databases for time series data
db_tutorial - Writing a sqlite clone from scratch in C
julia - The Julia Programming Language
awesome-dbdev - Awesome materials about database development.
plum - Multiple dispatch in Python
go-caskdb - (educational) build your own disk based KV store in Go
sqlite_micro_logger_arduino - Fast and Lean Sqlite database logger for Arduino UNO and above
fio - Flexible I/O Tester
remixdb - RemixDB: A read- and write-optimized concurrent KV store. Fast point and range queries. Extremely low write-amplification.
WebGL-Fluid-Simulation - Play with fluids in your browser (works even on mobile)
dynamic-dns - An automated dynamic DNS solution for Docker and DigitalOcean