multipart-stream-rs
bonsaidb
multipart-stream-rs | bonsaidb | |
---|---|---|
2 | 25 | |
6 | 979 | |
- | 0.6% | |
0.0 | 7.9 | |
over 1 year ago | about 2 months ago | |
Rust | Rust | |
Apache License 2.0 | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
multipart-stream-rs
-
Introduction to HTTP Multipart
The article talks about multipart/form-data in particular.
Another thing one might run across is multipart/x-mixed-replace. I wrote a crate for that. [1] I didn't see a spec for it, but someone since pointed out to me that it's probably identical to multipart/x-mixed, and now seeing an example in the multer README it clicks that I should have looked at RFC 1341, which says this:
> All subtypes of "multipart" share a common syntax, defined in this section.
...and written a crate general enough for all of them. Maybe I'll update my crate for that sometime. My crate currently assumes there's a Content-Length: for each part, which isn't specified there but makes sense in the context I use it. It wouldn't be hard to also support just the boundary delimiters. And then maybe add a form-data parser on top of that.
btw, the article also talks specifically about proxying the body. I don't get why they're parsing the multipart data at all. I presume they have a reason, but I don't see it explained. I'd expect that a body is a body is a body. You can stream it along, and perhaps also buffer it in case you want to support retrying the backhaul request, probably stopping the buffering at some byte limit at which you give up on the possibility of retries, because keeping arbitrarily large bodies around (in RAM or even spilling to SSD/disk) doesn't sound fun.
[1] https://crates.io/crates/multipart-stream
-
What's everyone working on this week (17/2021)?
I find my implementation in parser.rs kind of gross, but at least it seems to work. If anyone happens to look, I'd appreciate tips for cleaning up this code.
bonsaidb
-
Two Years of BonsaiDb: A retrospective and looking to the future
I do have ideas in the issue tracker on some of the next steps towards an actual migration system.
-
Some key-value storage engines in Rust
What about https://github.com/khonsulabs/bonsaidb? Progress seems stall since last summer but very cool project
-
Are there a demand for management system of embedded storage like RocksDB? I plan to build one in Rust as the language becoming a core of many popular databases but wonder if there’s a demand. Can’t find any similar project even in other languages.
There is Nebari which is the KV part of BonsaiDB I've used both successfully (and that is currently in production)
-
Is `inlining` a function essentially the same thing as writing a macro?
In BonsaiDb, I define entire test suites as macros. This crate has a common trait that has multiple implementations in different crates. Each implementation needs to be tested thoroughly. For cargo test to be able to work in each crate independently, I needed to have the #[test]-annotated functions in the crate being built. By using a macro, I can define the functions in one location and invoke the macro in each crate to import the test suite into that crate.
-
bonsai-bt: A Behavior Tree library in Rust for creating complex AI logic https://github.com/Sollimann/bonsai
hey, just letting you know that there already is a project called bonsai-db and some people might confuse bonsai-bt as part of that project
-
What's everyone working on this week (12/2022)?
I'm finishing up a large refactor of BonsaiDb which will add support for using BonsaiDb in non-async code.
- BonsaiDB: Document database that grows with you, written in Rust
-
What's everyone working on this week (10/2022)?
I'm working on a major refactoring of BonsaiDb, aiming to improve the design of several interrelated features. While it started by aiming to enable a non-async interface for BonsaiDb, I realized mid-refactor that another major refactor would be better to do simultaneously rather than separately. Thank goodness that refactoring in Rust is such a wonderful experience!
-
Announcing BonsaiDb v0.1.0: A Rust NoSQL database that grows with you
It depends on what you mean by "support graphs". If you mean support the abillity to build a GraphQL interface in front of it, yes that is already possible in a limited fashion, although there are no first-class relationship types yet.
-
What's everyone working on this week (5/2022)?
I'm trying to release the first alpha of BonsaiDb. I'm wrapping up replacing OPAQUE with Argon2, in an effort to make upgrading less likely to cause issues in the future (given that OPAQUE is still a draft protocol). I still love OPAQUE and will bring it back in the future.
What are some alternatives?
fullstack-rust - Reference implementation of a full-stack Rust application
sled - the champagne of beta embedded databases
paperoni - An article extractor in Rust
cosmicverge - A systematic, sandbox MMO still in the concept phase. Will be built with Rust atop BonsaiDb and Gooey
nym - Manipulate files en masse using patterns.
tokei - Count your code, quickly.
milli - Search engine library for Meilisearch ⚡️
cpp-from-the-sky-down
roaring-rs - A better compressed bitset in Rust
tusd - Reference server implementation in Go of tus: the open protocol for resumable file uploads
cherrybomb - Stop half-done APIs! Cherrybomb is a CLI tool that helps you avoid undefined user behaviour by auditing your API specifications, validating them and running API security tests.