paperoni
multipart-stream-rs
Our great sponsors
paperoni | multipart-stream-rs | |
---|---|---|
3 | 2 | |
126 | 6 | |
- | - | |
0.0 | 0.0 | |
about 2 years ago | over 1 year ago | |
Rust | Rust | |
MIT License | Apache License 2.0 |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
paperoni
-
Paperoni 0.6.0 release
Hello r/rust, I've released v0.6.0-alpha1 of Paperoni today. Paperoni is an article downloader that can download web articles into EPUB files. This current release also allows you to export articles as HTML files which opens up the possibilities for exporting to PDF. This was a feature requested about 3 months ago when I first posted about this project. Feel free to check it out and give any feedback. Thanks!
- paperoni: An article downloader written in Rust
-
What's everyone working on this week (17/2021)?
Planning for a Friday minor release of paperoni along with its roadmap for the coming year.
multipart-stream-rs
-
Introduction to HTTP Multipart
The article talks about multipart/form-data in particular.
Another thing one might run across is multipart/x-mixed-replace. I wrote a crate for that. [1] I didn't see a spec for it, but someone since pointed out to me that it's probably identical to multipart/x-mixed, and now seeing an example in the multer README it clicks that I should have looked at RFC 1341, which says this:
> All subtypes of "multipart" share a common syntax, defined in this section.
...and written a crate general enough for all of them. Maybe I'll update my crate for that sometime. My crate currently assumes there's a Content-Length: for each part, which isn't specified there but makes sense in the context I use it. It wouldn't be hard to also support just the boundary delimiters. And then maybe add a form-data parser on top of that.
btw, the article also talks specifically about proxying the body. I don't get why they're parsing the multipart data at all. I presume they have a reason, but I don't see it explained. I'd expect that a body is a body is a body. You can stream it along, and perhaps also buffer it in case you want to support retrying the backhaul request, probably stopping the buffering at some byte limit at which you give up on the possibility of retries, because keeping arbitrarily large bodies around (in RAM or even spilling to SSD/disk) doesn't sound fun.
[1] https://crates.io/crates/multipart-stream
-
What's everyone working on this week (17/2021)?
I find my implementation in parser.rs kind of gross, but at least it seems to work. If anyone happens to look, I'd appreciate tips for cleaning up this code.
What are some alternatives?
bonsaidb - A developer-friendly document database that grows with you, written in Rust
fullstack-rust - Reference implementation of a full-stack Rust application
nym - Manipulate files en masse using patterns.
roaring-rs - A better compressed bitset in Rust
milli - Search engine library for Meilisearch ⚡️
readability - A standalone version of the readability lib
tusd - Reference server implementation in Go of tus: the open protocol for resumable file uploads
roux-stream - Streaming API for the Rust Reddit Client roux