streams
proposal-arraybuffer-base64
streams | proposal-arraybuffer-base64 | |
---|---|---|
5 | 5 | |
1,331 | 220 | |
0.6% | 6.4% | |
6.0 | 7.6 | |
4 days ago | 23 days ago | |
HTML | HTML | |
GNU General Public License v3.0 or later | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
streams
-
Backpressure explained – the resisted flow of data through software
Yup, this is what WHATWG's Streams spec[0] (linked in the article) says. It defines backpressure as a "process of normalizing flow from the original source according to how fast the chain can process chunks" where the reader "propagates a signal backwards through the pipe chain".
Mozilla's documentation[1] similarly defines backpressure as "the process by which a single stream or a pipe chain regulates the speed of reading/writing".
The article confuses backpressure (the signal used for regulation of the flow) with the reason backpressure is needed (producers and consumers working at different speeds). It should be fairly clear from the metaphor, I would have thought: With a pipe of unbounded size there is no pressure. The pressure builds up when consumer is slower than producer, which in turn slows down the producer. (Or the pipe explodes, or springs a leak and has to drop data on the ground.)
[0] https://streams.spec.whatwg.org/#pipe-chains
[1] https://developer.mozilla.org/en-US/docs/Web/API/Streams_API...
- Streams Standard
-
Streams and React Server Components
// https://streams.spec.whatwg.org/#example-transform-identity const { writable, readable } = new TransformStream(); fetch("...", { body: readable }).then(response => /* ... */); const writer = writable.getWriter(); writer.write(new Uint8Array([0x73, 0x74, 0x72, 0x65, 0x61, 0x6D, 0x73, 0x21])); // "streams!" writer.close();
-
Goodbye, Node.js Buffer
Yeah, in your case I think most of the complexity is actually on the ReadableStream side, not the base64 side.
The thing that I'd actually want for your case is either a TransformStream for byte stream <-> base64 stream (which I expect will come eventually, once the simple case gets done), or something which would let you read the entire stream into Uint8Array or ArrayBuffer, which is a long-standing suggestion [1].
---
> Why does de-chunking a byte array need to be complicated
Keep in mind the concat proposal is _very_ early. If you think it would be useful to be able to concat Uint8Arrays and have that implicitly concatenate the underlying buffers, [2] is the place to open an issue.
---
> You have made me realize I don't even know what the right venue is to vote on stuff. How should I signal to TC39 that e.g. Array.fromAsync is a good idea?
Unfortunately, it's different places for different things. Streams are not TC39 at all; the right place for suggestions there is in the WHATWG streams repo [3]. Usually there's already an existing issue and you can add your use case as a comment in the relevant issue. TC39 proposals all have their own Github repositories, and you can open a new issue with your use case.
Concrete use cases are much more helpful than just "this is a good idea". Though `fromAsync` in particular everyone agrees is good, and it mostly just needs implementations, which are ongoing; see e.g. [4]. If you _really_ want to advance a stage 3 proposal, you can contribute a PR to Chrome or Firefox with an implementation - but for nontrivial proposals that's usually hard. For TC39 in particular, use cases are only really valuable pre-stage-3 proposals.
[1] https://github.com/whatwg/streams/issues/1019
[2] https://github.com/jasnell/proposal-zero-copy-arraybuffer-li...
[3] https://github.com/whatwg/streams
[4] https://bugs.chromium.org/p/v8/issues/detail?id=13321
-
Are you using generators?
// AudioWorkletStream // Stream audio from Worker to AudioWorklet // guest271314 2-24-2020 let port; onmessage = async e => { 'use strict'; if (!port) { [port] = e.ports; port.onmessage = event => postMessage(event.data); } const { urls } = e.data; // https://github.com/whatwg/streams/blob/master/transferable-streams-explainer.md const { readable, writable } = new TransformStream(); (async _ => { for await (const _ of (async function* stream() { while (urls.length) { yield (await fetch(urls.shift(), {cache: 'no-store'})).body.pipeTo(writable, { preventClose: !!urls.length, }); } })()); })(); port.postMessage( { readable, }, [readable] ); };
proposal-arraybuffer-base64
-
Updates from the 100th TC39 meeting
Uint8Array to/from Base64: Uint8Array<->base64/hex.
-
Goodbye, Node.js Buffer
The proposal for native base64 support for Uint8Arrays is mine. I'm glad to see people are interested in using it. (So am I!)
For a status update, for the last year or two the main blocker has been a conflict between a desire to have streaming support and a desire to keep the API small and simple. That's now resolved [1] by dropping streaming support, assuming I can demonstrate a reasonably efficient streaming implementation on top of the one-shot implementation, which won't be hard unless "reasonably efficient" means "with zero copies", in which case we'll need to keep arguing about it.
I've also been working on documenting [2] the differences between various base64 implementations in other languages and in JS libraries to ensure we have a decent picture of the landscape when designing this.
With luck, I hope to advance the proposal to stage 3 ("ready for implementations") within the next two meetings of TC39 - so either next month or January. Realistically it will probably take a little longer than that, and of course implementations take a while. But it's moving along.
[1] https://github.com/tc39/proposal-arraybuffer-base64/issues/1...
[2] https://gist.github.com/bakkot/16cae276209da91b652c2cb3f612a...
-
Base64 Encoding, Explained
There's some additional interesting details, and a surprising amount of variation in those details, once you start really digging into things.
If the length of your input data isn't exactly a multiple of 3 bytes, then encoding it will use either 2 or 3 base64 characters to encode the final 1 or 2 bytes. Since each base64 character is 6 bits, this means you'll be using either 12 or 18 bits to represent 8 or 16 bytes. Which means you have an extra 4 or 2 bits which don't encode anything.
In the RFC, encoders are required to set those bits to 0, but decoders only "MAY" choose to reject input which does not have those set to 0. In practice, nothing rejects those by default, and as far as I know only Ruby, Rust, and Go allow you to fail on such inputs - Python has a "validate" option, but it doesn't validate those bits.
The other major difference is in handling of whitespace and other non-base64 characters. A surprising number of implementations, including Python, allow arbitrary characters in the input, and silently ignore them. That's a problem if you get the alphabet wrong - for example, in Python `base64.standard_b64decode(base64.urlsafe_b64encode(b'\xFF\xFE\xFD\xFC'))` will silently give you the wrong output, rather than an error. Ouch!
Another fun fact is that Ruby's base64 encoder will put linebreaks every 60 characters, which is a wild choice because no standard encoding requires lines that short except PEM, but PEM requires _exactly_ 64 characters per line.
I have a writeup of some of the differences among programming languages and some JavaScript libraries here [1], because I'm working on getting a better base64 added to JS [2].
[1] https://gist.github.com/bakkot/16cae276209da91b652c2cb3f612a...
[2] https://github.com/tc39/proposal-arraybuffer-base64
-
Updates from the 96th TC39 meeting
Base64 for Uint8Array:ArrayBuffer to/from Base64
-
Updates from the 84th meeting of TC39
ArrayBuffer to/from base64: ArrayBuffer <-> base64 string functions.
What are some alternatives?
AudioWorkletStream - fetch() => ReadableStream => AudioWorklet
nodejs-polars - nodejs front-end of polars
encoding - Encoding Standard
proposal-intl-numberformat-v3 - Additional features for Intl.NumberFormat to solve key pain points.
console - Console Standard
proposal-array-from-async - Draft specification for a proposed Array.fromAsync method in JavaScript.
proposal-async-iterator-helpers - Methods for working with async iterators in ECMAScript
url - URL Standard
excel_97_egg - A web port of the magic carpet simulator hidden within Microsoft Excel 97
proposal-regexp-atomic-operators