streams | encoding | |
---|---|---|
5 | 2 | |
1,331 | 266 | |
0.6% | 0.4% | |
6.0 | 4.2 | |
4 days ago | about 2 months ago | |
HTML | HTML | |
GNU General Public License v3.0 or later | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
streams
-
Backpressure explained – the resisted flow of data through software
Yup, this is what WHATWG's Streams spec[0] (linked in the article) says. It defines backpressure as a "process of normalizing flow from the original source according to how fast the chain can process chunks" where the reader "propagates a signal backwards through the pipe chain".
Mozilla's documentation[1] similarly defines backpressure as "the process by which a single stream or a pipe chain regulates the speed of reading/writing".
The article confuses backpressure (the signal used for regulation of the flow) with the reason backpressure is needed (producers and consumers working at different speeds). It should be fairly clear from the metaphor, I would have thought: With a pipe of unbounded size there is no pressure. The pressure builds up when consumer is slower than producer, which in turn slows down the producer. (Or the pipe explodes, or springs a leak and has to drop data on the ground.)
[0] https://streams.spec.whatwg.org/#pipe-chains
[1] https://developer.mozilla.org/en-US/docs/Web/API/Streams_API...
- Streams Standard
-
Streams and React Server Components
// https://streams.spec.whatwg.org/#example-transform-identity const { writable, readable } = new TransformStream(); fetch("...", { body: readable }).then(response => /* ... */); const writer = writable.getWriter(); writer.write(new Uint8Array([0x73, 0x74, 0x72, 0x65, 0x61, 0x6D, 0x73, 0x21])); // "streams!" writer.close();
-
Goodbye, Node.js Buffer
Yeah, in your case I think most of the complexity is actually on the ReadableStream side, not the base64 side.
The thing that I'd actually want for your case is either a TransformStream for byte stream <-> base64 stream (which I expect will come eventually, once the simple case gets done), or something which would let you read the entire stream into Uint8Array or ArrayBuffer, which is a long-standing suggestion [1].
---
> Why does de-chunking a byte array need to be complicated
Keep in mind the concat proposal is _very_ early. If you think it would be useful to be able to concat Uint8Arrays and have that implicitly concatenate the underlying buffers, [2] is the place to open an issue.
---
> You have made me realize I don't even know what the right venue is to vote on stuff. How should I signal to TC39 that e.g. Array.fromAsync is a good idea?
Unfortunately, it's different places for different things. Streams are not TC39 at all; the right place for suggestions there is in the WHATWG streams repo [3]. Usually there's already an existing issue and you can add your use case as a comment in the relevant issue. TC39 proposals all have their own Github repositories, and you can open a new issue with your use case.
Concrete use cases are much more helpful than just "this is a good idea". Though `fromAsync` in particular everyone agrees is good, and it mostly just needs implementations, which are ongoing; see e.g. [4]. If you _really_ want to advance a stage 3 proposal, you can contribute a PR to Chrome or Firefox with an implementation - but for nontrivial proposals that's usually hard. For TC39 in particular, use cases are only really valuable pre-stage-3 proposals.
[1] https://github.com/whatwg/streams/issues/1019
[2] https://github.com/jasnell/proposal-zero-copy-arraybuffer-li...
[3] https://github.com/whatwg/streams
[4] https://bugs.chromium.org/p/v8/issues/detail?id=13321
-
Are you using generators?
// AudioWorkletStream // Stream audio from Worker to AudioWorklet // guest271314 2-24-2020 let port; onmessage = async e => { 'use strict'; if (!port) { [port] = e.ports; port.onmessage = event => postMessage(event.data); } const { urls } = e.data; // https://github.com/whatwg/streams/blob/master/transferable-streams-explainer.md const { readable, writable } = new TransformStream(); (async _ => { for await (const _ of (async function* stream() { while (urls.length) { yield (await fetch(urls.shift(), {cache: 'no-store'})).body.pipeTo(writable, { preventClose: !!urls.length, }); } })()); })(); port.postMessage( { readable, }, [readable] ); };
encoding
-
Transcoding Latin 1 strings to UTF-8 strings at 12 GB/s using AVX-512
Be aware that with the WHATWG Encoding specification [1], that says that latin1, ISO-8859-1, etc. are aliases of the windows-1252 encoding, not the proper latin1 encoding. As a result, browsers and operating systems will display those files differently! It also aliases the ASCII encoding to windows-1252.
[1] https://encoding.spec.whatwg.org/#names-and-labels
-
[AskJS] Why are TextEncoder and TextDecoder classes?
They used to support UTF16 (both LE and BE) before it got removed because no one uses UTF16: https://github.com/whatwg/encoding/issues/18
What are some alternatives?
AudioWorkletStream - fetch() => ReadableStream => AudioWorklet
text-encoding - Polyfill for the Encoding Living Standard's API
console - Console Standard
fetch - Fetch Standard
proposal-array-from-async - Draft specification for a proposed Array.fromAsync method in JavaScript.
WHATWG HTML Standard - HTML Standard
url - URL Standard
WebCodecsOpusRecorder - WebCodecs Opus Recorder/Media Source Extensions Opus EncodedAudioChunk Player
proposal-async-iterator-helpers - Methods for working with async iterators in ECMAScript
falcon - Brushing and linking for big data
dom - DOM Standard