|about 1 month ago||over 1 year ago|
|GNU General Public License v3.0 or later||Apache License 2.0|
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
Working with Node.js streams
1 project | dev.to | 24 Sep 2021
Knowledge of Node.js streams is essential because they are a great tool to rely on when handling large sets of data. Check out the Node.js API docs for more information about streams.
How I Built An Incomplete CMS
4 projects | dev.to | 23 Jun 2021
AWS S3 buckets can hold a wide variety of objects in storage and can be accessed through Node APIs. This seemed like the best option because I would be able to scale the number of posts I make indefinitely while not affecting my file size and I could maintain the already existing and tested markdownToHtml conversion. It is possible to store markdown files in AWS S3 buckets and then stream the contents into my existing functions. Before going any further, here is are the dependencies from my package.json.
NodeJS pipeline to c#?
1 project | reddit.com/r/node | 25 May 2021
Get streaming data and send it to different clients
2 projects | reddit.com/r/node | 10 May 2021
A Transform stream is a Duplex stream where the output is computed in some way from the input. Examples include zlib streams or crypto streams that compress, encrypt, or decrypt data.2 projects | reddit.com/r/node | 10 May 2021
The Stream API is only relevant if OP has an actual Stream-based connection to each client (in which case the .pipe() method is going to be the way to go for doing one-to-many stream broadcasting). But he/she is probably better off using a library like SocketIO or ws. I.e. that uses WebSockets under the hood and, thus, exposes a higher level abstraction.
Encoding in Stream Transform
1 project | reddit.com/r/node | 29 Apr 2021
Fun problem, took me awhile to figure out. It's buried in the Node.js documentation:https://nodejs.org/api/stream.html#stream_duplex_and_transform_streams
Newbie graphql file upload question! (Need help)
1 project | reddit.com/r/graphql | 20 Mar 2021
How I made my own file compressor using Node.js
1 project | dev.to | 16 Mar 2021
The definition above is from Node.js Documentation.
Protip to anybody using Gulp: switch from .pipe() to .pipeline()
1 project | reddit.com/r/webdev | 3 Mar 2021
It's not. It's part of the stream, standard library since Node 10. Here's the documentation
Unzip large files in AWS using Lambda and Node.js
1 project | dev.to | 1 Mar 2021
Unzipper package, on the other hand, works based on Node.js streams. In short, streams allows us to process (read/write) data in chunks, keeping memory footprint as well as execution time very low.
We haven't tracked posts mentioning Highland yet.
Tracking mentions began in Dec 2020.
What are some alternatives?
scramjet - Simple yet powerful live data computation framework
through2 - Tiny wrapper around Node streams2 Transform to avoid explicit subclassing noise
through2-concurrent - Simple Node.JS stream (streams2) Transform that runs the transform functions concurrently (with a set max concurrency)
pumpify - Combine an array of streams into a single duplex stream using pump and duplexify
ffmpeg.wasm - FFmpeg for browser and node, powered by WebAssembly
get-stream - Get a stream as a string, buffer, or array
Most.js - Ultra-high performance reactive programming
into-stream - Convert a string/promise/array/iterable/asynciterable/buffer/typedarray/arraybuffer/object into a stream
concat-stream - writable stream that concatenates strings or data and calls a callback with the result