WebCodecsOpusRecorder
WebRTC
Our great sponsors
WebCodecsOpusRecorder | WebRTC | |
---|---|---|
19 | 6 | |
10 | 1,252 | |
- | - | |
2.8 | 8.5 | |
about 1 month ago | 7 days ago | |
JavaScript | JavaScript | |
Do What The F*ck You Want To Public License | MIT License |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
WebCodecsOpusRecorder
-
[AskJS] Do you think we need an Automatic Code Documentation Generator, especially after Github Co-pilot?
Take for example https://github.com/guest271314/WebCodecsOpusRecorder. There was no roadmap anywhere in the wild for how to write Opus encoded packets produced by WebCodecs AudioEncoder to a single file, including the capability to include media metadata such as artist, album, artwork in the file, for use with Media Session API - without a media container - and play back the file in the browser. So how would the documentation be automatically generated?
-
Sleekiest JavaScript Trick you know?
We can write a Uint32Array, JSON, ArrayBuffer adjacent to the other in a Blob. That means we can write our own algorithm for stroing arbitrary data and reading the data back. E.g., we can write the length of JSON containing configuration metadata, for example image artwork and offsets of ArrayBuffers before the JSON, then after the JSON write a series of ArraBuffers next to each other, then read the length of the JSON stored in the first 4 bytes as a Uint32Array, get the variable length of the JSON following the Uint32Array, read the offsets in an array in JSON configuration, then read each offset of ArrayBuffers stored in the file. Kind of how the Native Messaging protocol works, extended for the capability to write arbitrary data to a file with decoding instruction set encoded within the file itself, so we can, for example, write Opus encoded audio from WebCodecs, image artwork, artists, title, album data to a file, and then read the file, display images, artist, album data written therein in using Media Session API, and stream the audio using Medis Source Extensions, or decode the audio to a WAV file from Opus compression in the browser. E.g., https://github.com/guest271314/WebCodecsOpusRecorder. Bonus: The resulting file size, excluding the images serialized in the file, audio for audio, is less than Opus encoded in WebM file, the default container for MediaRecorder output on Chrome on Linux.
-
MP4 File and the Range Request Header
Not at all. Here I encoded Opus audio output by WebCodecs AudioEncoder, write the encoded chunks to a single file, preceded by JSON configuration and indexes of the discrete encoded chunks, optioally included media metadata such as artists, album, artwork, so we can fetch the first 4 bytes to read the Uint32Array at the beginning of the file to get the offsets information, then make separate range requests for the given timeslice(s) or media and playback that media https://github.com/guest271314/WebCodecsOpusRecorder.
-
JSON with multiline strings
As long as the encoder and decoder are on the same page, and you keep track of offsets, you can do whatever you want. Particularly using a Blob. Here https://github.com/guest271314/WebCodecsOpusRecorder/blob/main/WebCodecsOpusRecorder.js I write a Uint32Array, JSON, and ArrayBuffers containing WebCodecs Opus encoded audio, and optionally images and metadata for Media Session API to the same file, and play the file back in the browser, in pertinent part
-
Have some basic python, time to turn up the heat and learn web app development on JavaScript
Another fun project was encoding Opus packets output by WebCodecs AudioEncoder to a single file, and playing the file back in the browser https://github.com/guest271314/WebCodecsOpusRecorder. There was no road map to do that.
-
[AskJS] Why are TextEncoder and TextDecoder classes?
I never had an issue encoding and decoding Opus packets using the above approaches https://github.com/guest271314/WebCodecsOpusRecorder.
-
Yo - instead of making fun of people's ideas - HELP THEM OUT and give them feedback!
I carried on an developed a way to do just that, and save all packets to a single file and play back that file several ways. The resulting file winds up being more compact than Opus encoded ina WebM container. I then added a way to include images in the file to support Media Session metadata https://github.com/guest271314/WebCodecsOpusRecorder. Et al.
-
How do I append to an array inside a json file in node?
Recording raw Opus packets produced by WebCodecs AudioEncoder to a single file - without a media container such as Matroska, WebM, MP3, AAC, etc. - then playing back the file. You can test for yourself on Chrome or Chromium here https://guest271314.github.io/WebCodecsOpusRecorder/webcodecs-opus-recorder-mse-wav-player.html. Record your microphone or other device remapped as a microphone, save the file, then upload the file and play it back. I included the ability to also store an image in the file for media session metadata support, so we get to see same or similar image you see at global media controls when playing for example a YouTube video.
-
At what point in your programming journey do you step back and learn Data Structures and Algorithms?
There was no roadmap for how to write Opus packets produced by Chrome's WebCodecs AudioEncoder to a single file - without writing the Opus packs to a media container such as Matroska or WebM. I just know it could be done, and used my experience testing Native Messaging to use the concept of preceding the data with a Uint32Array containing the length of the file, in this case, including the offsets of each packet to JSON array, then writing the algorith to extrack that data for playback https://github.com/guest271314/WebCodecsOpusRecorder.
-
Trying to record off a canvas, but bitrate is very low; high values are ignored.
This is how I write Opus packets to a file without a container and playback using Media Source Extension or as WAV file https://github.com/guest271314/WebCodecsOpusRecorder
WebRTC
-
A popular Bluetooth car battery monitor that siphons up all your location data
It's worth a try though to push back. We are not talking holding off a bunch of murderous Russian troops. It's not a good look trying to recall " ... and they came for me" either.
Just push back. Here's an example: Reolink are a Chinese company, what makes cameras - nothing intrinsically wrong with that but you should expect them to be be required to comply with any requirements the CCP might ... require. Reolink are also quite a savvy bunch and have gradually ensured that their products don't actually require an internet connection, at all. They do offer an app and the requirements of using the app are that the cams need to see the interwebs and be gatewayed by systems that are eventually subject to the CCP.
Now this isn't quite yet perfect. Rio cams have offered ONVIF for at least five years, so Zoneminder, Frigate and all the rest can be your NVR. The camera's VLAN can be firewalled off from the internet. Mine is called THINGS and it is net door to SEWER for stuff I really worry about!
Their doorbell offering is pretty decent but two way comms needs some handling. At the moment their app is the best bet for functionality but there are signs that Home Assistant with webrtc - https://github.com/AlexxIT/WebRTC should be OK.
It is not impossible to live without prop software. At least care and try.
-
New addition to the hallway (echo show 15)
For Home Assistant, I found WebRTC to work pretty well, particularly on substreams, but honestly I just ended up using iframes of Blue Iris instead as I found that to be more reliable and more versatile (I can easily access stored videos from the tablet, etc)
- Options for casting a live feed to chromecast/nest hubs with HA?
-
What's in your WFH video call set-up?
I stream live cams using this https://github.com/AlexxIT/WebRTC
-
ESP32_CAM RTSP Home Assistant
Not sure if exactly the same issue, but I was having big problems with RTSP framerate and latency (about 2seconds) - and solved by using this WebRTC addon, which works super well: https://github.com/AlexxIT/WebRTC
- help understanding timing delays: hikvision, zoneminder, home assistant
What are some alternatives?
webm-writer-js - JavaScript-based WebM video encoder for Google Chrome
homebridge-camera-ui - Homebridge plugin for RTSP Cameras with HSV, motion detection support, Image Rekognition, Web UI to manage/watch streams and WebApp support
worker-dom - The same DOM API and Frameworks you know, but in a Web Worker.
Ant-Media-Server - Ant Media Server is a live streaming engine software that provides adaptive, ultra low latency streaming by using WebRTC technology with ~0.5 seconds latency. Ant Media Server is auto-scalable and it can run on-premise or on-cloud.
AudioWorkletStream - fetch() => ReadableStream => AudioWorklet
Shinobi - :peace_symbol: :palestinian_territories: Shinobi CE - The Free Open Source CCTV platform written in Node.JS (Camera Recorder - Security Surveillance Software - Restreamer
text-encoding - Polyfill for the Encoding Living Standard's API
lovelace-fan-xiaomi - Xiaomi Smartmi Fan Lovelace card with CSS fan animation
encoding - Encoding Standard
purifier-card - Air Purifier card for Home Assistant Lovelace UI
noctis - 🐵 Dark Blue Theme for Home Assistant
vacuum-card - Vacuum cleaner card for Home Assistant Lovelace UI