nuxt-medusa
aifiles
nuxt-medusa | aifiles | |
---|---|---|
6 | 10 | |
120 | 129 | |
- | - | |
5.5 | 5.0 | |
about 1 month ago | about 1 month ago | |
TypeScript | TypeScript | |
MIT License | GNU General Public License v3.0 or later |
Stars - the number of stars that a project has on GitHub. Growth - month over month growth in stars.
Activity is a relative number indicating how actively a project is being developed. Recent commits have higher weight than older ones.
For example, an activity of 9.0 indicates that a project is amongst the top 10% of the most actively developed projects that we are tracking.
nuxt-medusa
- Nuxt Modules Crash Course
-
Nuxt + Supabase = Technology Stack of Dreams 🚀
Interesting fact - in this video, one of the modules I created for Nuxt is featured - https://nuxt-medusa.vercel.app/
- Open source repo for building Ecommerce with Nuxt and Medusa
-
Nuxt-Medusa Module: Integrate Medusa with your Nuxt.js application
Access the module here, including a tutorial, documentation, video, and release notes. You can also try it directly in your browser using the module sandbox on Stackblitz.
-
JSTools Weekly — ✨2023#8: TS-Reset: A ‘CSS reset’ For TS, Improving JS Types
nuxt-medusa: 🛍️ Medusa module for Nuxt
-
Nuxt, Medusa, TailwindCSS Crash Course
This article will also showcase a Nuxt Module that I have recently created -> https://github.com/baroshem/nuxt-medusa. Make sure to star it on🚀 GitHub as it motivates me to make the module even better!
aifiles
-
Cleaning up my 200GB iCloud with some JavaScript
So yes, those 10TB archives may end up being 5TB if someone spent the time to really comb over, understand, make good decisions, and organize that data. But I have not yet seen anything that can scratch that surface yet, other than perhaps https://github.com/jjuliano/aifiles - but I won't use it until it's local only and has guarantees not to destroy data without explicit permission. An overlay filesystem that shows compression/deduplication with LLM capability like aifiles is probably the best option here.
However, I wouldn't imagine that most people's life data is less than 2TB even with all of this - it's mostly imposed as an artificial constraint by these companies.
-
Dropbox axes 16%
I imagine things like this are underway: https://github.com/jjuliano/aifiles
Honestly I think it's actually a pretty fantastic development if that's the direction. I don't use Dropbox much, nor have I used aifiles (due to sending all of your data to OpenAI) - but the idea of being able to not have to manually and tediously look over terabytes of files to completely reorganize all of your files into a better directory hierarchy, and tag each files with meaningful labels, etc sounds phenomenal.
Obviously there are some implementation details for this to be not awful - for example: 1) only local models for local data, 2) making the changes on e.g. ZFS (to allow rollback) or as some type of optional 'overlay' view to switch back and forth from the original to ai-organized, etc. and 3) having thresholds and logic for what may be considered 'duplicates' to be removed, and how to better compress data
As for the de-duplication and processing: this could be very good for dropbox in that, e.g. if a person wants to completely re-encode all of their image files or video files with AV1, the resulting data could be cut in half or more - which saves Dropbox storage space. After which, neural perceptual hashing could be done on all of the files and a threshold of similarity could do de-duplication on a perceptual basis (for example, keep the bigger size file that is 99% similar to a 2x downsized version, and re-encode it). User preference to keep things like tiff files completely intact or any other lossless encoding of their choosing could be good options as well
There's definitely a strange disparity between the computation cost for deploying a decent model to do this compared to the storage cost - but if a (perhaps even non-LLM!) small model is created to be able to plow through data at fast rates could be deployed it may make sense.
Or perhaps the type of semantic compression that LLMs do are of interest for making a new type of lossy compression algorithm of which Dropbox is interested.
- Digital clutter: Learning to let go and stop hoarding terabytes
- Big media is gearing up for battle with Google and Microsoft over AI chatbots using their articles for training: 'We are actively considering our options'
-
JSTools Weekly — ✨2023#8: TS-Reset: A ‘CSS reset’ For TS, Improving JS Types
aifiles: A CLI that organize and manage your files using AI
- AI Files: organize and manage your files using AI
-
Show HN: AI Files – manage and organize your files with AI
Output from the language model is also being injected into a script that is then executed: https://github.com/jjuliano/aifiles/blob/ef529fd6281eaf8d373...
He argued below that he is not vulnerable to indirect prompt injection attacks (https://github.com/greshake/llm-security), but I think he is wrong.
What are some alternatives?
ts-async-kit - the easiest API to deal with promises in Typescript. Currently, ↩️ Retrying 🏃♂️ looping & 😴 sleeping
chatgpt-api - Node.js client for the official ChatGPT API. 🔥
Tailwind CSS - A utility-first CSS framework for rapid UI development.
nuxt-scheduler - Create scheduled jobs with human readable time settings
suspense - Utilities for working with React Suspense
algolia - 🔎 Algolia module for Nuxt
ts-reset - A 'CSS reset' for TypeScript, improving types for common JavaScript API's
sonner - An opinionated toast component for React.
llm-client - LLMClient - JS/TS Use prompt signatures, Agents, Reasoning, Function calling, RAG and more. Based on the Stanford DSP Paper
Superforms - Superforms is a SvelteKit library that helps you with server-side validation and client-side display of forms.
concurrent.js - Non-blocking Concurrent Computation for JavaScript RTEs (Web Browsers, Node.js & Deno & Bun)