SaaSHub helps you find the best software and product alternatives Learn more →
Transformers.js-examples Alternatives
Similar projects and alternatives to transformers.js-examples
-
-
Sevalla
Deploy and host your apps and databases, now with $50 credit! Sevalla is the PaaS you have been looking for! Advanced deployment pipelines, usage-based pricing, preview apps, templates, human support by developers, and much more!
-
gallery
A gallery that showcases on-device ML/GenAI use cases and allows people to try and use models locally.
-
-
WebGPT
Run GPT model on the browser with WebGPU. An implementation of GPT inference in less than ~1500 lines of vanilla Javascript.
-
entity-db
EntityDB is an in-browser vector database wrapping indexedDB and Transformers.js over WebAssembly
-
-
-
InfluxDB
InfluxDB – Built for High-Performance Time Series Workloads. InfluxDB 3 OSS is now GA. Transform, enrich, and act on time series data directly in the database. Automate critical tasks and eliminate the need to move data externally. Download now.
-
-
-
amazon-ivs-webgpu-captions-demo
This repository contains an experimental demo application that shows how you can add client-side auto-generated captions to Amazon IVS Real-time and Low-latency streams using transformers.js and WebGPU.
-
dalle-playground
A playground to generate images from any text prompt using Stable Diffusion (past: using DALL-E Mini)
-
transformers.js-examples discussion
transformers.js-examples reviews and mentions
-
Ch.at – a lightweight LLM chat service accessible through HTTP, SSH, DNS and API
Something like this? https://github.com/huggingface/transformers.js-examples/tree...
-
Google AI Edge – on-device cross-platform AI deployment
Really happy to see additional solutions for on-device ML.
That said, I probably wouldn't use this unless mine was one of the specific use cases supported[0]. I have no idea how hard it would be to add a new model supporting arbitrary inputs and outputs.
For running inference cross-device I have used Onnx, which is low-level enough to support whatever weights I need. For a good number of tasks you can also use transformers.js which wraps onnx and handles things like decoding (unless you really enjoy implementing beam search on your own). I believe an equivalent link to the above would be [1] which is just much more comprehensive.
[0] https://ai.google.dev/edge/mediapipe/solutions/guide
[1] https://github.com/huggingface/transformers.js-examples
-
Running DeepSeek Janus-Pro-1B in the Browser: A Step-by-Step Guide
The following example demonstrates how to load and run DeepSeek Janus-Pro-1B in a Web Worker for non-blocking inference. The full code is available in the GitHub repository.
-
Running DeepSeek-R1 in the Browser: A Comprehensive Guide
As AI technology evolves, running sophisticated machine learning models directly within the browser is becoming increasingly feasible. This guide will walk you through how to load and use the DeepSeek-R1 model in a browser using JavaScript. We'll also cover the implementation details based on the example found here.
-
A note from our sponsor - SaaSHub
www.saashub.com | 1 Sep 2025
Stats
huggingface/transformers.js-examples is an open source project licensed under Apache License 2.0 which is an OSI approved license.
The primary programming language of transformers.js-examples is JavaScript.
Popular Comparisons
- transformers.js-examples VS SemanticFinder
- transformers.js-examples VS WebGPT
- transformers.js-examples VS entity-db
- transformers.js-examples VS gpt-tfjs
- transformers.js-examples VS transformers-js
- transformers.js-examples VS gallery
- transformers.js-examples VS amazon-ivs-webgpu-captions-demo
- transformers.js-examples VS dalle-playground
- transformers.js-examples VS kalidoface-3d
- transformers.js-examples VS executorch